Adding a Content-Signal directive to robots.txt — per the Content Signals draft spec — drops the Lighthouse SEO score from 100 to 92 across all pages. The directive looks like:
Content-Signal: ai-train=yes, search=yes, ai-input=yes
The robots.txt validator appears to flag any unrecognised field as invalid, which feeds into the SEO score penalty.
A couple of questions:
- Is there an intention to recognise
Content-Signal once the spec matures, or would that be out of scope for Lighthouse?
- More broadly, should unrecognised directives in
robots.txt be treated as a hard failure, or would a warning be more appropriate given that the format is extensible by design?
Happy to provide a minimal reproduction if useful.
Adding a
Content-Signaldirective torobots.txt— per the Content Signals draft spec — drops the Lighthouse SEO score from 100 to 92 across all pages. The directive looks like:The robots.txt validator appears to flag any unrecognised field as invalid, which feeds into the SEO score penalty.
A couple of questions:
Content-Signalonce the spec matures, or would that be out of scope for Lighthouse?robots.txtbe treated as a hard failure, or would a warning be more appropriate given that the format is extensible by design?Happy to provide a minimal reproduction if useful.