Skip to content

robots.txt SEO audit fails on unrecognised Content-Signal directive (draft spec) #16983

@IndrajeetPatil

Description

@IndrajeetPatil

Adding a Content-Signal directive to robots.txt — per the Content Signals draft spec — drops the Lighthouse SEO score from 100 to 92 across all pages. The directive looks like:

Content-Signal: ai-train=yes, search=yes, ai-input=yes

The robots.txt validator appears to flag any unrecognised field as invalid, which feeds into the SEO score penalty.

A couple of questions:

  • Is there an intention to recognise Content-Signal once the spec matures, or would that be out of scope for Lighthouse?
  • More broadly, should unrecognised directives in robots.txt be treated as a hard failure, or would a warning be more appropriate given that the format is extensible by design?

Happy to provide a minimal reproduction if useful.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions