Per-check failure index

Missing or Misconfigured robots.txt

A wrong robots.txt can block search engines from your entire site without you knowing.

Audited

30

Failing

0

Passing

30

What this check looks for

robots.txt lives at /robots.txt and tells crawlers which paths they can fetch. The common own-goal: a `Disallow: /` left over from staging that blocks the whole site. The other common miss: no Sitemap directive, so Google has to discover the sitemap on its own. At a minimum: allow what you want indexed, disallow admin/api paths, and add `Sitemap: https://yourdomain.com/sitemap.xml` at the bottom.

Failing (0)

Every audited site passes this check. That's rare.

Passing (30)

sorted by overall audit score (best first)

Audit your own site for this check

Free, no account, no credit card. Same 12-check engine that scored every site on this page.

Run a free audit