5 min read
2026-02-16
Googlebot needs to see styles and scripts for proper page rendering. Blocking resources hurts rankings.
The `Sitemap: https://example.com/sitemap.xml` directive helps bots find all pages on your site.
Block technical pages: /admin/, /api/, /tmp/, search and filtering pages.
The robots.txt testing tool in Search Console shows which URLs are blocked.
The `*` character blocks all matching paths. `Disallow: /*.pdf$` will block all PDFs, including useful ones.
Blocking the Sitemap in robots.txt
Empty User-agent (rules don't apply)
Spaces and typos in paths
Placing the file outside the site root
See also: Sitemap Generator, Keyword Density