7 min read
2026-02-08
Robots.txt is a text file in the root of a website that tells search engine bots which pages can be crawled and which cannot. The file is located at `example.com/robots.txt`.
| Directive | Description |
|---|---|
| User-agent | Which bot the rule applies to |
| Disallow | Block crawling of a path |
| Allow | Permission (overrides Disallow) |
| Sitemap | Link to the sitemap |
| Crawl-delay | Delay between requests |
Block all bots from the entire site: `Disallow: /`
Block only admin area: `Disallow: /admin/`
Allow everything: leave `Disallow:` empty
Choose which bots to set rules for
Specify blocked and allowed paths
Add a link to your Sitemap
Copy the generated file and upload it to your site root
Robots.txt is a recommendation, not a restriction. Malicious bots may ignore it
Do not use it to hide confidential data
Google may index a URL even with Disallow if there are links pointing to it
See also: Sitemap Generator, Meta Tag Generator, Heading Checker