Robots.txt Generator

Optimize your site's SEO by guiding search engine crawlers.

Live Preview
User-agent: * Allow: / Disallow: /admin Disallow: /api Disallow: /private

In-Depth Guide

Everything you need to know

SEO Optimization

Guide search engine crawlers like Googlebot to index the right parts of your site, while protecting sensitive areas from exposure.

Crawler Efficiency

Save crawl budget by instructing bots to ignore non-essential paths or redundant parameters that don't need indexing.

Common Directives

User-agent

The bot the rule applies to ('*' is all bots)

Disallow

Prevents crawling of specific paths

Allow

Explicitly permits crawling within disallowed paths

Sitemap

Points bots to your sitemap for better discovery