Robots.txt Generator
Optimize your site's SEO by guiding search engine crawlers.
Live Preview
In-Depth Guide
Everything you need to know
SEO Optimization
Guide search engine crawlers like Googlebot to index the right parts of your site, while protecting sensitive areas from exposure.
Crawler Efficiency
Save crawl budget by instructing bots to ignore non-essential paths or redundant parameters that don't need indexing.
Common Directives
User-agentThe bot the rule applies to ('*' is all bots)
DisallowPrevents crawling of specific paths
AllowExplicitly permits crawling within disallowed paths
SitemapPoints bots to your sitemap for better discovery