Generate robots.txt files. Control search engine crawler access to your site.
Select which crawlers the rules apply to (all bots, Googlebot, Bingbot, etc.) and add allow/disallow directives for specific URL paths.
Add your XML sitemap URL so crawlers can discover it automatically. Optionally set a crawl-delay to control crawling frequency.
Copy the generated robots.txt content and upload it as a text file to your domain root (yourdomain.com/robots.txt). The file takes effect immediately.
Robots.txt generator online — create properly formatted robots.txt files for your website instantly for free. Configure user-agent directives, allow and disallow rules, sitemap references, and crawl-delay settings for search engine crawlers. Control which pages Googlebot, Bingbot, and other crawlers can access and index. Prevent indexing of private pages, admin areas, and duplicate content. Essential for technical SEO, website security, and efficient search engine crawling. Copy the generated file and upload it to your domain root.