Robots.txt Generator

Generate robots.txt files. Control search engine crawler access to your site.

Rule #1

Global Settings

How to Use Robots.txt Generator

1

Configure user-agent rules

Select which crawlers the rules apply to (all bots, Googlebot, Bingbot, etc.) and add allow/disallow directives for specific URL paths.

2

Set sitemap and crawl delay

Add your XML sitemap URL so crawlers can discover it automatically. Optionally set a crawl-delay to control crawling frequency.

3

Copy and upload

Copy the generated robots.txt content and upload it as a text file to your domain root (yourdomain.com/robots.txt). The file takes effect immediately.

About Robots.txt Generator

Robots.txt generator online — create properly formatted robots.txt files for your website instantly for free. Configure user-agent directives, allow and disallow rules, sitemap references, and crawl-delay settings for search engine crawlers. Control which pages Googlebot, Bingbot, and other crawlers can access and index. Prevent indexing of private pages, admin areas, and duplicate content. Essential for technical SEO, website security, and efficient search engine crawling. Copy the generated file and upload it to your domain root.

Key Features

  • Generate properly formatted robots.txt files with user-agent, allow, and disallow directives
  • Configure rules for specific search engine crawlers including Googlebot, Bingbot, and others
  • Set sitemap URL references to help search engines discover your XML sitemap automatically
  • Add crawl-delay directives to control how frequently bots crawl your website pages
  • One-click copy of generated robots.txt content ready to upload to your website root directory
  • Free robots.txt generation with proper syntax validation and formatting for all major crawlers

Frequently Asked Questions

How to create a robots.txt file for my website?
Use our generator to configure allow/disallow rules, add your sitemap URL, and copy the properly formatted robots.txt file. Upload it to your domain root directory.
How to block search engines from indexing certain pages?
Add a Disallow directive for the URL path you want to block. For example, "Disallow: /admin/" prevents crawlers from accessing your admin pages.
How to add a sitemap URL to robots.txt?
Enter your sitemap URL in the sitemap field. The generator adds a "Sitemap: https://yourdomain.com/sitemap.xml" directive that helps search engines find your sitemap.
How to configure robots.txt for Googlebot specifically?
Set the user-agent to "Googlebot" and add allow/disallow rules that apply only to Google. You can have different rules for different search engine crawlers.
How to set crawl delay in robots.txt?
Enter a crawl-delay value in seconds. This tells crawlers to wait between requests, reducing server load. Note: Googlebot does not officially support crawl-delay.
How to allow all pages in robots.txt?
Set "User-agent: *" with "Allow: /" to permit all crawlers to access all pages. Add your sitemap URL for complete crawling coverage.
How to block all crawlers in robots.txt?
Set "User-agent: *" with "Disallow: /" to block all search engine crawlers from accessing any page on your site. Use this carefully as it removes your site from search results.
How to test if robots.txt is working?
Upload the generated file to your domain root. Use Google Search Console robots.txt tester to verify your rules work as expected for Googlebot.