HomeDeveloper Toolsrobots.txt Generator

control crawlers. one file.

generate a robots.txt file with custom rules, sitemap URL, and crawl-delay. presets for common configurations.

robots.txt preview
User-agent: *
Allow: /
free use available

What is robots.txt Generator?

robots.txt Generator creates a properly formatted robots.txt file for your website. Choose which crawlers to allow or block, set crawl delays, add sitemap URLs, and use presets for common configurations including blocking AI crawlers.

A robots.txt file tells search engine bots and web crawlers which parts of your site they can access. Getting the syntax wrong can accidentally block Google from indexing your site or allow bots you want to restrict.

This tool generates syntactically correct robots.txt files with clear rules. Copy the output and save it as robots.txt in your website's root directory.

How to Use robots.txt Generator

  1. 1

    Choose a preset or start blank

    Select a common preset like 'Allow all crawlers' or 'Block AI crawlers' or start with a blank configuration.

  2. 2

    Add rules

    Specify user-agent rules for different crawlers, and set which paths to allow or disallow.

  3. 3

    Add sitemap URL

    Enter your sitemap URL so crawlers can find it.

  4. 4

    Copy or download

    Copy the generated robots.txt content or download it as a file.

Common Use Cases

New website setup

Generate a robots.txt file as part of your initial SEO setup.

Blocking AI crawlers

Block GPTBot, CCBot, and other AI training crawlers from scraping your content.

Staging site protection

Block all crawlers from indexing your staging or development site.

Selective crawling

Allow search engines but block specific paths like admin pages or API routes.

SEO audits

Generate a new robots.txt to replace a misconfigured one that is blocking legitimate crawlers.

Frequently Asked Questions

more free tools

PDF utilities, image tools, developer helpers — all free, no signup.

Something wrong?