Robots.txt Generator - Build Robots.txt Online
User-agent: * Allow: / Disallow: /admin Disallow: /api Disallow: /private Disallow: /tmp
About the Robots.txt Generator
Every website needs a robots.txt file to tell crawlers which paths are open and which are off-limits. Writing the syntax by hand is simple for one or two rules, but gets tedious when you need to manage multiple user-agents, block AI training bots, and include sitemaps.
This visual builder lets you create robots.txt files through a form interface. Start with a preset, then customise individual rules. The live preview updates as you type so you always see exactly what will be generated.
Presets
- Allow All. Opens your entire site to all crawlers.
- Block All. Disallows all crawlers from all paths. Useful for staging sites.
- Block AI Crawlers. Allows regular search engines but blocks 10 known AI training crawlers including GPTBot, CCBot, and Claude-Web.
- Standard SEO. Allows everything except common private directories like /admin, /api, /private, and /tmp.
How to Use
Click a preset to start, then add or edit rules as needed. Each rule block has a user-agent field, allow paths, disallow paths, and an optional crawl-delay. Add sitemap URLs at the bottom. Copy the output from the preview pane on the right and save it as robots.txt at the root of your site.