Robots.txt Generator

Generate a valid robots.txt file to control how search engine crawlers access your website. Configure rules and download instantly.

robots.txt Preview
User-agent: *
Allow: /

What is robots.txt?

The robots.txt file tells search engine crawlers which pages or sections of your website they can or cannot access. It must be placed in the root of your domain (e.g., https://example.com/robots.txt). Use Disallow rules to block admin panels, staging paths, or duplicate content. Always add your sitemap URL at the bottom to help crawlers find all your pages.