Robots.txt Generator

Create a perfect robots.txt file to guide search engine crawlers and control which pages they can access.

What is a Robots.txt File?

A robots.txt file tells search engine crawlers which pages or files they can or can't request from your site. It's used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google.

To keep a page out of Google, use noindex directives or password-protect the page.

Robots.txt Syntax Explained

User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php Sitemap: https://example.com/sitemap.xml

Common Use Cases

Related Tools

XML Sitemap Generator

Create XML sitemaps

Robots.txt Tester

Test your robots.txt

Website SEO Analyzer

Full SEO analysis