Mega Dev Tools

Robots.txt Generator

Generate a robots.txt file to control how search engines crawl and index your website.

Quick Presets

User-Agent Rules

Define crawling rules for different bots

Disallow Paths

Allow Paths

Generated robots.txt

Copy this content to your robots.txt file

User-agent: *
Disallow: /admin/
Disallow: /cart/
Disallow: /checkout/
Disallow: /account/
Allow: /
Allow: /products/
Allow: /categories/

Sitemap: https://example.com/sitemap.xml
Sitemap: https://example.com/products-sitemap.xml

How to Use

1. Upload to Your Website

Save the generated content as "robots.txt" and upload it to your website's root directory (e.g. https://yoursite.com/robots.txt).

2. Test Your robots.txt

Use Google Search Console's robots.txt Tester to verify your file works correctly.

3. Common Patterns

Use Disallow: / to block all crawlers, or list specific paths. Allow: / can override a broader Disallow. Always add your Sitemap URL(s).