Back to all tools

Robots.txt Tester

SEO

Fetch and analyze robots.txt rules. Test which paths are allowed or blocked for crawlers.

Unlock Full Power with Krawly Pro

Get access to all 150+ tools with higher limits. Start with 100 free credits — no credit card required.

Use via API
Robots.txt Tester — cURL
curl -X POST "https://krawly.io/api/v1/tools/robots-txt-tester/" \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{"url": "https://example.com"}'
150+ Tools Full API Access Bulk Processing Priority Support

What is Robots.txt Tester?

Robots.txt Tester fetches and parses the robots.txt file from any domain. It shows all rules, sitemaps, user agents, and tests common paths to show whether they're allowed or blocked.

Use Cases

  • Test if important pages are blocked
  • Find sitemap URLs from robots.txt
  • Audit crawl directives
  • Debug crawling issues

Key Features

Rule parsing (allow/disallow/crawl-delay)
Sitemap discovery
Path testing against rules
Raw file view with copy

Frequently Asked Questions