Robots.txt Validator & Tester

Validate robots.txt syntax, inspect parsed rule groups, and test whether a URL path would be allowed or blocked for any user-agent.

Robots.txt Content
Validation Results
No syntax errors or warnings found
URL Tester
BLOCKEDMatched disallow: /admin/ (line 4)
Parsed Rule Groups
User-agent:Googlebotline 2
allow/public/:3
disallow/admin/:4
disallow/api/:5
crawl-delay10:6
User-agent:*line 8
disallow/private/:9
allow/:10
Sitemaps
sitemaphttps://example.com/sitemap.xml:12

How It Works

1

Paste your robots.txt

Paste the contents of your robots.txt file into the editor. A sample is pre-loaded to get started.

2

Review validation results

The tool checks syntax, groups rules by User-agent, and flags errors and warnings.

3

Test specific URLs

Enter a user-agent name and URL path to check whether that path would be allowed or blocked.

FAQ

Related Tools

Building on or debugging infrastructure?

Plan your stack and deployment before it becomes a problem — run a Discovery session.

Built & Maintained by Varstatt

Varstatt is a one-person product studio run by Jurij Tokarski, product engineer since 2011. These tools are free and open — no signup, no catch.