Robots.txt Validator

Validate robots.txt syntax and test URL paths against crawl rules. Check which pages are allowed or blocked for any user-agent.

How It Works

1

Paste your robots.txt

Paste the contents of your robots.txt file into the editor. A sample is pre-loaded to get started.

2

Review validation results

The tool checks syntax, groups rules by User-agent, and flags errors and warnings.

3

Test specific URLs

Enter a user-agent name and URL path to check whether that path would be allowed or blocked.

FAQ

Built & Maintained by Varstatt

Varstatt is a one-person product studio run by Jurij Tokarski, product engineer since 2011. These tools are free and open — no signup, no catch.