---
title: Robots.txt Validator — Syntax Check & URL Path Tester
url: https://varstatt.com/toolkit/robots
description: Validate robots.txt syntax and test URL paths against crawl rules. Check which pages are allowed or blocked for any user-agent.
section: Developer Toolkit (https://varstatt.com/toolkit)
tags: seo
related: Robots (https://varstatt.com/toolkit/robots), Sitemap (https://varstatt.com/toolkit/sitemap), OG Tags (https://varstatt.com/toolkit/og)
---
# Robots.txt Validator

Validate robots.txt syntax and test URL paths against crawl rules. Check which pages are allowed or blocked for any user-agent.

## How It Works

1. **Paste your robots.txt** — Paste the contents of your robots.txt file into the editor. A sample is pre-loaded to get started.
2. **Review validation results** — The tool checks syntax, groups rules by User-agent, and flags errors and warnings.
3. **Test specific URLs** — Enter a user-agent name and URL path to check whether that path would be allowed or blocked.

## FAQ

### What directives are supported?

User-agent, Allow, Disallow, Sitemap, Crawl-delay, and Host. Unknown directives trigger a warning.

### How does URL matching work?

Patterns support * (wildcard) and $ (end anchor) per the robots.txt specification. The most specific matching rule wins — longer patterns take priority.

### Does Disallow: / block everything?

Yes. 'Disallow: /' blocks all paths for the specified user-agent. To allow everything, use 'Allow: /' or leave Disallow empty.

### Is this sent to a server?

No. All parsing and validation runs in your browser. To validate your site's Open Graph tags, try the [OG Tag Validator](https://varstatt.com/toolkit/og-preview).

## Usage

This tool runs entirely in the browser — visit the URL above to use it.

Prefill inputs via URL parameters:

- `https://varstatt.com/toolkit/robots?input=...`

## Related Tools

- [Robots](https://varstatt.com/toolkit/robots)
- [Sitemap](https://varstatt.com/toolkit/sitemap)
- [OG Tags](https://varstatt.com/toolkit/og)
