Free SEO Tool
Robots.txt Tester & Validator
Test whether a URL path is allowed or blocked by robots.txt rules. Fetch a live robots.txt file, choose a crawler, and check the exact matching rule.
Robots.txt Content
Paste your robots.txt rules here or fetch them from a live domain.
Test Rules
Choose a user-agent and check how a specific path is treated.
Googlebot is selected by default because most users want to test Google crawling.
Ready
Run a test
Choose a user-agent and path, then click Test Rules.
Syntax notes
Common example
Block admin pages but allow the AJAX endpoint:
User-agent: * Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.phpWildcard usage
Use * to match multiple characters and $ for end matching.
Disallow: /*?replytocom= Disallow: /private$ Allow: /public/*Important reminder
robots.txt controls crawling, not security. Sensitive content should not rely on robots.txt alone.
Use robots.txt for crawl guidance, not protection.
