New Robots.txt Syntax Checker: a validator for robots.txt files

Advertisement

Robots.txt Checker

Robots.txt files (often erroneously called robot.txt, in singular) are created by webmasters to mark (disallow) files and directories of a web site that search engine spiders (and other types of robots) should not access.

Get Free Updates - Weekly

Sign up for the free tips on business and productivity today.
Receive email only when we have new content.

This robots.txt checker is a “validator” that analyzes the syntax of a robots.txt file to see if its format is valid as established by Robot Exclusion Standard (please read the documentation and the tutorial to learn the basics) or if it contains errors.

1
Simple usage: How to check your robots.txt file format? Just insert the full URL (Example: http://www.yourdomain.com/robots.txt) of the robots.txt file you want to analyze and hit Enter

2
Powerful: The checker finds syntax errors, “logic” errors, mistyped words and it gives you useful optimization tips

Advertisement

Divi - Huge Savings

3
Accurate: The validation process takes in account both Robots Exclusion Standard rules and spider-specific (Google, Inktomi, etc.) extensions.

Source: New Robots.txt Syntax Checker: a validator for robots.txt files

Technorati tags: Robots, Web Development, Validation, Utility, Ruhani Rabin, RuhaniRabin.com

Some of the link on this post may have affiliate links attached. Read the FTC Disclaimer.

4 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
4
0
Would love your thoughts, please comment.x
()
x