Parsero is a free script written in Python which helps you to automatically audit the Robots.txt file of a web server. In just a few seconds, you are able to get a lot of valuable information which is needed when you are auditing a website.
parsero [-h] [-u URL] [-o] [-sb] [-f FILE]
-h, --help show this help message and exit -u URL Type the URL which will be analyzed -o Show only the "HTTP 200" status code -sb Search in Bing indexed Disallows -f FILE Scan a list of domains from a list
cyborg@cyborg:~$ parsero -u google.com -o ____ | _ \ __ _ _ __ ___ ___ _ __ ___ | |_) / _` | '__/ __|/ _ \ '__/ _ \ | __/ (_| | | \__ \ __/ | | (_) | |_| \__,_|_| |___/\___|_| \___/ Starting P a r s e r o v0.81 at 09/04/15 17:24:42 P a r s e r o scan report for google.com http://google.com/movies? 200 OK http://google.com/ads/search? 200 OK [+] 253 links have been analyzed and 2 of them are available!!! Finished in 37.18 seconds.