[New Tool] Parsero v0.71 – Attacking Robots.txt Files Released

Parsero is a free script written in Python which reads the Robots.txt file of a web server and looks at the Disallow entries. The Disallow entries are the URL path of directories or files hosted on a web server which the administrators don’t want to be indexed by the crawlers.

Sometimes these paths typed in the Disallows entries are directly accessible by the users (without using a search engine) just visiting the URL and the Path and sometimes they are not available to be visited by anybody…Because it is really common that the administrators write a lot of Disallows and some of them are available and some of them are not, you can use Parsero in order to check the HTTP status code of each Disallow entry in order to check automatically if these directories are available or not.

 

HTTP Status Code:

200 OK The request has succeeded.
403 Forbidden The server understood the request, but is refusing to fulfill it.
404 Not Found The server hasn't found anything matching the Request-URI.
302 Found The requested resource resides temporarily under a different URI.

 

Usage:

$ python3.3 parsero.py -h
usage: parsero.py [-h] [-u URL] [-o] [-sb]
optional arguments:
-h, --help show this help message and exit
-u URL Type the URL which will be analyzed
-o Show only the "HTTP 200" status code
-sb Search in Bing indexed Disallows

 

More Information: here

Download Parsero v0.71

Thanks to our friend Javier Nieto, for sharing this tool with us. 😉

MaxiSoler

www.artssec.com @maxisoler