Description: Web penetration testing has benefited from certain sites providing a ready made list of sensitive areas that they don't want crawled, robots.txt. The presenter pulled, and analyzed, the robots.txt file from numerous sites to determine most common user-agents and locations. From the results, he has derived a better listing of directories to use with tools like dirbuster and for better reconnaissance. More details available at http://bsidespgh.com/2015/speakers.php
Tags:
Disclaimer: We are a infosec video aggregator and this video is linked from an external website. The original author may be different from the user re-posting/linking it here. Please do not assume the authors to be same without verifying.