Description: Web penetration testing has benefited from certain sites providing a ready made list of sensitive areas that they don't want crawled, robots.txt. I pulled, and analyzed, the robots.txt file from numerous sites to determine most common user-agents and locations. From the results, I have derived a better listing of directories to use with tools like dirbuster and for better reconnaissance.
For More Information Please Visit: - http://bsidespgh.com/
Tags:
Disclaimer: We are a infosec video aggregator and this video is linked from an external website. The original author may be different from the user re-posting/linking it here. Please do not assume the authors to be same without verifying.