User-Directed Spidering

This is a more sophisticated and controlled technique, which is usually preferable to automated spidering. Here, the user walks through the application in the normal way using a standard browser, attempting to navigate through all of the application’s functionality. As he does so, the resulting traffic is passed through a tool combining an intercepting proxy and spider, which monitors all requests and responses. The tool builds up a map of the application, incorporating all of the URLs visited by the browser, and also parses all of the application’s responses in the same way as a normal application-aware spider and updates the site map with the content and functionality it discovers. The spiders within Burp Suite and WebScarab can be used in this way.

Compared with the basic spidering approach, this technique carries numerous benefits:

■ Where the application uses unusual or complex mechanisms for navigation, the user can follow these using a browser in the normal way. Any functions and content accessed by the user will be processed by the proxy/spider tool.

■ The user controls all data submitted to the application and can ensure that data validation requirements are met.

■ The user can log in to the application in the usual way, and ensure that the authenticated session remains active throughout the mapping process. If any action performed results in session termination, the user can log in again and continue browsing.

■ Any dangerous functionality, such as deleteUser.jsp , will be fully enumerated and incorporated into the site map, because links to it will be parsed out of the application’s responses. But the user can use his discretion in deciding which functions to actually request or carry out.

Screenshot from 2020-04-17 22:28:02

Figure -1 : IEWatch performing HTTP and HTML analysis from within the browser


NEXT is..Discovering Hidden Content…..,