A while back, I wrote an article with tips for evaluating a website when conducting online research.
While all of the information in the original article still holds true, I thought it was a good time to post another article with some additional investigative resources and techniques. The first article covered three tools to help obtain background information on a website:
- Whois Lookup
- Internet Archives: Wayback Machine
- Reverse Image Search
Below are three additional tools and techniques to gain additional insight into a website.
Map a site with Visual Site Mapper
Visualizing the layout of a website can help identify useful sections of a complex site that may have otherwise been overlooked. Visual Site Mapper can quickly crawl and display a graphical layout of a website’s internal link structure.
Simply enter the URL of the website that you want to map, and press the “Get Map” button. Each graph node represents a page of the website, and hovering over a node will highlight all of the other pages that page links to.
Link websites with Google Analytics code
Google Analytics is a popular service that allows webmasters to track statistics such as the number of website visitors per day, visitor location, and time spent on the site. Webmasters are able to do this by including a piece of tracking code on the website. The tracking code includes a unique Google user account number that can link multiple unrelated sites to a common owner.
The tracking ID can be identified in a webpage’s source code, which can be viewed by right clicking on the page and selecting “View Page Source”. The user ID will start with “UA” and then contain a string of numbers, as seen below.
Once you have obtained the ID, you can enter it into a lookup tool that aggregates website information, such as SpyOnWeb.com, to identify any other websites linked to that account number. It is also possible to skip the above step, and just search the website URL directly in SpyOnWeb to obtain the Google tracking ID for the site, and other website information.
A more in-depth write-up on using tracking codes to link websites can be found here. An interesting real-world case study using this technique as part of an investigation into news websites, can also be found here.
Scan a website for malicious content
The web can be a dangerous place. VirusTotal provides an interface to scan a website URL against over 60 antivirus scanners and URL/domain blacklisting services, in addition to providing other URL analysis features. A user can select a file from their computer to be scanned or enter a URL in the browser interface. An API is also available to submit searches programmatically.