Firefox plugin for spider emulation & checking for malware on websites
There was a thread back in 2008 where someone was considering writing a Firefox Plugin that would emulate a search engine spider. I could used that for my hobby blog, but I also have a professional interest. I do incident response for a university.
I have a slightly different application of a web browser plugin. We have had a website that was compromised and began to host malware. The developers for the site asked me to scan it. But I only have a vulnerability scanner. They looked at the web pages in the mySql database, but they would really like a scan of the site for malware. I know that Finjan (now M86) Secure Browsing and the Dr Web Firefox plugins do some sort of page analysis before they show the page. I thought that if I combined them with a website spider plugin, that I would get an understanding of the health of the website. If this is too convoluted, and there is an easier way, I will confess to being no web expert, and plead ignorance.
Not sure what you're looking for here re. "emulating a spider."
A crawler/robot/spider simply issues requests for resources, just like your browser does. The difference is that your browser also renders such for viewing, whereas the former do not.
As for the site believed to be compromised, does the developer not have copies of the clean code, so as to simply do file comparisons?
As Deepsand suggests above, if you are just looking to mitigate the attack, uploading fresh backups to replace the compromised content, followed by internal virus scans and external vulnerability scans would be the best course. If you want to maintain a chain of evidence, though, you would need to first create a forensic duplicate of the server.
Usually, the best spider for malware on a site is Googlebot - its slow, but it is exhaustive. Beyond that, you could probably hack a scanner using wget (a linux based spider, included in most if not all distros) in Perl or write a spider in Java to compare the site to your clean source code. In general though, I think something like that would have to be a custom program because no two sites are really alike.
Typically, I tend to do most of these checks manually when I review a site for malware, using a variety of plugins, among other things changing my referrer to Google Search Results, and changing my user agent to googlebot. That is because some attacks are hidden from the webmaster by only displaying the malware to visitors who arrive at the site from Google, and others only show the altered content to Googlebot.
grateful4godgrace, there are several FF extensions to check websites for vulnerabilities.
SQL Inject Me
There's also a Inline Code Finder version for Firebug.
Last edited by Bernd; 08-16-2010 at 06:00 AM.
Tags for this Thread