Network News

X My Profile
View More Activity

Anti-Virus Testing and Consumer Reports

Consumer Reports recently came under heavy fire from some in the anti-virus industry for creating some 5,500 new virus variants to see how well a dozen leading products fared in detecting the new nasties. More than 100 security experts and executives from companies like Microsoft and HP as well as anti-virus vendors F-Secure, Kaspersky, McAfee, Sophos, Symantec and Trend Micro signed their names to a declaration denouncing Consumer Reports' methods, stating that it is "not necessary and ... not useful to write computer viruses to learn how to protect against them."

Some of the signatories noted -- via various media reports about the scandal -- that with so many viruses already in circulation today (estimates vary from 100,000 to 180,000) it was hardly necessary for Consumer Reports to gin up new ones that could, in theory, be leaked into the wild.

Today, however, I read a rather thoughtful article written by Juergen Schmidt, an editor with the German technology magazine Heise Security. Schmidt picks apart what he sees as the source of the industry's angst on this. He argues that testing anti-virus products against known viruses is a non-starter because the real battle against malicious worms and viruses these days is against previously unknown threats, of which he says about 250 emerge each day.

From the article: "The commandment 'Thou shalt not create new viruses' is a sensible self-imposed commitment by the manufacturers of anti-virus software, which prevents them from creating an atmosphere of threat to promote their products. In contrast, meaningful comparative testing of anti-virus software requires that testers work with self-generated virus variants. Anyone condemning such tests in general is certainly not doing so in the interests of the user."

Schmidt says that in light of the poor job most anti-virus programs do at spotting new threats (without the benefit of code snippets), it is clearly necessary to test anti-virus software using previously unseen malware.

"Known viruses no longer represent any great danger for users with anti-virus software -- pretty much every product will recognize them reliably. The real danger lies with the estimated 250 new malware programs that are released every day. And recognizing these as a threat is where many anti-virus products still fail miserably."

As I have noted here before, many malware authors are increasingly outpacing the security vendors by automagically updating the genetic makeup of their creations before anti-virus companies have time to ship updates. As a result, we have an industry whose business is predicated on 10 percent to 20 percent of its customers being successfully attacked before it can even begin to respond, according to some estimates.

If you'd care to see a slick, Web-based method some criminals use to evade fresh anti-virus signatures, check out this story I wrote from a few months back about a Russian hacking ring.

But don't take my word for it: Go ahead and submit any new nasty you receive in your e-mail inbox to, a free online service that scans files against more than two dozen of the top anti-virus applications. If the nasty is new enough -- i.e., released in the last four to eight hours -- my experience has been that maybe a quarter of those anti-virus products will flag it as malicious or suspicious. For a sobering look at the rate of detection failures, check out VirusTotal's graph here. In the past seven days, only 261 of the 24,190 infected files submitted to VirusTotal for scanning -- slightly more than 1 percent -- were detected as malicious by all of the anti-virus vendors.

In a blog post two weeks ago I included the results of a VirusTotal scan of a worm that was exploiting a freshly patched flaw in Microsoft Windows. More than 12 hours after the thing had surfaced, roughly half of the anti-virus products failed to detect the sample as malicious.

Yes, the anti-virus industry has in large part gotten better with their "heuristic" detection methods that can spot brand new viruses by linking them to previously identified variants. But this technology has a long, long way to go, in my opinion. Anti-virus testing is hard, and is easy to mess up, and maybe virus-writing for the sake of testing anti-virus products isn't the best way to determine the most agile products (I certainly don't agree with some of Consumer Reports' rankings). But it seems to me that we need a better way to advance the debate about improving their performance.

The most innovative idea I've seen so far came in a presentation from Paul Vixie and David Dagon at the DefCon hacker conference in Las Vegas this year. Vixie and Dagon proposed creating a massive malware repository to which all of the anti-virus vendors would automatically submit new samples. It remains to be seen whether the industry considers this a worthwhile endeavor -- and if so, whether it can set aside its notions of competitive advantage to invest any energy or resources in the idea.

By Brian Krebs  |  August 29, 2006; 7:11 PM ET
Categories:  From the Bunker  
Save & Share:  Send E-mail   Facebook   Twitter   Digg   Yahoo Buzz   StumbleUpon   Technorati   Google Buzz   Previous: Sun Acknowledges Security Hole in Patch Process
Next: Using Images to Fight Phishing


Excellent post.

Posted by: jcanto | August 30, 2006 1:58 AM | Report abuse

That last suggestion just sounds so darn open source. I like it.

Posted by: openSource | August 30, 2006 2:35 AM | Report abuse

it is not necessary to create new viruses in order to test how well anti-virus products handle new viruses... as has already been pointed out retrospective testing does this and if you look at the results of a retrospective test you'll find they already do a good job of showing the failings of anti-virus products...

one thing jurgen schmidt doesn't realize in his criticism of retrospective testing (that the virus writers may modify their viruses in ways that affect detection by one scanner but not another) is that the same holds true for the viruses created in a lab for consumer reports' test...

additionally, because retrospective testing uses viruses from the real world it more accurately represents how anti-virus products perform against new viruses in the real world...

Posted by: kurt wismer | August 30, 2006 2:53 AM | Report abuse

As a Test Engineer for well over 30 years, I have some ethical as well as technical issues here. Testing is about ethical and accurate measurements. In order to get accurate measurements, you have to have standards better than what you are testing. I somehow doubt that virus samples developed in a few weeks time could even come close to measuring the abilities of software. The idea of taking 3 month old software and definaitions and putting it against current malware that actually is in the open and spreading, would seem to be a much better evalauation of the capabilities of the software. And using existing software, the software vendors and use this to acctually accomplish imporvements in their software, as opposed to something that is contrived, and I seriously doubt would follow actual trends that exist in the real world.
That and creating active viruses, just does not sit right, on a very ethical level.

Posted by: PennWritre aka Ed | August 30, 2006 8:03 AM | Report abuse


Posted by: Okan Akkaya | August 30, 2006 9:17 AM | Report abuse

AV-Comparatives, an independent Austrian antivirus test lab, recently analyzed the reliability and effectiveness of the proactive protection module in Kaspersky® Anti-Virus 6.0. Tests were conducted using an in-house collection of 6,329 samples, including 3,175 backdoors, 2,720 other Trojans, 294 email worms, 124 script viruses and 16 other viruses. During testing, only the proactive defense module of Kaspersky Anti-Virus was enabled.

The recent results from AV-Comparatives showed that Kaspersky Anti-Virus 6.0 provides exceptionally effective proactive protection - having detected 99% of the AV-Comparatives collection.

Posted by: Valdis | August 30, 2006 9:28 AM | Report abuse

PenWritre: does exactly that.

Posted by: anonymouse | August 30, 2006 10:18 AM | Report abuse

Hear Hear. I'm glad to see some people in the media picking up the opposing view. The frenzied and frankly unprofessional ("I wanted to hit my head against a brick wall", ranted one spokesman) outcry from the AV vendors smacked of damage-control to me. Editors at the SANS institute (a security training outfit) were skeptical of the AV vendors' protests from the start.

That being said, CR is not as strong in computer software testing as they are in other fields. A lot of this has to do with the constraints of the magazine, where only a couple of pages can be devoted to their reviews, so detail about their methodology can only be obtained by writing them and asking for it. The AV vendors seem to have made no such effort before launching their offensive.

At first glance, the 5000 new variant test seems comparatively ad-hoc when contrasted with the work of industry supported (so-called "independent") AV testing firms. Of course, CR's conclusions were similar to the av-comparatives results sited above--so perhaps the testing wasn't so bad after all.

Posted by: Travis Finucane | August 30, 2006 11:08 AM | Report abuse

@travis finucane
"The frenzied and frankly unprofessional ("I wanted to hit my head against a brick wall", ranted one spokesman) outcry from the AV vendors smacked of damage-control to me. Editors at the SANS institute (a security training outfit) were skeptical of the AV vendors' protests from the start."

if, as you observed, the retrospective tests by independant test organizations give results similar to the consumer reports test, then the outcry over the consumer reports test cannot be about damage control concerning the results as they should then be doing the same thing for the retrospective tests...

therefore the outcry is not over the results of the test but the way the test was performed...

Posted by: kurt wismer | August 30, 2006 12:45 PM | Report abuse

The AV vendors are really concerned about the size of Consumer Reports' subscriber base and the number of TV/radio/print media outlets who regularly draw stories from CR. Bad publicity from CR is MUCH more threatening to consumer sales than critiques from security outfits like SANS or av-comparatives. It's all about money and marketing.

The AV vendors' complaints are similar to those voiced by Suzuki a decade ago, when their Samurai jeep was put through real-world flip-over tests.

Posted by: Ken L | August 30, 2006 1:06 PM | Report abuse

>I certainly don't agree with some of Consumer Reports' rankings<

Brian Krebs, what are your personal antivirus rankings/recommendations?

Posted by: John Johnson | August 30, 2006 1:28 PM | Report abuse

My personal favorite is NOD32 from Eset. I use it on two of my machines and find it to be fast, light on system resources, and very quick to update w/ new definitions. But that one wasn't even considered in the CR report.

Posted by: Bk | August 30, 2006 2:12 PM | Report abuse

I concur with Brian, ESET's NOD32 is total wiz-bang. I wonder if CR could be asked to run the same test with ESET's product line. I would love to see the results compared next to the other clowns, errr...Ummmm AV providers.


Posted by: DOUGman | August 30, 2006 2:46 PM | Report abuse

it should be interesting to note that although nod32 wasn't tested, the folks at eset still denounced consumer results for their testing methodology...

once again, the damage control theory doesn't hold water...

Posted by: kurt wismer | August 30, 2006 2:48 PM | Report abuse

The weakest link in the virus-antivirus wars is and will continue to be the the user who leaves his/her machine poorly protected either through laziness or technical sloppiness. Whether product A is better than product B by some percent is irrelevant to this problem which will continue to allow spammers to make use of all the machines they need to spew out their product.

Posted by: jsi | August 30, 2006 3:08 PM | Report abuse

I never see reviews of the anti-virus software produced by the Norwegian company Norman. It was recommended to me by a long-time local computer shop owner b/c it uses a "sandbox" approach to dealing with potential attacks. I have been using it for a few months now and to date am satisfied. Have others here had good or bad experiences with it? Details on the sandbox approach are here:

Posted by: tdb | August 30, 2006 4:44 PM | Report abuse

go to and you'll see tests that include norman's product...

Posted by: kurt wismer | August 30, 2006 6:12 PM | Report abuse

I was a long time NOD32 user but resently I switched to Kaspersky after getting hit several times with some new viruses that NOD32 failed to detect but Kaspersky had no problem detecting them.
NOD32 started recognizing those files a day or two later after I sent them the files several times. And generaly kaspersky releases definition updates much faster and more often then NOD32.

Posted by: I | August 30, 2006 6:33 PM | Report abuse

I tried Kasperksy's trial -- on two different computers -- and had the same result: it killed my networking, and for some reason I could no longer get online. It wasn't the firewall, and I shut off everything else I could think of. I notified Kaspersky about it and they said they'd heard of similar issues with their product.

Posted by: Bk | August 30, 2006 6:42 PM | Report abuse

I use an apple computer and no virus can touch the Apple operating system just micro soft

Posted by: William D. Tomany | August 30, 2006 6:53 PM | Report abuse

Well, along with the last part of the article, if we had a HUGE repository for all of the thus found virii so that all the diffrent AV programs out there could be as well armed as possible, then Why Not? If all of the Con.Rep. Virii made it out into the wild, then YES, the ethical arguments against making them in the first place would be retrospectively agreed with. But, untill then, if all of the fresh new virii were documented and submitted, then 5,500 new virii would be acounted for. Thats a good hunk of mal-code that can Now be found, and now protected against. 200 plus new pieces of malware and viruses being put out a day? well, then isn't that like a months worth of code that we can learn from BEFORE the same thing is designed "in the wild".

just my .02$

Posted by: naMretupmoC | August 30, 2006 9:36 PM | Report abuse


the ethical problem with consumer reports creating viruses isn't just that they might get out... that's certainly part of the problem, but that's dependent on some accident occurring...

another part of the problem, and one that happens regardless of consumer reports' viruses escaping or not, is that it gives credence to the arguments put forward by those who have no business handling viruses (because they're too careless or irresponsible or whatever) that they have a legitimate reason to create viruses...

thus, even if consumer reports' viruses don't escape they're still contributing to the problem...

Posted by: kurt wismer | August 30, 2006 11:33 PM | Report abuse

I'm not worried about CR having their viruses excape. They hve a lot to lose if they are careless and nothing to gain. The AV companies, in reality, are in the totaly opposit postion.

Posted by: Bill Robins | August 30, 2006 11:52 PM | Report abuse

An additional argument against what CR did is that there is no real proof the variants they created are even legitimate threats. Looking at the mass quantity of samples they generated, I do not fathom how they could have validated that each new sample maintained malicous behavior. Should the AV companies be expected to detect harmless snippets of code? Did CR actually execute each of their samples to see if non-signature based behavior analysis would detect or prevent malicious behavior? Did they prove that each sample they created actually performs maliciously? The retrospective testing peformed by the AV companies actually tests against malicous software.

I simply fail to see how CR's testing can truly be conclusive. In their testing (if not all the samples are threats), the better performing products might actually be those more likely to have false-positives.

Posted by: Bob Lecht | August 31, 2006 1:50 AM | Report abuse

Fair play, altho i've fallen in love with that new virus that came out, the one with the ### polymorphic rootkit ###.

Now that is cool.

Posted by: Anonymous | August 31, 2006 2:29 AM | Report abuse

The Consumer Reports web site has come up with a smart idea to get credit card information using their own scandalous material. I subscribed to for a monthly fee of US$ 4.95 to be able to see their antivirus "tests". There is no other possibility to get it - of course, except a yearly subscription. Do you know what message I got when tried to cancel my subscription in order to prevent them charging my credit card the next month? "Sorry, an error occurred while validating your account information ...".

This organizations' domain name rather should be; by the way, they don't have any idea about antivirus tests at all.

As far as I know, makes similar tests without creating any new viruses. It's enough to "freeze" an antivirus for a couple of months and afterwards test it on the real viruses which have appeared during this particular period of time. Have the Consumer Reports after the tests sent the samples created by them to the experts of any antivirus companies? If not - then Consumer Reports' actions border on criminal activities.

Posted by: Consumer | August 31, 2006 4:36 AM | Report abuse

Any suggestions for anti-virus software for macs?

Posted by: Virii | September 1, 2006 1:27 AM | Report abuse

Don't look for Mac advice from CR; they've bought into the "Macs are just as vulnerable as PC's" fantasy hook, line and sinker. Sad, really, since CR prides itself on giving "objective advice" always absent in any paragraph involving Mac's.

Posted by: Judge C. Crater | September 5, 2006 5:04 PM | Report abuse

As for testing with unbiased quality results unlike VirusBulletin and others who cater to their advertisers which are AV companies! Besides the fact that other supposed "good" test companies don't have enough samples for a decent test to begin with and their standards of setup and using the same maximum settings available to each tested anti-virus soft is questionable at best from some of these characters. That is why I personally recommend and realize from my own independent long term tests that these sites re the MOST accurate and the unbiased and test setup factors as mentioned are key in them getting the truth with real scientific methods which provide real info which unfortunately is suppressed to most end-users by the crafty PR work and ads by the bigger AV corp. giants. Sad indeed. Here are my choices:

Personally I think teaming up Kaspersky and NOD32 makes for a *nearly unbeatable combo with a few other tools to go along with them to fill the gaps which are all compatible together without trying to sabotage one's system or cause major conflict as long as only one is monitoring and the other is on-demand for manual scans.

AS for Consumer Reports my hats off to them for stepping up and having the brass balls when all these other self-righteous hypocrites do or learn to do very similar if not the same exact thing in private or in a lab. This is like saying it is illegal to trade viruses, although it is not but Avers have a moral problem with it if anyone but THEM does this since they think there can be no independent researchers only them with certs and credentials and their company logo to protect them like a big shield saying we can do it you can't end of story. do they have special morals training that the rest of the intelligent population does or do the go to some ethics class that we wouldn't understand .. please give me a break. This is hypocritical you know you do it and I know you do it so why try to pain others as idiots if they have the background knowledge to do so in a safe environment under controls and safeguards. After all you guys apparently need all the help you can get, but please if I here the it's against my ethics argument one more time I may puke and die from it. The truth hurts and the fact is before safe labs and even after I am quite sure some AV companies have slipped up themselves and released test samples over the net or into storage later to be transfered accidentally off-site so please please let us all be honest and not preach you are researchers and techs not Priests or members of the Church of Scientology. =p Just tone down the ethics speech I know it by heart since I hear you all recite it by heart like it were the "Pledge of Allegiance." If they did things under strict lab conditions than it is no different than a lab of yours with hot samples except you didn't make them (or at least we can assume) what is the difference do they turn to the dark side when making their own sample for testing??! Guess what I have done it too and I am not ashamed or afraid to say it and without releasing them "into the wild" unto some poor unsuspecting soul so I did nothing either morally or legally wrong in my country so I don't care what your moral gut feeling is. Moral gut feelings simply get in the way of the logical scientific work here and are simply opinions yours and mine and meaningless as such. Remember the old saying opinions are like a--holes everyone has one. I think you get my point. Well for the researchers and excellent AV companies and teams in them keep up the good work.:) I hope others take a lesson in Consumer Reports idea at leat in a more progressive aproach to testing the antivirus engines, signitures and heuristics to maybe stay a bit ahead of the game rather than lagging behind with the "old school" type approach. I look forward t othe innovatiosn in the industry that this may yet brign for those willing to try something a bit taboo for the rest to except. Of coprse others cold be doing it and telling no one. Cough....|for internal use only|....cough....excuse me clearing my throat. ;-)


Posted by: Azothoz | September 9, 2006 6:53 PM | Report abuse

The comments to this entry are closed.

RSS Feed
Subscribe to The Post

© 2010 The Washington Post Company