Your antivirus tool has many different opportunities to protect your PC from attack. It can totally block access to a malicious URL, wipe out the downloaded code on sight, recognize and prevent malicious behavior, even roll back the system changes made by a malware attack. It's easy to create a lab test that checks just one of these layers, for example, the ability to block malicious URLs or recognize malware by signature. Such a test is informative, but doesn't give the whole picture. AV-Comparatives runs a continual Real-World Protection test that lets each antivirus use all of its weapons against live malware. The latest summary of this test's results reveals a broad range of effectiveness.

Dynamic Testing

The full report goes into great detail about the exact test methodology. Briefly, researchers install 20 or more antivirus products on identical PCs. Every day they gather the latest malicious URLs and test whether each product protected the system. If the antivirus asks the user whether to block or allow any action, they always choose allow. An antivirus that successfully fends off compromise even when the user makes the wrong choice still gets full credit. If making the wrong choice leads to compromise, it gets half credit.

During May and June, the company's researchers ran over 4,000 such tests. They also checked for false positives—legitimate URLs or programs wrongly identified as malicious by the antivirus software. A product with more than the average number of false positives can lose points.

Winners

Nine products successfully protected against 99 percent or more of the samples. Eight of them, including Bitdefender and Kaspersky, earned the top rating, Advanced+. Due to false positives, F-Secure got knocked down one rank to Advanced.

Avast and Baidu made impressive comebacks. In the previous summary, both failed to achieve even the Standard rating. This time around Baidu rated Standard and Avast rose to Advanced.

Losers

AV-Comparatives ran this test under Windows 7, and included the optional Microsoft Security Essentials as a baseline. Had it been given an official rating, it wouldn't have reached the Standard level. Along with AhnLab and ThreatTrack VIPRE, Microsoft rated merely Tested.

McAfee, Trend Micro, and eScan all earned Advanced+ in the previous report. This time around they would have rated Advanced based solely on detection rate, but false positives dragged all three down to a Standard rating.

Also-Rans

Not every security vendor chooses to participate in testing by AV-Comparatives. G DATA's people don't approve of the way the testing system rates protection that depends on user interaction, so they've opted out. Symantec has long contended the file detection test performed by AV-Comparatives is irrelevant. Since this test is included in the all-or-nothing package of tests, Symantec hasn't participated for years.

This time around, AV-Comparatives roped in Symantec and G DATA for testing, for informational purposes. The real-world test is exactly the kind of test Symantec believes should be universal, as it exercises the whole product. Symantec would have earned Advanced+ in this test, with a very high detection rate and few false positives. G DATA would have managed an Advanced rating.

Testing the actual effectiveness of antivirus products is a tough job. Doing it right requires dedication and creativity. It's not surprising that this particular test has won a number of awards from European governments and organizations. The testing experts at AV-Comparatives are doing a great job.

Further Reading

Business Reviews