This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.

File Detection Test August 2009

Date August 2009
Language English
Last Revision September 19th 2009

of Malicious Software including false alarm and on-demand scanning speed test

Release date 2009-09-20
Revision date 2009-09-19
Test Period August 2009
Number of Testcases 1562092
Online with cloud connectivity checkbox-unchecked
Update allowed checkbox-checked
False Alarm Test included checkbox-checked
Platform/OS Microsoft Windows


The File Detection Test is one of the most deterministic factors to evaluate the effectiveness of an anti-virus engine. These test reports are released twice a year including a false alarm test. For further details please refer to the methodology documents as well as the information provided on our website. In this test, the following 16 up-to-date Security Products were tested using 1562092 prevalent malware samples.

Tested Products

Test Procedure

Each test system is running on Microsoft Windows XP SP3 including a respective security product, which was last updated on the 10th of August 2009. The malware sets were also frozen on the 10th August 2009. All products had Internet-access during the test and were tested using default settings. To ensure that all file recognition capabilities are used, we enabled scan of all files, scan of archives and scan for PUA in all products.

On each test system the malware set is scanned. The detections made by the security product are noted and analysed. Although no samples were executed during this test, we considered cases where malware would be recognized on-access, but not on-demand. The test is thus called File Detection Test (as opposed to the earlier On-Demand Tests), as on-access scanning is taken into consideration.


The test-set is split in two parts. The percentages below refer to SET B, which contains only malware from the last 7 months. SET A is usually covered very well (>97%) by all the tested products and contains malware from December 2007 to December 2008. SET B contains nearly 1.6 million malware samples.

Ranking System

Detection Rate Clusters/Groups
(given by the testers after consulting statistical methods)
87 - 93%
93 - 97%
97 - 100%
Few (0-15 FPs)
Many (over 15 FPs)

Test Results

The test-set used contained 1562092 recent/prevalent samples from the last few weeks/months. We estimate the remaining error margin on the final percentages to be below 0.2%.

Total detection rates (clustered in groups)

Please consider also the false alarm rates when looking at the file detection rates below.

1.G DATA99.8%

Graph of missed samples (lower is better)

The results of our on-demand tests are usually applicable also for the on-access scanner (if configured the same way), but not for on-execution protection technologies (like HIPS, behaviour blockers, etc.).

A good detection rate is still one of the most important, deterministic and reliable features of an antivirus product. Additionally, most products provide at least some kind of HIPS, behaviour-based or other functionalities to block (or at least warn about the possibility of) malicious actions e.g. during the execution of malware, when all other on-access and on-demand detection/protection mechanism failed.

Please do not miss the second part of the report (it will be published in a few months) containing the retrospective test, which evaluates how well products are at detecting new/unknown malware. Further test reports (cleaning test, performance test, PUA detection test, dynamic test, etc.) covering other aspects of the various products will be released soon on our website.

Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to evaluate the software by themselves and build their own opinion about them. Test data or reviews just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and encourage readers to research also other independent test results provided by various well-known and established independent testing organizations, in order to get a better overview about the detection and protection capabilities of the various products over different test scenarios and various test-sets.

Scanning Speed Test

Anti-Virus products have different scanning speeds due to various reasons. It has to be taken in account how reliable the detection rate of an Anti-Virus is; if the Anti-Virus product uses code emulation, if it is able to detect difficult polymorphic viruses, if it does a deep heuristic scan analysis and active rootkit scan, how deep and thorough the unpacking and unarchiving support is, additional security scans, etc.

Most products have technologies to decrease scan times on subsequent scans by skipping previously scanned files. As we want to know the scan speed (when files are really scanned for malware) and not the skipping files speed, those technologies are not taken into account here. In our opinion some products should inform the users more clearly about the performance-optimized scans and then let the users decide if they prefer a short performance-optimized scan (which does not re-check all files, with potential risk of overlooking infected files) or a full-security scan.

The following graph shows the throughput rate in MB/sec (higher is faster) of the various Anti-Virus products when scanning (on-demand) with highest settings our whole set of clean files (used for the false alarm testing). The scanning throughput rate will vary based on the set of clean files5, the settings and the hardware used.

The average scanning throughput rate (scanning speed) is calculated by the size of the clean-set in MB’s divided by the time needed to finish the scan in seconds. The scanning throughput rate of this test cannot be compared with future tests or with other tests, as it varies from the set of files, hardware used etc.

The scanning speed tests were done under Windows XP SP3, on identical Intel Core 2 Duo E8300/2.83GHz, 2GB RAM and SATA II disks.

False Positive (False Alarm) Test Result

In order to better evaluate the quality of the detection capabilities of anti-virus products, we provide also a false alarm test. False alarms can sometimes cause as much troubles as a real infection. Please consider the false alarm rate when looking at the detection rates, as a product which is prone to cause false alarms achieves higher scores easier.

1.Bitdefender, eScan, F-Secure4few FPs
2.Avast, Microsoft5
3.AVG, Kaspersky8
7.Avira21 many FPs
10.Norman, Trustport42

McAfee without in-the-cloud had 26 false alarms.

Details about the discovered false alarms (including their assumed prevalence) can be seen in a separate report available at:

Summary Result

A product that is successful at detecting a high percentage of malicious files but suffers from false alarms may not be necessarily better than a product which detects less malicious files but which generates fewer false alarms.

The following chart shows the combined file detection rates and false alarms.

Award levels reached in this File Detection Test

AV-Comparatives provides a 3-level-ranking-system (STANDARD, ADVANCED and ADVANCED+). As this report contains also the raw detection rates and not only the awards, users that do not care about false alarms can rely on that score alone if they want to.

The Awards are not only based on detection rates – also False Positives found in our set of clean files are considered. A product that is successful at detecting a high percentage of malware but suffers from false alarms may not be necessarily better than a product which detects less malware but which generates less FP’s.

* these products got lower awards due to false alarms


As almost all products run nowadays in real life with highest protection settings by default or switch automatically to highest settings in case of a detected infection, we tested all products with highest settings (except Sophos and F-Secure).

Below are some notes about the used settings (scan of all files etc. is always enabled) and some technologies which need to be explained:

  • AVG, BitDefender, eScan, ESET, Kingsoft, Microsoft, Norman: Runs with highest settings by default.
  • avast: Runs (in case of an infection) by default automatically with highest settings.
  • G DATA: Runs (depending from hardware) with highest settings by default.
  • AVIRA, Kaspersky, Symantec, TrustPort: Asked to get tested with all extended categories enabled and with heuristic set to high/advanced. Due to that, we recommend users to consider also setting the heuristics to high/advanced, as those products do not use their highest settings by default.
  • F-Secure, Sophos: Asked to get tested and awarded based on its default settings (without deep heuristic / suspicious detections). Due that, we suggest users to consider to also do not set the settings to high (except in case of an existing infection).
  • McAfee: Uses an in-the-cloud technology (Artemis / Active Protection) which is enabled by default and working while an Internet connection is available. Artemis was tested at the same time as other products were updated so it did not have any time advantage over other products. For informational purposes, we noted also the results without the in-the-cloud technology (offline).

McAfee VirusScan Plus 2009 and above comes with the “in-the-cloud” Artemis technology turned on by default. For corporate users or home users using older McAfee products without “Active Protection”  – as well as all other users – it may be important to know what the baseline minimum detection rate of McAfee would be, should the Internet connection be not available. The McAfee detection rate without Internet connection was 92.6%.

Copyright and Disclaimer

This publication is Copyright © 2009 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

(September 2009)