This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

Retrospective / Proactive Test May 2010

Date February 2010
Language English
Last Revision June 5th 2010

Heuristic and behavioural protection against new/unknown malicious software


Release date 2010-06-06
Revision date 2010-06-05
Test Period February 2010
Number of Testcases 27271
Online with cloud connectivity checkbox-unchecked
Update allowed checkbox-unchecked
False Alarm Test included checkbox-checked
Platform/OS Microsoft Windows

Tested Products

Test Procedure

Anti-Virus products often claim to have high proactive detection capabilities – far higher than those reached in this test. This is not just a self-promotional statement; it is possible that products reach the stated percentages, but this depends on the duration of the test-period, the size of the sample set and the used samples. The data shows how good the proactive detection capabilities of the scanners were in detecting new threats. Users should not be afraid if products have, in a retrospective test, low percentages. If the anti-virus software is always kept up-to-date, it will be able to detect more samples. For understanding how the detection rates of the Anti-Virus products look with updated signatures and programs, have a look at our regular on-demand detection tests. Only the on-demand detection capability was tested. Some products may be had the ability to detect some samples e.g. on-execution or by other monitoring tools, like behaviour-blocker, etc. Those kinds of additional protection technologies are considered by AV-Comparatives in e.g. dynamic tests.

This test report is the second part of the February 2010 test. The report is delivered begin of June due the high-required work, deeper analysis and preparation of the retrospective test-set. Many new viruses and other types of malware appear every day, this is why it’s important that Anti-Virus products not only provide new updates, as often and as fast as possible, but also that they are able to detect such threats in advance (also without executing them) with generic and/or heuristic techniques. Even if nowadays most Anti-Virus products provide daily, hourly or cloud updates, without heuristic/generic methods there is always a time-frame where the user is not reliably protected.

The products used the same updates and signatures they had the 10th February, and the same highest detection settings were used as in February. This test shows the proactive detection capabilities that the products had at that time. We used new malware appeared between the 11th and 18th February 2010. The following 20 products were tested:

Testcases

We tried to include in the test-set only prevalent real-world malware that has not been seen before the 10th February 2010 by consulting telemetry / cloud data collected and shared within the AV industry. Consulting that data was quite interesting for us, as it showed that, while some vendors had seen some malware already many months or even years ago, the same malware hashes appeared in some other vendors clouds only recently.

Ranking System

The awards are given by the testers after consulting a number of statistical methods, including hierarchical clustering. We based our decisions on the following scheme:

Proactive Protection Rates
Under 50%
3
2
1
None - Few FPs
TESTED
STANDARD
ADVANCED
ADVANCED+
Many FPs
TESTED
TESTED
STANDARD
ADVANCED
Very many FPs
TESTED
TESTED
TESTED
STANDARD
Crazy many FPs
TESTED
TESTED
TESTED
TESTED

Test Results

The results show the proactive (generic/heuristic) on-demand detection capabilities of the scan engines against new malware. This Test is performed on-demand, it is NOT an on-execution/behavioral test. The percentages are rounded to the nearest whole number. Do not take the results as an absolute assessment of quality – they just give an idea of who detected more, and who less, in this specific test. To know how these anti-virus products perform with updated signatures, please have a look at our on-demand tests of February and August. Readers should look at the results and build an opinion based on their needs. All the tested products are already selected from a group of very good scanners and if used correctly and kept up-to-date, users can feel safe with any of them.

False Positive (False Alarm) Test Result

To better evaluate the quality of the detection capabilities, the false alarm rate has to be taken into account too. A false alarm (or false positive) is when an Anti-Virus product flags an innocent file to be infected when it is not. False alarms can sometimes cause as much troubles like a real infection.

The false-alarm test results were already included in the February test report. For details, please read the report, False Alarm Test February 2010.

1.eScan1very few FPs (0-3)
2.F-Secure2
3.Bitdefender, ESET, Microsoft3
4.Sophos4 few FPs (4-15)
5.G DATA, Kaspersky5
6.PC Tools8
7.Trustport9
8.AVG10
9.Avast, Avira, Symantec11
10.Trend Micro38 many FPs (over 15)
11.Panda47
12.McAfee61
13.Norman64
14.Kingsoft67
15.K7193

Summary Result

The results show the proactive protection capabilities of the various products against new malware. The percentages are rounded to the nearest whole number.

Below you can see the proactive protection results over our set of new and prevalent malware files/families appeared in-the-field (27,271 malware samples):

  Blocked Compromised Proactive / Protection Rate False Alarms Cluster
TrustPort 17181 10090 63% few 1
G DATA 16635 10636 61% few 1
Microsoft 16090 11181 59% very few 1
Kaspersky 16090 11181 59% few 1
AVIRA 14454 12817 53% few 1
ESET NOD32, F-Secure 14181 13090 52% very few 1
BitDefender, eScan 13636 13636 50% very few 1
 
Panda 17181 10090 63% many 2
K7 13636 13636 50% many 2
Symantec 11727 15544 43% few 2
AVG 9272 17999 34% few 2
Sophos 8727 18544 32% few 2
Avast 7909 19362 29% few 2
 
McAfee 10363 16908 38% many 3
Norman 7363 19908 27% many 3
Trend Micro
7090 20181 26% many 3
PC Tools 4636 22635 17% few 3
 
Kingsoft 3000 24271 11% many

Award levels reached in this Heuristic / Behavioural Test

The following awards are for the results reached in the proactive/behavioural test, considering not only the protection rates against new malware, but also the false alarm rates:

* these products got lower awards due to false alarms

Notes

To avoid some frequent questions, below are some notes about the used settings (scan of all files etc. is always enabled) of some products, whereas highest settings were not used on vendors request:

  • F-Secure, Sophos: asked to get tested and awarded based on their default settings (i.e. without using their advanced heuristics / suspicious detections setting).
  • AVG, AVIRA: asked to do not enable/consider the informational warnings of packers as detections.

Copyright and Disclaimer

This publication is Copyright © 2010 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(June 2010)