This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

False Alarm Test March 2015

Date March 2015
Language English
Last Revision April 30th 2015

Appendix to the Anti-Virus Comparative March 2015


Release date 2015-05-01
Revision date 2015-04-30
Test Period March 2015
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-unchecked
Platform/OS Microsoft Windows

Introduction

This report is an appendix to the File Detection Test March 2015 listing details about the discovered False Alarms.

With AV testing it is important to measure not only detection capabilities but also reliability. One aspect of reliability is the ability to recognize clean files as such, and not produce false alarms (false positives). No product is immune from false positives (FPs), but some produce more than others, and the our goal is to find out which programs do best in this respect. There is no complete collection of all legitimate files that exist, and so no “ultimate” test of FPs can be done. What can be done, and is reasonable, is to create and use a set of clean files which is independently collected. If with such a set one product has e.g. 30 FPs and another only 5, it is likely that the first product is more prone to FP’s than the other. It doesn’t mean the product with 5 FPs doesn’t have more than 5 FPs globally, but it is the relative number that is important.

Tested Products

Test Procedure

In order to give more information to the users about the false alarms, we try to rate the prevalence of the false alarms. Files which were digitally signed are considered more important. Due to that, a file with e.g. prevalence “level 1” and a valid digital signature is upgraded to the next level (e.g. prevalence “level 2”). Files which according to several telemetry sources had zero prevalence have been provided to the vendors in order to fix them, but have also been removed from the set and were not counted as false alarms.

The prevalence is given in five categories and labeled with the following colors: fp_prevalence

LevelPresumed number of affected usersComments
1fp_prevalence_1Probably fewer than hundred usersIndividual cases, old or rarely used files, very low prevalence
2fp_prevalence_2Probably several hundreds of users


Initial distribution of such files was probably much higher, but current usage on actual systems is lower (despite its presence), that is why also well-known software may now affect / have only a prevalence of some hundreds or thousands of users.
3fp_prevalence_3Probably several thousands of users
4fp_prevalence_4Probably several tens of thousands (or more) of users
5fp_prevalence_5Probably several hundreds of thousands or millions of usersSuch cases are likely to be seen much less frequently in a false alarm test done at a specific time, as such files are usually either whitelisted or would be noticed and fixed very fast.

Most false alarms will probably fall into the first two levels most of the time. In our opinion, anti-virus products should not have false alarms on any sort of clean files regardless of how many users are currently affected by them. While some AV vendors may play down the risk of false alarms and play up the risk of malware, we are not going to rate products based on what the supposed prevalence of false alarms is. We already allow a certain amount of false alarms (currently 10) inside our clean set before we start penalizing scores, and in our opinion products which produce a higher amount of false alarms are also more likely to produce false alarms on more prevalent files (or in other sets of clean files). The prevalence data we give about clean files is just for informational purpose. The listed prevalence can differ inside the report, depending on which file/version the false alarm occurred, and/or how many files of the same kind were affected.

Testcases

All listed false alarms were encountered at the time of testing. False alarms caused by unencrypted data blocks in anti-virus related files were not counted. If a product had several false alarms belonging to the same software, it is counted here as only one false alarm. Cracks, keygens, etc. or other highly questionable tools, including FPs distributed/shared primarily by vendors (which may be in the several thousands) or other non-independent sources are not counted here as false positives.

Test Results

1.ESET, Trend Micro1very few FPs
2.Panda3
3.Fortinet6 few FPs
4.Sophos, Tencent8
5.Bitdefender, Kaspersky9
6.McAfee10
7.AVG14
8.Emsisoft16 many FPs
9.BullGuard, eScan, F-Secure, Lavasoft19
10.Quick Heal28
11.Avira44
12.ThreatTrack50
13.Avast77
14.Baidu94 very many FPs

Copyright and Disclaimer

This publication is Copyright © 2015 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(May 2015)