This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

False Alarm Test September 2016

Date September 2016
Language English
Last Revision October 10th 2016

Appendix to the Anti-Virus Comparatives September 2016


Release date 2016-10-15
Revision date 2016-10-10
Test Period September 2016
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-unchecked
Platform/OS Microsoft Windows

Introduction

This report is an appendix to the File Detection Test September 2016 listing details about the discovered False Alarms.

With AV testing it is important to measure not only detection capabilities but also reliability. One aspect of reliability is the ability to recognize clean files as such, and not produce false alarms (false positives). No product is immune from false positives (FPs), but some produce more than others, and the our goal is to find out which programs do best in this respect. There is no complete collection of all legitimate files that exist, and so no “ultimate” test of FPs can be done. What can be done, and is reasonable, is to create and use a set of clean files which is independently collected. If with such a set one product has e.g. 30 FPs and another only 5, it is likely that the first product is more prone to FP’s than the other. It doesn’t mean the product with 5 FPs doesn’t have more than 5 FPs globally, but it is the relative number that is important.

Tested Products

Test Procedure

In order to give more information to the users about the false alarms, we try to rate the prevalence of the false alarms. Files which were digitally signed are considered more important. Due to that, a file with e.g. prevalence “level 1” and a valid digital signature is upgraded to the next level (e.g. prevalence “level 2”). Files which according to several telemetry sources had zero prevalence have been provided to the vendors in order to fix them, but have also been removed from the set and were not counted as false alarms.

The prevalence is given in five categories and labeled with the following colors: fp_prevalence

LevelPresumed number of affected usersComments
1fp_prevalence_1Probably fewer than hundred usersIndividual cases, old or rarely used files, very low prevalence
2fp_prevalence_2Probably several hundreds of users


Initial distribution of such files was probably much higher, but current usage on actual systems is lower (despite its presence), that is why also well-known software may now affect / have only a prevalence of some hundreds or thousands of users.
3fp_prevalence_3Probably several thousands of users
4fp_prevalence_4Probably several tens of thousands (or more) of users
5fp_prevalence_5Probably several hundreds of thousands or millions of usersSuch cases are likely to be seen much less frequently in a false alarm test done at a specific time, as such files are usually either whitelisted or would be noticed and fixed very fast.

Most false alarms will probably fall into the first two levels most of the time. In our opinion, anti-virus products should not have false alarms on any sort of clean files regardless of how many users are currently affected by them. While some AV vendors may play down the risk of false alarms and play up the risk of malware, we are not going to rate products based on what the supposed prevalence of false alarms is. We already allow a certain amount of false alarms (currently 10) inside our clean set before we start penalizing scores, and in our opinion products which produce a higher amount of false alarms are also more likely to produce false alarms on more prevalent files (or in other sets of clean files). The prevalence data we give about clean files is just for informational purpose. The listed prevalence can differ inside the report, depending on which file/version the false alarm occurred, and/or how many files of the same kind were affected.

Testcases

All listed false alarms were encountered at the time of testing. False alarms caused by unencrypted data blocks in anti-virus related files were not counted. If a product had several false alarms belonging to the same software, it is counted here as only one false alarm. Cracks, keygens, etc. or other highly questionable tools, including FPs distributed/shared primarily by vendors (which may be in the several thousands) or other non-independent sources are not counted here as false positives.

Test Results

Some products using third-party engines/signatures may have fewer or more false alarms than the licensed engine has by its own, e.g. due to different internal settings implemented, the additional checks/engines/clouds/signatures, whitelist databases, time delay between the release of the original signatures and the availability of the signatures for third-party products, additional quality assurance of signatures before release, etc.

False Positives (FPs) are an important measurement for AV quality.  One FP report from a customer can result in large amount of engineering and support work to resolve the issue.  Sometimes this can even lead to important data loss or system unavailability.  Even “not significant” FPs (or FPs on old applications) deserve mention and attention because FPs are likely to be a result of principled rule detections.  It just happened that the FP was on an insignificant file. The FP possibility is probably still in the product and could cause an FP again on a more significant file. Thus, they still deserve mention and still deserve to be penalised. Below you will find the false alarms we observed in our independent set of clean files. Red entries highlight false alarms on files that were digitally signed.

1.ESET, Fortinet, Trend Micro0very few FPs
2.Bitdefender, Lavasoft2 few FPs
3.Avira, BullGuard, eScan, Kaspersky, Sophos, ThreatTrack3
4.Tencent4
5.Emsisoft, McAfee5
6.F-Secure, Quick Heal6
7.Microsoft12 many FPs
8.AVG19
9.Avast28

Details about the discovered false alarms

ESET, Fortinet and Trend Micro had zero false alarms on the used set of clean files.

Bitdefender 2 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Herold package Gen:Variant.Symmi.67713 fp_prevalence_1
Sony package Gen:Variant.Razy.30991 fp_prevalence_4

 

Lavasoft 2 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Herold package Gen:Variant.Symmi.67713 fp_prevalence_1
Sony package Gen:Variant.Razy.30991  fp_prevalence_4

 

Avira 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
AutoIt package TR/SelfDel.ec1900  fp_prevalence_2
Igel package HEUR/APC  fp_prevalence_1
Vuex package TR/Agent.89584.12 fp_prevalence_4

 

BullGuard 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Herold package Gen:Variant.Symmi.67713 fp_prevalence_1
Rapid package Gen:Variant.Symmi.64277 fp_prevalence_1
Sony package Gen:Variant.Razy.30991 fp_prevalence_4

 

eScan 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Herold package Gen:Variant.Symmi.67713 (DB) fp_prevalence_1
Rapid package Gen:Variant.Symmi.64277 (DB) fp_prevalence_1
Sony package Gen:Variant.Razy.30991 (DB) fp_prevalence_4

 

Kaspersky Lab 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
A1 package Trojan.Win32.Llac.lbpa fp_prevalence_4
AutoIt package Trojan.Win32.SelfDel.cfzt fp_prevalence_2
WinTuning package UDS:DangerousObject.Multi.Generic fp_prevalence_1

 

Sophos 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
FreeDM package Mal/Generic-S  fp_prevalence_1
PersonDJ package Mal/Zbot-UM  fp_prevalence_3
Profe package Mal/Generic-S  fp_prevalence_1

 

ThreatTrack 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Herold package Gen:Variant.Symmi.67713 fp_prevalence_1
Rapid package Gen:Variant.Symmi.64277 fp_prevalence_1
Sony package Gen:Variant.Razy.30991 fp_prevalence_4

 

Tencent 4 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Crossfire package Gen:Variant.Mikey.53043 fp_prevalence_5
Herold package Gen:Variant.Symmi.67713 fp_prevalence_1
Rapid package Gen:Variant.Symmi.64277 fp_prevalence_1
Sony package Gen:Variant.Razy.30991 fp_prevalence_4

 

Emsisoft 5 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
GxTrans package Trojan.Generic.7464985 fp_prevalence_1
Herold package Gen:Variant.Symmi.67713 fp_prevalence_1
Orange package Trojan.Sinowal.Gen.1 fp_prevalence_1
Rapid package Gen:Variant.Symmi.64277 fp_prevalence_1
Sony package Gen:Variant.Razy.30991 fp_prevalence_4

 

McAfee 5 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
BTRV package RDN/Generic.com fp_prevalence_3
Pegasys package Artemis!c5e21bed1b70 fp_prevalence_3
Settlers package Artemis!32c50b75be89 fp_prevalence_3
TCHunt package Artemis!5f94359c18d6 fp_prevalence_3
Vuex package Artemis!94a9fa418324 fp_prevalence_4

 

F-Secure 6 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
FinePrint package Trojan:W32/Gen4135.1fc23018e8!Online fp_prevalence_2
GxTrans package Trojan.Generic.7464985 fp_prevalence_1
Herold package Gen:Variant.Symmi.67713 (DB) fp_prevalence_1
InstantPlayer package Trojan:W32/BitCoinMiner.J fp_prevalence_1
Orange package Trojan.Sinowal.Gen.1 fp_prevalence_1
Rapid package Gen:Variant.Symmi.64277 (DB) fp_prevalence_1

 

Quick Heal 6 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Crossfire package EE:Malwr.Heur.Mikey.53043 fp_prevalence_5
GXTrans package EE:Malware.Generic.7464985 fp_prevalence_1
Herold package EE:Malwr.Heur.Symmi.67713 fp_prevalence_1
Orange package EE:Trojan.Sinowal.Gen.1 fp_prevalence_1
Rapid  package EE:Malwr.Heur.Symmi.64277 fp_prevalence_1
Sony package EE:Malwr.Heur.Razy.30991 fp_prevalence_4

 

Microsoft 12 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Amok package Trojan:Win32/Rundas!plock fp_prevalence_1
CDDVDBurner package Trojan:Win32/Rundas!plock fp_prevalence_1
eZip package Trojan:Win32/Rundas!plock fp_prevalence_2
MinScout package Trojan:Win32/Dynamer!ac fp_prevalence_1
PEbuilder package Trojan:Win32/Dynamer!ac fp_prevalence_1
SL package Trojan:Win32/Dynamer!ac fp_prevalence_1
Snow package Trojan:Win32/Rundas!plock fp_prevalence_1
Star package Trojan:Win32/Dynamer!ac fp_prevalence_2
SUSD package Trojan:Win32/Rundas!plock fp_prevalence_1
Wetterstation package Trojan:Win32/Dynamer!ac fp_prevalence_1
WildTangent package Trojan:Win32/Dorv.D!rfn fp_prevalence_4
xCAT package Trojan:Win32/Dynamer!ac fp_prevalence_2

 

AVG 19 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
ARCAD package Win32/Herz.A fp_prevalence_1
Atomic package Win32/DH{IyQl?} fp_prevalence_3
Brother package Generic_s.HNM fp_prevalence_2
Casino package Crypt_s.LAG fp_prevalence_1
Clipsave package Generic_s.IGI fp_prevalence_1
CoffeeFTP package Win32/DH{CA?} fp_prevalence_2
Delay package Luhe.Fiha.A fp_prevalence_5
DigitaleBibliothek package PSW.Banker7.OSM fp_prevalence_1
Divx package BackDoor.Generic19.AIUS fp_prevalence_4
EOC package Atros3.AWYT fp_prevalence_2
HP package Crypt5.AWRU fp_prevalence_5
IBM package Generic_s.HVM fp_prevalence_3
Kinstone package Win32/Herz.B fp_prevalence_1
MyWinLocker package Generic37.BELF fp_prevalence_5
Norton package Generic_r.MFR fp_prevalence_3
Presto package Generic_s.HNM fp_prevalence_4
Roboform package Generic_s.ILT fp_prevalence_1
Sygate package Win32/Herz.B fp_prevalence_2
WildTangent package Generic_r.IGQ fp_prevalence_5

 

Avast 28 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
3COM package FileRepMalware fp_prevalence_2
Acer  package FileRepMalware fp_prevalence_2
ActualWindowsManager package Win32:Evo-gen [Susp] fp_prevalence_1
Adobe package Win32:Evo-gen [Susp] fp_prevalence_4
Cluster package Win32:Evo-gen [Susp] fp_prevalence_1
ColorEfex package Win32:Evo-gen [Susp] fp_prevalence_1
DateInTray package Win32:Evo-gen [Susp] fp_prevalence_1
DirectX package Win32:Malware-gen fp_prevalence_2
EuroRoute package Win32:Evo-gen [Susp] fp_prevalence_1
FLV package Win32:Evo-gen [Susp] fp_prevalence_1
HP package FileRepMalware fp_prevalence_5
ISO2USB package FileRepMetagen [Malware] fp_prevalence_4
JBTray package Win32:Evo-gen [Susp] fp_prevalence_1
LetsTrade package Win32:Evo-gen [Susp] fp_prevalence_3
LiteStep package FileRepMalware fp_prevalence_3
Logik package Win32:Evo-gen [Susp] fp_prevalence_1
Matrox package Win32:Evo-gen [Susp] fp_prevalence_3
Money package Win32:Evo-gen [Susp] fp_prevalence_2
MP3pooler package Win32:Evo-gen [Susp] fp_prevalence_1
MyHints package Win32:Evo-gen [Susp] fp_prevalence_1
RibbonCreator package Win32:Dropper-gen [Drp] fp_prevalence_2
SafetyBrowser package Win32:Malware-gen fp_prevalence_3
StarOffice package Win32:Evo-gen [Susp] fp_prevalence_1
TrendMicro package Win32:Evo-gen [Susp] fp_prevalence_3
TurboSliders package Win32:Evo-gen [Susp] fp_prevalence_1
VOO3OC package Win32:Evo-gen [Susp] fp_prevalence_1
Vuex package FileRepMetagen [Malware] fp_prevalence_4

 

Copyright and Disclaimer

This publication is Copyright © 2016 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(October 2016)