This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy.
Accept

False Alarm Test March 2021

Date March 2021
Language English
Last Revision April 12th 2021

Appendix to the Malware Protection Test March 2021


Release date 2021-04-15
Revision date 2021-04-12
Test Period March 2021
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-checked
Platform/OS Microsoft Windows

Introduction

This report is an appendix to the Malware Protection Test March 2021 listing details about the discovered False Alarms.

In AV testing, it is important to measure not only detection capabilities but also reliability. One aspect of reliability is the ability to recognize clean files as such, and not to produce false alarms (false positives). No product is immune from false positives (FPs), but some produce more than others. False Positives Tests measure which programs do best in this respect, i.e. distinguish clean files from malicious files, despite their context. There is no complete collection of all legitimate files that exist, and so no “ultimate” test of FPs can be done. What can be done, and is reasonable, is to create and use a set of clean files which is independently collected. If, when using such a set, one product has e.g. 15 FPs and another only 2, it is likely that the first product is more prone to FPs than the other. It doesn’t mean the product with 2 FPs doesn’t have more than 2 FPs globally, but it is the relative number that is important.

Tested Products

Test Procedure

In order to give more information to the user about the false alarms, we try to rate the prevalence of the false alarms. Files which were digitally signed are considered more important. Due to that, a file with the lowest prevalence level (Level 1) and a valid digital signature is upgraded to the next level (e.g. prevalence “Level 2”). Extinct files which according to several telemetry sources had zero prevalence have been provided to the vendors in order to fix them, but have also been removed from the set and were not counted as false alarms.

The prevalence is given in five categories and labeled with the following colors: fp_prevalence

LevelPresumed number of affected usersComments
1fp_prevalence_1Probably fewer than hundred usersIndividual cases, old or rarely used files, unknown prevalence
2fp_prevalence_2Probably several hundreds of usersInitial distribution of such files was probably much higher, but current usage on actual systems is lower (despite its presence), that is why also well-known software may now affect / have only a prevalence of some hundreds or thousands of users.
3fp_prevalence_3Probably several thousands of users
4fp_prevalence_4Probably several tens of thousands (or more) of users
5fp_prevalence_5Probably several hundreds of thousands or millions of usersSuch cases are likely to be seen much less frequently in a false alarm test done at a specific time, as such files are usually either whitelisted or would be noticed and fixed very fast.

Most false alarms will probably (hopefully) fall into the first two levels most of the time. In our opinion, anti-virus products should not have false alarms on any sort of clean files regardless of how many users are currently affected by them. While some AV vendors may play down the risk of false alarms and play up the risk of malware, we are not going to rate products based on what the supposed prevalence of false alarms is. We already allow a certain number of false alarms (currently 10) inside our clean set before we start penalizing scores, and in our opinion products which produce a higher number of false alarms are also more likely to produce false alarms with more prevalent files (or in other sets of clean files). The prevalence data we give for clean files is just for informational purpose. The listed prevalence can differ inside the report, depending on which file/version the false alarm occurred, and/or how many files of the same kind were affected.

Testcases

All listed false alarms were encountered at the time of testing. False alarms caused by unencrypted data blocks in anti-virus related files were not counted. If a product had several false alarms belonging to the same application, it is counted here as only one false alarm. Cracks, keygens, or other highly questionable tools, including FPs distributed/shared primarily by vendors (which may be in the several thousands) or other non-independent sources are not counted here as false positives.

Test Results

There may be a variation in the number of false positives produced by two different programs that use the same engine (principal detection component). For example, Vendor A may license its detection engine to Vendor B, but Vendor A’s product may have more or fewer false positives than Vendor B’s product. This can be due to factors such as different internal settings being implemented, differences in other components and services such as additional or differing secondary engines/signatures/whitelist databases/cloud services/quality assurance, and possible time delay between the release of the original signatures and the availability of the signatures for third-party products.

False Positives (FPs) are an important measurement for AV quality. Furthermore, the test is useful and needed to avoid that vendors optimize products to score good in tests by looking at the context – this is why false alarms are being mixed and tested the same way as tests with malware are done. One FP report from a customer can result in large amount of engineering and support work to resolve the issue. Sometimes this can even lead to important data loss or system unavailability. Even “not significant” FPs (or FPs on older applications) deserve mention and attention because FPs are likely to be a result of principled rule detections. It just happened that the FP was on an insignificant file. The FP possibility is probably still in the product and could potentially cause an FP again on a more significant file. Thus, they still deserve mention and still deserve to be penalised. Below you will find some info about the false alarms we observed in our independent set of clean files. Red entries highlight false alarms on files that were digitally signed.

The detection names shown were taken mostly from pre-execution scan logs (where available). If a threat was blocked on/during/after execution (or no clear detection name was seen), we state “Blocked” in the column “Detected as”.

1.ESET0no/very few FPs
2.Avast, AVG, Kaspersky, Total AV1
3.Avira, G DATA2 few FPs
4.Trend Micro3
5.Bitdefender, Microsoft, VIPRE4
6.McAfee6
7.Total Defense9
8.NortonLifeLock22 many FPs
9.K7, Malwarebytes46 very many FPs
10.Panda65

Details about the discovered false alarms

avast  1 False Alarm
False alarm found in some parts of Detected as Supposed prevalence
Delphi package Win32:Malware-gen

 

avg  1 False Alarm
False alarm found in some parts of Detected as Supposed prevalence
Delphi package Win32:Malware-gen

 

 1 False Alarm
False alarm found in some parts of Detected as Supposed prevalence
CheckSig package HEUR:Trojan-Ransom.Win32.Gen.gen

 

1 False Alarm
False alarm found in some parts of Detected as Supposed prevalence
Swiftswing package HEUR/AGEN.1133632

 

avira 2 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Swiftswing package HEUR/AGEN.1133632
Unchecky package Blocked

 

2 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Delphi package Trojan.GenericKD.43956185
FolderLock package Win32.Trojan.VB.NC

 

trendmicro 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
ESET package Blocked
Safety package Blocked
Seamonkey package Blocked

 

bitdefender  4 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Delphi package Trojan.GenericKD.43956185
Notepad package Blocked
Stempel package Blocked
Tiscali package Blocked

 

microsoft  4 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Delphi package Trojan:Win32/Wacatac.B!ml
Elevate package Trojan:Win32/Occamy.C21
Language package Blocked
Seamonkey package Program:Win32/Wacapew.C!ml

 

vipre  4 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Delphi package Trojan.GenericKD.43956185
Makro package Blocked
Stempel package Blocked
Tiscali package Blocked

 

mcafee 6 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Anlagenverbinder package Blocked
ArchiCrypt package Blocked
Delphi package Blocked
GT4T package Blocked
Safety package Blocked
SpyDetector package Blocked

 

total-defense 9 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Anyvideo package Blocked
ATAsec package Blocked
Delphi package Trojan.GenericKD.43956185
FTPcopy package Blocked
Maxx package Blocked
PerfMenu package Blocked
Puzzle package Blocked
Tiscali package Blocked
WinRAR package Blocked

 

22 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Avance package Trojan.Gen
BrothersInArms package Heur.AdvML.C
Delphi package Heur.AdvML.C
Dimio package Heur.AdvML.C
DirectX package Trojan.Gen.X
Divx package Heur.AdvML.B
Earth package Trojan.Gen.2
Easo package Trojan.Gen
Fasthide package Trojan.Gen.2
FineReader package Heur.AdvML.B
Hardcopy package Trojan.Gen.2
Kaspersky package Trojan.Tooso!gen
Konwerter package Trojan.ADH
Lame package Heur.AdvML.B
LockOn package Heur.AdvML.B
MKV package Heur.AdvML.M
NetworkFile package Heur.AdvML.B
OpenOffice package Heur.AdvML.B
Pixa package Heur.AdvML.B
Syspad package Trojan.ADH.2
Traffic package Heur.AdvML.B
XiceCube package Heur.AdvML.C

 

k7  46 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Acrobat package Riskware ( 0040eff71 )
AMD package Riskware ( 0040eff71 )
Anyvideo package Riskware ( 0040eff71 )
Asterisk package Riskware ( 0040eff71 )
AYC package Riskware ( 0040eff71 )
Christmas package Riskware ( 0040eff71 )
CSE package Riskware ( 0040eff71 )
DB2EXE package Riskware ( 0040eff71 )
Deputy package Trojan ( 0057128d1 )
Doom package Riskware ( 0040eff71 )
Driver package Riskware ( 0040eff71 )
DVB package Trojan ( 004b897a1 )
EasyPhoto package Riskware ( 0040eff71 )
eBay package Riskware ( 0040eff71 )
EEP package Trojan ( 004b9db51 )
ESET package Riskware ( 0040eff71 )
GamePack package Riskware ( 0040eff71 )
Gothic package Trojan ( 004ba4bb1 )
Hardpage package Riskware ( 0040eff71 )
HarryPotter package Riskware ( 0040eff71 )
HDCleaner package Riskware ( 0040eff71 )
HP package Riskware ( 0040eff71 )
IconXtractor package Riskware ( 0040eff71 )
iLivid package Riskware ( 0040eff71 )
Internet package Riskware ( 0040eff71 )
Lexicon package Riskware ( 0040eff71 )
Makro package Riskware ( 0040eff71 )
Maxx package Riskware ( 0040eff71 )
MyUninstaller package Riskware ( 0040eff71 )
Need4Speed package Riskware ( 0040eff71 )
Nero package Riskware ( 0040eff71 )
Office package Riskware ( 0040eff71 )
OperaTor package Trojan-Downloader ( 005706151 )
PanoramaStudio package Riskware ( 0040eff71 )
PMS package Trojan ( 0056ed541 )
Purr package Riskware ( 0040eff71 )
RC package Riskware ( 0040eff71 )
Recorder package Riskware ( 0040eff71 )
Recovery package Riskware ( 0040eff71 )
Safety package Riskware ( 0040eff71 )
Slot package Trojan ( 0052b5941 )
SpyDetector package Riskware ( 0040eff71 )
Todo package Riskware ( 0040eff71 )
VideoConverter package Riskware ( 0040eff71 )
WSFTP package Trojan ( 004b89a21 )
Zapfer package Riskware ( 0040eff71 )

 

46 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Adel package Malware.AI.4122649945
AOWS package Generic.Malware/Suspicious
ArchPros package Malware.Heuristic.106
Bestellformular package Worm.Agent
Blinkx package Malware.Heuristic.106
BMS package MachineLearning/Anomalous.93%
Cable package Malware.AI.3450574278
CPU package MachineLearning/Anomalous.100%
CPY package MachineLearning/Anomalous.100%
Cubes package Trojan.MalPack.Krunchy
Delphi package Generic.Malware/Suspicious
Doom package MachineLearning/Anomalous.96%
Drivers package Trojan.Dropper
Elevate package Malware.AI.1024729222
ESET package Generic.Malware/Suspicious
FileWorks package MachineLearning/Anomalous.94%
Gemini package Blocked
Glace package Backdoor.Bot
Miranda package Malware.AI.1459280784
MT package MachineLearning/Anomalous.100%
MyCar package MachineLearning/Anomalous.100%
MyUninstaller package Blocked
Nokia package Malware.AI.4188472001
PersonalDesktop package Malware.Heuristic.106
Prosto package Blocked
Puzzle package Blocked
PwGenerator package Malware.Heuristic.105
RC package Malware.Heuristic.106
Recorder package Blocked
RemoteMouse package Malware.Heuristic.106
Seamonkey package Generic.Malware/Suspicious
SEP package MachineLearning/Anomalous.94%
Sfix package Blocked
Spacestrike package Malware.AI.994356642
SpyDetector package Malware.Heuristic.1004
SpyWall package MachineLearning/Anomalous.97%
Stempel package Malware.Heuristic.106
Sun package Heuristics.Shuriken
Todo package MachineLearning/Anomalous.93%
TU package MachineLearning/Anomalous.100%
Uget package Heuristics.Shuriken
VideoConverter package Blocked
Weather package MachineLearning/Anomalous.100%
WebDe package Heuristics.Shuriken
WinPrepare package Blocked
Wizard package MachineLearning/Anomalous.94%

Malwarebytes is a new entry in our tests – it is to be expected that their number of false alarms will be much lower next time.

 65 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Adel package Blocked
Adresso package Blocked
AmericanConquest package Blocked
Backup package Blocked
Blinkx package Blocked
BMC package Blocked
BrothersInArms package Blocked
BTM package Blocked
BurnAware package Blocked
ButtonShop package Blocked
CheckSig package Blocked
CPU package Blocked
CPY package Blocked
Delphi package Blocked
DrSaver package Blocked
Downtown package Blocked
Easo package Trojan
Erste package Blocked
Feedback package Blocked
FMOD package Blocked
Gemini package Blocked
Glace package Trojan  
Gothic package Blocked
Groowe package Blocked
Harbor package Blocked
iLift package Blocked
Katalog package Blocked
Language package Blocked
Lazarus package Blocked
Lottofee package Blocked
Lottoziehung package Blocked
MT package Blocked
MyCar package Blocked
Notepad package Blocked
OleManager package Blocked
Packer package Blocked
Password package Blocked
PDCconverter package Blocked
PersonalDesktop package Blocked
PMS package Blocked
Ports package Blocked
Puzzle package Blocked
PwGenerator package Blocked
RC package Blocked
Recovery package Blocked
Reminder package Blocked
RemoteMouse package Blocked
ReplayParser package Blocked
Robot package Blocked
ScreenCamera package Blocked
ScreenRecorder package Blocked
SEP package Blocked
ServiceCenter package Blocked
SipGate package Blocked
Sobolsoft package Blocked
SpyWall package Blocked
StringVorstellung package Blocked
TCPview package Blocked
Todo package Blocked
TU package Blocked
VideoConverter package Blocked
Weather package Blocked
WinRAR package Blocked
XiceCube package Blocked
Ytlinks package Blocked

Copyright and Disclaimer

This publication is Copyright © 2021 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(April 2021)