This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

False Alarm Test September 2024

Date September 2024
Language English
Last Revision October 10th 2024

Appendix to the Malware Protection Test September 2024


Release date 2024-10-15
Revision date 2024-10-10
Test Period September 2024
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-checked
Platform/OS Microsoft Windows

Introduction

This report is an appendix to the Malware Protection Test September 2024 listing details about the discovered False Alarms.

In AV testing, it is important to measure not only detection capabilities but also reliability. One aspect of reliability is the ability to recognize clean files as such, and not to produce false alarms (false positives). No product is immune from false positives (FPs), but some produce more than others. False Positives Tests measure which programs do best in this respect, i.e. distinguish clean files from malicious files, despite their context. There is no complete collection of all legitimate files that exist, and so no “ultimate” test of FPs can be done. What can be done, and is reasonable, is to create and use a set of clean files which is independently collected. If, when using such a set, one product has e.g. 15 FPs and another only 2, it is likely that the first product is more prone to FPs than the other. It doesn’t mean the product with 2 FPs doesn’t have more than 2 FPs globally, but it is the relative number that is important. In our view, antivirus products should not generate false alarms on any clean files, irrespective of the number of users affected. While some antivirus vendors may downplay the risk of false alarms and exaggerate the risk of malware, we do not base product ratings solely on the supposed prevalence of false alarms. We currently tolerate a certain number of false alarms (currently 10) within our clean set before penalizing scores. Products that yield a higher number of false alarms are more likely to trigger false alarms with more prevalent files or in other sets of clean files. The prevalence data we provide for clean files is purely for informational purposes. The listed prevalence may vary within the report, depending on factors such as which file/version triggered the false alarm or how many files of the same kind were affected. There can be disparities in the number of false positives produced by two different programs utilizing the same detection engine. For instance, Vendor A may license its detection engine to Vendor B, yet Vendor A’s product may exhibit more or fewer false positives than Vendor B’s product. Such discrepancies could stem from various factors, including differences in internal settings, additional or varying secondary engines/signatures/whitelist databases/cloud services/quality assurance, and potential delays in making signatures available to third-party products.

Sometimes, a few vendors attempt to dispute why some clean or non-malicious software/files are blocked or detected. Explanations may include: the software being unknown or too new and awaiting whitelisting, detection of non-current/old versions due to newer software version availability, limited usage within their userbase, complete absence of any user reports on false positives (thus suggesting false positives are non-existent for them), bugs in the clean software (e.g., an application crashing under certain circumstances), errors or missing information in End User License Agreements making it illegal in some countries (like a missing/unclear disclosure of data transmission), subjective user interface usability issues (e.g., missing the option to close the program in the system tray), software being available only in specific languages (e.g., Chinese), assumptions that the file must be malware because other vendors detect it according to a multiscanning service (copycat behaviour we increasingly observe, unfortunately), or issues with unrelated software from the same vendor/distributor many years ago. If these rules were consistently applied, almost every clean software would be flagged as malware at some point. Such dispute reasons often lack validity and are therefore rejected. Antivirus products could enhance user control and understanding by offering options such as filtering based on language or EULA validity and providing clear explanations for detections rather than blanket classification as malware. This would empower users to manage and understand detection reasons more effectively. Ultimately, it’s not about which specific file is misclassified but that it is misclassified. Achieving a high malware score is effortless if done with lax signatures/heuristics at the expense of false positives. Although we even list here the prevalence of the files, the same detection rules causing those FPs on some rare files can as well be the cause for a major FP case if the detection signatures/heuristics are not properly fixed/adapted.

All listed false alarms were encountered at the time of testing. False alarms caused by unencrypted data blocks in anti-virus related files were not counted. If a product had several false alarms belonging to the same application, it is counted here as only one false alarm. Cracks, keygens, or other highly questionable tools, are not counted here as false positives.

In order to give more information to the user about the false alarms, we try to rate the prevalence of the false alarms. Files which were digitally signed are considered more important. Due to that, a file with the lowest prevalence level (Level 1) and a valid digital signature is upgraded to the next level (e.g. prevalence “Level 2”). Extinct files which according to several telemetry sources had zero prevalence have been provided to the vendors in order to fix them, but have also been removed from the set and were not counted as false alarms.

The prevalence is given in five categories and labeled with the following colors:fp_prevalence

LevelPresumed number of affected usersComments
1fp_prevalence_1Probably fewer than hundred usersIndividual cases, old or rarely used files, very low prevalence
2fp_prevalence_2Probably several hundreds of users


Initial distribution of such files was probably much higher, but current usage on actual systems is lower (despite its presence), that is why also well-known software may now affect / have only a prevalence of some hundreds or thousands of users.
3fp_prevalence_3Probably several thousands of users
4fp_prevalence_4Probably several tens of thousands (or more) of users
5fp_prevalence_5Probably several hundreds of thousands or millions of usersSuch cases are likely to be seen much less frequently in a false alarm test done at a specific time, as such files are usually either whitelisted or would be noticed and fixed very fast.

Most false alarms will probably (hopefully) fall into the first two levels most of the time.

False Positives (FPs) serve as a critical measurement for assessing antivirus quality. Moreover, such testing is necessary to prevent vendors from optimizing products solely to perform well in tests. Hence, false alarms are assessed and tested in the same manner as malware tests. A single FP report from a customer can trigger a significant amount of engineering and support work to resolve the issue, sometimes resulting in data loss or system unavailability. Even seemingly insignificant FPs (or FPs on older applications) warrant attention because they may still indicate underlying issues in the product that could potentially cause FPs on more significant files. Below, you’ll find information about the false alarms observed in our independent set of clean files. Entries highlighted in red denote false alarms on files that were digitally signed.

Tested Products

Test Results

In order to better evaluate the quality of the file detection capabilities (ability to distinguish good files from malicious files) of anti-virus products, we provide a false alarm test. False alarms can sometimes cause as much trouble as a real infection. Please consider the false alarm rate when looking at the detection rates, as a product which is prone to false alarms may achieve higher detection rates more easily. In this test, a representative set of clean files was scanned and executed (as done with malware).

1.Kaspersky, Trend Micro1very few FPs
2.ESET2 few FPs
3.G DATA3
4.Avast, AVG4
5.Bitdefender, Quick Heal, Total Defense5
6.Microsoft8
7.McAfee, TotalAV13 many FPs
8.Avira15
9.F-Secure17
10.Panda28 very many FPs
11.Norton32

A product that is successful at detecting a high percentage of malicious files but suffers from false alarms may not be necessarily better than a product which detects fewer malicious files, but which generates fewer false alarms.

The detection names presented were primarily obtained from pre-execution scan logs, where available. If a threat was blocked during or after execution, or if no clear detection name was identified, we indicate “Blocked” in the “Detected as” column.

Details about the discovered false alarms

 
Kaspersky 1 False Alarm
False alarm found in some parts of Detected as Supposed prevalence
Winchloe package Packed.Win32.PolyCrypt.b

 

Trend Micro 1 False Alarm
False alarm found in some parts of Detected as Supposed prevalence
Photozoom package Malicious

 

 
ESET 2 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Fairplay package Suspicious
Haruhost package a variant of WinGo/Agent_AGen.BH

 

 
G Data 3 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Bingofolies package Win32.Trojan.PSE.1GIU4QX (Engine B)
Cassa package IL:Trojan.MSILZilla.38224 (Engine A)
P2pover package Win32.Trojan.Agent.EH (Engine B)

 

 
Avast 4 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Agenteguardiao package Win32:Malware-gen
Ddnsgo package Win32:Evo-gen [Trj]
Obsidium package Win32:Malware-gen
Staffexpresswait package Win32:Evo-gen [Trj]

 

 
AVG 4 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Agenteguardiao package Win32:Malware-gen
Ddnsgo package Win32:Evo-gen [Trj]
Obsidium package Win32:Malware-gen
Staffexpresswait package Win32:Evo-gen [Trj]

 

 
Bitdefender 5 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Cassa package IL:Trojan.MSILZilla.38224
Tibia package Trojan.GenericKD.72528881
Tyrk package Gen:Variant.Zusy.554928
Wutheringwaves package Gen:Variant.Lazy.365837
Yearbook package Gen:Variant.Lazy.557199

 

 
Total Defense 5 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Cassa package IL:Trojan.MSILZilla.38224
Tibia package Trojan.GenericKD.72528881
Tyrk package Gen:Variant.Zusy.554928
Wutheringwaves package Gen:Variant.Lazy.365837
Yearbook package Gen:Variant.Lazy.557199

 

 
Quick Heal 5 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Easyburn package Behaviour Detection
Greenstone package Behaviour Detection
Linkws package Browsing Protection
New net srl package Behaviour Detection
Pyxis package Behaviour Detection

 

 
Microsoft 8 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Boosterx package Trojan:Win32/Malgent!MSR
Dmw package Trojan:Win32/Sdum!MSR
Firewall package Trojan:Win32/Malgent!MSR
Hp package Trojan:Win32/Malgent
Imagine package Trojan:Win32/Wacatac.B!ml
Sms package Trojan:Win32/AgentTesla!ml
Staffexpresswait package Trojan:Win32/Wacatac.B!ml
Teracopy package Trojan:Win32/CryptInject!MSR

 

 
McAfee 13 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Acterapide package Real Protect-LS!f1d48e7a3f4c
Agenteguardiao package ti!CF66AC1AA2F5
Cassa package ti!F3179F88F99A
Genisys package Real Protect-LS!a3a2c62e2cbf
Imagine package ti!61625DB92277
Integerscaler package GenericRXWQ-YP!74F742149B41
Legendsofchantra package Real Protect-LS!f80706b89810
Login package Real Protect-LS!791801f9acec
Obsidium package ti!11FD016F4A5B
Sms package Real Protect-LS!f65368e670e7
Staffexpresswait package ti!94A912F0C0FE
Winchloe package ti!5B10FDD7FEF7
Wutheringwaves package Real Protect-LS!3f22cc39fad6

 

 
TotalAV 13 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Acterapide package HEUR/APC
Cassa package HEUR/AGEN.1306272
Comax package HEUR/AGEN.1315672
Ddnsgo package HEUR/APC
Dmw package TR/Redcap.qqzfz
Enpower package TR/Kryptik.lnxmp
Haruhost package TR/Redcap.fnzru
Kensington package TR/Redcap.gdkzo
Legendsofchantra package HEUR/APC
Login package TR/AVI.TrojanX.cxwgu
Rostelecom package TR/ATRAPS.Gen
Selax package HEUR/AGEN.1372167
Staffexpresswait package HEUR/AGEN.1365763

 

 
Avira 15 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Acterapide package HEUR/APC
Cassa package HEUR/AGEN.1306272
Comax package HEUR/AGEN.1315672
Comsytec package TR/Crypt.XPACK.Gen
Ddnsgo package HEUR/APC
Dmw package TR/Redcap.qqzfz
Enpower package TR/Kryptik.lnxmp
Haruhost package TR/Redcap.fnzru
Kensington package TR/Redcap.gdkzo
Legendsofchantra package Drop.Win32.Score.APC.HEUR/APC.100
Login package TR/AVI.TrojanX.cxwgu
New Net srl package BDS/Redcap.fszsf
Rostelecom package TR/ATRAPS.Gen
Selax package HEUR/AGEN.1372167
Staffexpresswait package HEUR/AGEN.1365763

 

 
F-Secure 17 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
7zsfx package Trojan:W32/Agent.DSNN
Acterapide package HEUR/APC
Cassa package Heuristic.HEUR/AGEN.1306272
Comax package Heuristic.HEUR/AGEN.1315672
Comsytec package Trojan.TR/Crypt.XPACK.Gen
Dmw package Trojan.TR/Redcap.qqzfz
Enpower package Trojan.TR/Kryptik.lnxmp
Fractalus package Suspicious:W32/Malware!DeepGuard.p
Getmp3 package Packed:MSIL/SmartIL.A
Haruhost package Trojan.TR/Redcap.fnzru
Kensington package Trojan.TR/Redcap.gdkzo
Login package Trojan.TR/AVI.TrojanX.cxwgu
Maxx package Suspicious:W32/Malware!DeepGuard.pg
Rostelecom package Trojan.TR/ATRAPS.Gen
Samurize package Trojan-Downloader:JS/TeslaCrypt.C
Selax package Heuristic.HEUR/AGEN.1372167
Staffexpresswait package Trojan.TR/Dropper.Gen6

 

 
Panda 28 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Agenteguardiao package Suspicious
Autoruns package Suspicious
Autoshutdown package Suspicious
Bingofolies package Suspicious
Broadworks package Suspicious
Cassa package Suspicious
Ccunwash package Malware
Cefa aviation sas package Suspicious
Dmw package Suspicious
Fairplay package Suspicious
Foxit package Trojan
Greenstone package Malware
Hardphoneprocess package Suspicious
Imagine package Suspicious
Journaltrace package Suspicious
Legendsofchantra package Suspicious
Meldemax package Security Risk
New net srl package Malware
Obsidium package Suspicious
Optimik package Suspicious
Proconnect package Suspicious
Rostelecom package Suspicious
Seetrol package Suspicious
Smartleap package Suspicious
Sms package Suspicious
Staffexpresswait package Suspicious
Subtitler package Malware
Ukrainegta package Suspicious

 

 
Norton 32 False Alarms
False alarm found in some parts of Detected as Supposed prevalence
Acterapide package Heur.AdvML.B
Alpx package Heur.AdvML.B
Androidmultitool package Heur.AdvML.B
Ascend package Heur.AdvML.B
Cassa package Scr.Malcode!gdn30
Ccleaner package Heur.AdvML.B
Cddvdburner package Heur.AdvML.B
Comfort package Heur.AdvML.B
Comsytec package Bloodhound.W32.EP
Ddnsgo package Heur.AdvML.B
Dmw package Heur.AdvML.B
Enpower package Heur.AdvML.B
Exelent package Heur.AdvML.B
Fghjaz package Heur.AdvML.B
Imagine package Heur.AdvML.B
Kjhn package Heur.AdvML.B
Launcher package Heur.AdvML.B
Login package Heur.AdvML.B
Lossantos package Heur.AdvML.B
Mafiacity package Heur.AdvML.B
Maxx package Heur.AdvML.B
PCW package ISB.Downloader!gen52
Smartleap package Heur.AdvML.B
Sms package Heur.AdvML.B
Staffexpresswait package Heur.AdvML.B
Syslog package Heur.AdvML.B
Trans package Heur.AdvML.B
Vlctkeoxe package Heur.AdvML.B
Winchloe package Heur.AdvML.B
Wutheringwaves package Heur.AdvML.B
Y73 package Heur.AdvML.C
Yearbook package Heur.AdvML.B

 

Copyright and Disclaimer

This publication is Copyright © 2024 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(October 2024)

Skip to content