This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

Potentially-Unwanted-Application Test November 2009

Date November 2009
Language English
Last Revision November 30th 2009

like Adware, Spyware, Rogue Software


Release date 2009-11-30
Revision date 2009-11-30
Test Period November 2009
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-unchecked
Platform/OS Microsoft Windows

Introduction

The issue with Adware, Spyware and other fraudulent software increased a lot over the past years. Such type of applications are not the typical malware and its classification is sometimes not an easy task – they are usually covered under the term “potentially unwanted applications” (PUA). Under some circumstances certain “potentially unwanted applications” are accepted/wanted applications in some countries, depending on cultural background or different judicature, due which sometimes legal disputes come up whether a software can be considered as malware or not. The term “potentially unwanted” covers this grey area. Usually our malware test-sets do not include this kind of threats, but as users may want to get an idea about the detection rate of potentially unwanted software of their Anti-Virus product. Anyway, it seems like the coverage of PUA’s is similar to the coverage of malware.

The PUA Test-Set – containing around 1,1 million samples – used for this test includes only PE files and covers mainly Adware (e.g. mostly Virtumonde, Browser Hijackers, etc.), Spyware (e.g. Keyloggers, etc.), Rogue Software (e.g. mostly FakeAV’s and other misleading applications, etc.). gathered from January 2009 to October 2009. We decided to do not include dialers, potentially dangerous tools and other grayware, not only because it would increase the test-set of several further millions of samples (and therefore take even longer to scan), but also because the inclusion and classification of such grayware applications is even more debatable.

The Adware/Spyware/Rogues (Potentially Unwanted Applications – PUA) sets were frozen the 10th October 2009. The system and the products were updated and frozen on the 4th November 2009.

We tested all products with highest settings (except F-Secure and Sophos – on their own request;

The results of our on-demand tests are usually applicable also for the on-access scanner (if configured the same way), but not for on-execution protection technologies (like HIPS, behaviour blockers, etc.).

A good detection rate is still one of the most important, deterministic and reliable features of an antivirus product. Additionally, most products provide at least some kind of HIPS, behaviour-based or other functionalities to block (or at least warn about the possibility of) malicious actions e.g. during the execution of malware, when all other on-access and on-demand detection/protection mechanism failed.

AV-Comparatives publishes also some other test reports which cover different aspects/features of the products. Please have a look on our website for further information. Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to evaluate the software by themselves and build their own opinion about them. Test data or reviews just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and encourage readers to research also other independent test results provided by various well-known and established independent testing organizations, in order to get a better overview about the detection and protection capabilities of the various products over different test scenarios and various test-sets.

Please try the products on your own system before making a purchase decision based on these tests. There are also some other program features and important factors (e.g. price, ease of use/management, compatibility, graphical user interface, language, HIPS / behaviour blocker functions, etc.) to consider.

Tested Products

Test Results

Graph of missed samples (lower is better)

Summary results

Detection rates for “potentially unwanted software”:

1.G DATA, Trustport99.8%
2.Avira, McAfee98.9%
3.Bitdefender, eScan, F-Secure, Symantec98.6%
4.Kaspersky96.7%
5.ESET96.5%
6.Avast96.3%
7.Sophos95.4%
8.Microsoft94.6%
9.AVG93.9%
10.Norman88.5%
11.Kingsoft87.1%

McAfee VirusScan Plus 2009 and above comes with the “in-the-cloud” Artemis technology turned on by default. For corporate users or home users using older McAfee products without “Active Protection” – as well as all other users – it may be important to know what the baseline minimum detection rate of McAfee would be, should the Internet connection be not available. The McAfee detection rate without Internet connection was 92.6%.

Award levels reached in this PUA Test

The Awards are based only on detection rates of unwanted programs like Adware, Spyware and Rogue AVs. For detection rates of malware like Trojans, backdoors, viruses, etc., as well as for false alarm rates of the products, please refer to the other test reports available on our website.

Copyright and Disclaimer

This publication is Copyright © 2009 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(November 2009)