This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy.
Accept

Performance Test (AV-Products) November 2010

Date November 2010
Language English
Last Revision December 13th 2010

Impact of Anti-Virus Software on System Performance


Release date 2010-12-13
Revision date 2010-12-13
Test Period November 2010
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-unchecked
Platform/OS Microsoft Windows
Methodology Click here

Introduction

We want to make clear that the results in this report are intended to give only an indication of the impact on system performance (mainly by the real-time/on-access components) of the various Anti-Virus products in these specific tests. Users are encouraged to try out the software on their own PCs and form an opinion based on their own observations.

Tested Products

Please note that the results in this report apply only to the products/versions listed above and should not be assumed comparable to (e.g.) the versions provided by the above listed vendors as part of a product suite. Also, keep in mind that different vendors offer different (and differing quantities of) features in their products. The following activities/tests were performed:

  • File copying
  • Archiving / Unarchiving
  • Encoding / Transcoding
  • Installing / Uninstalling applications
  • Launching applications
  • Downloading files
  • PC Mark Vantage Professional Testing Suite

Test Procedure

The tests were performed on an Intel Core 2 Duo E8300 machine with 2GB of RAM and SATAII hard disks. The performance tests were first done on a clean Microsoft Windows 7 Professional (32-Bit) system and then with the installed Anti-Virus software (with default settings).

The hard disk was defragmented before starting the various tests, and care was taken to minimize other factors that could influence the measurements and/or comparability of the systems (network, temperature, etc.). Optimizing processes/fingerprinting used by the products were also considered – this means that the results represent the impact on a system which has already been used by the user for a while. The tests were repeated several times (with and without fingerprinting) in order to get mean values and filter out measurement errors. After each run the workstation was defragmented and rebooted. We simulated various file operations that a computer user would execute: copying* different types of clean files from one place to another, archiving and unarchiving files, encoding and transcoding** audio and video files, converting DVD-Files to IPOD format, downloading files from Internet, launching applications, etc. We make use of windows automation software to replicate the activities and measure the times.

We also used a third-party industry recognized performance testing suite (PC Mark Vantage Professional Edition) to measure the system impact during real-world product usage. Readers are invited to evaluate the various products themselves, to see how they impact on their systems (such as software conflicts and/or user preferences, as well as different system configurations that may lead to varying results).

Anti-Virus products need to load on systems at an early stage to provide security from the very beginning – this load has some impact on the time needed for a system to start up. Measuring boot times accurately is challenging. The most significant issue is to define exactly when the system is fully started, as many operating environments may continue to perform start-up activities for some time after the system appears responsive to the user. It is also important to consider when the protection provided by the security solution being tested is fully active, as this could be a useful measure of boot completion as far as the security solution is concerned. Some Anti-Virus products are loading their services very late (even minutes later) at boot (users may notice that after some time that the system loaded, the system gets very slow for some moments), so the system looks like loading very fast, but it just loads its services later and makes the system also insecure/vulnerable. As we do not want to support such activities, we still do not measure boot times.

To support our concerns, we tested on an older system if the products are loading all their protection modules before e.g. malware in the start-up folder is executed. All products failed this test, except AVG and Sophos. AVG and Sophos were the only two products which detected and blocked the malware before its execution after system start-up (by loading itself at an early stage), in all others cases first the malware was successfully executed and only later detected by the AV products, when it was already too late.

* We used 3GB data of various file categories (pictures, movies, audio files, various MS Office documents, PDF files, applications/executables, Microsoft Windows 7 system files, archives, etc.).

** Converting MP3 files to WAV, MP3 to WMA

Test Results

AVC ScorePC Mark ScoreImpact Score
1.K785969
2.Kingsoft, Sophos889210
3.Avast, Microsoft889111
4.AVG, eScan, ESET, F-Secure, Symantec839413
5.Avira809515
6.Trustport858916
7.McAfee, Panda809218
8.G DATA758728
9.Kaspersky Lab709030
10.Norman609238
11.Bitdefender638839
12.Trend Micro608347
13.PC Tools388567

Award levels reached in this Performance Test

We provide a 4-level ranking system: Tested, STANDARD, ADVANCED and ADVANCED+. All products were quite good, and reached at least the STANDARD level, which means they have an acceptable impact on system performance. ADVANCED means they have only a low impact on system performance and ADVANCED+ denotes products with even lower impact (according to the test results).

The following certification levels are for the results reached in this performance test report. Please note that the performance test only tells you how much impact an Anti-Virus may have on a system compared to other Anti-Virus products; it does not tell you anything about the effectiveness of the protection a product provides. To determine, for example, how the detection rates of the various Anti-Virus products are, please refer to our other tests, available at www.av-comparatives.org

Copyright and Disclaimer

This publication is Copyright © 2010 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(December 2010)