This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

Performance Test (AV-Products) October 2012

Date October 2012
Language English
Last Revision October 12th 2012

Impact of Anti-Virus Software on System Performance


Release date 2012-10-12
Revision date 2012-10-12
Test Period October 2012
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-unchecked
Platform/OS Microsoft Windows
Methodology Click here

Introduction

We want to make clear that the results in this report are intended to give only an indication of the impact on system performance (mainly by the real-time/on-access components) of the various Anti-Virus products in these specific tests. Users are encouraged to try out the software on their own PC’s and form an opinion based on their own observations.

Tested Products

Please note that the results in this report apply only to the products/versions listed above (e.g. 64-Bit versions, product version, etc.). Also, keep in mind that different vendors offer different (and differing quantities of) features in their products.

The following activities/tests were performed under an up-to-date Windows 7 Professional SP1 64-Bit:

  • File copying
  • Archiving / Unarchiving
  • Encoding / Transcoding
  • Installing / Uninstalling applications
  • Launching applications
  • Downloading files
  • PC Mark 7 v1.0.4 Professional Testing Suite

Test Procedure

The tests were performed on an Intel Core i5 750 machine with 4GB of RAM and SATAII hard disks. The performance tests were done on a clean and fully updated Windows 7 Professional SP1 64 Bit system (English) and then with the installed Anti-Virus software (with default settings). The tests have been done with an active internet connection to simulate real world impact of cloud ser-vices/features.

The hard disks were defragmented before starting the various tests, and care was taken to minimize other factors that could influence the measurements and/or comparability of the systems. Optimizing processes/fingerprinting used by the products were also considered – this means that the results rep-resent the impact on a system which has already been used by the user for a while. The tests were repeated several times (with and without fingerprinting) in order to get mean values and filter out measurement errors. After each run the workstation was defragmented and rebooted. We simulated various file operations that a computer user would execute: copying (we used around 5GB of data consisting of various file types and sizes (pictures, movies, audio files, various MS Office documents, PDF files, applications/executables, Microsoft Windows 7 system files, archives, etc.)) different types of clean files from one place to another, archiving and unarchiving files, encoding and transcoding (converting MP3 files to WAV, MP3 to WMA, AVI to MPG and MPG to AVI, as well as iPod format) audio and vid-eo files, converting DVD-Files to iPod format, downloading files from Internet, launching applications, etc. We also used a third-party industry recognized performance testing suite (PC Mark 7 Professional) to measure the system impact during real-world product usage. Readers are invited to evaluate the various products themselves, to see how they impact on their systems (such as software conflicts and/or user preferences, as well as different system configurations that may lead to varying results).

Security products need to load on systems at an early stage to provide security from the very begin-ning – this load has some impact on the time needed for a system to start up. Measuring boot times accurately is challenging. The most significant issue is to define exactly when the system is fully started, as many operating environments may continue to perform start-up activities for some time after the system appears responsive to the user. It is also important to consider when the protection provided by the security solution being tested is fully active, as this could be a useful measure of boot completion as far as the security solution is concerned. Some Anti-Virus products are loading their services very late (even minutes later) at boot (users may notice that after some time that the system loaded, the system gets very slow for some moments), so the system looks like loading very fast, but it just loads its services later and makes the system also insecure/vulnerable. As we do not want to support such activities, we still do not measure boot times.

Test Results

AVC ScorePC Mark ScoreImpact Score
1.Webroot8899.82.2
2.ESET8899.72.3
3.Avast, Microsoft8899.32.7
4.F-Secure8399.67.4
5.Kaspersky8398.18.9
6.AVG8198.810.2
7.Panda8098.811.2
8.McAfee7897.814.2
9.Avira, Qihoo7699.314.7
10.Bitdefender7698.715.3
11.Sophos7599.515.5
12.eScan7597.817.2
13.BullGuard7395.721.3
14.Fortinet6897.924.1
15.PC Tools6895.326.7
16.G DATA6597.727.3
17.GFI6097.432.6
18.Trend Micro5895.836.2

Award levels reached in this Performance Test

The following award levels are for the results reached in this performance test report. Please note that the performance test only tells you how much impact an Anti-Virus product may have on a system compared to other Anti-Virus products; it does not tell anything about the effectiveness of the protection a product provides.

Copyright and Disclaimer

This publication is Copyright © 2012 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(October 2012)