This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy.
Accept

Performance Test (Suite Products) November 2013

Date November 2013
Language English
Last Revision November 19th 2013

Impact of Anti-Virus Software on System Performance


Release date 2013-11-19
Revision date 2013-11-19
Test Period November 2013
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-unchecked
Platform/OS Microsoft Windows
Methodology Click here

Introduction

We want to make clear that the results in this report are intended only to give an indication of the impact on system performance (mainly by the real-time/on-access components) of the various Internet security products in these specific tests. Users are encouraged to try out the software on their own PC’s and see how it performs on their own systems.

Tested Products

Please note that the results in this report apply only to the products/versions listed above (e.g. 64-Bit versions, product version, etc.). Also, keep in mind that different vendors offer different (and differing quantities of) features in their products.

The following activities/tests were performed under an up-to-date Windows 8 Pro 64-Bit system:

  • File copying
  • Archiving / Unarchiving
  • Encoding / Transcoding
  • Installing / Uninstalling applications
  • Launching applications
  • Downloading files
  • PC Mark 8 Professional Testing Suite

Test Procedure

The tests were performed on an Acer Aspire XC600 machine with an Intel Core i5-3330 CPU (3GHz), 4GB of RAM and SATA II hard disks. The performance tests were done on a clean and fully updated Windows 8 Pro 64-Bit system (English) and then with the installed Internet security software (with default settings). The tests were done with an active Internet connection to allow for the real-world impact of cloud services/features.

The hard disks were defragmented before starting the various tests, and care was taken to minimize other factors that could influence the measurements and/or comparability of the systems. Optimizing processes/fingerprinting used by the products were also considered – this means that the results represent the impact on a system which has already been operated by the user for a while. The tests were repeated several times (with and without fingerprinting) in order to get mean values and filter out measurement errors. After each run, the workstation was defragmented and rebooted. We simulated various file operations that a computer user would execute: copying (we use around 3GB of data consisting of various file types and sizes (pictures, movies, audio files, various MS Office documents, PDF files, applications/executables, Windows operating system files, archives, etc.).) different types of clean files from one place to another, archiving and unarchiving files, encoding and transcoding (converting MP3 files to WAV, MP3 to WMA and AVI to MP4) audio and video files, converting DVD files to iPod format, downloading files from the Internet, launching applications, etc. We also used a third-party, industry-recognized performance testing suite (PC Mark 8 Professional) to measure the system impact during real-world product usage. Readers are invited to evaluate the various products themselves, to see how they impact on their systems (due to e.g. software conflicts and/or user preferences, as well as different system configurations that may lead to varying results).

Security products need to load on systems at an early stage to provide security from the very beginning – this load has some impact on the time needed for a system to start up. Measuring boot times accurately is challenging. The most significant issue is to define exactly when the system is fully started, as many operating environments may continue to perform start-up activities for some time after the system appears responsive to the user. It is also important to consider when the protection provided by the security solution being tested is fully active, as this could be a useful measure of boot completion as far as the security solution is concerned. Some security products load their services very late at boot (or even minutes later). Users may notice that some time after the system has loaded, it will become very slow for a little while; thus, it initially looks as though the system has loaded very quickly, but in fact the security product just loads its services belatedly, leaving the system more vulnerable. As we find this misleading, we still do not publish boot times in our reports.

Test Results

These specific test results show the impact on system performance that anti-virus products have, compared to the other tested anti-virus products. The reported data just gives an indication and is not necessarily applicable in all circumstances, as too many factors can play an additional part.

AVC ScorePC Mark ScoreImpact Score
1.Avira, Bitdefender, Sophos9099.01.0
2.Avast, F-Secure, Kaspersky Lab9098.81.2
3.AVG9098.11.9
4.Qihoo8899.03.0
5.Symantec9096.83.2
6.ESET8598.66.4
7.eScan, ThreatTrack8597.67.4
8.BullGuard8398.88.2
9.AhnLab, Tencent8398.18.9
10.Panda8595.59.5
11.Kingsoft8397.29.8
12.Emsisoft8098.811.2
13.Fortinet8394.612.4
14.G DATA8097.212.8
15.McAfee7894.817.2
16.Trend Micro7595.319.7

The out-of-box system impact score with enabled Windows Defender on Microsoft Windows 8.1 is 5.5

Copyright and Disclaimer

This publication is Copyright © 2013 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(November 2013)