This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

Performance Test November 2018

Date November 2018
Language English
Last Revision December 14th 2018

Impact of Business Security Software on System Performance


Release date 2018-12-17
Revision date 2018-12-14
Test Period November 2018
Online with cloud connectivity checkbox-checked
Update allowed checkbox-checked
False Alarm Test included checkbox-unchecked
Platform/OS Microsoft Windows
Methodology Click here

This report is an excerpt of the Business Security Test 2018 (August – November). For more details, please click here.

Introduction

We want to make clear that the results in this report are intended only to give an indication of the impact on system performance (mainly by the real-time/on-access components) of the business security products in these specific tests. Users are encouraged to try out the software on their own PC’s and see how it performs on their own systems.

Tested Products

We have tested the product that each manufacturer submits for the protection tests in the Business Main Test Series. Please note that the results in this report apply only to the specific product versions listed above (i.e. to the exact version numbers and to 64-bit systems). Also, keep in mind that different vendors offer different (and differing numbers of) features in their products.

The following activities/tests were performed under an up-to-date Windows 10 RS4 64-Bit system:

  • File copying
  • Archiving / unarchiving
  • Installing / uninstalling applications
  • Launching applications
  • Downloading files
  • Browsing Websites
  • PC Mark 10 Professional Testing Suite

Test Procedure

The tests were performed on a Lenovo E560 machine with an Intel Core i5-6200U CPU, 8GB of RAM and SSD hard disks. We consider this machine configuration as “high-end”. The performance tests were done on a clean Windows 10 RS4 64-Bit system (English) and then with the installed business security client software. The tests were done with an active Internet connection to allow for the real-world impact of cloud services/features.

Care was taken to minimize other factors that could influence the measurements and/or comparability of the systems. Optimizing processes/fingerprinting used by the products were also considered – this means that the results represent the impact on a system which has already been operated by the user for a while. The tests were repeated several times (with and without fingerprinting) in order to get mean values and filter out measurement errors. After each run, the workstation was reverted to the previously created system image and rebooted six times. We simulated various file operations that a computer user would execute: copying different types of clean files from one place to another, archiving and unarchiving files, downloading files from the Internet and launching applications (opening documents).

We believe that increasing the number of iterations increases our statistical precision. This is especially true for performance testing, as some noise is always present on real machines. We perform each test multiple times and provide the median as result.

We also used a third-party, industry-recognized performance testing suite (PC Mark 10 Professional) to measure the system impact during real-world product usage. We used the predefined PCMark 10 Extended test. Readers are invited to evaluate the various products themselves, to see what impact they have on their systems (due to e.g. software conflicts and/or user preferences, as well as different system configurations that may lead to varying results).

We use around 5GB of data consisting of various file types and sizes (pictures, movies, audio files, MS Office documents, PDF documents, business applications/executables, Windows operating system files, archives, etc.).

Testcases

File copying: Some anti-virus products ignore some types of files by design/default (e.g. based on their file extensions), or use fingerprinting technologies, which may skip already scanned files in order to increase the speed (see comments on page 5). We copied a set of various common file types from one physical hard disk to another physical hard disk.

Archiving and unarchiving: Archives are commonly used for file storage, and the impact of anti-virus software on the time taken to create new archives or to unarchive files from existing archives may be of interest for most users. We archived a set of different file types that are commonly found on home and office workstations. The results already consider the fingerprinting/optimization technologies of the anti-virus products, as most users usually make archives of files they have on their disk.

Installing/uninstalling applications: We installed several common applications with the silent install mode, then uninstalled them and measured how long it took. We did not consider fingerprinting, because usually an application is installed only once.

Launching applications: Microsoft Office (Word, Excel, and PowerPoint) and PDF documents are very common. We opened and then later closed various documents in Microsoft Office and in Adobe Acrobat Reader. The time taken for the viewer or editor application to launch, and afterwards to close, was measured. Although we list the results for the first opening and the subsequent openings, we consider the subsequent openings more important, as normally this operation is done several times by users, and optimization of the anti-virus products take place, minimizing their impact on the systems.

Downloading files: The content of several common websites is fetched via wget from a local and a public server.

Browsing Websites: common websites are opened with Google Chrome. The time to completely load and display the website was measured. We only measure the time to navigate to the website when an instance of the browser is already started.

Ranking System

Slow
Mediocre
Fast
Very Fast
The mean value of the products in this cluster builds a clearly slower fourth cluster in the given subcategory
The mean value of the products in this cluster builds a third cluster in the given subcategory
The mean value of the products in this group is higher than the average of all scores in the given subcategory
The mean value of the products in this group is lower than the average of all scores in the given subcategory

Test Results

These specific test results show the impact on system performance that a security product has, compared to the other tested security products. The reported data just gives an indication and is not necessarily applicable in all circumstances, as too many factors can play an additional part. The testers defined the categories Slow, Mediocre, Fast and Very Fast by consulting statistical methods and taking into consideration what would be noticed from the user’s perspective, or compared to the impact of the other security products. If some products are faster/slower than others in a single subtest, this is reflected in the results.

Overview of single AV-C performance scores

Vendor File copying Archiving /
Unarchiving
Installing /
Uninstalling
Applications
Launching Applications Downloading Files Browsing Webites
First Run Subsequent Run First Run Subsequent Run
Avast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast
Bitdefender perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast
CrowdStrike perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast
Emsisoft perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-mediocre perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-veryfast
Endgame perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-mediocre perf-level-mediocre perf-level-mediocre perf-level-veryfast perf-level-veryfast
eScan perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-fast perf-level-mediocre
ESET perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast
FireEye perf-level-veryfast perf-level-veryfast perf-level-mediocre perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-fast perf-level-veryfast
Fortinet perf-level-fast perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast
Kaspersky Lab perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-veryfast
McAfee perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-fast perf-level-mediocre perf-level-veryfast perf-level-veryfast perf-level-veryfast
Microsoft perf-level-mediocre perf-level-fast perf-level-fast perf-level-mediocre perf-level-fast perf-level-veryfast perf-level-fast perf-level-veryfast
Panda perf-level-veryfast perf-level-veryfast perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast
Saint Security perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-fast perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-veryfast
Trend Micro perf-level-fast perf-level-veryfast perf-level-fast perf-level-fast perf-level-fast perf-level-veryfast perf-level-mediocre perf-level-fast
VIPRE perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-fast perf-level-veryfast perf-level-veryfast perf-level-veryfast perf-level-veryfast

 

Key Slow perf-level-mediocre Mediocre perf-level-fast Fast perf-level-veryfast Very fast

PC Mark Tests

In order to provide an industry-recognized performance test, we used the PC Mark 10 Professional Edition testing suite. Users using PC Mark 10 benchmark should take care to minimize all external factors that could affect the testing suite, and strictly follow at least the suggestions documented inside the PC Mark manual, to get consistent and valid/useful results. Furthermore, the tests should be repeated several times to verify them. For more information about the various consumer scenarios tests included in PC Mark, please read the whitepaper on their website.

“No security software” is tested on a baseline system without any security software installed, which scores 100 points in the PC Mark 10 benchmark.

Baseline system: Intel Core i5-6200U machine with 8GB RAM and SSD drive

Summarized results

Users should weight the various subtests according to their needs. We applied a scoring system to sum up the various results. Please note that for the File Copying and Launching Applications subtests, we noted separately the results for the first run and for subsequent runs. For the AV-C score, we took the rounded mean values of first and subsequent runs for File Copying, whilst for Launching Applications we considered only the subsequent runs. “Very fast” gets 15 points, “fast” gets 10 points, “mediocre” gets 5 points and “slow” gets 0 points. This leads to the following results:

AVC ScorePC Mark ScoreImpact Score
1.Bitdefender9098.21.8
2.CrowdStrike9098.11.9
3.Avast909.82.0
4.Kaspersky9097.82.2
5.ESET9097.52.5
6.VIPRE8897.74.3
7.Panda8597.47.6
8.McAfee8597.17.9
9.Fortinet8596.78.3
10.Saint Security8596.48.6
11.Emsisoft8097.013.0
12.FireEye7597.917.1
13.eScan7396.320.7
14.Endgame7096.923.1
15.Trend Micro6395.131.9
16.Microsoft6394.033.0

Copyright and Disclaimer

This publication is Copyright © 2018 by AV-Comparatives ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV-Comparatives prior to any publication. AV-Comparatives and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

AV-Comparatives
(December 2018)