This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy.
Accept

Performance Test Methodology

Operating system

Microsoft Windows; the exact version will be noted in each test report.

Aim of the test

The test aims to compare how much the tested security-products slow down everyday tasks, such as launching applications, while protecting the system.

Target Audience

The test is of interest to all users of Windows PCs who are concerned that security products may reduce the performance of their system.

Definition of Performance

Performance is defined here as the speed with which a PC running a particular security product carries out a particular task, relative to an otherwise identical PC without any security software. It should be noted that perceived performance, i.e. whether the user subjectively feels that the system is being slowed down, is important. In some cases, the impact of security software on some actions, such as opening a program, may in itself have a negligible effect on the user’s actual productivity during a working day, but nonetheless irritate or frustrate the user; this can then indirectly cause him or her to be less productive. As we do not condone using a PC without antivirus software, the relative scores of the products tested should be considered more important than the difference between the fastest product and a PC without any security software.

Scope of the test

The test measures performance (speed) reduction caused by the tested programs when carrying out a number of everyday tasks such as opening and saving files. It does not measure the impact of the security products on boot time, as some products give the false impression of having loaded quickly. In reality, there is a delay in launching the protection features, in order to imply that the product has not slowed down the boot process.  Thus full protection is only offered some time after the Windows Desktop appears, the appearance of speed coming at the cost of security. Please note that this test does not in any way test the malware detection/protection abilities of the participating products. AV-Comparatives carries out a number of separate malware protection tests, most notably the Real-World Protection Test, details of which can be found on our website, www.av-comparatives.org. We do not recommend choosing an antivirus product solely on the basis of low performance impact, as it must provide effective protection too.

Test Setup

The operating system is installed on a single test PC and updated. Care is taken to minimize other factors that could influence the measurements and/or comparability of the system; for example, Windows Update is configured so that it will not download or install updates during the test. Additionally, Microsoft Office and Adobe Reader are installed, so that the speed of opening/saving Microsoft Office and PDF files can be measured. Two further programs, both relating to performance testing, are installed: PC Mark 8 Professional, which is an industry-recognised performance-testing suite, and Winautomation, which automates specified tasks (like a macro) and logs the time taken to complete them. It is used to simulate various file operations that a computer user would execute.

Settings

Default settings are used for all consumer products.

Test procedure

All the tests are performed with an active Internet connection, to allow for the real-world impact of cloud services and features. The hard disks are defragmented before starting each individual test run, and. Optimizing processes/fingerprinting used by the products are also considered – this means that the results represent the impact on a system which has already been operated by the user for a while. The Winautomation tests are repeated at least 4 times, twice with fingerprinting and twice without) in order to get mean values and filter out measurement errors. In the event of fluctuations, the tests are repeated additional times. After each run, the workstation is defragmented and rebooted. The automated test procedures are as follows:

File copying

We copy a set of various common file types from one physical hard disk to another physical hard disk.

Archiving and unarchiving

We archive a set of different file types that are commonly found on home and office workstations. The results already consider the fingerprinting/optimization technologies of the anti-virus products, as most users usually make archives of files they have on their disk.

Encoding/transcoding

We encode and transcode some multimedia files with FFmpeg and HandBrakeCLI.

Installing/uninstalling applications

We install several popular applications with the silent install mode, then uninstall them and measure how long it takes. We do not consider fingerprinting, because usually an application is installed only once.

Launching applications

We open and then later close some large document files in Microsoft Office and some large PDF files in Adobe Acrobat Reader. The time taken for the viewer or editor application to launch, and afterwards to close, is measured. Although we list the results for the first opening and the subsequent openings, we consider the subsequent openings more important, as normally this operation is done several times by users, and optimization of the anti-virus products takes place, minimizing their impact on the systems.

Downloading files

Large files are downloaded from a local server with a GUI-less browser that allows sending HTTP requests in the background. Additionally, the content of several popular websites are fetched via wget, also from a local server.

PC Mark Test

PC Mark Professional is run, to measure the system impact during real-world product usage.

RATINGS

To produce an overall rating for each product, we combine the scores from the PC Mark test and from our own automated tests, as follows. The range of times taken to complete each of the automated tasks is clustered into the categories Very Fast, Fast, Mediocre and Slow. Points are then awarded for each task, with products in the Fast category gaining 15 points, Fast getting 10 points, Mediocre 5 points and Slow zero points. A rounded-up average is taken of the points awarded to produce a final score. For the File Copying test, the mean score for the first run (with brand-new image) is taken, along with the average of the subsequent runs (to allow for e.g. fingerprinting by the security product). For the Launching Applications test, only the scores of the subsequent runs are used to create the average. As there are 6 individual tasks, a product can get up to 90 points overall. The number of points is then added to the PC Mark score, which goes up to 100.

SUMMARY

Detection No
False Positives No
Cloud connectivity Yes
Updates allowed Yes
Default configuration Yes (for consumer products)