This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

Performance Test Methodology

Operating system

Microsoft Windows; the exact version will be noted in each test report.

Aim of the test

The test aims to compare how much the tested security-products slow down everyday tasks, such as launching applications, while protecting the system.

Target Audience

The test is of interest to all users of Windows PCs who are concerned that security products may reduce the performance of their system.

Definition of Performance

Performance is defined here as the speed with which a PC running a particular security product carries out a particular task, relative to an otherwise identical PC without any security software. It should be noted that perceived performance, i.e. whether the user subjectively feels that the system is being slowed down, is important. In some cases, the impact of security software on some actions, such as opening a program, may in itself have a negligible effect on the user’s actual productivity during a working day, but nonetheless irritate or frustrate the user; this can then indirectly cause him or her to be less productive. As we do not condone using a PC without antivirus software, the relative scores of the products tested should be considered more important than the difference between the fastest product and a PC without any security software.

Scope of the test

The test measures performance (speed) reduction caused by the tested programs when carrying out a number of everyday tasks such as opening and saving files. It does not measure the impact of the security products on boot time, as some products give the false impression of having loaded quickly. In reality, there is a delay in launching the protection features, in order to imply that the product has not slowed down the boot process.  Thus full protection is only offered some time after the Windows Desktop appears, the appearance of speed coming at the cost of security. Please note that this test does not in any way test the malware detection/protection abilities of the participating products. AV-Comparatives carries out a number of separate malware protection tests, most notably the Real-World Protection Test, details of which can be found on our website, www.av-comparatives.org. We do not recommend choosing an antivirus product solely on the basis of low performance impact, as it must provide effective protection too.

Test Setup

The operating system is installed on a single test PC and updated. Care is taken to minimize other factors that could influence the measurements and/or comparability of the system; for example, Windows Update is configured so that it will not download or install updates during the test. Additionally, Microsoft Office and Adobe Reader are installed, so that the speed of opening/saving Microsoft Office and PDF files can be measured. Two further programs, both relating to performance testing, are installed: PC Mark Professional, which is an industry-recognised performance-testing suite, and our own automation which runs specified tasks  and logs the time taken to complete them. It is used to simulate various file operations that a computer user would execute.

Settings

Default settings are used for all consumer products.

Test procedure

All the tests are performed with an active Internet connection, to allow for the real-world impact of cloud services and features. Optimizing processes/fingerprinting used by the products are also considered – this means that the results represent the impact on a system which has already been operated by the user for a while. The tests are repeated at least 10 times, five times with fingerprinting and five times without) in order to get mean values and filter out measurement errors. In the event of fluctuations, the tests are repeated additional times. After each run, the workstation is rebooted. The automated test procedures are as follows:

File copying

We copied a set of various common file types from one physical hard disk to another physical hard disk. Some anti-virus products might ignore some types of files by design/default (e.g. based on their file type), or use fingerprinting technologies, which may skip already scanned files in order to increase the speed.

Archiving and unarchiving

Archives are commonly used for file storage, and the impact of anti-virus software on the time taken to create new archives or to unarchive files from existing archives may be of interest for most users. We archived a set of different file types that are commonly found on home and office workstations.

Installing applications

We installed several common applications with the silent install mode and measured how long it took. We did not consider fingerprinting, because usually an application is installed only once.

Launching applications

Microsoft Office (Word, Excel, PowerPoint) and PDF documents are very common. We opened and then later closed various documents in Microsoft Office and in Adobe Acrobat Reader. The time taken for the viewer or editor application to launch was measured. Although we list the results for the first opening and the subsequent openings, we consider the subsequent openings more important, as normally this operation is done several times by users, and optimization of the anti-virus products take place, minimizing their impact on the systems.

Downloading files

Common files are downloaded from a webserver on the Internet.

Browsing websites

Common websites are opened with Google Chrome. The time to completely load and display the website was measured. We only measure the time to navigate to the website when an instance of the browser is already started.

PC Mark Test

PC Mark Professional is run, to measure the system impact during real-world product usage.

RATINGS

To produce an overall rating for each product, we combine the scores from the PC Mark test and from our own automated tests, as follows. The range of times taken to complete each of the automated tasks is clustered into the categories Very Fast, Fast, Mediocre and Slow. Points are then awarded for each task, with products in the Fast category gaining 15 points, Fast getting 10 points, Mediocre 5 points and Slow zero points. A rounded-up average is taken of the points awarded to produce a final score. For the File Copying test, the mean score for the first run (with brand-new image) is taken, along with the average of the subsequent runs (to allow for e.g. fingerprinting by the security product). For the Launching Applications test, only the scores of the subsequent runs are used to create the average. As there are 6 individual tasks, a product can get up to 90 points overall. The number of points is then added to the PC Mark score, which goes up to 100.