AV-Comparatives released their latest Performance Test Report for Windows security products under Microsoft Windows 7. 19 products were tested regarding their impact on system performance. But keep in mind: Protection is much more important than speed!
Anti-Spam Test (Consumer Products) 2016
Spam can be defined as unsolicited emails sent en masse. These may be sent for advertising purposes, in which case they may be seen as irritating but harmless. They may attempt to deceive the recipient into sending money to the scammer; typical examples are pretending to be a friend or relative who has lost their wallet while abroad, and so needs money to get home, or claiming that by paying a relatively small administration fee, the recipient will receive a much larger sum as lottery winnings. Other malicious spam emails may contain links to phishing pages or malware, or simply include malware as an attachment.
In this first Anti-Spam Test Report, AV-Comparatives had a look at 13 consumer anti-spam products for their ability to filter out spam emails.
Microsoft-prevalence-based analysis of the File Detection Test
This Microsoft-prevalence-based analysis report is supplementary to AV-Comparatives’ main report, already published, of the March 2016 File-Detection Test. No additional testing has been performed; rather, the existing test result have been re-analysed from a different perspective, to consider what impact the missed samples are likely to have on customers, according to telemetry data of Microsoft.
Microsoft commissioned this supplementary report. In this report, customer impact is measured according to prevalence. Essentially, some malware samples pose a greater threat to the average user than others, because they are more widespread. An impact heatmap can be found on https://impact.av-comparatives.org
More information can be read also in this here.
Support Test 2016
Given the numerous risks to be found on the Internet today, effective antimalware software is essential when going online. If a user is unable to install or activate their security program, or it is not working as expected, rapid help from an expert is called for. Arguably the quickest way of getting assistance is to pick up the phone and speak to one of the manufacturer’s support agents. The aim of Support Tests is to assess how quickly and effectively the vendor’s support services cope with typical questions.
File Detection Test March 2016
AV-Comparatives Summary Report 2015
The AV-Comparatives Summary Report 2015 is now available! Find out which products scored best in our public main-test series of 2015! http://www.av-comparatives.org/summary-reports/
Real-World Protection Test August-November 2015
The overall report of the Real-World Protection Test (August-November) is now available! It can be found here.
Vendors’ reaction to undetected malware samples
We had a closer look at the vendors’ reaction regarding the samples they missed in the online file-detection test of September.
100 days later, many of them had added detection for all missed malicious files we had sent to them after the test.
This means they now detect all threats in the test-set, resulting in a 100% detection rate with the September test-set. On the other hand, six vendors only added detection for about 90% of their misses, which results in a detection rate of between 99.2% and 99.9% of the September test-set.
To compare the reaction with that of other vendors not in our public main-test series, and who therefore do not get missed samples, we looked at how much one well-known vendor (with a similar number of misses) added in the past 100 days; it was also around 90%.
This shows that some vendors are faster at adding detection for missed malware files, and some are slower or reactive (i.e. wait till one of their users reports an infection / a missed sample to them) in adding detection for malicious samples, even if these are prevalent and confirmed as malicious. Some vendors sometimes claim that they get low scores in tests because they do not detect “non-malicious” or “non-prevalent” samples, which is thus shown not to be accurate, as they and even other vendors not taking part in tests add (albeit with much delay) missed malicious files which have been found in the field.