This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy .
Some of our partner services are located in the United States. According to the case law of the European Court of Justice, there is currently no adequate data protection in the USA. There is a risk that your data will be controlled and monitored by US authorities. You cannot bring any effective legal remedies against this.
Accept

Vendors’ reaction to undetected malware samples

We had a closer look at the vendors’ reaction regarding the samples they missed in the online file-detection test of September.

100 days later, many of them had added detection for all missed malicious files we had sent to them after the test.

This means they now detect all threats in the test-set, resulting in a 100% detection rate with the September test-set. On the other hand, six vendors only added detection for about 90% of their misses, which results in a detection rate of between 99.2% and 99.9% of the September test-set.

To compare the reaction with that of other vendors not in our public main-test series, and who therefore do not get missed samples, we looked at how much one well-known vendor (with a similar number of misses) added in the past 100 days; it was also around 90%.

This shows that some vendors are faster at adding detection for missed malware files, and some are slower or reactive (i.e. wait till one of their users reports an infection / a missed sample to them) in adding detection for malicious samples, even if these are prevalent and confirmed as malicious. Some vendors sometimes claim that they get low scores in tests because they do not detect “non-malicious” or “non-prevalent” samples, which is thus shown not to be accurate, as they and even other vendors not taking part in tests add (albeit with much delay) missed malicious files which have been found in the field.

Microsoft-prevalence-based analysis of the File Detection Test

This Microsoft-prevalence-based analysis report is supplementary to AV-Comparatives’ main report, already published, of the September 2015 File-Detection Test. No additional testing has been performed; rather, the existing test result have been re-analysed from a different perspective, to consider what impact the missed samples are likely to have on customers, according to telemetry data of Microsoft.

Microsoft commissioned this supplementary report. This report is a prototype customer-impact report; improved versions might be provided for future File-Detection Test reports. In this report, customer impact is measured according to prevalence. Essentially, some malware samples pose a greater threat to the average user than others, because they are more widespread. An impact heatmap can be found on http://impact.av-comparatives.org

More information can be read also in this here.

Follow Us

Twitter

Facebook

Tags