Vendors’ reaction to undetected malware samples

Published by:

lego-568039_960_720

We had a closer look at the vendors’ reaction regarding the samples they missed in the online file-detection test of September.

100 days later, many of them had added detection for all missed malicious files we had sent to them after the test.

This means they now detect all threats in the test-set, resulting in a 100% detection rate with the September test-set. On the other hand, six vendors only added detection for about 90% of their misses, which results in a detection rate of between 99.2% and 99.9% of the September test-set.

To compare the reaction with that of other vendors not in our public main-test series, and who therefore do not get missed samples, we looked at how much one well-known vendor (with a similar number of misses) added in the past 100 days; it was also around 90%.

This shows that some vendors are faster at adding detection for missed malware files, and some are slower or reactive (i.e. wait till one of their users reports an infection / a missed sample to them) in adding detection for malicious samples, even if these are prevalent and confirmed as malicious. Some vendors sometimes claim that they get low scores in tests because they do not detect “non-malicious” or “non-prevalent” samples, which is thus shown not to be accurate, as they and even other vendors not taking part in tests add (albeit with much delay) missed malicious files which have been found in the field.

Microsoft-prevalence-based analysis of the File Detection Test

Published by:

msprev

This Microsoft-prevalence-based analysis report is supplementary to AV-Comparatives’ main report, already published, of the September 2015 File-Detection Test. No additional testing has been performed; rather, the existing test result have been re-analysed from a different perspective, to consider what impact the missed samples are likely to have on customers, according to telemetry data of Microsoft.

Microsoft commissioned this supplementary report. This report is a prototype customer-impact report; improved versions might be provided for future File-Detection Test reports. In this report, customer impact is measured according to prevalence. Essentially, some malware samples pose a greater threat to the average user than others, because they are more widespread. An impact heatmap can be found on http://impact.av-comparatives.org

More information can be read also in this Microsoft blog post.

False Alarm Test Report September 2015

Published by:

fp

AV-Comparatives releases an appendix report for its False Alarm Test done during the File-Detection Test. The False Alarm Test report contains details about the false alarms encountered by the various products, such a the affected programs, the detection names and the supposed prevalence (according to various telemetry data sources). You can download the appendix False Alarm Test report of September 2015 as PDF here.

Continue reading