This website uses cookies to ensure you get the best experience on our website.
Please note that by continuing to use this site you consent to the terms of our Privacy and Data Protection Policy.
Accept

Keeping AV-Comparatives alive

For those that did not already read this info on our website:
Starting from 2008, AV-Comparatives will – like most other testers do – no longer provide its services for free, as the expenses for the site and all the work involved are too high. The vendors will not pay for the big tests itselfs (and it has of course no influence of any kind on the test results or award system) – the participation fee includes e.g. the usage of the reached award for marketing purposes and other services (like getting false positives and missed malware after the tests, bug reports, etc.). The number of participants will be limited to about 18 vendors.

What’s coming next

Here an overview about what is coming next on AV-Comparatives:

– the retrospective test (which will be released the 1st December) will be slightly changed/improved further. Some vendors where already suggesting some test method changes since about a half year and as those methods are a) for us easier to do than we did so far, b) seems to be widely accepted also from other vendors and c) should not change much anyway, this new “methods” will be used in this upcoming retrospective test. The changes etc. will be described in the report. At least for this retrospective test, the level-ranges will likely remain unchanged.

– the summary report of 2007 will be released during December, containing a brief summary of all tests and some notes about the various products.

– In 2008 we will probably start to ask for some kind of fees for the various services we provide and we will probably drop out some vendors from the regular tests and include some other vendors instead which are more known and often asked to be included. The number of tested products will probably be limited to 16-18 products.

– Behavioral-Testing is getting more and more important. There are currently discussions to find the best (vendor-independent) practice for such kind of tests and how testers can perform such tests. As such tests are not trivial and require lot of resources it may take a while until we do them.

– If manpower and resources permits, we would like to do from time to time also other tests showing how well products protect against malicious actions in some various scenarios. But first we will continue to focus on keeping and improving further the quality and methods of the current tests.

Summary to keep in mind

For peoples which are too lazy to read the information on the website and the reports:

1) The products tested by AV-Comparatives are already a selection of very good scanners. E.g. is a minimum requirement a detection rate of at least 85%. There are some big vendors (and many small vendors) which do not reach this requirement and are therefore not included in our tests due that. Some (relativly unknown) products do not even detect 20% of the test-set.

2) STANDARD rating is a good rating. It means that a product provides a good detection rate also of malware which is not on the Wildlist. As long a product scores at least STANDARD and is able to pass regularly the tests of VirusBulletin or ICSA, you can feel pretty safe with them. ADVANCED and ADVANCED+ are higher ratings, depending on your surfing habits and needs, you may feel to be in the need of using a scanner with such a rating.

3) Do not look just at the percentages or placing orders. A product belonging to one category (STANDARD, ADVANCED, ADVANCED+) can be considered as good as the other products in the same category.

4) The detection rate of an Anti-Virus product is just one factor you have to consider when choosing an AV. Other important factors are e.g.: impact on system performance, support, compatibility, price, GUI, easy of management, other protection features offered by the product, etc. – in other words: do not base the decision based on detection results alone and do not let other peoples decide what is best for you; try the various anti-virus products by yourself on your PC by downloading an evaluation version.

5) Do not annoy or bash a vendor if it scores lower than you would have expected in one test (and see point 1 & 2). You can be sure that they will do their best to improve their product and that their first goal is to protect their customers from the malware which is submitted to them by their customers (which of course has higher priority than samples submitted by other sources). Only in case your product e.g. often failed to protect you or the support you needed did not help you, you may consider to change your AV. If you are happy with your actual product and feel comfortable with it, there is probably no need to change it. Remember that AV-Comparatives does not recommend any specific product to you to use, what you get is just data results, all the rest remains up to you.

6) Look at various tests, possibly from as many different (professional) testers you find, see how the AV’s perform in those in long terms and . Some other testing institutions are listed in our links sections.

7) There is no AV which offers 100% detection against all malware. So it may be good if you from time to time check your PC with some online scanners of other vendors than the one you have installed. It may find something that your AV missed to detect – even if in tests it scores e.g. 99,9%.

Old blog entries removed

During the blog migration some blog posts got lost/removed. We will try to recover at least the most interesting posts.

 

Follow Us

Twitter

Facebook

Tags

Real-World Protection Test test results wpdt False Alarm Malware Protection hacking deepweb iot fileless powershell security video youtube test zero day cryptography deception social engineering bug hunting antivirus security review business security corporate enterprise products incident response nextgen android review netapp certification mac anti-phishing amtso protection rttl performance anti-spam spam-test commissioned malware file detection microsoft prevalence telemetry hotline phone support detection fdt dynamic wildlist poll survey summary report whole-product testing certified eicar testing labs trusted it-security windows 10 false alarms false positives apt cyberwar cheating david harley schneier testing detection test antivirus handbook malware analysis certificates fraud netcraft phishing ssl saas small business smb EU NSA privacy safe harbor 10-Years awards Participation av comparatives comparison Virus Bulletin conference disinfection remediation removal VB2015 child protection content filtering parental control mobile tools antispy metadata spyware online sandbox jotti metascan multiscanner online scan virustotal market share OPSWAT blogs news malicious URLs malware sources websites honeypots forensics rootkit malware scam url reverse engineering penetration testing exploit python memory network botnet windows metasploit fiddler sandbox honeypot disassembler IDA browser apple email spam linux story statistics 10 Years AV-Comparatives anniversary greenwald Hero snowden surveillance Whistleblower 1984 NSA Handbook orwell pc mark speed windows 8.1 behavioral behavioural heuristics proactive retrospective ubuntu list of mobile security vendors list of security vendors for Mac list of security vendors for PC archive reports best paper award Malware Conference file-detection ISO certification data leakage data transmission award constantinus it security tips knowledgebase