What's the most effective way to evaluate or test antimalware products under extreme conditions? Even if I put a prospective product in my environment for a month, it may not encounter enough different samples for a valid test.
Ask the Expert
SearchSecurity expert Michael Cobb is standing by to answer your questions about enterprise application security and platform security. Submit your question via email. (All questions are anonymous.)
Creating an effective test for an antimalware product is something of a Holy Grail quest. Although there are several methods for evaluating antimalware products, they do not truly reflect the performance of products in real life. As you state in your question, even running a product for a month may not expose it to a varied enough range of threats. Many infections are due to direct or indirect user actions that allow malware to infect a system, but how do you simulate different user interactions on encountering a virus, malware warning or email attachment from an unknown source? It could in fact be a combination of actions over a period of time that leads to a vulnerable system state.
Advanced testing strategies do try to simulate real enterprise settings by testing against standard data sets with automated user profiles to imitate user interaction with security messages. However, this assumes that users' behavior and all of the variables affecting their computing environments can be predicted and reflected in automated profiles. Also, given the increasingly dynamic nature of malware, just keeping malware data sets up to date is an onerous task.
Some organizations think a cheap and easy way to run an antimalware test is to just load a hard drive full of all the malware they can find and then run their chosen antimalware product against it to see what it detects. In reality, such a test only measures raw detection accuracy and comes nowhere near replicating malicious code in a real-world environment. The scanning engine is only one component that needs to be tested in a range of technologies used by modern antivirus programs. There is behavior analysis once the code is executed, and features to protect the entry vectors such as email attachments and malicious links.
To more thoroughly test antimalware products it is necessary to accurately replicate the infrastructure you're trying to protect. Detection differences between versions of the same scanner running on different operating systems can occur due to implementation differences. Also, when trying to evaluate the overall effectiveness of antimalware, assess how a program handles healing a system when it finds malicious code. Does it surgically remove the malware, quarantine it or just send a message to the support desk saying, "There's a problem"? If even the most basic malware detections require manual intervention, that's a problem. On the other end of the spectrum, the antimalware systems should send an alert message if it encounters a problem it can't remediate on its own. Other tests that need to be completed include:
- A false-positives test: How many erroneous infection alerts are created on a clean system?
- A proactive test: How successful are its proactive techniques at spotting malware with no signature?
- An on-access test: How quickly does it scan files for malware when they're opened or saved to disk?
- A test of general performance and interface ease-of-use.
Such tests demand a great deal of manual work, and an organization would have to run them against all the products that are within its budget, as benchmarking just one is useless. The only way to really compare one product against another is to monitor real usage over a period of time to understand their strengths and weaknesses and how external factors influence their effectiveness. The Anti-Malware Testing Standards Organization, which was started to address the need for improvement in the objectivity, quality and relevance of antimalware testing methodologies, is a good place to start when looking for ways to create a standards-based test.
This was first published in July 2013