Column

Antimalware effectiveness put to the test

Neil Roiter

Antimalware vendors are loading up -- with traditional signature-based detection, heuristic detection, detection based on common attack characteristics and exploits of known vulnerabilities, application controls, host firewalls….whew!

SearchSecurity.com:

To get security news and tips delivered to your inbox, 

    Requires Free Membership to View

click here to sign up for our free newsletter.

But how effective is all of this? Recent tests from a couple of sources -- Virus Bulletin (VB) and Secunia -- didn't have all the answers, but the findings were interesting enough to make us wonder, yet again, how effective are these products and how do you test that effectiveness?

"For a long time, I've viewed signature-based detection as a commodity. You could easily purchase very similar functionality from all the major vendors," said Ed Skoudis, co-founder and senior security consultant for InGuardians Inc. "The other stuff is where all the interesting detection happens, especially as signature-based detection grows less effective over time, because the bad guys are moving so fast."

"To write a product that does the detection is difficult enough; to come up with test that's valid, reproducible and accurate is a much different animal."
Andrew Hayter, antimalcode program managerICSA Labs

The annual VB100 certification test, which has been around since 1998, didn't tell us much except that AV vendors can shoot fish in a barrel -- in this case, a WildList virus sampling they surely all have signatures for. But in other test results detailed in the October Bulletin, detecting bots and worms, polymorphic viruses and, especially Trojans, were more revealing. While all the major vendors scored perfectly on the VB100 test, they missed 5-15% on the Trojans test.

The reason? First, this was a fresh batch of specimens, so the products had to depend on their other detection techniques, with disappointing results. Going forward, John Hawes, VB technical consultant, said the Bulletin is moving rapidly to ensure that fresh samples are used for each evaluation -- bots and worms as well -- to test the mettle of these endpoint security suites. Trojans are a particular challenge.

"Trojans are difficult because there are so many of them; 90-95% of new malware reported are Trojans," he said. "Huge, huge numbers of malware are coming out all the time, and keeping on top of it is quite a tricky task."

So, vendors have responded with combinations of tools in suites -- or what they optimistically call "integrated endpoint security solutions." These packages are a new phenomenon, really only emerging late in 2007, and showing their immaturity when Skoudis and colleague Matt Carpenter for conducted early comparative testing for Information Securitymagazine.

The testers conducted real-time and on-demand scans, running thousands of samples through each products combined tools, with, at best, mixed results. A series of exploit protection tests against known vulnerabilities, and a zero-day test against a buffer flow vulnerability in a test application both yielded hits and misses. One would expect these products to perform well against exploits, especially given the emphasis vendors place on host-based intrusion prevention systems (HIPS), but exploit detection testing by Secunia says it ain't so.

Secunia's Internet Security Suite test was designed to test the exploit detection ability of a dozen different products. Secunia turned 144 malicious files and 156 malicious Web pages against XP SP2 with missing patches and a number of vulnerable programs. The results were dismal. Symantec was tops with 64 hits. The rest? Look at your hand. Count the fingers.

The implications of these and other tests is that antimalware products make us safer, but not safe.

A case in point was this year's DefCon Race to Zero contest, in which a team of three researchers from Mandiant used obfuscation techniques to get 10 well-known viruses and exploits -- including Slammer and the 20-year-old Stone virus -- past major AV scanners. It took them six hours.

Secunia chief technology officer Thomas Kristensen thinks things might be even worse if the bad guys tried harder.

"What makes people lucky is that bad guys still have quite a bit to learn," he said. "They are not that good at exploiting the latest vulnerabilities on a massive scale. If their attempts to exploit are caught, it's simply because using they're using some old payload that is already known by the different security solutions."

Cold comfort. We still depend on our endpoint security suites to play an important role in protecting our systems and information. And we depend on antimalware testing and certification to help make our purchasing decisions. But testing is all over the map. IT media sites and publications may each have their own procedures, which may or may not even be consistent from one review to the next. Certification organizations like ICSA and VB, among others, conduct repeated tests and strive for consistent methodology.

There's no easy answer to the testing dilemma, given the rapidly evolving threats and the products designed to counter them. The newly formed Anti-Malware Testing Standards Organization (AMTSO) offered a set of guidelines, "The Fundamental Principles of Testing," released in November, to establish some common ground for fair and safe testing. "The group is primarily vendor driven, but includes several certification labs, such as ICSA, VB Bulletin, AV-Comparatives.org and Av-Test.org. "

AMTSO issued nine principles of testing:
 

  • Testing must not endanger the public.
  • Testing must be unbiased.
  • Testing should be reasonably open and transparent.
  • The effectiveness and performance of antimalware products must be measured in a balanced way.
  • Testers must take reasonable care to validate whether test samples or test cases have been accurately classified as malicious, innocent or invalid.
  • Testing methodology must be consistent with the testing purpose.
  • The conclusions of a test must be based on the test results.
  • Test results should be statistically valid.
  • Vendors, testers and publishers must have an active contact point for testing related correspondence.

    On their face, these are common sense for-good-and-against-evil statements. However, they provide a framework for antimalware tests, which might differ in their particulars but provide some assurance that they were conducted responsibly. The guidelines go into some detail about how the testers….tested. For example, "testing should be reasonably open and transparent" calls for disclosure of the test methodology: selection of test samples, updating products, configuration settings, environment (OS version, apps running, etc.), how responses were measured and so on.

    AMTSO issued a companion document, "Best Practices for Dynamic Testing," because this type of testing is both critical and, compared to static tests, very difficult to reproduce. Among other things, it recommends sufficient samples and repeated testing over time with different sample sets.

    "Dynamic testing says let's open machines up to the real-world Internet and see it on the fly," said Andrew Hayter, antimalcode program manager for ICSA Labs. "It is extremely difficult, very, very, very time-consuming and virtually impossible to reproduce."

    AMTSO did not and probably will not get into the business of recommending specific testing protocols or criteria. Testers will continue to thrash that out in a changing threat landscape.

    "Setting a particular protocol would be a very difficult thing to do," said Hayter. "You would have to be able to dynamically adjust whatever you are doing, your testing methodology to react to the type of malware that is changing or evolving."

    Further, testing all the moving parts of products that use multiple tools and techniques -- that is to say, almost all of them -- and trying to draw overall conclusions and recommendations is, to say the least, problematic.

    "To write a product that does the detection is difficult enough; to come up with test that's valid, reproducible and accurate is a much different animal," said ICSA's Hayter. "Testing some of these malware in products that combine multiple types of technology is an extremely difficult, extremely time-consuming process."

    InGuardians' Skoudis presents three options:
     

  • Test all in one; just turn everything on and record the results. The problem is that different vendors have different features, and it's hard to tell whether you've triggered a given feature or not.
  • Test with the defaults turned on. However, the vendor would say you should have gone beyond the defaults. "Then why did the vendor choose the default as it is?" Skoudis said.
  • Test piece parts -- signatures, behavior-based, firewall, HIPS. The question, here ICSA's Hayter said, is: "How can we develop a result that is representative of the suites, and by testing individual components, how do you add it all up to say that in total the solution is certified?"

    "Products have all grown so complicated, it's hard to do a comprehensive test of them in a reasonable amount of time with a reasonable testing budget," said Skoudis. "And, it will only get worse with more time as these products add more and more features."


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: