Answer

Picking the best enterprise antivirus product: Does AV research count?

Our antivirus deal is almost up and we're evaluating new products and services. Not surprisingly, we have found that each vendor says it has the best research team. Are there metrics we can request from the vendors so that we can determine which research team is, in fact, more successful at identifying malware? Or is that even necessary? Has anti-malware reached the point where, similar to weather forecasting, they are really all the same and working off the same data?

    Requires Free Membership to View

Ask the expert!

Have questions about enterprise information security threats for expert Nick Lewis? Send them via email today! (All questions are anonymous).

It sounds like your organization is doing its due diligence in critically evaluating your anti-malware products to determine the best fit for your organization. That's a good thing. With the frequency of changes in anti-malware protections, when choosing the best enterprise antivirus product for its needs, every enterprise should conduct a thorough anti-malware comparison and choose the most cost-effective product that provides the necessary protection. Many vendors have extended their traditional antivirus products into endpoint protection suites. There are a number of different organizations that evaluate the effectiveness of anti-malware protection: Virus Bulletin, AV-Comparatives, AV-test, West Coast Labs, Gartner Inc., and others.

The best AV research team might help drive product innovations and advanced research, but research should only be part of an antivirus evaluation. In the anti-malware research community, vendors share samples, and each evaluates the sample to determine how to detect, block or remediate the malware, but little is publically documented about the whole process. While detecting malware first does provide some advantages in producing protections, using a product that has a protection in place first doesn't help if the endpoint doesn't receive the update.

There are numerous metrics that can be used to evaluate the effectiveness of different anti-malware research teams that involve looking at features of the vendors' products, including: the number of false positives, detection vs. blocking, and the average time from customer malware submission to detection deployed to endpoint. The number of false positives reveals any possible issues with the quality assurance process. Detection vs. blocking indicates the effectiveness of the malware analysis and whether the company is able to create detection but not completely determine what is necessary to clean a system. The average time from customer malware submission to detection deployed to endpoint could help enterprises more effectively protect their endpoints by producing detection for an enterprise that used custom malware and getting that detection deployed to their endpoints.

This was first published in August 2012

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: