Home > All, Security Issues > Shadowserver’s new anti-virus test suite – how good is it?

Shadowserver’s new anti-virus test suite – how good is it?

September 7, 2011 Leave a comment Go to comments

“Our little gnomes in the backroom,” says the excellent Shadowserver in an announcement headed ‘New AV Test Suite’, “have been working feverishly for the last several months to put the finishing touches on our new Anti-Virus backend test systems.”

Malware testing, as we know, is a tricky business. AMTSO, the Anti-malware Testing Standards Organization, has expended much energy and expertise in developing detailed methodologies designed to ensure fair, unbiased and accurate anti-virus tests. But do we get this from Shadowserver? Do we get a new AV comparison source that we can realistically access for accurate unbiased information on the different AV products available to us? Let’s see.

Shadowserver starts off with a fair comment.

No single vendor detects 100%, nor can they ever. To expect complete protection will always be science-fiction.

That being said, it goes on…

…you can see the different statistics of the different vendors in our charts.

Here’s a couple of examples.

Avira

Shadowserver's results for Avira

Panda

Shadowserver's results for Panda

The one thing that really leaps out here is that Panda apparently misses (shown in green) far more of the test samples than Avira. This is counterintuitive. Panda is a commercial product backed by one of the world’s leading security companies. Avira, which I personally trust sufficiently to use on my XP netbook, is a free product. Shadowserver provides a partial answer:

The longest running issue has been our inability to use Windows based AV applications. We can now handle that, however it is still not what you might buy for home or commercial use.  We are utilizing a special command-line-interface version from each of the vendors that we are using.  This is not something you can purchase or utilize normally.  These are all special version but most of them do use the same engines and signatures that the commercial products use.

Luis Corrons, PandaLabs

Luis Corrons, technical director, PandaLabs

This is important. Luis Corrons, technical director at PandaLabs elaborated:

What ShadowServer does is not an antivirus test. As they say, they do not even use commercial products, but special versions. Furthermore, it is static analysis of files they capture. It is a statistic. But the data cannot be used to say “product x detects more than product y” or “product x detects this percentage” as they are not using any of the other security layers used in real products (behavioural analysis/blocking, firewall, URL filtering, etc). The most you can say with this system is product x was able to detect y percent of files using their signatures and heuristics (the oldest antivirus technologies).

This is important. The AV companies have long recognised that the original signature database solution to malware cannot match the speed with which new signatures are required for polymorphic virus families. So they have supplemented their signature detection with more advanced and sophisticated methodologies.

In our case (Panda) ShadowServer is using an engine which is a few years old (at least 5) and of course is not using the cloud, so I can guarantee that our results are going to be awful. We have been asking SS for years to use a new version, but they were not supporting Windows. Now that they are supporting it, they forgot to mention it, but it’s not a problem as we’ll be sending them a new version with cloud connection. Anyway, even though in that way the results will be way better, or even if we are the number 1 vendor, that doesn’t mean anything, as it is only a static analysis of some files.

One solution would be for Shadowserver to work more closely with AMTSO. Shadowserver is not currently a member of AMTSO. I urge it to join. And I urge AMTSO to waive all membership fees so that this non-profit free service organization can do so. Both parties would benefit enormously. In the meantime, I asked David Harley, a director of AMTSO and research fellow at ESET, for his personal thoughts.

David Harley

David Harley, senior research fellow at ESET; director at AMTSO

Shadowserver has never been discussed within AMTSO, that I remember… In the past they’ve shied away from suggesting that their statistics are suitable for direct comparison of vendor performance. One of the reasons they cited for that is that their testing has been focused on Linux/gateway versions, and you can’t assume that desktop versions will perform in the same way across a range of products. Including some Windows products will make a difference in that respect, but I can’t say how much, because I don’t know which versions they’re using. Where gateway products are used, it’s unlikely that the whole range of detection techniques are used that an end-point product uses. Detection is often dependent on execution context, certainly where detection depends on some form of dynamic analysis. A gateway product on an OS where the binary can’t execute may not detect what its desktop equivalent does, because the context is inappropriate. On the other hand, the gateway product’s heuristics may be more paranoid. Either way, there’s a possibility for statistical bias…

This isn’t a criticism of Shadowserver, which does some really useful work. I just don’t think I could recommend this as a realistic guide to comparative performance assessment…

Neither Luis nor David are known to shy away from the truth, whether of themselves or their products. But both seem fairly clear: Shadowserver is good; but this service is not yet ready. Shadowserver’s AV test suite will not give a realistic view of different AV products’ actual capabilities. Not yet. It needs more work. I’m certain that will happen. But for the time being at least, don’t use Shadowserver’s statistics to form an opinion on the relative merits of different AV products.

————————

UPDATE from Shadowserver
It is difficult to not compare one vendor to the next due to how we have the data
structured on the pages.  It would be impossible not to try and derive conclusions
from those results.  While that is the case, our goal is not to create a real
comparison site for everyone to try and compete to see which AV vendor is better
than the next…

That is not our purpose…

That being said, our purposes in doing AV testing is simple.  We wanted to know what
each malware was supposed to be for categorization purposes, and of course just to
see what happened.  We collect a lot of malware daily and trying to find ways of
tying our data together is important.

Because we are volunteers and a non-profit we really enjoy sharing what we find
no matter how odd.  We even enjoy talking about when we screw something up or
when we encounter something exciting.  Everything here is for you our public to
enjoy, discuss, and even criticize…
Shadowserver, 8 September.

 

Shadowserver’s AV test suite
PandaLabs

ESET

Categories: All, Security Issues
  1. Ted Turdgeson
    September 8, 2011 at 8:46 pm

    I don’t believe Shadowserver ever stated that they had a service that was to be used for comparison nor had they planned on doing this as a service, so I think you unfairly made the statement “But both seem fairly clear: Shadowserver is good; but this service is not yet ready. Shadowserver’s AV test suite will not give a realistic view of different AV products’ actual capabilities. Not yet. It needs more work.”

    Like

    • September 8, 2011 at 9:57 pm

      I absolutely stand by what I have said. And would ask you a couple of questions. Why has Shadowserver published this data if not as a ‘service’? It is a service. What is it for if not to allow comparison? No one vendor’s results are at all meaningful unless viewed in relation, ie, comparison, to another. Would visitors do anything other than compare the different results?

      Having said that, then obviously this ‘service’ should not be compared to the many well-funded commercial testing companies. But wouldn’t it be wonderful if it could? A completely free, completely independent, completely unbiased test of all the major suppliers. That’s what I would love to see – but it’s a long way off.

      Finally, I would point to the update put out by Shadowserver this evening – the gist of which I append to my post.

      Like

  2. September 7, 2011 at 6:08 pm

    That was an excellent article, Kevin.

    It’s probably worth distinguishing between companies and products, though. Panda and Avira are companies, not products. Both in fact provide commercial and free anti-virus products. For example, Panda has a free command line scanner, while Avira AntiVir Premium costs money.

    But that’s a small point. The fact is that static file tests of products that have other layers of protection do not represent a real life situation.

    Cheers,
    Simon

    Like

  3. September 7, 2011 at 5:15 pm

    i’m sure luis knows this and simply misspoke, but signatures and heuristics are not the oldest anti-virus technology. the first anti-virus was a behaviour-based program called flushot.

    Like

  1. February 19, 2013 at 10:59 am
  2. September 7, 2011 at 5:48 pm
  3. September 7, 2011 at 4:39 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: