“Our little gnomes in the backroom,” says the excellent Shadowserver in an announcement headed ‘New AV Test Suite’, “have been working feverishly for the last several months to put the finishing touches on our new Anti-Virus backend test systems.”
Malware testing, as we know, is a tricky business. AMTSO, the Anti-malware Testing Standards Organization, has expended much energy and expertise in developing detailed methodologies designed to ensure fair, unbiased and accurate anti-virus tests. But do we get this from Shadowserver? Do we get a new AV comparison source that we can realistically access for accurate unbiased information on the different AV products available to us? Let’s see.
Shadowserver starts off with a fair comment.
No single vendor detects 100%, nor can they ever. To expect complete protection will always be science-fiction.
That being said, it goes on…
…you can see the different statistics of the different vendors in our charts.
Here’s a couple of examples.
The one thing that really leaps out here is that Panda apparently misses (shown in green) far more of the test samples than Avira. This is counterintuitive. Panda is a commercial product backed by one of the world’s leading security companies. Avira, which I personally trust sufficiently to use on my XP netbook, is a free product. Shadowserver provides a partial answer:
The longest running issue has been our inability to use Windows based AV applications. We can now handle that, however it is still not what you might buy for home or commercial use. We are utilizing a special command-line-interface version from each of the vendors that we are using. This is not something you can purchase or utilize normally. These are all special version but most of them do use the same engines and signatures that the commercial products use.
This is important. Luis Corrons, technical director at PandaLabs elaborated:
What ShadowServer does is not an antivirus test. As they say, they do not even use commercial products, but special versions. Furthermore, it is static analysis of files they capture. It is a statistic. But the data cannot be used to say “product x detects more than product y” or “product x detects this percentage” as they are not using any of the other security layers used in real products (behavioural analysis/blocking, firewall, URL filtering, etc). The most you can say with this system is product x was able to detect y percent of files using their signatures and heuristics (the oldest antivirus technologies).
This is important. The AV companies have long recognised that the original signature database solution to malware cannot match the speed with which new signatures are required for polymorphic virus families. So they have supplemented their signature detection with more advanced and sophisticated methodologies.
In our case (Panda) ShadowServer is using an engine which is a few years old (at least 5) and of course is not using the cloud, so I can guarantee that our results are going to be awful. We have been asking SS for years to use a new version, but they were not supporting Windows. Now that they are supporting it, they forgot to mention it, but it’s not a problem as we’ll be sending them a new version with cloud connection. Anyway, even though in that way the results will be way better, or even if we are the number 1 vendor, that doesn’t mean anything, as it is only a static analysis of some files.
One solution would be for Shadowserver to work more closely with AMTSO. Shadowserver is not currently a member of AMTSO. I urge it to join. And I urge AMTSO to waive all membership fees so that this non-profit free service organization can do so. Both parties would benefit enormously. In the meantime, I asked David Harley, a director of AMTSO and research fellow at ESET, for his personal thoughts.
Shadowserver has never been discussed within AMTSO, that I remember… In the past they’ve shied away from suggesting that their statistics are suitable for direct comparison of vendor performance. One of the reasons they cited for that is that their testing has been focused on Linux/gateway versions, and you can’t assume that desktop versions will perform in the same way across a range of products. Including some Windows products will make a difference in that respect, but I can’t say how much, because I don’t know which versions they’re using. Where gateway products are used, it’s unlikely that the whole range of detection techniques are used that an end-point product uses. Detection is often dependent on execution context, certainly where detection depends on some form of dynamic analysis. A gateway product on an OS where the binary can’t execute may not detect what its desktop equivalent does, because the context is inappropriate. On the other hand, the gateway product’s heuristics may be more paranoid. Either way, there’s a possibility for statistical bias…
This isn’t a criticism of Shadowserver, which does some really useful work. I just don’t think I could recommend this as a realistic guide to comparative performance assessment…
Neither Luis nor David are known to shy away from the truth, whether of themselves or their products. But both seem fairly clear: Shadowserver is good; but this service is not yet ready. Shadowserver’s AV test suite will not give a realistic view of different AV products’ actual capabilities. Not yet. It needs more work. I’m certain that will happen. But for the time being at least, don’t use Shadowserver’s statistics to form an opinion on the relative merits of different AV products.
UPDATE from Shadowserver
It is difficult to not compare one vendor to the next due to how we have the data
structured on the pages. It would be impossible not to try and derive conclusions
from those results. While that is the case, our goal is not to create a real
comparison site for everyone to try and compete to see which AV vendor is better
than the next…
That is not our purpose…
That being said, our purposes in doing AV testing is simple. We wanted to know what
each malware was supposed to be for categorization purposes, and of course just to
see what happened. We collect a lot of malware daily and trying to find ways of
tying our data together is important.
Because we are volunteers and a non-profit we really enjoy sharing what we find
no matter how odd. We even enjoy talking about when we screw something up or
when we encounter something exciting. Everything here is for you our public to
enjoy, discuss, and even criticize…
Shadowserver, 8 September.
Last year I voiced two main concerns about AMTSO, the anti-malware testing standards organisation. One was collusion in the false marketing impression given by claims of 100% test success against malware in the Wild(list). I won’t repeat my concerns here (see instead the original articles AMTSO: a serious attempt to clean up anti-malware testing; or just a great big con? and Anti Malware Testing Standards Organization: a dissenting view). Sadly, there has never really been any acknowledgement that this is a valid concern; nevermind any action on it.
The second concern is that AMTSO is effectively a closed shop: it is largely by the industry for the industry; and for that reason alone it cannot be trusted. This caused no inconsiderable heat, with some members of AMTSO feeling that I was saying that they personally could not be trusted. Others, however, accepted that it was a valid issue.
Well, I am now delighted that AMTSO has made serious attempts to address the problem. Last October it announced a new low-cost subscription fee in an attempt to get more people involved:
While AMTSO recognizes that strict requirements for full membership are necessary to ensure it achieves its objectives, it also understands that the fees put it out of reach for many interested individuals that may have a valuable contribution to improving the objectivity, quality and relevance of testing methodologies. Hopefully, the new low cost subscription model will widen the reach of the organisation and enable more people to have a say in the future of anti-malware testing.
Philipp Wolf, of AMTSO member Avira
This new subscription currently stands at €25 per annum. I don’t know how many subscribers it has attracted – but I doubt that it is many. “They will also have the right to attend meetings, though not as voting members.” Why should I pay money to have no ultimate say in things?
Today, however, AMTSO has launched an open (and free!) “forum where anyone may post and join in testing-related discussions.” Users are still unable to vote on AMTSO issues, but that’s fair enough. Discussions, like justice, should be seen to be done. Provided that AMTSO moderators do not censor this discussion forum (other than the usual legal requirements), it will “provide a discussion point where anyone with a question or an opinion on the testing of anti-malware software can make their voice heard.”
For that, AMTSO deserves to be commended.
I must say that I have known and respected David Harley for many years. I still respect him enormously. Indeed, I respect the anti-virus industry in general. So, having said that, a quick response to David’s article. In it, he says:
That brings us to Kevin Townsend, who could never be described as AMTSO’s best friend…
…But then I realized that he might have been misled by this statement on the AMTSO home page…
I would just like to say that I don’t believe that I have been misled. I know that AMTSO is open to the people I would like to see within it. But the fact is they are not in it. So my point is simply that AMTSO needs to go and get them. Just saying that “AMTSO membership is open to any corporation, institution or unaffiliated individual interested in participating in this organization” is not in itself sufficient if they don’t join. However, as soon as AMTSO has sufficient representation from within the AV user community, it will gain the credibility it deserves.