Archive

Posts Tagged ‘false positives’

Tis the comparative season

April 25th, 2011 Comments off

There’s been a few comparative tests published as of late. In case you’ve missed any of them here’s a quick rundown of the most significant ones.

First on the list are the Q1-2011 quarterly results of the Full Product Test (FPT)  by AV-Test.org. These FPT’s are performed on a monthly basis and are very in-depth, covering pretty much all aspects of a modern security software and testing from a users’ perspective by replicating infection vectors and user experience. The areas tested include real-time blocking of malicious websites, detection of relevant and active malware samples (zoo malware, wildlist malware according to AV-Test.org criteria of wildlist, not the limited WildList.org list), false positive testing, performance testing, disinfection testing, detection and disinfection of active rootkits, behaviour-based dynamic detection, dynamic false positives and packing and archive support. Overall one of the most comprehensive regular tests out there. It’s such a tough test that 5 out of the 22 vendors tested did not obtain the minimum score to achieve certification. Panda Internet Security came out with very good scores and achieved certification. The report on Panda Internet Security can be found here (PDF) and the complete results for all vendors here.

Next in line are a couple of tests by AV-Comparatives.org. The first one is the traditional On-Demand test from February 2011 which also tests false positives and performance of the on-demand scanner. In this test Panda Antivirus Pro achieved the #4 rank in terms of malware detection. We still had 18 false positives that, even though are of low prevalence according to AV-Comparatives.org, prevented us from achieving the Advanced+ certification. We’re doing a lot of work in improving in this area. Panda Antivirus Pro also achieved the #2 rank in the performance test for scanning speeds. The full report can be downloaded from the AV-Comparatives.org website here.

The second test by AV-Comparatives.org that has been published recently is the Whole-Product Test. Similar to the AV-Test.org Full Product Test, this test tries to test user experience by replicating the infection vector. Unlike the AV-Test.org FPT, this one focuses only on malicious websites and behaviour-based dynamic detection. Panda Internet Security scored very good with a 98.8% protection index. More information can be found at the AV-Comparatives.org site here.

If you’re interested in these types of AV tests, make sure to vote on your favourite AV testing outfit in our open poll here. So far both AV-Comparatives and AV-Test are leading the pack.

AV-Comparatives Performance Test 2010

August 23rd, 2010 11 comments

‘Tis the comparative season ;)

Right after the AV-Test Certification and the chinese PCSL Full-Protection test, the new AV-Comparatives Performance Test has just been published.

Once again, Panda Internet Security gets an excellent score, obtaining the #1 rank as best performer and winning the Advanced+ award.
perf_adv+_aug10

For more details be sure to visit AV-Comparatives and download the full report from http://www.av-comparatives.org/comparativesreviews/performance-tests, altough I can give you a preview of the full results here:
avc-201008-table

Automated False Positives

June 2nd, 2010 5 comments

I’ve covered the impact that automated detection systems have on false positives in the past. Hispasec, the makers of VirusTotal, also talked about this issue in their blog post aptly named Antivirus Rumorology. More recently Kaspersky conducted an experiment during a press conference and showed a bunch of journalists how these false positives roll over from one vendor engine to the next. Of course being journalists, they only took home the message “AV copies each other and mostly us” as is shown in the articles published covering the event . Even though the objective of the experiment was put under scrutiny, the fact remains that this is an industry-wide problem and no single vendor is immune to its effects, not even Kaspersky as we will see.

As some of the regular readers of this blog will probably remember, in March 2010 we published a “PandaCloudTestFile.exe” binary file to test the connectivity of Panda products with its cloud-scanning component, Collective Intelligence. This “PandaCloudTestFile.exe” is a completely harmless file that only tells the Panda products to query the cloud. Our cloud-scanning servers have been manually configured to detect this file as malicious with the only objective of showing the end user that the cloud-scanning component of his/her product are working correctly.

Initially this file was only detected by Panda as Trj/CI.A (a Collective Intelligence detection) and Symantec’s Insight (noting that this is not a very common file, even though treating reputation alone as “suspicious” is by itself grounds enough for debate — maybe another future post).

Panda 10.0.2.2 2010.03.10 Trj/CI.A
Symantec 20091.2.0.41 2010.03.11 Suspicious.Insight

A few days later came the first problematic detection, this time from Kaspersky, who detected the “PandaCloudTestFile.exe” with a signature, specifically calling it a Bredolab backdoor. I call this detection problematic as it is clearly not a suspicious detection nor a reputation signature. It is also clearly an incorrect detection as the file in itself is not related in any way to Bredolab. Soon we will see why this Kaspersky signature is problematic.

Kaspersky 7.0.0.125 2010.03.20 Backdoor.Win32.Bredolab.djl

In the next few days some other AV scanners started detecting it as well, in many cases with the exact same Bredolab name.

McAfee+Artemis 5930 2010.03.24 Artemis!E01A57998BC1
Fortinet 4.0.14.0 2010.03.26 W32/Bredolab.DJL!tr.bdr
TheHacker 6.5.2.0.245 2010.03.26 Backdoor/Bredolab.dmb
Antiy-AVL 2.0.3.7 2010.03.31 Backdoor/Win32.Bredolab.gen
Jiangmin 13.0.900 2010.03.31 Backdoor/Bredolab.bmr
VBA32 3.12.12.4 2010.03.31 Backdoor.Win32.Bredolab.dmb

In the month that follows (April 2010) a bunch of new engines started detecting it, mostly as the Bredolab name we are now familiar with, although some new names started appearing as well (Backdoor.generic, Monder, Trojan.Generic, etc.).

a-squared 4.5.0.50 2010.04.05 Trojan.Win32.Bredolab!IK
AhnLab-V3 2010.04.30.00 2010.04.30 Backdoor/Win32.Bredolab
AVG 9.0.0.787 2010.04.30 BackDoor.Generic12.BHAD
Ikarus T3.1.1.80.0 2010.04.05 Trojan.Win32.Bredolab
CAT-QuickHeal 10.00 2010.04.12 Backdoor.Bredolab.djl
TrendMicro 9.120.0.1004 2010.04.03 TROJ_MONDER.AET
Sunbelt 6203 2010.04.21 Trojan.Win32.Generic!BT
VBA32 3.12.12.4 2010.04.02 Backdoor.Win32.Bredolab.dmb
VirusBuster 5.0.27.0 2010.04.17 Backdoor.Bredolab.BLU

And to top it all off, during this month of May 2010 the following engines started detecting “PandaCloudTestFile.exe” as well. Here we can also even see a “suspicious” detection, probably the only one out of all of them that could make any sense.

Authentium 5.2.0.5 2010.05.15 W32/Backdoor2.GXIM
F-Prot 4.5.1.85 2010.05.15 W32/Backdoor2.GXIM
McAfee 5.400.0.1158 2010.05.05 Bredolab!j
McAfee-GW-Edition 2010.1 2010.05.05 Bredolab!j
Norman 6.04.12 2010.05.13 W32/Suspicious_Gen3.CUGF
PCTools 7.0.3.5 2010.05.14 Backdoor.Bredolab
TrendMicro-HouseCall 9.120.0.1004 2010.05.05 TROJ_MONDER.AET
ViRobot 2010.5.4.2303 2010.05.05 Backdoor.Win32.Bredolab.40960.K

It is worth noting that consumer products have other technologies included in their products, such as white-listing and digital certificate checks, which could cause the file to not be detected on the consumer endpoint, but the fact that there is a signature for such file is a good indicator that it will probably be detected on the endpoint.

So why am I writing about all this? First of all, to emphasize the point I tried to make in the past that automated systems have to be maintained, monitored, tuned and improved so that more in-depth analysis is done through them and not rely so much on “rumorology”.

Secondly, to show that this is an industry-wide problematic that results from having to deal with tens of thousands of new malware variants per day, and no vendor is immune to it. What matters at the end of the day is that the automated systems are supervised and improved constantly to avoid false positives.

I can certainly understand why vendors point to their signatures being “rolled over” to other AV engines, but these same vendors should also take care so that they do not become the source of these “false positive rumors” in the first place.
 

UPDATE June 3rd, 2010: Reading Larry’s post over at securitywatch, it seems Kaspersky has reacted quickly and has removed their signature for the PandaCloudTestFile.exe file. Thanks Larry & Kaspersky!

AV Comparative Against Chinese Malware

May 10th, 2010 5 comments

As many of you already know, a large portion of today’s malware is created and/or distributed from China. With that in mind, chinese independent AV testing lab PC Security Labs, has published a comparative study of AV detection of chinese malware. The comparative can be downloaded from here in PDF format.

Panda Internet Security 2010 has done fairly good in this test, ranking first in both detection as well as overall score:
PCSL Chinese Malware 2010-05

The thing I like best about PCSL tests is that, unlike other tests out there, PCSL takes a unified look at the products tested. Not only does it look at static and dynamic (behavioural) detection, but also at static and dynamic false positives, combining everything into a single, unified, global score per product. Other tests only look at these different technologies separate from each other.

As some of you may remember we started taking part in PCSL’s main AV tests in November 2008 and so far we’ve achieved Excellent score in all the tests.
null

More info @ PC Security Labs website or at the main published report at http://article.pchome.net/content-1116841.html (chinese only)