Archive

Posts Tagged ‘test’

Q2 2011 Test Results of Security Suites

July 20th, 2011 5 comments

Recently both AV-Test.org and AV-Comparatives.org have announced respective results for their dynamic real-world or whole-product tests. Basically these AV tests try to replicate user experience by introducing malware to the test machine in pretty much the same way a regular user would encounter malware and get infected. We are very proud of the results of Panda Internet Security 2011/2012 as it shows consistency in providing top quality detection and protection, on top of better known security vendors such as Symantec, Avast, AVG, ESET, Trend Micro, Microsoft, Webroot, etc.

AV-TEST REAL-WORLD TEST – Q2 2011 RESULTS

In this real-world test results for Q2 2011 Panda was one of only 4 vendors to achieve a score higher than 15 points.

AV-COMPARATIVES WHOLE-PRODUCT TEST – JUNE 2011

In the June 2011 test Panda Internet Security achieved the first place in “blocking” rate without requiring any user interaction along with two other vendors.

Tis the comparative season

April 25th, 2011 Comments off

There’s been a few comparative tests published as of late. In case you’ve missed any of them here’s a quick rundown of the most significant ones.

First on the list are the Q1-2011 quarterly results of the Full Product Test (FPT)  by AV-Test.org. These FPT’s are performed on a monthly basis and are very in-depth, covering pretty much all aspects of a modern security software and testing from a users’ perspective by replicating infection vectors and user experience. The areas tested include real-time blocking of malicious websites, detection of relevant and active malware samples (zoo malware, wildlist malware according to AV-Test.org criteria of wildlist, not the limited WildList.org list), false positive testing, performance testing, disinfection testing, detection and disinfection of active rootkits, behaviour-based dynamic detection, dynamic false positives and packing and archive support. Overall one of the most comprehensive regular tests out there. It’s such a tough test that 5 out of the 22 vendors tested did not obtain the minimum score to achieve certification. Panda Internet Security came out with very good scores and achieved certification. The report on Panda Internet Security can be found here (PDF) and the complete results for all vendors here.

Next in line are a couple of tests by AV-Comparatives.org. The first one is the traditional On-Demand test from February 2011 which also tests false positives and performance of the on-demand scanner. In this test Panda Antivirus Pro achieved the #4 rank in terms of malware detection. We still had 18 false positives that, even though are of low prevalence according to AV-Comparatives.org, prevented us from achieving the Advanced+ certification. We’re doing a lot of work in improving in this area. Panda Antivirus Pro also achieved the #2 rank in the performance test for scanning speeds. The full report can be downloaded from the AV-Comparatives.org website here.

The second test by AV-Comparatives.org that has been published recently is the Whole-Product Test. Similar to the AV-Test.org Full Product Test, this test tries to test user experience by replicating the infection vector. Unlike the AV-Test.org FPT, this one focuses only on malicious websites and behaviour-based dynamic detection. Panda Internet Security scored very good with a 98.8% protection index. More information can be found at the AV-Comparatives.org site here.

If you’re interested in these types of AV tests, make sure to vote on your favourite AV testing outfit in our open poll here. So far both AV-Comparatives and AV-Test are leading the pack.

AV-Test.org 2010 Test Results

January 31st, 2011 2 comments

The independent AV testing organization AV-Test.org recently released the last results of its monthly “Full Product Tests”. The Full Product Tests are a comprehensive look at anti-malware products’ ability to protect end users in real-life situations. It covers three main areas of each product: Protection, Repair and Usability. Under each area there are multiple sub-tests, such as signature detection, behavioural or dynamic detection, etc. The detailed results are available at www.av-test.org/certifications.

In order to gain certification a product has to achieve a minimum score of 12 or above. The results are very revealing, with many products not reaching the mininum score nor the certification. We are happy to announce that in the 3 quarters that AV-Test.org has conducted these tests, Panda Internet Security has achieved the certification in all cases.

On a related note, AV-Test.org recently surpassed the 50 million unique malicious sample mark. This is aligned with what our Collective Intelligence servers have analyzed and processed automatically, which is up to 146 million files (both good and bad files).

AV-Comparatives Performance Test 2010

August 23rd, 2010 11 comments

‘Tis the comparative season ;)

Right after the AV-Test Certification and the chinese PCSL Full-Protection test, the new AV-Comparatives Performance Test has just been published.

Once again, Panda Internet Security gets an excellent score, obtaining the #1 rank as best performer and winning the Advanced+ award.
perf_adv+_aug10

For more details be sure to visit AV-Comparatives and download the full report from http://www.av-comparatives.org/comparativesreviews/performance-tests, altough I can give you a preview of the full results here:
avc-201008-table

PC Security Labs July 2010 Test Results

August 23rd, 2010 3 comments

 
Right after the good news from the AV-Test Certification results comes the newest test results from Chinese independent lab PC Securitylabs (www.pcsecuritylabs.net).

In this test Panda Internet Security has achieved both “5 Star” rating as well as a special award for “Top Detection” out of all tested solutions.

201007

The full test report can be downloaded from www.pcsecuritylabs.net or directly from our server here.

AV-Test Q2-2010 Full Product Test Results

August 17th, 2010 9 comments

Finally the full report of the comprehensive Full-Product Test from German independent antivirus tester AV-Test.org is out.

Panda Internet Security has received excellent scores in all categories, accomplishing top rank along with two other vendors. According to Andreas Marx, CEO of AV-Test.org, “Panda Internet Security was one of only three products which was able to receive the highest scores during this exhaustive test which was performed over a period of 12 weeks“.

av-test-2010-q2-fx-2

The Full-Product Test is a very extensive test which looks at many different aspects of a security solution:

  • Real-World Testing – protection against 0-day and web/email malware
  • Dynamic (Behaviour) Detection Testing – blocking of malware on execution
  • Detection of Large Malware Collection – testbed from last 3 months’ malware
  • Detection of Widespread Malware – based on WildList criteria
  • Repair and removal of widespread malware
  • Removal of malicious components and remediation of system modifications
  • Detection of hidden active rootkits
  • Removal of hidden active rootkits
  • Average slow-down of the computer
  • False positives during static on-demand scanning
  • False positives during dynamic on-access scanning

av-test-2010-q2

The complete report can be download from the AV-Test.org website or from our server here.

Some additional comments from AV-Test.org regarding Panda Internet Security:

Panda Internet Security showed impressive high results for the static and dynamic detection of new malware.

The detection and removal of actively running stealth malware such as rootkits was no problem for Panda Internet Security, but for many other reviewed products.

We tested not only the protection against known and unknown malware, but also the removal of critters which had previously infected the system and Panda Internet Security received 5.5 out of 6.0 possible points in these two category, the highest scores archived by a program during this exhaustive review.

Not only the protection against and removal of new malware was very high, but at the same time Panda Internet Security had less impact on the system from the usability point of view.

Automated False Positives

June 2nd, 2010 5 comments

I’ve covered the impact that automated detection systems have on false positives in the past. Hispasec, the makers of VirusTotal, also talked about this issue in their blog post aptly named Antivirus Rumorology. More recently Kaspersky conducted an experiment during a press conference and showed a bunch of journalists how these false positives roll over from one vendor engine to the next. Of course being journalists, they only took home the message “AV copies each other and mostly us” as is shown in the articles published covering the event . Even though the objective of the experiment was put under scrutiny, the fact remains that this is an industry-wide problem and no single vendor is immune to its effects, not even Kaspersky as we will see.

As some of the regular readers of this blog will probably remember, in March 2010 we published a “PandaCloudTestFile.exe” binary file to test the connectivity of Panda products with its cloud-scanning component, Collective Intelligence. This “PandaCloudTestFile.exe” is a completely harmless file that only tells the Panda products to query the cloud. Our cloud-scanning servers have been manually configured to detect this file as malicious with the only objective of showing the end user that the cloud-scanning component of his/her product are working correctly.

Initially this file was only detected by Panda as Trj/CI.A (a Collective Intelligence detection) and Symantec’s Insight (noting that this is not a very common file, even though treating reputation alone as “suspicious” is by itself grounds enough for debate — maybe another future post).

Panda 10.0.2.2 2010.03.10 Trj/CI.A
Symantec 20091.2.0.41 2010.03.11 Suspicious.Insight

A few days later came the first problematic detection, this time from Kaspersky, who detected the “PandaCloudTestFile.exe” with a signature, specifically calling it a Bredolab backdoor. I call this detection problematic as it is clearly not a suspicious detection nor a reputation signature. It is also clearly an incorrect detection as the file in itself is not related in any way to Bredolab. Soon we will see why this Kaspersky signature is problematic.

Kaspersky 7.0.0.125 2010.03.20 Backdoor.Win32.Bredolab.djl

In the next few days some other AV scanners started detecting it as well, in many cases with the exact same Bredolab name.

McAfee+Artemis 5930 2010.03.24 Artemis!E01A57998BC1
Fortinet 4.0.14.0 2010.03.26 W32/Bredolab.DJL!tr.bdr
TheHacker 6.5.2.0.245 2010.03.26 Backdoor/Bredolab.dmb
Antiy-AVL 2.0.3.7 2010.03.31 Backdoor/Win32.Bredolab.gen
Jiangmin 13.0.900 2010.03.31 Backdoor/Bredolab.bmr
VBA32 3.12.12.4 2010.03.31 Backdoor.Win32.Bredolab.dmb

In the month that follows (April 2010) a bunch of new engines started detecting it, mostly as the Bredolab name we are now familiar with, although some new names started appearing as well (Backdoor.generic, Monder, Trojan.Generic, etc.).

a-squared 4.5.0.50 2010.04.05 Trojan.Win32.Bredolab!IK
AhnLab-V3 2010.04.30.00 2010.04.30 Backdoor/Win32.Bredolab
AVG 9.0.0.787 2010.04.30 BackDoor.Generic12.BHAD
Ikarus T3.1.1.80.0 2010.04.05 Trojan.Win32.Bredolab
CAT-QuickHeal 10.00 2010.04.12 Backdoor.Bredolab.djl
TrendMicro 9.120.0.1004 2010.04.03 TROJ_MONDER.AET
Sunbelt 6203 2010.04.21 Trojan.Win32.Generic!BT
VBA32 3.12.12.4 2010.04.02 Backdoor.Win32.Bredolab.dmb
VirusBuster 5.0.27.0 2010.04.17 Backdoor.Bredolab.BLU

And to top it all off, during this month of May 2010 the following engines started detecting “PandaCloudTestFile.exe” as well. Here we can also even see a “suspicious” detection, probably the only one out of all of them that could make any sense.

Authentium 5.2.0.5 2010.05.15 W32/Backdoor2.GXIM
F-Prot 4.5.1.85 2010.05.15 W32/Backdoor2.GXIM
McAfee 5.400.0.1158 2010.05.05 Bredolab!j
McAfee-GW-Edition 2010.1 2010.05.05 Bredolab!j
Norman 6.04.12 2010.05.13 W32/Suspicious_Gen3.CUGF
PCTools 7.0.3.5 2010.05.14 Backdoor.Bredolab
TrendMicro-HouseCall 9.120.0.1004 2010.05.05 TROJ_MONDER.AET
ViRobot 2010.5.4.2303 2010.05.05 Backdoor.Win32.Bredolab.40960.K

It is worth noting that consumer products have other technologies included in their products, such as white-listing and digital certificate checks, which could cause the file to not be detected on the consumer endpoint, but the fact that there is a signature for such file is a good indicator that it will probably be detected on the endpoint.

So why am I writing about all this? First of all, to emphasize the point I tried to make in the past that automated systems have to be maintained, monitored, tuned and improved so that more in-depth analysis is done through them and not rely so much on “rumorology”.

Secondly, to show that this is an industry-wide problematic that results from having to deal with tens of thousands of new malware variants per day, and no vendor is immune to it. What matters at the end of the day is that the automated systems are supervised and improved constantly to avoid false positives.

I can certainly understand why vendors point to their signatures being “rolled over” to other AV engines, but these same vendors should also take care so that they do not become the source of these “false positive rumors” in the first place.
 

UPDATE June 3rd, 2010: Reading Larry’s post over at securitywatch, it seems Kaspersky has reacted quickly and has removed their signature for the PandaCloudTestFile.exe file. Thanks Larry & Kaspersky!

AV Comparative Against Chinese Malware

May 10th, 2010 5 comments

As many of you already know, a large portion of today’s malware is created and/or distributed from China. With that in mind, chinese independent AV testing lab PC Security Labs, has published a comparative study of AV detection of chinese malware. The comparative can be downloaded from here in PDF format.

Panda Internet Security 2010 has done fairly good in this test, ranking first in both detection as well as overall score:
PCSL Chinese Malware 2010-05

The thing I like best about PCSL tests is that, unlike other tests out there, PCSL takes a unified look at the products tested. Not only does it look at static and dynamic (behavioural) detection, but also at static and dynamic false positives, combining everything into a single, unified, global score per product. Other tests only look at these different technologies separate from each other.

As some of you may remember we started taking part in PCSL’s main AV tests in November 2008 and so far we’ve achieved Excellent score in all the tests.
null

More info @ PC Security Labs website or at the main published report at http://article.pchome.net/content-1116841.html (chinese only)

Panda @ AV-Comparatives

January 26th, 2010 19 comments

After some years we have decided to participate again in the AV-Comparatives.org tests.

The main driver for this decision has been the evolution of the methodologies employed by AV-Comparatives. We are happy to see that cloud-scanning components of products are also tested and this of course is important for testing Panda products as they incorporate not only signature-based cloud-scanning but also cloud-heuristics.

We will participate in all the main tests of AV-Comparatives (On-Demand, Retrospective, False Positive, Malware Removal, etc.) as well as the new Whole-Product Test which is a very promising test which replicates user experience.

However AV-Comparative’s Retrospective Test (which consists of freezing a 2-week old signature and testing against new malware to see how good the heuristic engine is) still does not use cloud-heuristics which are present in Panda products. Even though this methodology will penalize Panda’s products to some degree, we believe it is important to be present in the rest of the tests performed by AV-Comparatives.

Categories: News Tags: , , ,