Recently both AV-Test.org and AV-Comparatives.org have announced respective results for their dynamic real-world or whole-product tests. Basically these AV tests try to replicate user experience by introducing malware to the test machine in pretty much the same way a regular user would encounter malware and get infected. We are very proud of the results of Panda Internet Security 2011/2012 as it shows consistency in providing top quality detection and protection, on top of better known security vendors such as Symantec, Avast, AVG, ESET, Trend Micro, Microsoft, Webroot, etc.
AV-TEST REAL-WORLD TEST – Q2 2011 RESULTS
In this real-world test results for Q2 2011 Panda was one of only 4 vendors to achieve a score higher than 15 points.
AV-COMPARATIVES WHOLE-PRODUCT TEST – JUNE 2011
In the June 2011 test Panda Internet Security achieved the first place in “blocking” rate without requiring any user interaction along with two other vendors.
Categories: behavior analysis, Heuristics, Malware, News, Stats av-comparatives, av-test, comparatives, dynamic test, full-product test, kaspersky, microsoft, test, whole-product test
There’s been a few comparative tests published as of late. In case you’ve missed any of them here’s a quick rundown of the most significant ones.
First on the list are the Q1-2011 quarterly results of the Full Product Test (FPT) by AV-Test.org. These FPT’s are performed on a monthly basis and are very in-depth, covering pretty much all aspects of a modern security software and testing from a users’ perspective by replicating infection vectors and user experience. The areas tested include real-time blocking of malicious websites, detection of relevant and active malware samples (zoo malware, wildlist malware according to AV-Test.org criteria of wildlist, not the limited WildList.org list), false positive testing, performance testing, disinfection testing, detection and disinfection of active rootkits, behaviour-based dynamic detection, dynamic false positives and packing and archive support. Overall one of the most comprehensive regular tests out there. It’s such a tough test that 5 out of the 22 vendors tested did not obtain the minimum score to achieve certification. Panda Internet Security came out with very good scores and achieved certification. The report on Panda Internet Security can be found here (PDF) and the complete results for all vendors here.
Next in line are a couple of tests by AV-Comparatives.org. The first one is the traditional On-Demand test from February 2011 which also tests false positives and performance of the on-demand scanner. In this test Panda Antivirus Pro achieved the #4 rank in terms of malware detection. We still had 18 false positives that, even though are of low prevalence according to AV-Comparatives.org, prevented us from achieving the Advanced+ certification. We’re doing a lot of work in improving in this area. Panda Antivirus Pro also achieved the #2 rank in the performance test for scanning speeds. The full report can be downloaded from the AV-Comparatives.org website here.
The second test by AV-Comparatives.org that has been published recently is the Whole-Product Test. Similar to the AV-Test.org Full Product Test, this test tries to test user experience by replicating the infection vector. Unlike the AV-Test.org FPT, this one focuses only on malicious websites and behaviour-based dynamic detection. Panda Internet Security scored very good with a 98.8% protection index. More information can be found at the AV-Comparatives.org site here.
If you’re interested in these types of AV tests, make sure to vote on your favourite AV testing outfit in our open poll here. So far both AV-Comparatives and AV-Test are leading the pack.
Categories: behavior analysis, Heuristics, Malware, News, Rootkits, Stats av-comparatives, av-test, comparatives, false positives, full-product test, test, whole-product test
The independent AV testing organization AV-Test.org recently released the last results of its monthly “Full Product Tests”. The Full Product Tests are a comprehensive look at anti-malware products’ ability to protect end users in real-life situations. It covers three main areas of each product: Protection, Repair and Usability. Under each area there are multiple sub-tests, such as signature detection, behavioural or dynamic detection, etc. The detailed results are available at www.av-test.org/certifications.
In order to gain certification a product has to achieve a minimum score of 12 or above. The results are very revealing, with many products not reaching the mininum score nor the certification. We are happy to announce that in the 3 quarters that AV-Test.org has conducted these tests, Panda Internet Security has achieved the certification in all cases.
On a related note, AV-Test.org recently surpassed the 50 million unique malicious sample mark. This is aligned with what our Collective Intelligence servers have analyzed and processed automatically, which is up to 146 million files (both good and bad files).
‘Tis the comparative season
Right after the AV-Test Certification and the chinese PCSL Full-Protection test, the new AV-Comparatives Performance Test has just been published.
Once again, Panda Internet Security gets an excellent score, obtaining the #1 rank as best performer and winning the Advanced+ award.
For more details be sure to visit AV-Comparatives and download the full report from http://www.av-comparatives.org/comparativesreviews/performance-tests, altough I can give you a preview of the full results here:
Right after the good news from the AV-Test Certification results comes the newest test results from Chinese independent lab PC Securitylabs (www.pcsecuritylabs.net).
In this test Panda Internet Security has achieved both “5 Star” rating as well as a special award for “Top Detection” out of all tested solutions.
The full test report can be downloaded from www.pcsecuritylabs.net or directly from our server here.
Finally the full report of the comprehensive Full-Product Test from German independent antivirus tester AV-Test.org is out.
Panda Internet Security has received excellent scores in all categories, accomplishing top rank along with two other vendors. According to Andreas Marx, CEO of AV-Test.org, “Panda Internet Security was one of only three products which was able to receive the highest scores during this exhaustive test which was performed over a period of 12 weeks“.
The Full-Product Test is a very extensive test which looks at many different aspects of a security solution:
- Real-World Testing – protection against 0-day and web/email malware
- Dynamic (Behaviour) Detection Testing – blocking of malware on execution
- Detection of Large Malware Collection – testbed from last 3 months’ malware
- Detection of Widespread Malware – based on WildList criteria
- Repair and removal of widespread malware
- Removal of malicious components and remediation of system modifications
- Detection of hidden active rootkits
- Removal of hidden active rootkits
- Average slow-down of the computer
- False positives during static on-demand scanning
- False positives during dynamic on-access scanning
The complete report can be download from the AV-Test.org website or from our server here.
Some additional comments from AV-Test.org regarding Panda Internet Security:
Panda Internet Security showed impressive high results for the static and dynamic detection of new malware.
The detection and removal of actively running stealth malware such as rootkits was no problem for Panda Internet Security, but for many other reviewed products.
We tested not only the protection against known and unknown malware, but also the removal of critters which had previously infected the system and Panda Internet Security received 5.5 out of 6.0 possible points in these two category, the highest scores archived by a program during this exhaustive review.
Not only the protection against and removal of new malware was very high, but at the same time Panda Internet Security had less impact on the system from the usability point of view.
As many of you already know, a large portion of today’s malware is created and/or distributed from China. With that in mind, chinese independent AV testing lab PC Security Labs, has published a comparative study of AV detection of chinese malware. The comparative can be downloaded from here in PDF format.
Panda Internet Security 2010 has done fairly good in this test, ranking first in both detection as well as overall score:
The thing I like best about PCSL tests is that, unlike other tests out there, PCSL takes a unified look at the products tested. Not only does it look at static and dynamic (behavioural) detection, but also at static and dynamic false positives, combining everything into a single, unified, global score per product. Other tests only look at these different technologies separate from each other.
As some of you may remember we started taking part in PCSL’s main AV tests in November 2008 and so far we’ve achieved Excellent score in all the tests.
More info @ PC Security Labs website or at the main published report at http://article.pchome.net/content-1116841.html (chinese only)
Similar to the EICAR file, we have created a small “Cloud Test File” which can be used by testers and users to verify if their Panda product can successfully connect to the Collective Intelligence cloud-scanning servers.
The file PandaCloudTestFile.exe should be detected:
- During HTTP download
Download PandaCloudTestFile.exe. It’s MD5 hash is E01A57998BC116134EE96B6D5DD88A13. Alternatively you can also download a passworded RAR file with the EXE in it. The password is “panda”.
DISCLAIMER: This file is *not malicious*. If it is detected it simply means your Panda product can correctly connect to Collective Intelligence.
NOTE TO OTHER AV VENDORS: Please do not add detection for this file.
After some years we have decided to participate again in the AV-Comparatives.org tests.
The main driver for this decision has been the evolution of the methodologies employed by AV-Comparatives. We are happy to see that cloud-scanning components of products are also tested and this of course is important for testing Panda products as they incorporate not only signature-based cloud-scanning but also cloud-heuristics.
We will participate in all the main tests of AV-Comparatives (On-Demand, Retrospective, False Positive, Malware Removal, etc.) as well as the new Whole-Product Test which is a very promising test which replicates user experience.
However AV-Comparative’s Retrospective Test (which consists of freezing a 2-week old signature and testing against new malware to see how good the heuristic engine is) still does not use cloud-heuristics which are present in Panda products. Even though this methodology will penalize Panda’s products to some degree, we believe it is important to be present in the rest of the tests performed by AV-Comparatives.