Bitdefender wins Anti-Virus Test while Microsoft AV Failed Certification

Interesting Anti-Virus news from AV-Test labs this week. Every two months AV-Labs puts Anti-Virus programs through the paces and scores them on how well they detect, protect and remove anti-viruses. One of our favorite Anti-Virus programs, Bitdefender won the contest again this month, while surprisingly Microsoft’s Security Essentials failed to be certified.

BitDefender won by earning 17 out of 18 points:

AV-Test Bitdefender

With a perfect score in Protection and Repair.

Microsoft, which missed the certification cut-off of 11 points, earned 10.5 points:

AV-Test Microsoft

Scoring only a 1.5 in protection and 3.5 in repair.

So how well did your favorite Anti-Virus make out?

Check out the full report.

3 Tips for Effective Vulnerability Assessments

Every business has different needs, but they also have many things in common. Today, almost all businesses have an IT infrastructure once they reach a certain size. This commonality results in businesses having similar needs.

One such shared need is ensuring that you maintain a secure business network infrastructure.

There are many things an organization can do to keep its network secure, ranging from patch management to firewalls. However, one tactic that is often overlooked is performing a periodic vulnerability assessment.

Regular vulnerability assessments are essential because threats to your network security continually change and evolve, and your security should be able to match this. A user’s PC or network access point might be secure today, but it could become completely vulnerable tomorrow simply because some malicious attacker might have discovered a previously unknown attack vector.

A vulnerability assessment doesn’t come without its own associated costs. You need to strike a balance between security and inconvenience for your end users. Also, it is important that your vulnerability assessments are conducted correctly, as an error could result in the very problems that you are trying to avoid.

With that in mind, we have prepared some tips to ensure efficiency in your vulnerability assessments, helping you to keep your business network secure.

1) Select a proper schedule for your vulnerability assessment:

Vulnerability assessments cover many different tasks. Machines are scanned for missing software patches and they ensure software is correctly configured and that no aspect of your network has changed. You do not want to see that new users have been created, new shares opened or even new PCs or hardware connected to your network without your knowledge.

However, all these checks affect your network performance, making it essential that they are run at times when they least impact productivity. At the same time, they should not be run so infrequently that they leave a large window of opportunity for any attacker to exploit.

Ideally your vulnerability assessments should be run daily and outside of normal business hours. This schedule should be carefully tailored to meet your specific business needs.

2) Do your testing before implementing any changes:

A vulnerability assessment is designed to find deficiencies in your network, be they missing patches or an

incorrect configuration. When this occurs your vulnerability assessment software will offer you a number of options to remedy the situation, or provide you with information on how you might tackle the vulnerability that has been found.

It is important to understand that every network is different. Every computer has different software installed, and is comprised of different hardware. Software patches will alter the core of the software you run and this can lead to potential problems. Likewise, any changes you make to secure your network can also result in issues due to the unique nature of your system.

This is why it is always recommended to have test environments that mirror your live network as much as possible. Any changes can be first implemented on this test network before live deployment. In this way you can prevent yourself from implementing changes that are actually to the detriment of your network operation.

3) Disaster recovery plans are a must:

A bad practice that is often seen in vulnerability assessments and remediation plans is to only think about how we are going to solve an issue only once we actually come face-to-face with the problem itself.

By doing this you can actually cause unnecessary down time as you grapple with unexpected scenarios. A better way to deal with such undesirable events is to plan ahead and create disaster recovery plans for the most common eventualities. This should include a failed patch deployment that results in system instability, measures to take when there is a detected intrusion, as well as the course of action to follow when you encounter a virus infection.

Vulnerability assessment is an important component in maintaining business network security. However, like so many other tasks, it needs to be approached in the right manner. Utilizing the three simple tips above can save you a lot of time in the future and ensure you and your network steer clear of some insidious pitfalls.

This guest post was provided by Emmanuel Carabott on behalf of GFI Software Ltd. GFI is a leading software developer that provides a single source for to address their network security, content security and messaging need. Learn more on what to look out for when choosing a vulnerability scanner:

http://www.gfi.com/network-security-vulnerability-scanner

All product and company names herein may be trademarks of their respective owners.

Kinectasploit v2 – A Matrix Like Kinect Interface to Security Tools

I was visiting our friend Vivek’s site over at Security Tube and found this stunning Defcon 20 Kinectasploit v2 demonstration by Jeff Bryner. I have never seen Kinectasploit before, so this was quite a treat.

Kinectasploit is a Blender 3D first person shooter game environment that looks like the Matrix construct room. But instead of racks of weapons, you have access to 20 security programs. And with a combination of movement and hand gestures you choose a target network system, and then run nmap, Nessus, Ettercap or any of the other included security programs.

Jeff runs P0wnlabs an interesting looking online lab where you can learn and try out your mad hacking Skillz.

If you want to play with Kinectasploit, Jeff provides a Github page with all necessary software.

This really has to be seen to be believed. Great job Jeff!

An Eleven Character Linux Denial of Service Attack & How to Defend Against it

Sometimes it is the oddest, harmless looking things that could cause problems. I can’t think of anything more innocuous looking than the following Linux shell command:

But DO NOT run this on a Linux system, or chances are that you will perform a Denial of Service attack on your own machine! You may have to hard reset your system to get it back and you COULD LOSE DATA!

This is not new, I have seen this floating around, and it looked interesting. It was referenced in a 2007 post that said it didn’t work anymore because most modern OS’s are configured to protect against it. So of course I just HAD to try it.

I booted up my Ubuntu 12.04 system, opened a command shell, entered the command and…

It locked dead!

Okay just what is this command???

FORK BOMB PROCESS ATTACK

Meet the “Fork Bomb”. Basically all it does is instruct Linux to open processes – over and over again for an almost infinite number of times. Your RAM and CPU usage rises until the system no longer responds to input.

Let’s see what it does to an Ubuntu 12.04 system.

Here is an Ubuntu 12.04 System Monitor screenshot of a system before I ran the Fork Bomb:

The CPU and Memory usage are steady.

Now once the Fork Bomb is started:

Notice the significant increase in CPU and RAM usage. It even doubled the CPU usage on the virtual host, taking it from 8% to 17% while the attack was running.

I lost all control of the Ubuntu system. Even the keyboard lights were unresponsive. Supposedly some operating systems will recover if left alone long enough. But I waited a while and I never got control back.

(Okay, for all those out there claiming that it was just a Virtual Machine, I tried it on a stand alone Ubuntu 12.04 system with the same results. Okay, there was a quarter second pause before I lost control of the machine!)

DEFENDING AGAINST THE ATTACK

This is very easy to defend against. All you need to do is set limits to the number of processes that a user can open. These can be set per user, per group or globally. And you can set this one of two ways.

You can use the ulimit command for instant change that only lasts until the user logs off, or make the change permanent by editing the /etc/security/limits.conf file.

To use the ulimit command simply type “ulimit -u” with the number of processes that you want users to be allowed to run. So to set the limit to 512 just type:

sudo ulimit -u 512

Does this work? Absolutely – after running ulimit, the fork bomb is effectively throttled:

As you can see from the screenshot above, there is very little increase in RAM usage and the CPU usage is much more tolerable. And more importantly, I had full control of the system.

You can also change the /etc/security/limits.conf file to make the change permanent. Full instructions can be found on AskUbuntu.com, but basically just add the following line to the config file:

*    hard    nproc    512

The “*” means apply the change to everyone, “Hard” means it is a hard limit, and “nproc 512” locks the number of processes to 512.

You need to adjust the number of processes to a number that would be the best setting for your system. 512 seemed to work great on mine. Don’t set the number to low, or you may have other “denial of service” type issues, lol.

Oh, and for all the Mac Fanboys out there, this command didn’t seem to have any effect when run on a newer Mac. Okay, my friend ran it and it ate up 24 Gb of RAM, but seeming he had 64Gb of RAM on the system, it just laughed the attack off.

Even running it on a Mac with 24Gb of RAM had no discernible effect, other than getting a screen full of “Bash Fork: Resource Temporarily Unavailable” error messages like above. Looks like Mac’s have process limits enabled by default. (Thanks Command_Prompt and Bill!)

This should be obvious, but for the record, you should never run this command on systems that you do not own… Or put it in someone’s startup script.

But knowing how to limit a user’s ability to run processes is very important and throttling them on Linux systems where it is not done by default could curtail some problems before they surface.