Distributed Denial of Service (DDoS) Attacks Explained
Well, simply put, in a denial of service attack, the attacker sends repeated messages to a target website with such frequency, that the website can not keep up and slows to a crawl, in effect taking it offline.
Well, this works great for small websites, but larger websites are hosted on several computers and use a round robin DNS type resolution, so that multiple machines appear as one site. These can handle a lot more traffic so a different tactic is needed.
Attackers will usually use zombie machines that they have infected with a virus (also called ‘bots’) to work together to attack a single site. Sometimes hundreds and even thousands of systems are used in this matter. (keep your system and anti-virus updated! :) ) The website is hit with so many requests that it bogs them down to the point where they can no longer respond. This is called a Distributed Denial of Service Attack.
Most of the “hacktivists” involved with the Wikileaks DDoS attacks are using these DDoS attacks to shutdown each others websites. The hacktivists are receiving a lot of flack from the computer security “experts” for using these old style attacks (Kinda doesn’t make sense, because they do seem to be working).
For you see there is a newer, much more effecient method of Denial of Service attack called “Layer 7 DoS”. In this level of attack, instead of flooding a server with thousands of message packets, the actual webserver application itself is attacked. Partial request are opened with the server, but never finished. This leaves the server in a waiting state. It only takes a very few of these requests to bog down a server and take it offline.
In a Layer 7 Denial of service attack, a single attacker could take almost any single website down at will. They literally act like an on/off switch. The Jester used such a program he created called “Xerxes” to take wikileaks offline the first day of the latest release. I have seen a different Layer 7 DoS program run and it is brutally effective.
The scary part is that these have existed for quite a while now, and because they attack a function of webservers, neither Apache or microsoft have moved to fix them. That is the official word though, to truly fix the issue would probably require major rewrites and they are not willing to do that at this point. You will probably see these issues addressed in the next releases of Apache and IIS.