Botnets evolve toward a non-centralized command structure
April 25, 2011
on
on
Law enforcement agencies are closing in on botnets. Their weak spot is their centralized command structure. Attackers are therefore increasingly implementing peer-to-peer (P2P) architectures in their designs. However, this makes them vulnerable to pollution techniques. Researchers of the Los Alamos National Laboratory investigate the possible future of pollution resilient P2P botnets in an attempt to stay ahead of the game.
A botnet or robot network consists of thousands sometimes millions of computers infected with a robot program. This piece of code enables the attacker a.k.a. botherder to remotely control the infected computers and make them perform tasks. Because the zombie computers continue to appear acting normal to their users, the enslavement is seldom noticed. Botnets are used for Distributed Denial of Service (DDoS) attacks, spamming or information theft: harvesting passwords or credit card numbers that can be sold on the black market.
Law enforcement agencies are closing in on botnets. Only two weeks ago the US Justice Department asked the courts to allow the nonprofit Internet Systems Consortium (ISC) to take over millions of computers infected with the Coreflood bot in order to install a robot program of their own. ISC’s code having the sole purpose of kill switching the Coreflood bot, reported Wired.
Coreflood’s weak spot is it centralized structure. Although the network consists of millions of computers, there are only a few command-and-control servers that establish communications with all the zombies. Swapping those servers for ones controlled by ISC enabled them to harvest the zombies’ IP-addresses and send the kill-switch-command to all.
In an answer to law enforcement’s improved techniques of exterminating centralized botnets, botherders are developing decentralized P2P architectures for their botnets. In P2P all nodes in the network are equal, distributing information from bot to bot. Because there is no single point of command there is no single point of failure. Taking out a bot does not severely affect the network since its siblings simply jump to the next bot in line.
However, P2P is proven to have security flaws of its own. Ironically, the equal status characteristic of P2P is both its strength and its vulnerability. Since no single authority is recognized any bot can initiate distributing information. Botnet hunters can capture a bot and manipulate it to seed information that disrupts the network. These pollution techniques have already proven their functionality.
Researchers of the Los Alamos National Laboratory, however, are worried that botherders will soon find ways to make their networks pollution resistant. In an attempt to stay ahead of the game researchers Guanhua Yan, Duc Ha and Stephan Eidenbenz wrote a paper in which they designed a hypothetical P2P botnet that is resilient to pollution techniques. Naming it the Antbot: anti-polution peer-to-peer botnet. The Antbot is based on a tree-like command structure in which only a few bots at the top are authorized to initiate information distribution. The extensive number of low-level bots can only pass on information. Thus greatly reducing the chance of pollution. To keep the high-level bots from getting captured, the hierarchy is reconfigured every day. By changing which bots hold authoritative status in the chain of command botnet hunters have only 24 hours to pinpoint the command-and-control center and apply pollution techniques.
A botnet or robot network consists of thousands sometimes millions of computers infected with a robot program. This piece of code enables the attacker a.k.a. botherder to remotely control the infected computers and make them perform tasks. Because the zombie computers continue to appear acting normal to their users, the enslavement is seldom noticed. Botnets are used for Distributed Denial of Service (DDoS) attacks, spamming or information theft: harvesting passwords or credit card numbers that can be sold on the black market.
Law enforcement agencies are closing in on botnets. Only two weeks ago the US Justice Department asked the courts to allow the nonprofit Internet Systems Consortium (ISC) to take over millions of computers infected with the Coreflood bot in order to install a robot program of their own. ISC’s code having the sole purpose of kill switching the Coreflood bot, reported Wired.
Coreflood’s weak spot is it centralized structure. Although the network consists of millions of computers, there are only a few command-and-control servers that establish communications with all the zombies. Swapping those servers for ones controlled by ISC enabled them to harvest the zombies’ IP-addresses and send the kill-switch-command to all.
In an answer to law enforcement’s improved techniques of exterminating centralized botnets, botherders are developing decentralized P2P architectures for their botnets. In P2P all nodes in the network are equal, distributing information from bot to bot. Because there is no single point of command there is no single point of failure. Taking out a bot does not severely affect the network since its siblings simply jump to the next bot in line.
However, P2P is proven to have security flaws of its own. Ironically, the equal status characteristic of P2P is both its strength and its vulnerability. Since no single authority is recognized any bot can initiate distributing information. Botnet hunters can capture a bot and manipulate it to seed information that disrupts the network. These pollution techniques have already proven their functionality.
Researchers of the Los Alamos National Laboratory, however, are worried that botherders will soon find ways to make their networks pollution resistant. In an attempt to stay ahead of the game researchers Guanhua Yan, Duc Ha and Stephan Eidenbenz wrote a paper in which they designed a hypothetical P2P botnet that is resilient to pollution techniques. Naming it the Antbot: anti-polution peer-to-peer botnet. The Antbot is based on a tree-like command structure in which only a few bots at the top are authorized to initiate information distribution. The extensive number of low-level bots can only pass on information. Thus greatly reducing the chance of pollution. To keep the high-level bots from getting captured, the hierarchy is reconfigured every day. By changing which bots hold authoritative status in the chain of command botnet hunters have only 24 hours to pinpoint the command-and-control center and apply pollution techniques.
Read full article
Hide full article
Discussion (0 comments)