by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : February 2013
GCN FEBRUARY 2013 • GCN.COM 27 "To some extent, they are becoming less effective," Gaffan said, because of the bad guys' ability to rapidly switch between ad- dresses to deliver attacks. But a blacklist can still be useful as long as it is maintained properly. This means having the intelli- gence sources to add to the blacklist quick- ly so it's up to date. And just as important is removing old addresses and domains when they no longer are being used maliciously. "There is more to it than just adding IP addresses," Gaffan said. If old addresses are not removed, they can create what amounts to a self-inflicted DOS when le- gitimate traffic is blocked. The simplest type of DOS attack to deal with is a network attack with its flood of re- quests, and the simplest way to deal with it is to just absorb the traffic. It's just a matter of capacity. "We mitigate Layer 3 [network] and Layer 4 [transport] attacks at the edge, the same way we handle large flash crowds," said Trentley. That was the technique used in the July 4 wave of DDOS attacks in 2009, target- ing government, news and financial sites primarily in South Korea and the United States. Despite the apparent organization behind them, the attacks produced only about 20 megabits of data per second, which did not cause major disruptions. "Fortunately, they hit us where we were strongest," Trentley said. Akamai and other organizations have had to adapt to respond to more subtle ap- plication layer attacks that target back-end resources. But the principle of identifying and dropping the traffic as soon and as far from the targeted servers as possible still applies. Akamai identifies and drops in- complete messages at the edge. Deciding what DOS defenses to main- tain in-house and what, if any, to outsource to a third party requires balancing the value of the services being protected with the capacity to defend them. On-premises, in-line equipment to monitor, detect and respond to attacks can react quickly and reduce downtime. But do you have the resources to devote to those defenses con- tinuously during those periods when you are not under attack? Using ISPs, cloud service providers and security companies for early warning and response can be cost-effective. But if you do not want third-parties monitoring your traffic they will have to rely on you to no- tify them of problems, which can delay re- sponse and result in additional downtime. Striking the right balance will require a thorough understanding of your environ- ment, mission and resources, as well as the capabilities of vendors and service provid- ers, and then combining adequate train- ing and resources in-house with the right third-party agreement. • "There is no way you are going to mitigate these attacks from a fixed infrastructure. You can't." --- FRAN TRENTLEY, SENIOR SERVICE LINE DIRECTOR FOR AKAMAI TECHNOLOGIES' PUBLIC SECTOR BUSINESS In August 2012, an eight-hour DNS DDOS attack against AT&T took down the company s site and disrupted access to customers on AT&T s network. An attack against Internet domain name registrar GoDaddy in November took down domains using its DNS service. The U.S. government has been a leader in advancing DNS security, and the O ce of Management and Budget mandated the deployment of the DNS Security Extensions on .gov domains by 2010. As of Jan. 22, 76 percent of tested government domains had DNSSEC operational, according to the National Institute of Standards and Technology. This is not complete compliance, but it is far ahead of the 1 percent of tested private-sector domains that are using DNSSEC. Unfortunately, even wider use of DNSSEC would not solve the latest threat, Herberger said. "It never was designed to prevent denial of service attacks," but rather to protect the integrity of responses to DNS queries threatened by cache poisoning attacks. "The potential problem with cache poisoning hasn t materialized." The change in threat requires a change in defenses, imposing rules on traditionally open, unquestioning DNS servers. The number and type of queries accepted could be defined and limited, and servers could challenge queries to determine if they exhibit malicious behavior. The challenge could consist of something as simple as dropping a first query and waiting to see if it is repeated and in what time. The timing of a second query could help indicate if it is being generated by a bot. This could be done with a minimum of overhead and delay, Herberger said, and could be cheaper and more e ective than overprovisioning. --- William Jackson