by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : November 2013
said. "I m guessing we ve seen 50 percent to 60 percent overall energy savings." DOE s Hanford data center is report- ing an average Power Usage E ectiveness (PUE) rating of 1.8, with 1.0 being a per- fect score. "We ve found that in most of the things we do out here, the green benefits come out of doing business the right way," said Ben Ellison, DOE Hanford s chief informa- tion o cer. "We took it on face value that anything we did to be more e cient from an IT perspective and a cost perspective would make us more e cient from an en- ergy perspective, too." Now that it has built this cost-e ective private cloud infrastructure, the Hanford site is o ering IT services to other agen- cies. "We are now hosting other federal agencies in our data center because we have room and the capacity to support other workloads," Eckman said. "That will allow others to close down ine cient data centers they are operating." STATE OF CONSOLIDATION In Utah, the state government cut $4 mil- lion out of its annual IT budget as a result of consolidating 35 data centers and machine rooms into a primary and a failover site to support its 23,000 employees. The consoli- dation effort involved virtualizing servers, deploying modern storage systems and re- ducing IT staff by 20 percent. During this 18-month consolidation ef- fort, Utah decreased the number of physi- cal servers it operates from 1,864 to 591 -- a reduction of two-thirds. Utah standardized on HP servers running VMware and Hita- chi storage systems. After the consolidation, Utah upgraded its primary data center to an evaporative water-chilled air conditioning system and deployed hot-and-cold aisle containment. This investment immediately slashed the data center s monthly power bill from $25,000 to $16,500. "The air conditioning units in the data center consume about 75 percent to 85 per- cent of the power in the data center, says Russell Smith, data center manager. "The capital outlay on the cooling system was $300,000, with $200,000 from [American Recovery and Reinvestment Act] funds. Our ROI was just over a year." Utah deployed Niagara Framework, an environmental monitoring system, so that it can measure the energy e ciency of its data center and keep tweaking settings to improve it. Currently, the Utah data cen- ter s PUE is 1.3, an impressive feat. "If you look at the modeling we did at the start of the consolidation project, we expected 68 percent energy savings, says Dave Fletcher, Utah s chief technology of- ficer. "I assume we achieved at least that much given the amount of servers we re- duced and the fact that we closed ine - cient machine rooms and moved to a more centrally cooled environment. We estimate we saved the environment about 15 mil- lion pounds of carbon dioxide emissions a year." Next on the agenda for Utah is to up- grade to an evaporative air conditioning system at its backup data center to cut costs there. Utah also is testing desktop virtual- ization with about 1,200 state employees. Now that it has an e cient and cost ef- fective data center -- and 60 percent ex- cess floor space -- Utah plans to market private cloud infrastructure services to county and city governments. "We have a university hospital which has quite a few racks here," Fletcher said. "With our new infrastructure, we can rap- idly spin up new services." CREATIVE FINANCE With examples like Utah and DOE Hanford in mind, federal agencies are considering creative ways to finance the data center upgrades needed to achieve maximum re- turns from consolidation. One option is En- ergy Savings Performance Contracts (ES- PCs), which allow agencies to accomplish energy savings projects without upfront capital outlays. Instead, the contractor is paid back out of future energy savings. The ESPC contracts have "been around for a while, but it s a fairly new concept in the data center and IT environment," said Jay Owen, vice president of Schnei- der Electric IT Federal Solutions. Owen said Schneider Electric upgraded the air conditioning systems at a Veterans A airs Department data center with ESPC funds. "Most data centers in the federal mar- ket are fairly old, and the heating, cooling and power infrastructure was designed for much lower density equipment. If you bring in all of these virtualized IT systems, it s not going to work with the current in- frastructure, or, if it does work, it will be very ine cient," he added. To solve this problem, Schneider and other vendors are o ering high-density pods that can be deployed like building blocks in an existing data center to gradu- ally improve energy e ciency and drive down electric bills. The pods have localized cooling in the equipment racks, which is more e cient than floor-based air condi- tioning. Military and intelligence agencies are among the early adopters of pods. "Here is a systematic approach to let you migrate older data centers into a consoli- dated site over time," Owen said. "You are asking for a much smaller chunk of money as opposed to gutting an entire facility and retrofitting it up front. With the pods, you can get a from a PUE of about 3.0, which is average for a legacy data center, to the 1.5 to 1.6 range." "It s not as good as if you were building a state-of-the-art data center, but it s pretty close." • Cashing in on consolidation Federal Data Center Consolidation Initiative Established: February 26, 2010 Estimated Number of Federal Data Centers: 6,835 (July 2013) Data Center Closures: 484 (April 2013) Key Drivers: Eliminate redundant hardware, lower operational costs, reduce energy consumption, free up floor space and trim IT headcount. Current Goal: Close 40 percent of non-core data centers by the end of 2014 Estimated Savings: $ 2.5 billion by the end of 2015 (OMB) 30 GCN NOVEMBER 2013 • GCN.COM