by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : March 2015
[BrieFing] a tough time figuring out how to make charts responsive. The Department of Health and Human Services found an answer – and then it shared it. “What HHS did was they created some code – about four or five lines of code – that makes charts mobile-friend- ly,” Parcell said. “The Defense Financial Accounting Service actually took this code and implemented it on their web- site in about two hours, and they said it saved them a bunch of time and a bunch of headaches because 5 percent of their website is charts and graphs.” Another problem is that many mobile devices exist and in many sizes, but some agencies are limited to using only one or two. That makes testing respon- sive design difficult, Parcell said. To help with that, the DigitalGov team started a Federal CrowdSource Mobile Testing program. Through it, feds can volunteer to evaluate an application on a variety of devices. Looking ahead, the government is well positioned to be more proactive with mobility, Roberts said. “We’ve got a lot of security coming in. We’re not facing a midterm election, the budget situation is relatively stable, we’re not facing a presidential election, so I do think we’re going to have a pretty suc- cessful year when it comes to mobile,” he said. Although agencies fell short of the digital strategy’s goals in many ways, it’s still a useful document, Parcell said. “I think the good thing about deadlines is they put you on course to actually meet a goal, but you also have to be rethinking those goals along the way,” he said. • Over the past few months, researchers at the Department of Energy have been exploring new approaches for collecting, moving, sharing and analyzing massive scientific datasets. Researchers at Lawrence Berkeley National Laboratory recently led four science data pilot projects to show what could be gained when the facilities and tools were specifically linked to carry out specialized research and to show the potential of a highly focused science data infrastructure, according to DOE. “As each new generation of instru- ments and supercomputers comes on line, we have to make sure that our scientists have the capabilities to get the science out of that data and [that] these projects illustrate the future directions,” said Steven Binkley, director of DOE’s Office of Advanced Scientific Computing Research. All of the projects are researching new, more efficient technologies, and the goal is to reuse as many existing tools as possible and to develop new software as necessary, making it easier for the scien- tists to be able to examine their data in real time. The first project demonstrated the ability to use a central scientific comput- ing facility – National Energy Research Scientific Computing Center (NERSC) – to serve data from several experimental facilities in multiple formats using DOE’s ultra fast ESnet. The teams built tools to transfer the data from each site to NERSC and to automatically or semi- automatically analyze and visualize the information. The second project illustrated the concept known as a “super facility,” which integrates multiple, complemen- tary user facilities into a virtual facility offering fundamentally greater capabil- ity. This demonstrated for the first time that researchers will soon be able to analyze their samples during preliminary or “beamtime” tests and to adjust their experiment for the maximum scientific results. In the third project, the teams built a “data pipeline” for moving and process- ing observational data from the Dark Energy Survey. Using Docker virtual machine software that automates de- ployment of applications inside software containers, they built self-contained packages that included all the necessary applications for analysis. The containers could then be pushed out from to supercomputers at the national labs and fired up on the vari- ous systems pulling the data that they needed for processing. Then the results were pushed back to NCSA over ESnet. The fourth project, the virtual data facility, was a multi-lab effort to cre- ate a proof of concept for some of the common challenges encountered across domains, including authentication, data replication, and a framework for building user interfaces. Data endpoints were set up at Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge and Pacific North- west labs, and the service demonstrated datasets being replicated automatically from one site to the other four. Science teams across the DOE labora- tory system are increasingly dependent on the ability to efficiently capture and integrate large volumes of data that often require computational and data services across multiple facilities. These projects demonstrate the scientific potential of big data infrastructure. • BY GCN STAFF DOE pilots big data infrastructure projects GCN MARCH 2015 • GCN.COM 7 0315gcn_006-015.indd 7 3/5/15 12:40 PM