by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : February 2015
GCN OCTOBER 2012 • GCN.COM 27 ing data,” said Brand Niemann, a for- mer senior enterprise architect and data scientist at the Environmental Protec- tion Agency, who now heads up the Fed- eral Big Data Working Group, an interest group of federal and non-federal big data experts. The fact is, many agencies may already have people with such expertise on staff but don’t recognize it. It’s a matter of identifying the statisticians that are al- ready working with data and giving them more of a mandate and outlet to mine the agency’s data, Neimann said. Get it right, and the results can be transformative. ACCURACY COUNTS Any analysis of big data has limited use- fulness if the information in the dataset is not accurate to begin with. Until only re- cently, VHA’s Fihn said he had been skep- tical that data analytics could reach the levels of accuracy required for clinical use across the VHA. One reason is that, until just a few years ago, the only data available was from health insurance claims. “In terms of predictive ac- curacy we use what we call a C statistic,” he said. “A wholly accurate pre- dictive model has a C level of 1.0, and the least accurate has a level of zero. Using (health insurance) claims data, the most accurate level we could get was around 0.65, which is not much better than flip- ping a coin.” Between 2010 and 2011, however, the VHA brought online a corporate data warehouse that combined clinical data from some 126 different versions of the VISTA (Veterans Health Information Sys- tems and Technology Architecture) elec- tronic health record the agency had been using since the late 1990s. With that, Fihn said, and greater avail- ability of data on patient medications and vital signs, predictive models are regu- larly reaching C levels of 0.85, and are pushing 0.9 . It was a “quantum jump” in terms of the usefulness of predictive analytics, he said, and VHA medical staff feel they can now predict with confidence who the high-risk patients are. And even though predictions are still being published using claims data alone, he said, “for our considerations, we now reject those below C levels of 0.85, and we are actually moving to push things as close as we can to 0.9.” HHS doesn’t have any global metrics or milestones it wants to reach for big data, Sivak said, though there are spe- cific goals for individual programs. In fact, NIH may have the most expansive set of goals, with BD2K just part of a larger portfolio of activities that NIH is promoting, including cross-agency and international collaboration on big data initiatives and policies. It’s all a marker for just how quickly minds have changed over big data, Si- vak believes. “Back in the day,” nobody would have given any thought to making datasets public or making them available widely within HHS. But over the past five years the value of that has been conclu- sively demonstrated, he said, “and as a result, the default setting within HHS has changed from closed to open.” • ORNL’s Health Data Sciences Institute (HDSI), said ORNL’s approach is novel in health care. Big data computing capabilities in facilities such as ORNL, “are critical to health care delivery,” Tourassi said. “It’s a paradigm shift in an environment that has always been reactive.” HDSI is reaching out to partners who have different types of data and diverse needs for data analysis – such as genomics, electronic health records and health-sensor data. The projects will help collect, store, integrate and analyze data in support of next-generation personalized medicine care, said the researchers. For example, ORNL is building the capability for clinical experts to “semantically reason” with medical records and associate health data types, such as claims and clinical records while simulating the outcomes of different clinical interventions. “We know for certain that health data will be getting bigger and more complex as the practice of medicine expands and progresses, said Tourassi. “By being involved and leveraging the investment, we can anticipate and prepare for the next bottleneck.” How a computing powerhouse delivers health care insights “We were in a unique position with our leadership computing resources and data science expertise, and we saw an opportunity to use health data to discover data-driven insights for better health care quality, integrity and policy.” SREENIVAS SUKUMAR, OAK RIDGE NATIONAL LAB GCN FEBRUARY 2015 • GCN.COM 27 0215gcn_024-027.indd 27 2/2/15 9:52 AM