by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : July 2013
IN EARLY MAY, the Obama Administration released an ambitious triple threat against the status quo de- signed to change the course of information management in the federal government. Specifically, the objective is to change the "default" behavior in the development of IT systems to create "open and machine-readable" data (a.k.a., standards-based and structured) as opposed to "closed and human-readable" data (a.k.a., proprietary and unstructured). The three-pronged attack includes a vision statement via an executive order, an open data policy document by the O ce of Management and Budget and technical support via an online reposi- tory of tools and guidance, called Project Open Data. On a personal level, it is exciting to see this initiative as a solid down payment toward fulfilling the promise of the Federal Enterprise Architecture Data Refer- ence Model (DRM), which I stewarded during my stint in federal service. Let s examine three areas agencies must pay special attention to: 1. "Know what you know," or mastering catalogs through metadata. At the center of the policy are two separate but related catalogs: an enterprise data inventory and a public data listing. To create the enter- prise data inventory, the policy states that "agencies should use the Data Refer- ence Model from the Federal Enterprise Architecture." The policy goes on to specify that data assets should be described by using a set of common core and extensible metadata. It is amazing to see the mainstream media frequently refer to "metadata" in its re- porting on the National Secu- rity Agency and its request for Verizon phone metadata. The power and utility of metadata has become clear in that case (link analysis) and in this case (data asset discovery). Agencies designing meta- data catalogs should take care in three areas: First, design the metadata to en- able discovery and determine appropriate usage for data as- sets. Second, the public data listing should be derived from the enterprise data inventory and explicitly linked to those entries; and lastly, enter- prise data inventory should be complete, linked to the agency s FISMA system inven- tory and include field-level (attribute-level) description. 2. "Stovepipes must die," or creating a data services layer. The open data policy states that agencies should "build information systems to sup- port interoperability and information accessibility," but is vague on how to architect systems in this manner. The answer is to follow the Model- View-Controller (MVC) pat- tern in systems design. Software engineers have been leveraging the MVC pattern for more than three decades in the design of widgets, applications and even large systems because it works. It works by specifying a clean separation between the user interface, the busi- ness logic and the data. "Clean separation" is the key point, and the way to do that for the "model" portion of a system is to develop a Web services layer between the data storage and the ap- plications. 3. "Putting the 'I' back in CIO," or enterprise infor- mation management gets real. By far the most risky part of this assault on the status quo is agency heads giving CIOs the authority to implement open data correctly. I have led enterprise data management o ces (EDMOs) that had the backing of senior manage- ment and others that did not. An EDMO that lacks access to the data in the business units is bound to spin its wheels, waste money and accomplish little. Agency heads must view this new policy as an oppor- tunity to centralize control of their data, which perfectly aligns with their objectives for transparency, information sharing, big data and the mi- gration to cloud computing. If implemented correctly, the these actions represent a sea change where infor- mation sharing with the public and other govern- ment agencies becomes a routine byproduct and not a special case. That di cult and evasive goal is finally within reach if federal agen- cies muster the courage and resources to change the "de- fault" behavior to the "right" behavior. • --- Michael Daconta (md- firstname.lastname@example.org) is vice president of advanced technology at InCadence Strategic Solutions and the former metadata program manager for the Homeland Security Department. He is currently working on the second edition of his book, "Information as Product: How to Deliver the Right Informa- tion to the Right Person at the Right Time." 3 STEPS FOR MAKING THE OPEN DATA POLICY INTO A REALITY REALITY CHECK BY MICHAEL DACONTA By far the most risky part of this assault on the status quo is giving CIOs the authority to implement open data correctly. 16 GCN JULY 2013 • GCN.COM