by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : June 2013
The need for information is innate. The ability to manipulate, analyze and understand information is what makes it worthwhile. That s where Big Data Analytics comes in. These resources take in massive amounts of data and analyze it in ways that were previously impossible in order to reveal patterns and connections. That sounds relatively simple, but Big Data s implications are quite serious. Clearly, Big Data often requires organizations to upgrade their compute, power and storage systems. That is important, but it is not enough. As many agencies are beginning to realize, the most important sources of Big Data are on the network. Networks extend well beyond the traditional workplace, as an increasing number of employees are connecting to enterprise assets through mobile devices, virtual desktops and cloud computing. Additionally, government Big Data initiatives often involve pulling content from near real-time data sources, in various data forms, which only adds to the strain, notes the TechAmerica Foundation s report, "Demystifying Big Data: A Practical Guide to Transforming the Business of Government." That includes continuous data streams from satellites, radio frequency identi cation transponders, and various other data sensors. The good news, according to the report, is that most government agencies are now in the process of consolidating and standardizing their IT infrastructure. This creates an ideal opportunity to prepare their networks to address Big Data-related requirements. BIG DATA DEFINED Big Data refers to data sets so large that traditional tools and techniques can t manage to capture, store, analyze and share data in a timely and cost- effective manner. These systems involve vast amounts of data being produced at extremely fast rates and collected frequently. They also include what is called "unstructured" data -- that is, images, videos, email, documents, text and other formats that do not t the classic columns and rows of traditional relational database management systems. And the data is coming from a large variety of sources, with more sources being added all the time (see sidebar). Big Data: The Network Challenge Agencies undertaking Big Data initiatives need to ensure that their network infrastructures are up to the task. Four V's of Big Data These four attributes distinguish Big Data from other types of information. Volume -- The amount of data generated. Velocity -- The speed at which the information is generated and collected. Variety -- The vastly different types of data generated. Value -- The worth of extrapolating heretofore unknown patterns and categorizations. SPONSORED CONTENT Growth of Data In this new world of the Internet of Things, the small stuff has big effects. NUMBER OF DEVICES 1 device = 100 = 1,000 = 1 million = 1,000,000,000 = Today it's 10,000,000,000 = By 2020, 51,597,803,520 = NUMBER OF THEORETICAL CONNECTIONS (.001 PERCENT) unknown or nil theoretical connections (.001 percent) 0 theoretical connections 5 theoretical connections 4,999,995 theoretical connections 4,999,999,995,000 theoretical connections 499,999,999,950,000 theoretical connections 13,311,666,640,184,600 theoretical connections