by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : October 2014
GCN OCTOBER 2014 • GCN.COM 19 By the time they were ready to design and code the application, they had immersed themselves in mathematical calculations and load configurations. “ We kept in close contact with our customers to keep on top of the requirements and took whatever calculations we could from the old version, but there was a lot that couldn’t be ported because of the operat- ing system. There was a big learning curve,” Woods said. Woods and the other two develop- ers then started developing code in earnest, using tools that were either free or already existed in the office. Tools included Apple’s XCode freeware, exist- ing Fortify Security Scanner licenses, a built-in Clang scanner and a detached functional testing office. It took the team six months from the time they started learning app develop- ment until the first version was released. The application, with more than 19,000 lines of code, was the first iOS application of its kind for the Air Force. Since boom operators started using the iOS KC-10 Load Management Sys- tem, time has been saved and accuracy improved. The new system, running on iPads, reduced a 46 minute calculation to five minutes. It also saved 41 minutes per flight, which translates to 4,189 hours per year. The app also has decreased the error margin from 7-10 minor errors per f light to less than 1, reducing annual errors by about 49,000. The news of the app’s success spread fast. The team already has received preliminary requests to develop a similar app for the Coast Guard C-130H, KC-135, KC-46 and HC-140A. The project has also triggered develop- ment of additional iOS-based apps for pain management, power management and a user testing management system for qualifying Air Crew members. The software team also sent the KC-10 L Load Management System code to the 578th Software Maintenance Squadron when it undertook a similar development effort for the C-5 aircraft. They were able to use the iOS template Wi-Fi printing function- ality, saving 225 hours and $30,000. — Karen D. Schwartz PROJECT AT A GLANCE PROJECT: Leveraging Commercial Cloud Computing Services OFFICE: Space and Naval Warfare System Command (SPAWAR), Commercial Service Integration Team TECHNOLOGY USED: Commercial Infrastructure as a Service (IaaS) offering TIME TO IMPLEMENTATION: 3 years BEFORE: The Navy ran public-facing websites in its own facilities, incurring hardware costs in the process. AFTER: After creating a secure cloud environment, the Navy was able to migrate low-impact information systems to the more cost-effective commercial cloud. The Space and Naval Warfare (SPAWAR) Systems Command wanted its IT systems to have the advantages of commercial cloud computing. The key benefit it sought was to save costs by moving “low- impact” information systems — includ- ing web sites, small networks and data centers — to the cloud. But first it had to confront a major hurdle: achieve military-grade security in a commercial setting. The Command embarked on a three-year project to do just that. Collaborating with the Defense De- partment, civilian and intelligence com- munities, SPAWAR was able to clear the necessary approvals, certifications and accreditations to develop security best practices for infrastructure as a service (IaaS). “Addressing the gaps that prevented DOD from using commercial cloud ser- vices was non-trivial effort,” said Tommy Groves, SPAWAR Systems Center Atlantic spokesman. “Looking at policy, process, protocols, architecture and acquisition through this lens really represented a fundamental shift in how we think about operational IT.” In 2013, the first Secretary of the Navy portal hosted in a commercial cloud became operational, with Amazon Web Services (AWS) serving as the hosting provider. At the time, the Department of the Navy described the project as founda- tional. “This effort established a valuable baseline and a ‘way ahead’ for the DON to achieve efficiencies using commercial services for a more cost-effective ap- proach,” according to the CIO’s office. The Navy’s goal was to host systems at a lower cost than could be achieved in a government owned and operated facil- ity. Any opportunity to reduce cost has become critical at a time when the Navy has been directed to drastically shrink its IT budget. The ability to run websites in the com- mercial cloud has helped the Navy trim its capital investments in hardware, in ef- fect, turning fixed data center costs into variable ones, according to AWS. The savings have continued, even after the initial cloud migration. For example, AWS’ move from M1 to M3 instances, and its associated cost reductions, has A cache of IT savings in the cloud The Navy worked out the protocols to move military systems to the commercial cloud — and reap big savings
November and December 2014