by clicking on the page. A slider will appear, allowing you to adjust your zoom level. Return to the original size by clicking on the page again.
the page around when zoomed in by dragging it.
the zoom using the slider on the top right.
by clicking on the zoomed-in page.
by entering text in the search field and click on "In This Issue" or "All Issues" to search the current issue or the archive of back issues respectively.
by clicking on thumbnails to select pages, and then press the print button.
this publication and page.
displays a table of sections with thumbnails and descriptions.
displays thumbnails of every page in the issue. Click on a page to jump.
allows you to browse through every available issue.
GCN : January and February 2016
HPC and big data” because Java and other languages will be working along- side Spark and Hadoop, he added. Bridges also has nodes for databases, web services and management as well as three types of storage and memory filesystems. All the components are connected by Intel’s Omni-Path Fabric, a new solution that provides 100 giga- bits/sec of line speed, low latency and scalability. The Pittsburgh Supercomputing Center created Bridges with a two-tier topology composed of “compute is- lands,” which provide bisection band- width communication performance for applications spanning as many as 42 nodes, and “storage islands,” which use Intel’s Omni-Path Fabric to imple- ment multiple paths and provide opti- mal bandwidth to the shared parallel Pylon filesystem, which has 10 pet- abytes of storage and 180 gigabits/sec of bandwidth, according to the Bridges website. “The point is to have each compute node have multiple paths to storage to avoid congestion and also to give peo- ple the maximum performance at the minimum cost,” Nystrom said. Ninety percent of Bridges is avail- able via NSF’s Extreme Science and Engineering Discovery Environment (XSEDE), a virtual system that scien- tists use to share computing resources, data and expertise. Nystrom said access is free to U.S. - based researchers who are conducting open scientific research. To apply, they must submit either a startup project request, which the XSEDE Resource Allocation Committee reviews regu- larly with the expectation that a more robust request will follow, or a propos- al for projects that are past the startup phase. The committee reviews those submissions quarterly. The other 10 percent of Bridges’ resources are available via a discre- tionary pool, which Nystrom said he expects industrial affiliates of the Pitts- burgh Supercomputing Center to use. 30 GCN JANUARY/FEBRUARY 2016 • GCN.COM SUPERCOMPUTING BY STEPHANIE KANOWITZ case study SECURITY The FBI has come up with a way to enable users to securely access different environments without switching computers Secret and unclassified resources on one PC FBI officials are drawing on the idea that less is more in their move to replace the bureau’s unclassified infrastructure with a virtual network that will give users access to re- sources of different classification levels at a single workstation. The Justice Department awarded an $8 million contract to Raytheon Web- sense (now Forcepoint) to make that happen in accordance with the FBI’s En- clave Consolidation Initiative. With ECI, the FBI is seeking to move its unclas- sified distributed infrastructure to the data center and have users access the new virtual network via the more secure agency network. Besides saving money on hardware, cooling, power and systems administra- tion, the change will result in better se- curity and efficiency in terms of IT and the workforce. “Right now, [users] have multiple workstations on their desktops with dif- ferent classification levels,” said Ward Ponn, consulting engineer and chief ar- chitect at Forcepoint. That means users can’t see their un- classified Microsoft Outlook email and their Outlook schedule from the secret- Bridges was on track to go live in January, but some researchers could access it sooner, Nystrom said. “I ex- pect that once we start getting all the Bridges hardware online in the next few months, we’ll see people making really good use of it because of the early-user period,” he added. The center received $9.65 million in NSF funding in November 2014 to build Bridges. The impetus came from researchers in fields outside those that typically rely on HPC — such as social sciences — because they recog- nized the benefits of applying big-data tools and technologies to their work, Mannel said. Growing interest in big data might have created the opportunity for Bridges, but Nystrom said he believes the system’s ease of use will be its key selling point. “It presents the users with an inter- face they’re comfortable with,” he said. “They have their vocabulary, it knows what sort of data they work with, it has their styles of displaying data. It lets them interact with a back-end HPC resource very transparently.” In the end, “all they will know is they’re able to get very complicated bits of science done,” he added. “I think that’s where the real power of this is.” • SHUTTERSTOCK 0216gcn_029-031.indd 30 2/3/16 11:58 AM
March and April 2016