Open Menu Close Menu

HPC

U Texas Austin Wrangles Next-Gen Petascale Supercomputer

The University of Texas at Austin recently introduced its newest supercomputer. Stampede2, as it's called, is intended to serve as the flagship at the Texas Advanced Computing Center (TACC) and will be available to tens of thousands of researchers across the country.

When it's in full production this fall, the supercomputer will be able to sustain 18 petaflops of processing power — 18 quadrillion mathematical operations per second — more than twice the overall performance of the current system in terms of compute capability, storage capacity and network bandwidth. At the same time Stampede2 will consume only half as much power and take up just half the physical space of its predecessor. Cooling is provided by a chilled water system that's more cost- and energy-efficient than a standard air-conditioned approach.

The entire infrastructure will be run by a team of experts from TACC, U Texas Austin, Clemson University in South Carolina, Cornell in New York, the University of Colorado at Boulder, Indiana U and Ohio State. Vendors involved in the project include Dell, Intel and Seagate Technology.

The first phase of the rollout features 4,200 Knights Landing (KNL) nodes, the second generation of processors based on Intel's Many Integrated Core architecture. Unlike the legacy KNC, a Stampede2 KNL isn't a coprocessor; each 68-core KNL is a stand-alone, self-booting node. In fact, the system's accounting approach will be based on "node-hours"; a service unit representing a single compute node used for one hour (a node-hour) rather than a core-hour.

The second phase of rollout, scheduled for later this summer, will add 1,736 Intel Xeon Skylake nodes.

Importantly, Stampede2 maintains a Linux-based software environment like the one run on its predecessor to facilitate the migration of a large user base to the new system. The hardware and software combination has been designed to support traditional large-scale simulation users, users performing data intensive computations and emerging classes of new and non-traditional users to high performance computing.

Among the test projects already run on the supercomputer was one performed by teams at Stephen Hawking's cosmology research laboratory at the University of Cambridge, which leveraged the system to compare previously performed simulations with gravitational wave data observed by the NSF-funded Laser Interferometer Gravitational-Wave Observatory run by MIT and the California Institute of Technology. U Texas Austin researchers have used Stampede2 for tumor identification from magnetic resonance imaging (MRI) data. And researchers doing earthquake prediction work for the Southern California region at the University of California, San Diego reported that they had achieved a fivefold performance improvement over previous computations.

Stampede, the predecessor of Stampede2, will continue being used until the new supercomputer is fully operational. That system began running in 2013. Since then it has processed more than 8 million successful jobs and delivered over 3 billion core hours of computation, according to TACC.

The latest initiative was funded with a $30 million grant from the National Science Foundation. It's expected to serve the scientific community through 2021.

"Building on the success of the initial Stampede system, the Stampede team has partnered with other institutions as well as industry to bring the latest in forward-looking computing technologies combined with deep computational and data science expertise to take on some of the most challenging science and engineering frontiers," said Irene Qualters, director of NSF's Office of Advanced Cyberinfrastructure, in a prepared statement.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

comments powered by Disqus