Stampede2 Supercomputer Launches at Texas Advanced Computing Center

The University of Texas at Austin now runs the most powerful supercomputer at any higher education institution in the country. Housed at the Texas Advanced Computing Center (TACC), Stampede2 allows researchers nationwide across disciplines to answer questions that require high-performance computing power.

“Researchers will be able to use a wide range of applications, from large-scale simulations and data analysis using thousands of processors simultaneously, to smaller computations or interacting with Stampede2 through web-based community platforms,” according to a university statement.  

The new supercomputer will be integrated into TACC’s ecosystem of 15-plus advanced computing systems, “providing access to long-term storage, scientific visualization, machine learning and cloud computing capabilities,” the UT Austin statement explained. “In addition to its massive scale, the new system will be among the first to employ the most advanced computer processor, memory, networking and storage technology from its industry partners Dell EMC, Intel and Seagate.”

Furthermore, partner institutions Clemson University, Cornell University, Indiana University, Ohio State University and the University of Colorado will continue advising TACC on cybersecurity matters.

Stampede2 is ranked the 12th most powerful supercomputer in the world, according to a recent TOP500 list. Since TACC’s goal is to reach a processing power equivalent to about 100,000 desktop computers, the next step for the system involves adding “hardware and processors, giving it a peak performance 18 petaflops, or 18 quadrillion mathematical operations per second,” the statement explained.

TACC previously operated the system’s predecessor, Stampede1, from 2013 to 2012 during which time it ran a total of 8 million compute jobs for researchers. Stampede2 aims to “double the peak performance memory, storage capacity and bandwidth of its predecessor,” while taking up half as much physical space and consuming half as much power.

The National Science Foundation — which supported Stampede1 — awarded $30 million for the project. The NSF-supported Extreme Science and Engineering Discovery Environment will allocate time on Stampede2 to researchers based on a competitive peer-review process.

To learn more about Stampede2, watch the video below.

Source: YouTube.

About the Author

Sri Ravipati is Web producer for THE Journal and Campus Technology. She can be reached at [email protected].

Featured

  • From the Kuali Days 2025 Conference: A CEO's View of Planning for AI

    How can a company serving higher education navigate the changes AI brings to ed tech? What will customers expect? CT talks with Kuali CEO Joel Dehlin, who shared his company's AI strategies with attendees at Kuali Days 2025 in Anaheim.

  • glowing blue AI sphere connected by fine light lines, positioned next to a red-orange shield with a checkmark

    Cloud Security Alliance Offers Playbook for Red Teaming Agentic AI Systems

    The Cloud Security Alliance has introduced a guide for red teaming Agentic AI systems, targeting the security and testing challenges posed by increasingly autonomous artificial intelligence.

  • Training the Next Generation of Space Cybersecurity Experts

    CT asked Scott Shackelford, Indiana University professor of law and director of the Ostrom Workshop Program on Cybersecurity and Internet Governance, about the possible emergence of space cybersecurity as a separate field that would support changing practices and foster future space cybersecurity leaders.

  • abstract pattern of cybersecurity, ai and cloud imagery

    OpenAI Report Identifies Malicious Use of AI in Cloud-Based Cyber Threats

    A report from OpenAI identifies the misuse of artificial intelligence in cybercrime, social engineering, and influence operations, particularly those targeting or operating through cloud infrastructure. In "Disrupting Malicious Uses of AI: June 2025," the company outlines how threat actors are weaponizing large language models for malicious ends — and how OpenAI is pushing back.