U Arkansas Acquires SDSC's Trestles Supercomputer

The University of Arkansas is the proud new owner of a supercomputer featuring 256 servers, 16.4 terabytes of memory, a processing speed of 79 teraFLOPs and 8,192 processing cores.

The supercomputer, named Trestles, is being moved from the San Diego Supercomputer Center (SDSC) at the University of California, San Diego to the Arkansas High Performance Computing (HPC) Center. The University of Arkansas' existing supercomputer, named Razor, has a processing speed of 77 teraFLOPs and 4,328 processing cores. The addition of Trestles will more than double the Arkansas HPC Center's computational capacity and allow it to run three times as many jobs for researchers.

SDSC and UC San Diego originally deployed Trestles in 2010 with a $2.8 million grant from the National Science Foundation (NSF), and the supercomputer was "recognized as the leading science gateway platform in the NSF's eXtreme Digital (XD) Network, a collaborative set of compute and storage resources in the United States that scientists can use for advanced computational and data-enabled research," according to information from the University of Arkansas and SDSC.

Jeff Pummill is one of the interim co-directors of the Arkansas HPC Center. Last year, through his connections with the NSF's XSEDE (eXtreme Science and Engineering Discovery Environment) program, Pummill learned that SDSC planned to replace Trestles with its new petascale Comet supercomputer and the organizations made arrangements to transfer ownership of Trestles to Arkansas.

In preparation for the installation of Trestles at the Arkansas HPC Center, the University of Arkansas will decommission the Star of Arkansas supercomputer, which was activated in 2007 and was once the most powerful computer in the state.

About the Author

Leila Meyer is a technology writer based in British Columbia. She can be reached at [email protected].

Featured

  •  laptop on a clean desk with digital padlock icon on the screen

    Study: Data Privacy a Top Concern as Orgs Scale Up AI Agents

    As organizations race to integrate AI agents into their cloud operations and business workflows, they face a crucial reality: while enthusiasm is high, major adoption barriers remain, according to a new Cloudera report. Chief among them is the challenge of safeguarding sensitive data.

  • glowing digital brain above a chessboard with data charts and flowcharts

    Why AI Strategy Matters (and Why Not Having One Is Risky)

    If your institution hasn't started developing an AI strategy, you are likely putting yourself and your stakeholders at risk, particularly when it comes to ethical use, responsible pedagogical and data practices, and innovative exploration.

  • college students in a classroom focus on a silver laptop, with a neural network diagram on the monitor in the background

    Report: 93% of Students Believe Gen AI Training Belongs in Degree Programs

    The vast majority of today's college students — 93% — believe generative AI training should be included in degree programs, according to a recent Coursera report. What's more, 86% of students consider gen AI the most crucial technical skill for career preparation, prioritizing it above in-demand skills such as data strategy and software development.

  • glowing blue AI sphere connected by fine light lines, positioned next to a red-orange shield with a checkmark

    Cloud Security Alliance Offers Playbook for Red Teaming Agentic AI Systems

    The Cloud Security Alliance has introduced a guide for red teaming Agentic AI systems, targeting the security and testing challenges posed by increasingly autonomous artificial intelligence.