U Michigan Designs Data-Centric Supercomputer

Researchers at the University of Michigan (U-M), in collaboration with IBM, have designed a high-performance computing (HPC) cluster with the goal of advancing predictive modeling in computational science.

The HPC cluster, named ConFlux, is hosted at U-M's Center for Data-Driven Computational Physics and enables "large scale data-driven modeling of multiscale physical systems," according to information on the center's site. This type of data modeling is very challenging and requires HPC applications running on external clusters to connect with large data sets at run time.

"The recent acceleration in computational power and measurement resolution has made possible the availability of extreme scale simulations and data sets," said Karthik Duraisamy, director of U-M's Center for Data-Driven Computational Physics, in a prepared statement. "ConFlux allows us to bring together large scale scientific computing and machine learning for the first time to accomplish research that was previously impossible."

Some of the large-scale, data-driven research projects that will use ConFlux include:

  • A collaboration with NASA to use cognitive techniques to simulate turbulence around aircraft and rocket engines;
  • A project for the National Institutes of Health that combines noninvasive imaging with a physical model of blood flow to help doctors estimate artery stiffness;
  • Studying how clouds interact with atmospheric circulation in order to better understand climate science;
  • Research into the origins of the universe and stellar evolution; and
  • Predictions of the behavior of biologically inspired materials.

ConFlux was funded by a $2.4 million grant from the National Science Foundation and an additional $1.04 million from the University of Michigan.

IBM is providing servers and software solutions for ConFlux. Several members of the OpenPower Foundation, an open, collaborative technical community based on IBM'S Power architecture, also contributed to its development. 

About the Author

Leila Meyer is a technology writer based in British Columbia. She can be reached at [email protected].

Featured

  • Analyst or Scientist uses a computer and dashboard for analysis of information on complex data sets on computer.

    Anthropic Study Tracks AI Adoption Across Countries, Industries

    Adoption of AI tools is growing quickly but remains uneven across countries and industries, with higher-income economies using them far more per person and companies favoring automated deployments over collaborative ones, according to a recent study released by Anthropic.

  • businessmen shaking hands behind digital technology imagery

    Microsoft, OpenAI Restructure AI Partnership

    Microsoft and OpenAI announced they are redefining their partnership as part of a major recapitalization effort aimed at preparing for the arrival of artificial general intelligence (AGI).

  • computer monitor displaying a collage of AI-related icons

    Google Advances AI Image Generation with Multi-Modal Capabilities

    Google has introduced Gemini 2.5 Flash Image, marking a significant advancement in artificial intelligence systems that can understand and manipulate visual content through natural language processing.

  • Hand holding a stylus over a tablet with futuristic risk management icons

    Why Universities Are Ransomware's Easy Target: Lessons from the 23% Surge

    Academic environments face heightened risk because their collaboration-driven environments are inherently open, making them more susceptible to attack, while the high-value research data they hold makes them an especially attractive target. The question is not if this data will be targeted, but whether universities can defend it swiftly enough against increasingly AI-powered threats.