A Madrid university has become the latest institution to join a university program that introduces students to technologies used in data center and high performance computing environments.
The San Diego Supercomputer Center has built a Linux cluster around Raspberry Pi devices as part of an effort to help kids and adults learn about parallel computing.
The National Cheng Kung University Supercomputing Research Center in Taiwan has developed a switchless cluster supercomputer, called CK-Star.
The HPC Advisory Council and the International Supercomputing Conference have selected teams for the HPCAC-ISC 2014 Student Cluster Competition, which will take place in Leipzig, Germany in June 2014.
Intel has rolled out its new Intel HPC Distribution for Apache Hadoop software, an enterprise-grade solution designed for storing and processing large data sets.
The San Diego Supercomputer Center at the University of California, San Diego is deploying Comet, a virtualized petascale supercomputer to address the needs of "the 99 percent."
Forget about gigabit performance. Purdue University needs 40 gigabit/second interconnectivity for its newest computer cluster, the Conte.
The San Diego Supercomputer Center at the University of California, San Diego will deploy Comet, a machine capable of two quadrillion operations per second, or nearly two petaflops.
The lower end of the high-performance computing segment saw a double-digit surge in the second quarter of 2013, helping to propel HPC as a whole to a 34.7 percent increase in systems delivered from the same period a year ago.
Rutgers University will implement a scale-out storage solution to support its growing high-performance computing needs.