Intel has rolled out its new Intel HPC Distribution for Apache Hadoop software, an enterprise-grade solution designed for storing and processing large data sets.
The San Diego Supercomputer Center at the University of California, San Diego is deploying Comet, a virtualized petascale supercomputer to address the needs of "the 99 percent."
Forget about gigabit performance. Purdue University needs 40 gigabit/second interconnectivity for its newest computer cluster, the Conte.
The San Diego Supercomputer Center at the University of California, San Diego will deploy Comet, a machine capable of two quadrillion operations per second, or nearly two petaflops.
The lower end of the high-performance computing segment saw a double-digit surge in the second quarter of 2013, helping to propel HPC as a whole to a 34.7 percent increase in systems delivered from the same period a year ago.
Rutgers University will implement a scale-out storage solution to support its growing high-performance computing needs.
The University of Florida has implemented the state's fastest supercomputer with a 2.88 petabyte high-performance storage appliance.
Bandwidth availability and high performance computing is on the rise nationally on college campuses, according to data from the National Science Foundation's latest Survey of Science and Engineering Research Facilities.
A computer science professor from the University of Tennessee in Knoxville has just earned a million dollar grant to explore the next generation of high performance computing.
Adapteva has completed its first Parallella parallel-processing board designed for Linux supercomputing.