High-Performance Computing | News
Intel Intros HPC Distribution for Apache Hadoop for Storing and Processing Big Data
Intel has rolled out its new Intel HPC Distribution for Apache Hadoop software, an enterprise-grade solution designed for storing and processing large data sets.
The HPC (high-performance computing) distribution combines the Intel Distribution for Apache Hadoop software with the Intel Enterprise Edition of Lustre software. According to the company, the Intel Distribution for Apache Hadoop "provides distributed processing and data management for enterprise applications that analyze massive amounts of diverse data," while the Enterprise Edition of Lustre is designed "to make performance-based storage solutions easier to deploy and manage."
According to the company, Lustre is the most widely deployed file system for HPC, and Apache Hadoop can transform big data into manageable, distributed datasets for easier analysis. By combining these two tools, the Intel HPC Distribution for Apache Hadoop software can help organizations analyze and manage their big data by enabling them to run their MapReduce applications directly on Lustre-powered storage for improved task performance, storage scalability, and storage management. They can access their Lustre files directly from Hadoop, without first copying them to the Hadoop environment.
Intel is also collaborating with its partners in the HPC community to deliver customized HPC products "to meet the diverse needs of customers," according to the company. Intel hopes these customized HPC products will help organizations achieve scientific, industrial, and academic breakthroughs.
Further information about the Intel HPC Distribution for Apache Hadoop software can be found at intel.com.
Leila Meyer is a technology writer based in British Columbia. She can be reached at firstname.lastname@example.org.