Last month, Intel and Hewlett-Packard announced they will collaborate in building high performance computers (HPC) to handle the big data workloads of enterprises of all sizes. Currently these supercomputers exist in niches like academia, government, science and research labs.
Standard technologies can’t keep pace with the enterprise’s need to process data, advance innovation and compete, according to Bill Mannel. He’s the head of Hewlett-Packard’s HPC and Big Data business unit, and a 25-year veteran of supercomputer maker SGI.
To better explore the commercial demand for the really expensive, mind-blowing ability to perform quadrillions of calculations per second, he’ll oversee a new HP/Intel Center of Excellence in Houston staffed by HP and Intel engineers to build systems for specific customers. The two companies opened center of excellence in Grenoble, France two years ago to tap into the European market. Both will facilitate go-to-market collaboration in planning, developing, deploying and managing HPC solutions.
Sound familiar? Between 2008 and 2009, IBM opened high performance computing centers of excellence in Montpellier (France), Amsterdam and Dublin to develop systems for managing and forecasting environmental challenges. Like the Dutch have with water, for example. Their presence in Europe recently helped them secure a lucrative contract with the U.K. government.
While IBM’s efforts have been devoted to specific clients, the typical approach to building supercomputers, HP’s approach will be more ambitious. More American. Beyond tailored solutions, they will be developing a go-to-market strategy for off-the-shelf HPC configurations. Houston will also share their insights with the likes of Dell and Lenovo to help bring HPC to the mainstream. Intel, who already supplies chips for nearly 95 percent of all high-performance machines, says more than 50 technology companies will be drawn into the center, and we’ll start seeing new systems later this year.
The first HPC solution to come out of the center will integrate HP’s Apollo Servers with Intel’s Xeon x86 microprocessors. Called HPC Solutions Framework, they chose to crash the market with a brand that I’ve already forgotten.
So it begins. As the amount data that enterprises have at their disposal grows, and the need to capitalize on big data analysis becomes a business necessity, the industry is making HPC affordable and accessible.
The difficulty I see is bringing the big data analytics and high performance computing worlds together, architecturally. It’s not just about greater computing capability. Greater networking and storage — the whole stack — will be needed to support it.
But they’ll figure it out, and everyone will win. The competitive pressure that HP will place on supercomputer makers Cray, SGI, Dell and Lenovo will result in innovative new technology. The retailers and automakers who adopt HPC solutions will offer new products and services. The healthcare organizations who can benefit from better analysis and forecasting will save more lives.
Still, high performance computing is going to be for an elite group of companies. Most of us can find ways to improve the performance of our traditional enterprise set-ups and our ability to use them.
What about you? If you could perform quadrillion calculations in a second, could you translate that data into business insights?