HP officials see some signs of recovery in the life science computing sector amid a broader trend of across-the-board growth for high-performance technical computing.
At a "chalk talk" session in New York last Friday, Ty Rabe, director of the R&D group in HP's high-performance technical computing division, said that the market for life science computing appears to be on the mend following the boom-and-bust cycle it witnessed in the early part of the decade.
HP still places a "big focus" on life sciences, along with computer-aided engineering, oil and gas, and the financial sector, Rabe said.
Steve Langdon, an HP fellow in the high-performance computing division's technology solutions group, said that he's seeing "a wave of renewals" from life science customers the company worked with five years ago, although he declined to provide specific customer names.
While these renewals aren't growth "in the sense that they came out of nowhere," he said, they are reassuring nonetheless, because "things went a bit quiet [in the life science market] around 2004."
This week, HP said that market research firm IDC had named it the revenue leader in the HPC market for the third consecutive year. According to IDC's report, "Worldwide Technical Server 4Q05 Vendor Shares," HP had more than 31 percent of the $9 billion HPC market in 2005.
Building commodity-based clusters "isn't as sexy as building Blue Gene," Rabe said, "but it's where the money is."
According to the IDC report, HP holds the No. 1 position in several market segments, including workgroup systems, departmental systems, enterprise systems, and technical clusters — a segment that IDC said has been growing at more than 70 percent for the past four years.
Rabe said that HP intends to retain its competitive edge even as the HPC market migrates from proprietary RISC/Unix systems to Linux clusters built with commodity chips. "One might think that in a world of industry-standard hardware that everyone delivers the same platform," he said, "but the variations depend on how well the software works, and how long it takes to get the cluster up and running. These are hard problems to solve."
Langdon said that while many early adopters for cluster computing opted to roll their own, he's "seeing the pendulum swing back from do it yourself towards the integration and the pre-testing that we're offering."
Furthermore, Rabe said that the company's use of industry-standard components should give it an edge over HPC rival IBM. "I've always admired IBM's marketing," he said, "but I believe that in order to pay for that, they are focused on proprietary systems built around power, and promising new technologies like Blue Gene."
However, he said he doubts that IBM can keep up with the "tremendous advances" in chip technology, including the rapid push toward dual-core and quad-core CPUs that promise to dramatically improve the price/performance of industry standard systems. Building commodity-based clusters "isn't as sexy as building Blue Gene," Rabe said, "but it's where the money is."
— Bernadette Toner ([email protected])