Skip to main content
Premium Trial:

Request an Annual Quote

High-Density Chips


By Meredith W. Salisbury


In early October, a number of computer vendors issued long-awaited announcements that they were now offering systems and software based on dual-core chips. From HP and IBM to Intel and AMD, there’s no shortage of new products springing up — and no shortage of hype around this latest hardware advance. On its website, Intel says it expects that by the end of 2006, up to 85 percent of new desktop, mobile, and server processor shipments will incorporate these dual-core processors.

So just what are these things, and why should you care? Essentially, a dual-core processor is just what it sounds like: two cores, or processing engines, on one chip. The chip itself isn’t necessarily faster, but because the cores communicate with an ultra-fast protocol, you can in theory get more work out of them than you could with a standard single-core chip.

Dual-core processors are popping up all around, and as usual, the technophile bioinformatics crowd wants first crack at them. Two life sciences consulting firms — Scalable Informatics and the BioTeam — say they have brought dual-core systems in-house to test them for customers.

Joseph Landman, CEO of Scalable Informatics, says he was skeptical at first “that the dual-core scenario would work with Blast.” He convinced some folks at AMD to give him a system for benchmark testing and found, to his surprise, that a system with four cores in a single box gave basically linear speedups of about four. The performance of the four cores on dual-core processors was comparable to that of a four processors on a single-core system, he says, noting that you’d likely pay more for the four-processor, single-core version.

Chris Dagdigian at the BioTeam says his group had similar results. “We have run many life science applications on dual-core systems and for many cases the dual-processor, dual-core system” — that is, two chips with a total of four CPUs — “benchmarked and performed as if the test system was actually a quad-CPU system.”

Dagdigian points out that because of this equivalent performance, no one expects people to ditch their existing systems to run out and replace them with dual-core technology. But when people buy new machines or plan upgrades to their existing clusters, it’s likely they’ll go with dual-core processors.

Part of the reason for that stems from concerns about heat output for compute clusters. “For many years now the performance roadmap of some of the big chip makers basically revolved around boosting their chips to faster and faster speeds,” says Dagdigian. “The downside was that each new chip with higher clockspeed required more electricity and produced much more heat that had to be dissipated.” He notes that because clockspeed has been reduced in dual-core processors, the per-core performance of that kind of chip could actually be less than that of a traditional CPU.

Landman tested the dual-core system with Blast and HMMer, finding in both cases that the dual-core processor’s performance virtually matched that of a machine with as many single-core CPUs. The dual cores share memory on the chip, though, so he says that this technology will do best for applications that “have threaded code [and] run well on a shared-memory processor machine.” Memory-hungry programs, such as protein folding simulation software, are likely to perform better on a single-core system, he says.

“The basic ‘win’ is that the dual-core system from a performance standpoint acts like a much more expensive quad-CPU server, but with a far lower price and presumably a smaller form factor and lower environmental requirements,” Dagdigian says.

But the great unknown at this point, and a potential stumbling block for computational biologists, is how software vendors will treat dual-core machines. Vendors may require users to pay for software licenses on a per-core basis, doubling the software cost and wiping out any cost savings from the dual cores. Users will have to find vendors who license on a per-machine or per-chip basis to avoid paying more for software. “I think a lot of vendors have come to the realization that basically hitting that license [per-core] is probably not going to be a win with their customer base,” Landman says.


The Scan

Genetic Risk Factors for Hypertension Can Help Identify Those at Risk for Cardiovascular Disease

Genetically predicted high blood pressure risk is also associated with increased cardiovascular disease risk, a new JAMA Cardiology study says.

Circulating Tumor DNA Linked to Post-Treatment Relapse in Breast Cancer

Post-treatment detection of circulating tumor DNA may identify breast cancer patients who are more likely to relapse, a new JCO Precision Oncology study finds.

Genetics Influence Level of Depression Tied to Trauma Exposure, Study Finds

Researchers examine the interplay of trauma, genetics, and major depressive disorder in JAMA Psychiatry.

UCLA Team Reports Cost-Effective Liquid Biopsy Approach for Cancer Detection

The researchers report in Nature Communications that their liquid biopsy approach has high specificity in detecting all- and early-stage cancers.