|Matthew Dublin is a senior writer at Genome Technology.|
A team of researchers from Rice University are challenging the notion that has motivated the computing industry for over 50 years: accuracy is the ideal.
It might be counterintuitive to think of developing computer hardware that doesn't deliver anything but the most accurate of results. Yet allowing for a small amount of not-so-accurate computation can save users time and money.
This type of "inexact design" can seriously decrease power consumption and enhance processing speed by managing the probability of errors and placing limitations on which calculations produce errors. These less-than-perfect processors are designed by "pruning" or trimming away rarely used sections of digital circuits and creating what the team calls "confined voltage scaling" which trades certain areas of performance to further cut down on power usage.
"In the latest tests, we showed that pruning could cut energy demands 3.5 times with chips that deviated from the correct value by an average of 0.25 percent," says Avinash Lingamneni, a Rice graduate student who is a co-developer on the project. "When we factored in size and speed gains, these chips were 7.5 times more efficient than regular chips. Chips that got wrong answers with a larger deviation of about 8 percent were up to 15 times more efficient."
Inexact computer chips like this prototype are about 15 times more efficient than today's microchips:
While skepticism is understandable, the Rice team is certainly turning a lot of heads in the processor design community— their research has already earned best-paper honors at this week's ACM International Conference on Computing Frontiers in Cagliari, Italy.
The obvious question: What good is a processor that makes mistakes?
According to the team, there are certain application areas that can tolerate a significant amount of error. The example they cite is an image that was rendered with the "inexact" processor with relative errors up to 0.54 percent that were virtually indiscernible to the human eye. So there might be room for these imperfect processors in big data visualization or molecular modeling, although they probably won't be that popular in most areas of bioinformatics where mistakes simply cannot be tolerated.
For now, instead of pitching these processors as ideal for general purpose use, these researchers are saying their design may best be used as application-specific processors, such as embedded microchips in devices.