Earlier this year I ordered a kitchen island online. Of course, when all 200 pounds of it arrived on my doorstep, I was left to take approximately 3,742 pieces and assemble them into something I could chop vegetables on. Predictably, the instructions weren’t too helpful, but between the vague directions, slightly more useful diagrams, and sheer force of will, the thing came together very nicely.
I realize you don’t subscribe to this magazine because you desperately want to know about my kitchen. But I like the analogy. Each little piece of what would become my island held, upon inspection, a clear function in the overall design of the piece. Even without the manual, I probably could’ve managed just by looking at each part and figuring out how and where it connected to everything else.
Not so with the human genome — or any other genome, for that matter. After slaving away for the past decade of high-throughput sequencing effort, scientists are faced with the unhappy realization that the vast majority of genes completely evade our ability to understand what they do. And unlike a modular spice rack, you can’t pick up a gene, turn it over a few times, and then (eureka!) determine where it fits in the broader picture.
Many of you have entered the field of functional genomics, using an array of technologies to try to elucidate function of genes and other elements of the genome. In our cover story, we take a look at some of the leading efforts in the field, with an eye toward the spectrum of tools being brought to bear on this problem. It’s really impressive to see how ideas from synthetic biology, next-gen sequencing, proteomics, and more have all been successfully converted for this particular challenge.
In other articles, we look at emerging technology categories. One feature checks on how RNA interference is being used in the hunt for pathways. Thanks to silencing, writes Jeanene Swanson, pathway analysis has become a genome-wide, high-throughput force in its own right. She also sums up some of the latest research in the field with a publication roundup.
Another story will get you up to speed on data issues in metabolomics. It’s still a relatively nascent discipline, but scientists have learned from the trials of other fields and are attempting to incorporate data standards and quality controls for metabolomics experiments as soon as possible. Ciara Curtin brings you that story, along with a look at the major technologies used for metabolomics and where scientists see its promise in coming years.
And don’t miss this month’s Brute Force, where Matt Dublin sizes up the Cell processor. A year ago it was hyped as a major breakthrough in multi-core computing — but, as Matt found, getting bioinformatics applications to run on the platform has proven such a challenge that the Cell hasn’t made much of a mark in the community yet. But the early adopters he spoke with say that if programmers can get the code to work properly, the Cell unit could make a significant difference in computing for life sciences.