As metabolomics takes its place alongside genomics, transcriptomics, proteomics, and other large-scale experimental platforms, instrument vendors and their customers are seeking new ways to handle its ballooning data yields. In one example, Thermo Fisher Scientific is partnering with Genedata to optimize metabolomics-specific informatics workflows for its mass spectrometers.
Thermo is no stranger to bioinformatics. The company markets its own suite of analysis tools for proteomics, which it recently updated. However, Donna Wilson, Thermo's strategic marketing specialist in metabolism and metabolomics, says that the informatics requirements for metabolomics are very different than for proteomics, even though both fields need to analyze mass spec data.
"Metabolomics is trying to understand how small-molecule biomarkers can indicate disease [or] exposure," Wilson says. These metabolites "are closer to the phenotype of an organism than proteins or genes."
In practice, the experimental workflow for metabolomics, which is "small molecule-oriented," is different than for proteomics, which is "large molecule-oriented," she says. Data processing and interpretation differ as well, she says, necessitating different tools and appropriate statistical analysis, even though the two fields are often trying to answer the same questions.
One metabolomics project involving Thermo and Genedata is developing methods that quantitatively and computationally assess metabolic profiles generated on Thermo's LTQ-FT mass spec. Led by Dean Jones, who directs the Clinical Biomarkers Laboratory at Emory University, the project is currently assessing informatics tools for its metabolomics pipeline, and Jones and his colleagues are comparing results generated with Genedata's Expressionist and Refiner MS software with various open source tools, including the XCMS module within the R framework.
— Vivien Marx
The Swedish Neuroscience Institute is collaborating with the Seattle-based Institute for Systems Biology on research geared toward diseases of the brain and nervous system. The two institutes plan to work together to build a brain tumor tissue bank and an associated genomic database derived from samples removed during surgery.
The National Cancer Institute has granted GeneGo a $140,537 Phase I Small Business Innovation Research grant for the development of a platform for understanding the influence of nutrients on carcinogenesis and cancer prevention. The new platform will include a manually curated database on nutrition, an 'omics data repository, an advanced search function, and statistical modeling tools.
The value of Sentient components and services that IO Informatics will supply to the Centre of Excellence for the Prevention of Organ FailurE
Computer Modeling of Oligonucleotide Reaction Rates
Grantee: William Kennelly, DNA Software
Began: Sep. 1, 2008; Ends: Feb. 28, 2009
With this funding, Kennelly aims to build a mathematical model for determining the rate of reaction of DNA strands with DNA or RNA to form a duplex structure which can be incorporated into existing software for design and simulation of oligonucleotide reactions. The software will then predict all possible structures, such as monomers, self-dimers, and heterodimers, in a reaction and their concentration over time.
Software Tools for Next-Generation Sequencer Data
Grantee: Gabor Marth, Boston College
Began: Sep. 1, 2008; Ends: Jun. 30, 2012
This grant will cover the development of a suite of tools to support next-generation resequencing with a specific emphasis on base-calling programs that improve upon the native software supplied by the instrument vendors. Marth and his colleagues aim to develop a read alignment program that can map billions of reads to large, complex genome sequences. They also plan to construct a graphical assembly viewer program.