Skip to main content
Premium Trial:

Request an Annual Quote

Mass Spec Symposium Shines Light on Quantification Methods and Approaches

Premium
SAN FRANCISCO – Even as proteomics increasingly is moving toward quantification, not enough researchers are familiar with specific approaches and techniques for quantifying proteins, according to one of the organizers of the 8th International Symposium on Mass Spectrometry in the Health and Life Sciences held here last month.
 
The symposium, which began in 1984 before proteomics even existed as a field of research and is now held biennially, focuses on the newest technologies in and methods for mass spec-based protein research.
 
This year’s edition, which drew about 250 attendees, focused on quantitative proteomics. As the field increasingly expands beyond a shotgun approach, a spotlight has been shone on quantitative proteomics as a method to improve research and increase knowledge about the proteome.
 
Last year during a proteomics conference in Sienna, Ruedi Aebersold, a professor of molecular systems biology at the Federal Technical University in Zurich and the University of Zurich, sounded a clarion call to the research community to do more quantitative work [See PM 09/07/06].
 
And in a nod to the growing importance of quantitative proteomics, earlier this year Thermo Fisher Scientific shifted the focus of its Biomarker Research Initiatives in Mass Spectrometry in Cambridge, Mass., from mass spec-based protein biomarker discovery to biomarker validation [See PM 04/12/07].
 
Despite the movement toward a quantitative workflow, however, knowledge about the approach in the research community remains low, said A.L. Burlingame, an organizer of the symposium here and a professor of chemistry and pharmaceutical chemistry at the University of California, San Francisco.
 
“People don’t understand how to do it right,” he told ProteoMonitor. “It’s poorly understood.” While some of the methods that were discussed during the five-day symposium have been used in other research areas for some time, in protein research they are comparatively novel, he said.
 
For example, “The notion of isotope dilution has been around and used extensively in studies of drug metabolism by mass spectrometry for decades,” he said. “And the principles of isotope dilution have been around forever, so … it’s just using the same principles that are known to work from the beginning of time, so to speak, in the protein field.”
 
And in that respect, the symposium was a venue to see how some of the leading proteomics researchers are using current technologies in their work. Matthias Mann from the Max-Planck Institute for Biochemistry spoke of work he and his colleagues did using SILAC with LTQ Orbitrap mass spectrometry and bioinformatics to quantify a large proportion of the proteome.
 
Donald Hunt spoke about his work using electron transfer dissociation, a comparatively new method in proteomics, to identify proteins and characterize their post-translation modifications. And Josh Coon presented data from work he and colleagues have done using ETD and high-resolution mass analysis to discover and quantify post-translational modifications that signal human embryonic cells to exit the pluripotent state.
 
By the end of the symposium, Burlingame said, ““I think what [the attendees] benefit from is they learn a lot, they get exposed to a lot. It’s an easy way of seeing what the field is really doing and some people change direction because of it. But most people use the information to modify what they’re doing.”
 
Making the Case for Quantitation
 
The symposium kicked off with a presentation by Aebersold that amounted to Quantitative Proteomics 101, highlighting the advantages of using a quantitative strategy in protein-related research. One reason for quantification is that it’s necessary to detect changes in the proteome after it has been perturbed.
 

“People don’t understand how to do it right.”

In addition, quantitation is useful in differentiating noise from signal in protein complex analytes; it allows for classification of proteins based on common behavior; and it allows for comparative analysis of clinical samples, a field often called translation proteomics, which Aebersold called “an extremely complicated” but “very large” area of research.

 
“In most proteomic studies, [you] would benefit from having a quantitative dimension to them,” he told the audience.
 
Aebersold then outlined the most common quantitative methods being employed – isotope labeling; ion current; spectral counting; and multiple reaction monitoring targeted proteomics – and their strengths and weaknesses.
 
At the end of his talk, he concluded that there is no one strategy that is best and the choice should be made based on the problem being encountered and its context.
 
In ensuing presentations about their work, speakers had similar messages. Pedro Cutillas from the Ludwig Institute for Cancer Research in London said that label-free proteomics is attractive because it can offer a universal means of quantification and can be applied to primary tissues in clinical samples. In his own work, he has found label-free LC-MS to be more accurate than SDS-PAGE techniques. But, he added, the method has its own set of issues such as a need for an extracted ion chromatogram for each identified peptide; a sometimes unmanageable amount of identified peptides; and a need for bioinformatics for automation.
 
In his presentation, Eric Deutsche, a researcher at the Institute for Systems Biology, said his lab has found spectral counting provides higher sensitivity and greater ability to detect and identify lower signal-to-noise spectra with better false discovery rates. Under certain conditions it can provide greater identification of peptides than sequence searching, he added.
 
Others pointed out, however, that for large-scale absolute quantification, a label-free strategy is better, and quantification of small datasets through spectral counting can be less accurate than with other methods.

The Scan

Positive Framing of Genetic Studies Can Spark Mistrust Among Underrepresented Groups

Researchers in Human Genetics and Genomics Advances report that how researchers describe genomic studies may alienate potential participants.

Small Study of Gene Editing to Treat Sickle Cell Disease

In a Novartis-sponsored study in the New England Journal of Medicine, researchers found that a CRISPR-Cas9-based treatment targeting promoters of genes encoding fetal hemoglobin could reduce disease symptoms.

Gut Microbiome Changes Appear in Infants Before They Develop Eczema, Study Finds

Researchers report in mSystems that infants experienced an enrichment in Clostridium sensu stricto 1 and Finegoldia and a depletion of Bacteroides before developing eczema.

Acute Myeloid Leukemia Treatment Specificity Enhanced With Stem Cell Editing

A study in Nature suggests epitope editing in donor stem cells prior to bone marrow transplants can stave off toxicity when targeting acute myeloid leukemia with immunotherapy.