Skip to main content
Premium Trial:

Request an Annual Quote

Standards, Reference Material Development Highlight Proteomics Research in 2009

Premium

This story originally ran on Jan. 7.

By Tony Fong

There was no transformative breakthrough in proteomics research in 2009, but the year saw progress in several areas that have so far prevented the discipline from becoming more mainstream and moving into the clinic.

Meantime, a handful of projects were started last year that may yield important data and methods for future research.

While the story in commercial proteomics in 2009 was defined by a small handful of acquisitions and the recovery of Vermillion [See PM 01/04/10], the story on the research side was composed of less obvious, but still significant, accomplishments. These could set the stage for improved scientific data moving forward, drive more widespread adoption of proteomics technologies and methods, and ultimately lend true meaning to the term "clinical proteomics."

Among the most significant research done this past year were projects spearheaded separately by the Human Proteome Organization and the National Cancer Institute aimed at improving experimental reproducibility and developing standards.

For the past several years, HUPO and the NCI's Clinical Proteomic Technology Assessment for Cancer initiative have been developing reference materials, and this year saw the first fruits of their labor. In May, HUPO released the results of a test-sample study that purported mass spectrometry technology is both robust and reproducible [See PM 05/21/09 and 05/28/09].

The study stemmed from a project initiated by HUPO three years earlier to test the inter-laboratory reproducibility of mass specs and to develop a protein-standard mixture for the research community. While the authors of the study said that mass specs are capable of generating reproducible data, the study also highlighted the complexity of proteomics and the fact that missteps can be easily made along the research pipeline that can compromise experimental results.

Indeed, among the 27 laboratories that participated in the study — which included researchers from both academia and industry — just seven found all 20 proteins in the test sample, and only one identified all 22 tryptic peptides with a mass of 1,250 Daltons.

Invitrogen had originally planned to commercialize the 20-protein mixture as a reference standard, but during the fall HUPO officials disclosed that the company, which is part of Life Technologies, had changed its mind because HUPO's requirement that the mixture be at least 95 percent pure would make it prohibitively expensive to manufacture [See PM 10/09/09].

HUPO has since been negotiating with another vendor to commercialize the mixture, but so far, no deal has been reached.

NCI-CPTAC Standards and References

Meanwhile, the NCI's CPTAC network of five research teams also reported progress on projects ongoing since 2006, when CPTAC was created.

In October, the researchers published three studies detailing work they had conducted on the use of LC-MS platforms for proteomics. One described a yeast reference standard, a development that Daniel Liebler at Vanderbilt University and a team leader for CPTAC said offers the research community a model system that mirrors the kind of sample typically analyzed in proteomics experiments [See PM 11/06/09].

"We wanted to model the performance or to study the performance of discovery platforms in a much more complex system that would be more like what you would deal with when you're dealing with tissues or biofluids from humans," Liebler said. "There's just much more complexity, [and a] much broader range of concentrations of proteins because it's from a living organism."

Another study published the same month detailed more than 40 metrics developed by Paul Rudnick and colleagues at the National Institute of Standards and Technology to monitor the performance of LC-MS systems. The metrics, which were boiled down from more than 100 that the NIST scientists initially identified, fall into six classes: chromatography, dynamic sampling, ion source, MS1 signal, MS2 signal, and peptide identification.

[ pagebreak ]

In addition to being a guide to troubleshoot LC-MS platforms, the metrics serve as a comprehensive quality control profiler for such systems, which Liebler said proteomics had not had before.

In November, he and his colleagues at CPTAC published an article on repeatability and reproducibility on LC-MS systems. Like HUPO, the CPTAC teams said that although the technology can consistently identify "inventories of proteins even in complex biological samples," what that means in terms of the machines' ability to differentiate disease from healthy tissue and specimens remains unclear and will be studied further [See PM 11/20/09].

During the summer, CPTAC researchers also reported on the inter-laboratory reproducibility of a biomarker verification method that only recently has gained traction in proteomics. Eight research teams participated in three studies to test the method — multiple-reaction monitoring coupled with isotope dilution mass spectrometry, or SID-MRM-MS — and found that in addition to being reproducible, the method demonstrated sensitivity to the low-microgram per milliliter protein concentrations in unfractionated plasma [See PM 07/09/09].

Steven Carr, the principal investigator of a CPTAC team, said the research was not intended to put SID-MRM-MS into the clinic, but he and his colleagues also wrote in a study describing their work that the method has the potential to replace certain clinical immunoassays "especially in cases where interferences are known to exist or multiplex measurements are needed."

MRM in the Spotlight

In fact, 2009 was a year in which research developing MRM workflows became a high priority in the research community. After more than a decade of emphasis on biomarker identification, attention has been shifting toward verification with an eye toward translating discovery into clinical use.

Individual labs and researchers have been developing MRM-based methods for several years, but 2009 saw more concentrated efforts to test such workflows. In addition to the CPTAC efforts into SID-MRM-MS, two projects announced late in 2009 could eventually be a catalyst for greater use of MRM methods.

One, a collaboration between the Institute for Systems Biology and Agilent Technologies, seeks to build an MRM map for quantitative proteomics. At the end of the two-year project, the goal is to have at least four peptides for each of the estimated 20,000 to 25,000 protein-coding genes in the human proteome. It also would include verified rapid and accurate MRM-based mass-spec assays for identifying and quantifying any protein in the human proteome in a multitude of samples [See PM 10/23/09].

"What we're really doing is dramatically reducing the amount of time, energy, and cost of getting involved in setting these assays up in the first place," Ken Miller, director of LC-MS marketing for Agilent, told ProteoMonitor.

Related to that effort is an initiative to test the feasibility of measuring all proteins in humans. The project would test how difficult it would be to build mass spec-based assays using an MRM strategy that would cover the entire human proteome [See PM 10/30/09]. The larger, ambitious effort to map the human proteome is called the Human Proteome Detection and Quantitation initiative, or hPDQ [See PM 01/22/09].

"So this is the precursor stage where we're focused entirely on the MRM, defining what the best proteotypic peptides are, what the parameters are, and doing exploratory studies and cell lysates to see how well the assays behave," Amanda Paulovich from the Fred Hutchinson Cancer Research Center and a co-PI on the project said. "And then in a subsequent phase beyond this two-year early phase, we would go on to generate antibodies to improve sensitivity and throughput."

Doubts Linger, but Gaining Traction

Despite the increased attention to standards and the steady current of studies claiming new insights into the proteomes of any number of organisms, proteomics still found itself operating in the shadows of genomics in 2009, much as it has for the past decade, and those in the field often had to reassure themselves and each other of the relevance of their accomplishments.

Much of the criticism directed at proteomics continued to be focused on what it has and hasn't achieved and what it can — and can't — bring to patient care.

[ pagebreak ]

At the start of the year, NHLBI announced a new proteomics program — the first major initiative by the institute in four years specifically targeting the space — with the goal of using new technologies to answer a clinical question and to try to understand disease mechanisms.

Pothur Srinivas, a program director at NHLBI, told ProteoMonitor that the new program was created to address continuing questions about whether proteomics technologies have applications in the clinic, and, if so, how they can be used.

"The technology has developed but [it's] still not there yet," he said, "and we need to use these tools to answer a clinical question, so this is an effort at trying to do that."

Patrick Brown, a professor of biochemistry at Stanford University who has done extensive research on DNA microarrays, similarly questioned the clinical application of current proteomics technology.

At US HUPO's annual meeting in February, he told the audience that as long as mass specs remain so expensive and challenging to use, proteomics would never achieve what genomics has. He added that the power of the instruments may have been and continue to be over-sold.

And while genomics has focused on genetic variations and their possible meanings, proteomics has largely ignored such differences between individuals, contributing to its limited clinical use, Brown said [See PM 02/26/09].

However, even if proteomics has yet to fully prove its utility, signs were apparent throughout the year that the field is gaining traction in the wider scientific community.

For example, two landmark longitudinal studies this past year incorporated for the first time proteomics into their research. In March, the Framingham Heart Study entered into an agreement with BG Medicine to discovery biomarkers, including protein biomarkers, associated with heart disease, three years after FHS first put out a solicitation for a cooperative research and development agreement [See PM 03/19/09 and 12/07/06].

According to Daniel Levy, director of FHS and the Center for Population Studies at the National Heart, Lung, and Blood Institute, moving from the genomics research that had been conducted at FHS to proteomics was a natural progression as proteomics technology "is ready to allow us to leverage the many scientific resources that we've been creating here."

Australia's Busselton Health Study also moved into the proteomics space as its collaborators, Proteomics International and the Fremantle Hospital Diabetes Research Group, began searching for biomarkers associated with obesity-related diabetes [See PM 08/13/09].

Another indication that proteomics may be gaining the respect of other research fields is that proteomics in 2010 became a permanent division of the American Association for Clinical Chemistry after five years as a provisional division [See PM 11/06/09].

In addition to providing a nod to the field and its growing importance, Saeed Jortani, chairman of the AACC proteomics division, said that increased activities within the organization that are directed at proteomics could help bridge the gap between biomarker discovery and the translation of those discoveries to patient healthcare.

"As the years went by, we saw that that type of research [encountered] some obstacles," Jortani told ProteoMonitor. "And so now we see that there's a major role for us in that so much effort has been put into biomarker discovery [and] it goes without saying that we have to be there at the table."

Aside from classic biomarker research, proteomics also continued to be used for less traditional research, including using mass spectrometry to determine the tenderness of beef; detecting the presence of glycine on comets; and developing a mass-spec based method for detecting doping in race horses.

And researchers who in 2007 said they had sequenced proteins in a 68-million-year old Tyrannosaurus rex fossil, this spring announced they had sequenced an 80-million-year old hadrosaur [See PM 04/30/09]. In the process, they also silenced critics who had questioned the quality of the science on the T. rex sample and added weight to the possibility that proteomics may be a tool useful for studies into prehistoric times [See PM 05/07/09].

The Scan

Purnell Choppin Dies

Purnell Choppin, a virologist who led the Howard Hughes Medical Institute, has died at 91, according to the Washington Post.

Effectiveness May Decline, Data From Israel Suggests

The New York Times reports that new Israeli data suggests a decline in Pfizer-BioNTech SARS-CoV-2 vaccine effectiveness against Delta variant infection, though protection against severe disease remains high.

To See Future Risk

Slate looks into the use of polygenic risk scores in embryo screening.

PLOS Papers on Methicillin-Resistant Staphylococcus, Bone Marrow Smear Sequencing, More

In PLOS this week: genomic analysis of methicillin-resistant Staphylococcus pseudintermedius, archived bone marrow sequencing, and more.