Skip to main content
Premium Trial:

Request an Annual Quote

NCI CPTC Meeting Highlights Clinical Proteomic Development, But Underscores Unmet Needs


WASHINGTON, DC — Though proteomic life-science technologies and applications continue to mature, they still have a way to go before they can fulfill expectations projected a decade ago.

That was the message coming from the third annual meeting of the National Cancer Institute's Clinical Proteomics Technologies for Cancer initiative, held here this week.

The meeting was a wide-ranging affair with talks extending from new and in-development technologies, to the regulatory process governing diagnostics test to work being conducted under the Clinical Proteomic Technology Assessment for Cancer program, a component of CPTC specifically directed at technology assessment and optimization.

The other two CPTC components are Advanced Proteomic Platforms and Computational Sciences, which supports new technology development for quantitative proteomics, and the Proteomic Reagents and Resources Core, which serves as a central public source for reagents, data, standards of practice, and other information resulting from CPTC efforts.

CPTC was launched three years ago to enable the development of technology and standards that may spur biomarker discovery in proteomics with a particular focus on cancer-related research.

When proteomics first appeared on the clinical landscape, the expectation was that it would lead to new biomarker-based tests that would enable early screening for cancer and other diseases.

But the reality was that "the promise has not really turned out what we thought it would be," CPTC Director Henry Rodriguez said Monday, the opening day of the three-day meeting. That prompted the creation within NCI of CPTC, a five-year, $104 million initiative.

Speakers during the conference acknowledged that proteomics is still walking uphill. In his presentation, Paul Tempst, director of targeted proteomics in the Molecular Biology program at Memorial Sloan-Kettering Cancer Center in New York, drew attention to a 2004 article in The New York Times about Correlogic's protein-based diagnostic for ovarian cancer. The blood-based test, the article said, could detect the disease in its early stages more accurately than anything else on the market.

Trouble was, the test ran into problems with the US Food and Drug Administration, and OvaCheck still has not yet seen the light of the commercial market.

According to Tempst, the excitement over OvaCheck, which turned out to be premature, became a "black eye" for proteomics from which the field is still recovering.

Also during the meeting, Leigh Anderson, founder and president of the Plasma Proteome Institute, calculated that to date the FDA has cleared 109 proteins for use as biomarker-based tests, while an additional 96 are being used as home-brew tests. Together, these 205 proteins constitute approximately 1 percent of the total human proteome.

"So, we're not doing as badly as I had imagined," he said, but cautioned that over the past 15 years the number of proteins that have been approved for clinical tests has remained unchanged at 1.5 per year — which he called "amazingly low."

Clinical proteomics "has had no impact at all. I think we have a long way to go," he said.

Criticism of proteomics also came from a sector that has rarely been heard from in proteomics: patients. Elda Railey, co-founder of the Research Advocacy Network, a Plano, Texas-based non-profit patient-advocacy group, first expressed concern that among the nearly 50 talks given over the course of the three-day meeting the word "patient" was in the titles of only three.

"Where is our focus?" she said.

She added that while proteomics research has provided a lot of information, almost none of it has had clinical relevance, which has helped erode public confidence in the future of the discipline.

"I'm here to help remind you to find answers" to cancer and other diseases, Railey said.

Research Overload

To be sure, the relative dearth of usable clinical discoveries does not reflect a shortage of efforts to overcome the many obstacles preventing the field from progressing into the clinic. And this week's meeting put a spotlight on the many research projects underway to overcome those barriers.

[ pagebreak ]

At the Biodesign Institute at Arizona State University, for example, researchers are developing synthetic antibodies for use as protein affinity reagents, while a team at the National Institute of Standards and Technology is working on a peptide mixture and yeast reference material for use in shotgun proteomics as biological reference materials.

Internationally, researchers in Seoul, South Korea, have launched a 10-year effort called the Functional Proteomics Center aimed at discovering novel biomarker and therapeutic targets, while in Toronto the Ontario Cancer Biomarker Network is working to optimize multiple-reaction monitoring in combination with informatics and statistical analysis in order to learn how certain drugs affect known biomarkers.

Another technology highlighted at the meeting was a method called stable isotope standards and capture by anti-peptide antibodies, or SISCAPA, originally developed in 2004 by PPI’s Anderson and colleagues to rapidly assess large numbers of particle-bound antibodies.

Anti-peptide antibodies are used to enrich peptides. Here, as with ELISAs, a first antibody enriches the peptide of interest, but instead of a secondary antibody for detection, SISCAPA uses mass spectrometry, which enables the technology to combine the enhanced sensitivity of an immunoassay with the specificity of mass spec.

According to Anderson, SISCAPA, when used in concert with an MRM workflow, can be used to verify candidate protein biomarkers. It has also shown to be applicable to the known whole dynamic range of plasma.

However, MRM-SISCAPA may not be applicable to the clinic because no current LC-MS/MS platform is approved for clinical use, and despite widespread use of the instruments in clinical labs, "a true clinical [mass spectrometer] has yet to appear," Anderson said.

That may change: Earlier this year a global consortium of researchers proposed a project that would use SISCAPA or a similar method called immuno-MALDI to develop anti-peptide antibodies to measure proteins with high sensitivity and specificity [See PM 01/22/09]

Last month the project, called Human Proteome Detection and Quantitation, or hPDQ, received a $4.8 million NIH grant under the American Recovery and Reinvestment Act of 2009.

In his talk, Anderson said that to achieve the goals of hPDQ at least four items would need to be created: a comprehensive database of proteotypic peptides for each of the 21,500 human proteins; at least two synthetic proteotypic peptides labeled with stable isotopes and available in quantitated aliquots; anti-peptic antibodies specific for the same two proteotypic peptide and/or target proteins capable of binding the peptides with dissociation constants; and robust instrument platforms.

hPDQ, which would be a way of mapping out the human proteome, would cost less than $50 million, Anderson estimated.

Also during the CPTC meeting, Richard Smith, director of proteomics research at the Biological Sciences Division at Pacific Northwest National Laboratory, described ongoing work at his lab to create a nanoLC-ion mobility spectrometer-TOF mass spec to enable the detection and quantitation of lower abundance peptides and proteins in biological fluids such as plasma.

According to Smith, the work over the past year has shown "significant" improvement in proteome coverage over existing platforms. Responding to a question from the audience, he added that a two-year timeline to commercialize his platform would be "realistic."

Various working groups of CPTAC also provided updates on their work, including Steven Hall from the University of California, San Francisco's Mass Spectrometry Core Facility, who presented work he and his collaborators performed validating the interlaboratory reproducibility of MRM coupled with isotope dilution mass spectrometry as a protein biomarker-verification method [See PM 07/09/09].

And Zivana Tezak from the FDA's Center for Devices and Radiological Health/Office of In Vitro Diagnostics provided an update on the NCI-FDA Interagency Oncology Task Force on Molecular Diagnostics, which about a year and a half ago decided to focus on proteomics "because there are so many questions floating around" the field, she said.

Two "mock" 510(k) submissions were developed and submitted in the spring, with the goal of educating the proteomics community about the regulatory process. The authors plan to share their submissions and comments, Tezak said, and in addition to providing an educational tool to researchers, the FDA believes it can learn from the process and to translate it what it does to better help researchers through the application process.

The Scan

Transcriptomic, Epigenetic Study Appears to Explain Anti-Viral Effects of TB Vaccine

Researchers report in Science Advances on an interferon signature and long-term shifts in monocyte cell DNA methylation in Bacille Calmette-Guérin-vaccinated infant samples.

DNA Storage Method Taps Into Gene Editing Technology

With a dual-plasmid system informed by gene editing, researchers re-wrote DNA sequences in E. coli to store Charles Dickens prose over hundreds of generations, as they recount in Science Advances.

Researchers Model Microbiome Dynamics in Effort to Understand Chronic Human Conditions

Investigators demonstrate in PLOS Computational Biology a computational method for following microbiome dynamics in the absence of longitudinally collected samples.

New Study Highlights Role of Genetics in ADHD

Researchers report in Nature Genetics on differences in genetic architecture between ADHD affecting children versus ADHD that persists into adulthood or is diagnosed in adults.