NEW YORK – While proteomics researchers and tools vendors have for years predicted that scientists in related areas — perhaps most notably genomics — would come to embrace the approach, the limitations and complexity of proteomic experiments have slowed its spread.
Recently, though, the development of new technologies as well as continuing improvements to older methods have helped proteomics gain some traction with researchers outside the field. This effort remains, however, very much a work in progress.
Brian Hoffman, director of protein sciences at the Jackson Laboratory (JAX), has observed this shift firsthand. He noted that while JAX has traditionally been known as a genomics and transcriptomics-focused institution, the lab's faculty are "really realizing that protein data and proteomics is actually kind of the next step in a lot of the fields of study."
This is reflected in faculty demand for his lab's proteomic services, which Hoffman said over the last three years has grown roughly tenfold.
Genomics researchers and others outside the proteomics space moving to add proteomic data to their experiments "is absolutely a trend that we are seeing right now," said Shashanka Muppaneni, managing director and partner at Boston Consulting Group where he leads the firm's advisory work in life science tools and diagnostics.
"You are seeing a significantly greater uptake among genomics users [who believe] that proteomic data and, subsequently, multiomic data, is of significantly greater power than genomic data only," he said.
Daniela Hristova-Neeley, a partner at consulting firm Health Advances who covers the life science tools space, likewise said that increasing uptake of proteomics by non-proteomics researchers is a trend she and her colleagues have observed, though she added that it has been a "slow-developing" one.
"The reality is that genomics is great, but it doesn't necessarily tell you how cells will function and what will actually impact cell behavior," she said. Likewise, transcriptomics, while one step closer to protein-level data, also provides only limited insight into the ultimate translation and functionality of proteins, she added.
"At some point, [researchers have to] not only look at nucleic acids as potential markers for disease or health," said Chris Mason, professor of genomics, physiology, and biophysics at Weill Cornell Medicine.
Mason's lab has wide-ranging interests but, he noted, has historically been "more of a genomics lab." Recently, he has moved more substantially into proteomics, an effort he said had been driven in part by advances in the space.
"It's been a slow burn over the past 10 years, but I think in the past three to four years it has really accelerated," he said, noting that over the last few years some proteomics technologies have seen gains of an order of magnitude or more in throughput and depth of coverage.
While a decade ago discovery proteomics experiments could typically measure on the order of several hundred proteins in plasma, today a variety of platforms and technological approaches can measure more than 5,000 proteins in plasma, and with throughput that makes large population-scale studies feasible.
"There has been, I think, pent-up demand from people in the transcriptomics and genomics space who are used to having the capacity at least to see all the genes that are active in a sample," Mason said.
Currently, researchers looking to measure many thousands of proteins in plasma have three options: the SomaScan platform from SomaLogic (now part of Standard BioTools), which uses aptamer-based affinity reagents and can measure roughly 11,000 proteins; Olink's Explore HT platform, which uses the company's PEA immunoassay technology and can measure more than 5,300 proteins; and mass spectrometry-based approaches, most notably Seer's Proteograph enrichment platform, which, when coupled to top-of-the-line instruments like Thermo Fisher Scientific's Orbitrap Astral, can measure around 6,000 to 7,000 proteins per sample. Meanwhile, the TrueDiscovery platform from Swiss proteomics firm Biognosys measures 4,200 proteins in plasma or serum, while a plasma proteome enrichment technique developed by researchers at the University of Washington has enabled experiments measuring 5,000 to 6,000 proteins in plasma.
Mason said that his lab has explored SomaLogic's and Olink's platforms. He has also been working with Seer since 2021 and plans in coming months to publish research by his lab using the company's technology to study the effects of spaceflight on the plasma proteome. Mason also recently purchased a Platinum instrument from protein sequencing firm Quantum-Si, though that technology is aimed at more targeted applications.
"I generally try to be a connoisseur of the technologies and try everything at least once," he said. "It's like being at a buffet."
Not all labs are so adventurous (or well funded), however, and while proteomics does appear to be making inroads with nontraditional users, challenges remain.
This is particularly true with regard to mass spec, which requires a significant capital expenditure (well over $1 million for the most advanced systems) as well as Ph.D.-level personnel with expertise in the technology.
Mason noted that his lab does not run its proteomic samples in-house but sends them out for analysis, though he and his colleagues have been discussing the possibility of purchasing an instrument.
"We don't have enough samples yet to justify the additional equipment purchase, but we've been talking about it," he said.
"Mass specs have made significant leaps both in terms of consistency and usability, and moreover the informatics offerings that have cropped up in the market have become magnitudes better than what they used to be," said Muppaneni. "But it is still a new piece of capital [equipment] that a genomic lab has to install, which makes it a bit tougher."
He said that this makes core lab and send-out partnerships important for proteomics firms targeting genomic labs and other nontraditional customers.
Such users are a key part of the market Seer hopes to address with its Proteograph system, which uses nanoparticles to enrich plasma samples for proteomic analysis. In June, the company launched its Seer Technology Access Center, which employs its XT Assay Kit in combination with the Orbitrap Astral mass spectrometer to provide proteomic services to customers without access to mass spectrometry. The company has also established what it calls its Centers of Excellence (COE) network, including firms such as Allumiqs, Discovery Life Sciences, Panome Bio, Sanford Burnham Prebys, Soulbrain Holdings, and Evotec, that will run Seer customer samples.
Seer sees the Technology Access Center as a necessary step as it aims to build the evidence base for its technology, but it does not want services to be a major part of its business, said President and CFO David Horn, noting that a distributed model in which it sells instruments and consumables is more scalable and higher margin.
The COE network, on the other hand, fits into this distributed model. The COEs "are going to need instruments and kits from us, right?" he said. "If the COE runs the sample and I sell the instrument and kit to them, that is fine. They are just another customer in that sense. It is still a distributed business model."
Horn said he does believe some non-mass spec labs "will bring in mass spec over time as [the instruments] get better and hopefully easier to operate."
Vikram Bajaj, managing director at Foresite Capital and formerly the CSO at Grail as well as the former CSO at Verily (formerly Google Life Sciences), said, however, that he expects non-proteomics labs to continue to send out their samples given the option.
"We've had big mass spec operations going back to Verily … and just from firsthand experience, I can tell you that if I were doing something that didn't require [being done in-house], you wouldn't want these giant instruments," he said.
Adding to the challenge, Bajaj said, is that mass spec-based proteomics "doesn't have the support cycle and user community that sequencing does."
He also noted that even large-scale sequencing projects are often still done by vendors or large central core facilities.
Particularly as researchers move toward more multiomic analyses, "it is not necessarily realistic to have all these technologies in their lab," said Hristova-Neeley.
Affinity-based proteomics approaches may be better suited to in-house use by non-proteomics labs, particularly labs with next-generation sequencing experience and infrastructure. Olink's Explore HT platform currently uses NGS to read out its measurements, while sequencing giant Illumina is in the process of developing an NGS version of Standard BioTools' SomaScan assay called Illumina Protein Prep. The company said that it is planning an early-access release of the product this year and full commercial launch of the product, which will feature an 11,500-plex panel, in early 2025.
Olink's success in recent years in shifting from a services- to kit-based business indicates an interest among its customers in bringing the assay in-house. In Q1 2021, service revenue accounted for 71 percent of its total revenue, while kits accounted for 21 percent. In Q3 2023, service revenue accounted for 38 percent of its total revenue, while kits accounted for 54 percent.
Muppaneni noted that the ability to run affinity-based proteomic assays on NGS allows genomics labs to move into proteomics using already installed pieces of capital equipment, which, he said, "is a little easier than adding a mass spec."
Both approaches have their pros and cons, Muppaneni added. While affinity platforms — and SomaScan, in particular — have traditionally provided more protein measurements than mass spec-based approaches, they detect known proteins, while mass spec provides comparatively unbiased measurements and insights into protein variants not offered by affinity-based methods.
In part, choice of technology "depends on what your discovery objectives are," he said.
Foresite's Bajaj predicted that for large population-scale discovery initiatives like the UK Biobank project where researchers have in recent years been adding proteomic data to existing genomic data, affinity-based platforms will continue to dominate in the near term.
"If you are saying, I'm going to take human blood and do discovery at a cohort level, there [mass spec] is not yet competitive with affinity-based approaches," he said. "And for affinity-based approaches that have a sequencing readout, the costs are going to continue to fall, and it is going to become even more competitive."
There are indications, though, that mass spec is beginning to make inroads into this space. Last year, following the release of several new technologies that substantially improved the throughput and depth of coverage of mass spec-based plasma proteomics experiments, Maik Pietzner, a bioinformatician at the MRC Epidemiology Unit at the University of Cambridge School of Clinical Medicine whose research involves large-scale proteogenomic experiments, said that using mass spec for such efforts "seems now to become feasible."
He added that he and his colleagues are currently working to expand their proteogenomic research to include mass spec.
Meanwhile, Christopher Whelan, director of neuroscience data science at the Janssen Pharmaceutical Companies of Johnson & Johnson and chair of the UK Biobank Pharma Proteomics Project (PPP), has said that project, which is among the largest ongoing proteogenomic population studies, is in the process of implementing mass spec-based proteomics.
This month, researchers from Weill Cornell Medicine-Qatar and Seer used that company's Proteograph system for a proteogenomic study looking at 325 previously genotyped blood samples.
Yet, despite the signs of progress, movement of proteomics into non-proteomics labs remains a slow process. During Seer's presentation at the JP Morgan Healthcare Conference in January, Chair and CEO Omid Farokhzad said the company remained "relatively conservative" in terms of its outlook for 2024, while Horn noted that until the body of evidence supporting the platform "really grows … we're going to continue to be relatively modest in terms of growth."
And while JAX's Hoffman has seen a boom in demand for proteomic services from that facility's researchers, his experience isn't universal across proteomics cores.
For instance, Brett Phinney, director of the proteomics core facility at the University of California, Davis, said that on his campus, researchers interested in protein data still overwhelmingly opt for RNA-seq.
"I wish I could get the word out that proteomics in the last five years has really, really made big strides," he said. "It's difficult."