Skip to main content
Premium Trial:

Request an Annual Quote

Cedars-Sinai Aims to Bring Clinical Rigor to Discovery Plasma Proteomics


NEW YORK – Researchers at Los Angeles' Cedars-Sinai Medical Center have developed a series of high-throughput plasma proteomic workflows that the center is offering out of its Precision Biomarker Laboratory.

According to Jennifer Van Eyk, director of the PBL, the lab has run discovery experiments on thousands of plasma samples looking for potential biomarkers for various diseases and conditions and currently has the capacity to run tens of thousands of samples per year.

In a recent study published in Clinical Chemistry, Van Eyk and her colleagues presented the workflows they use for neat plasma, depleted plasma, and dried blood spots along with analytical performance data for each of these workflows, detailing the coefficients of variation for the peptides and proteins measured by the methods as well as the linearity and lower limits of detection and quantification that they determined for all of the peptides and proteins measured in each of the sample types.

The goal was to bring the level of standardization and characterization typical of targeted clinical mass spectrometry workflows to discovery proteomics, Van Eyk said. "What we were trying to do is make a discovery assay that would be acceptable to clinical chemists — to have the same stringency."

The researchers developed both high-throughput and mid-throughput workflows for all three sample types, with the high-throughput method taking 25 minutes per sample and the mid-throughput method taking 72 minutes per sample. The high-throughput workflow measured 74 percent of peptides in neat plasma, 93 percent of peptides in depleted plasma, and 87 percent of peptides in dried blood with inter-day CVs of under 30 percent. For the mid-throughput workflow it was 67 percent, 90 percent, and 78 percent for neat plasma, depleted plasma, and dried blood, respectively.

The researchers established reproducibility by running five replicates each day for three days. They established four categories: total proteins observed, meaning they were seen in any run on any day; proteins reliably observed, meaning they were seen in at least three runs on every day; the reproducible proteomic, meaning proteins that were measured in at least three runs every day with a multiday CV of below 30 percent; and the quantifiable proteome, meaning proteins that were measured at least three runs every day with a multiday CV of below 30 percent and with good linearity.

Using the high-throughput workflow, the number of proteins observed ranged from 240 in neat plasma to 681 in depleted plasma, while the quantifiable proteome ranged from 98 proteins in neat plasma to 121 in depleted plasma. Van Eyk and her team also found that combining the measurements from neat and depleted plasma from the same sample provided more protein identifications and quantifications than analyzing either sample type alone, yielding 754 observed proteins and 144 quantified. She said the workflows were able to quantify proteins across four to five orders of dynamic range.

While those numbers are well below the proteins identified and quantified in other plasma proteomics experiments, Van Eyk noted that this is a function of the stringent requirements they established for reproducibility and linearity.

"I can tell you that I can see, say, 1,000 proteins easily in healthy [individuals], but when you go to really say what is quantifiable, and then what is reproducibly quantifiable" the numbers go down, she said. She added that analysis of plasma from healthy individuals typically results in a lower number of proteins identified and quantified than samples from patients with a condition or disease.

"Things like stroke or MI, you can double your number of proteins when you have big events like that," she said.

Van Eyk said that the lab currently has four Thermo Fisher Orbitrap Exploris 480 instruments running more or less around the clock on plasma samples. One of its commonly used assays — among those presented in the Clinical Chemistry paper — is the combined neat-depleted plasma analysis, which provides a deeper analysis than does looking at just one or the other of those sample types.

That requires running two experiments for each sample, but with the high-throughput method that is still just 42 minutes, Van Eyk noted, "and if you split them between two instruments, you can do 60 samples a day."

Achieving good quantitation in these experiments did require the researchers to do careful linearity experiments to understand what proportion of a given peptide or protein was located in the neat versus depleted fraction, she said.

"Sometimes you only have like 5 percent of a protein in one of the two fractions, so you can quantify off of one [fraction], but other times they are split, so you have to quantify off both," she said. "But you have to know that. And the only way that works is you have to do these linearity studies so that you know how to combine each protein and peptide."

Van Eyk mentioned that proteomics firm Seer, which uses nanoparticles to enrich proteins for plasma proteomic analysis, faces a similar challenge in that researchers will need linearity data for the response to proteins to each of the nanoparticles used in the company's system in order to do good quantification.

"You need to understand which proteins you should quant where, and which proteins are split [between different nanoparticles]," she said. "If you have a protein that is split between six [nanoparticles], maybe you don't want to quantify that protein. If you have proteins that are only captured by one [nanoparticle], then those are gold."

"We would love to use [Seer]," she said. "But what I need to see, and what I think the field would like to see is, for a number of disease states, [data] showing which proteins are present and at which ratio on which of their [nanoparticles]."

The company has presented some data addressing this question as part of a lung cancer study it published in Nature Communications last year.

Meanwhile, Van Eyk said she and her colleagues continue to tweak their workflows while also using them for large-scale studies across a range of conditions. The lab is currently doing plasma proteomic discovery work on thousands of samples for the Bill & Melinda Gates Foundation, she said.

"We have a lot of systems in place that we have developed over the last few years, and the last year and a half has been a really big push to do really large numbers [of samples]," she said. "Our hope is that by adopting these methodologies that when we get into tens and tens of thousands of people that we are going to be able to understand a lot more. But to do that I think you have to have for proteins really good quantitation over a long period of time, which is why for us reproducibility is so important."