Skip to main content
Premium Trial:

Request an Annual Quote

UW, Thermo Fisher Researchers Develop Improved Approach for Multiplexed Targeted Mass Spec

Premium

NEW YORK – Researchers from the University of Washington and Thermo Fisher Scientific have developed a method that could enable more highly multiplexed targeted proteomic experiments.

Detailed in a paper published this month in Analytical Chemistry, the approach allows researchers to more accurately schedule liquid chromatography retention time windows, which boosts the number of targets that can be multiplexed in a single experiment and improves assay sensitivity, said Michael MacCoss, professor of genome sciences at the University of Washington and senior author on the study. 

In targeted proteomics experiments, researchers use a target analyte's LC retention time — when it comes off the column and goes into the mass spec — to schedule the instrument to measure that target at that time point. Such scheduling improves assay performance by focusing the mass spec's analysis on the points in time when the target analytes are expected to be present.

However, analyte retention times shift over the lifetime of an LC column, which means that researchers must use relatively wide scheduling windows to account for these shifts. The wider the scheduling windows used, the fewer analytes an experiment is able to target.

"Scheduling is generally one of the biggest challenges that there is in targeted proteomics," MacCoss said. "It's the thing that most people struggle with."

He noted that it was a particular challenge for nanoflow chromatography, where "even a slight shift in retention time often means that you don't sample a [target] peptide exactly right."

Researchers have developed a number of methods for addressing this problem, but they come with their own disadvantages. For instance, one common approach is to use internal standards for each of the target peptides. When the mass spec detects the internal standard, it triggers a targeted measurement of the endogenous target peptide. As the authors noted, however, this means an internal standard is required for each target, which increases the cost of the assay and makes it less suitable for highly multiplexed experiments.

MacCoss and his colleagues tackled the issue by using an initial data-independent acquisition experiment to establish the baseline retention times for the analytes in the sample. In subsequent experiments, they ran an initial DIA cycle followed by a second cycle in which they measured their peptide targets. By comparing the initial DIA cycle to the baseline DIA experiment, they were able to account for any shifts in retention time in real time and adjust their targeted measurements accordingly.

By accounting for shifts in LC retention times in this way, they were able to use narrower acquisition windows and therefore multiplex many more peptide measurements in a single experiment. In an analysis of HeLa cell digest, the researchers were able to quantify 1,489 peptides during a 56 minute run. MacCoss said that it might be possible to get that figure up to around 6,000 peptides targeted in an hour-long run.

MacCoss said his lab had looked into developing a similar approach around a decade ago but had run up against the speed and performance limitations of existing mass spec instruments. In the Analytical Chemistry study the researchers used a Thermo Fisher Fusion Lumos system.

Another issue, MacCoss said, was getting the data from the baseline DIA runs onto the processor embedded in the mass spec, which is necessary to enable the real-time retention time alignments. Doing so required access to the processor, which he said his Thermo Fisher co-authors were able to do.

"You're basically trying to score your current run to all of the spectra that were collected in a previous run in real-time and without it delaying the acquisition [of target peptide spectra]," he said, explaining why it was necessary to have this done by the onboard processor.

This required a strategy for compressing the data from the DIA run so that it could fit on the embedded processor.

"One of the problems with a lot of these embedded computers is that they don't have a lot of memory and they don't have a lot of processing power," MacCoss said. "The ability to do a lot of processing in real time is somewhat limited, so there was definitely a trick to getting the compression to work so that the [DIA data] could be handled and scored in real time."

He said that Ping Yip, senior scientist at Thermo Fisher and co-author on the study, led the way in devising the compression approach they used.

MacCoss said that he could envision the approach providing a more sensitive and reproducible alternate for experiments like analysis of large numbers of plasma samples where DIA mass spec has established itself.

"People have been measuring by [DIA] 400 or 500 proteins pretty regularly in unfractionated plasma," he said. "Now you can imagine doing almost all of that in a targeted experiment without missing anything."

He added that the gains in sensitivity and specificity provided by a targeted assay compared to DIA might allow researchers to reach somewhat deeper into a sample.

MacCoss also said that the approach could potentially help alleviate the challenges with nanoflow chromatography that have led many researchers to opt for higher flow LC at the expense of sensitivity.

"People have been pushing to move to higher flow rate chromatography mostly because getting nanoflow chromatography to be reproducible is hard, and if you want the same retention times, it takes a little bit of work to get there," he said. "Here you could potentially stick with the nanoflow chromatography set up and the fact that there is a slight shift in the retention time doesn't alter the scheduling because you are aligning it in real time."

Philip Remes, research scientist at Thermo Fisher and first author on the paper, said that he couldn't provide any information on if or when the company might make the approach available on its mass spec instruments, noting that the researchers have not yet "proven that the method is robust under the wide variety of experimental conditions and samples that would be needed for the technique to be commercialized."

He noted, however, that results from the study along with other preliminary results were promising and that "in principle the method could be used on any of our instruments and analyzers."