NEW YORK (GenomeWeb) – With data-independent acquisition quantitative proteomic workflows becoming faster, more comprehensive, and more reproducible, some researchers have suggested that isobaric tagging techniques like tandem mass tag (TMT) labeling may see a reduced role.
However, recent advances in mass spec instrumentation and methods as well as new TMT reagents are likely to boost the technique's utility. Additionally, growing interest in mass spec experiments using extremely small amounts of sample — including single-cell proteomic work — provide a potential new research area for isobaric labeling to expand into.
At last week's American Society for Mass Spectrometry annual meeting in Atlanta, Thermo Fisher Scientific introduced a new TMT workflow featuring a real-time search functionality that essentially eliminates the precursor interference issue that has long been a challenge for TMT-based analyses, said Ken Miller, the company's vice president of marketing, life sciences mass spectrometry.
Developed by Harvard University professor Steven Gygi, the method allows researchers to quantify proteomic samples at a depth of 8,000 to 10,000 proteins with high reproducibility across runs in under two hours, Gygi said, noting that forthcoming TMT reagents that enable higher multiplex experiments could bring the experiment time to under one hour per sample.
TMT labeling is a variety of isobaric labeling. TMT reagents were developed by Proteome Sciences and are sold by Thermo Fisher Scientific. A competing isobaric labeling reagent, iTRAQ, is sold by Sciex.
Isobaric labeling uses stable isotope tags attached to peptides of interest to enable relative or absolute quantitation of proteins via tandem mass spectrometry. Digested peptides are labeled with tags that fragment during MS2 to produce signals corresponding to the amount of peptide present in a sample. The approach is commonly used to multiplex samples, allowing researchers to run up to 10 samples in a single mass spec experiment, which improves throughput and reduces variation.
One of the major downsides of isobaric tagging is precursor interference, which can significantly impact the accuracy and precision of quantitative information generated in these experiments. In an isobaric tagging experiment, the mass spec isolates the target ion and fragments it, generating the isobaric tag reporter ions that correspond to the proportions of the different peptides in the tagged samples.
However, the isolation windows used to target a given precursor ion are typically wide enough that other non-target ions can slip through. Because these ions have also been labeled with isobaric tags, they also contribute to the reporter signal for the target peptide, which decreases the accuracy with which the actual target is measured.
One technique researchers have used to address this problem is to do quantitation at the MS3 level, which adds another level of ion isolation and fragmentation. This approach, which was pioneered by Gygi's lab, leads to significantly longer run times, however, reducing throughput.
"It really impacts what you're able to do," Gygi said. "So much so that some groups were willing to take the hit on quantification [accuracy] just to go back to doing the MS2 quantification."
The nature of a shotgun mass spec experiment is such that roughly half the MS2 spectra collected in a given experiment will not prove a good match for a peptide and so won't ultimately be used.
"That's not such a bad thing, because MS2 [scans] don't take up much [instrument] time, so people don't really care," Gygi said.
MS3 scans, on the other hand, are more time consuming, and, given that around half of MS2 scans aren't ultimately useful, the accompanying MS3 scans represent a significant waste of instrument time.
The real-time search technique Gygi and his colleagues came up with addresses this problem by trying to match MS2 spectra to a peptide database before moving on to MS3 fragmentation. Only peptides generating MS2 spectra that are a sufficiently good match are passed on for MS3 analysis, which cuts down on the amount of wasted MS3 scans and reduces the instrument time required.
In a study published this year in the Journal of Proteome Research, Gygi and his co-authors found that using the real-time search method, which they named RTS-MS3, they were able to cut their experiment time in half while slightly improving their quantitative accuracy.
The researchers developed the approach on Thermo Fisher's Orbitrap Fusion Lumos instrument and the company has incorporated the software enabling the technique on its new flagship Orbitrap Eclipse Tribrid instrument, which it introduced at last week's ASMS meeting.
In recent years, many proteomics researchers have shifted their focus from depth of coverage to throughput as they aim to run the hundreds to thousands of samples required to, for instance, discover and validate disease biomarkers.
Gygi's RTS-MS3 work suggests this tradeoff may not be necessary. In the JPR study, he and his colleagues identified 8,915 proteins in samples split into 12 fractions and analyzed over 18 hours of instrument time. While the overall experiment time is not especially short, because the samples are multiplexed by 10 using TMT reagents, it amounts to less than 2 hours per sample.
And new TMT reagents should drive that per-sample run time even lower, Gygi said, noting that Proteome Sciences had recently developed 16-plex TMT reagents that he had a chance to try.
"They worked exactly the same as the [10-plex]," he said. "We got similar numbers [of quantified proteins] and the real-time search worked just as well. So now you take your 18 hours for 16 samples and you're at [just over an hour] for each sample with 9,000 proteins quantified."
Gygi added that based on their structure, the 16-plex TMT reagents could be expanded to a 21-plex without significant difficulty, which would further reduce run times. And, he noted, the work he cited was done on a Fusion Lumos, which has lower performance specifications than the newly released Eclipse.
"Maybe you're looking at half an hour [per sample] for 9,000 proteins," he said.
By way of comparison, at last week's ASMS meeting, Bruker presented data collected from a two-hour DIA run on its timsTOF Pro instrument that quantified 7,565 proteins from HeLa cell digest. Thermo Fisher presented data from an experiment in which researchers using the company's new Orbitrap Exploris 480 instrument quantified 3,000 proteins in a five-minute DIA run.
Asked about which of the company's customers continue to prefer TMT amidst the growing uptake of DIA, Thermo Fisher's Miller said that "there are a lot of customers out there who are interested in doing a deep dive into the proteome and very accurately measuring quantitative values."
"I know there are DIA users who do that, as well, but it is my experience that typically our TMT customers seem to be going deeper," he added. "In terms of markets, we have a lot of biopharma customers who do deep-dive proteomics looking for disease targets and mechanisms, and then academic research still has a lot of customers out there using TMT."
Small samples
TMT is also poised to play a role in the ongoing move within proteomics research towards analysis of extremely small samples, including down to single-cell samples.
In 2016, a team led by researchers from TMT developer Proteome Sciences published a paper in the Journal of Alzheimer's Disease that used what the company has termed its TMT Calibrator method to quantify phosphorylated tau peptides in cerebrospinal fluid.
The TMT Calibrator method allows for quantitation of very low abundance peptides by combining TMT-labeled peptides from both the sample of interest and another sample source (such as tissue) where the target proteins are produced in higher abundance. By including the second supplementary sample source at a higher concentration than the sample of interest, researchers are able to ensure that even analytes present in low abundance in the sample of interest are present at high abundance in the overall sample, making them more likely to be fragmented and detected by the mass spec.
Since then, several groups have employed variations of this approach to enable analysis of very small samples and the company has seen increased demand among clients for its TMT Calibrator services, said Proteome Sciences CSO Ian Pike. Overall sales and royalties from TMT reagents were up 10 percent in 2018 according to the company's full year 2018 financial report.
In 2017, researchers at Northeastern and Harvard Universities detailed a TMT-based method called Single-Cell Proteomics by Mass Spectrometry (SCOPE-MS) that likewise used several hundred "carrier cells" to boost the peptide signals from the single cell that was the target of the analysis.
In March of this year, a team led by researchers from the Pacific Northwest National Laboratory published a study in Analytical Chemistry detailing what they called their Boosting to Amplify Signal with Isobaric Labeling (BASIL) strategy, which similarly combines TMT labeling with a carrier sample to enable quantification of extremely low abundance targets.
Wei-Jun Qian, a team leader in integrative omics at PNNL and senior author on the study said he and his colleagues explored the approach as a means of obtaining better phosphoproteomic data from small samples. Qian's main research interest is type I diabetes and he and his team had struggled over the years with phosphoproteomic analysis of human pancreatic islet cells, which he said are very challenging to obtain in significant quantities.
"We were going down as low as possible [using various enrichment techniques], and we could get good sensitivity, but we didn't have good reproducibility," he said.
Applying the BASIL approach, the researchers were able to perform phosphoproteomic analysis on samples consisting of around 100,000 cells, a roughly 30-fold improvement over their previous capabilities.
Qian said he believed that with further refinement, including application of the RTS-MS3 method, the BASIL approach could enable phosphoproteomic analysis of samples as small as 100 cells.
"I think this is a really enabling technology," he said.