Skip to main content
Premium Trial:

Request an Annual Quote

New, More Accurate Isobaric-Tagging Methods Could Improve Quantitative Proteomics

Premium

By Adam Bonislawski

Two papers published this month in the online edition of Nature Methods offer potential solutions to the precursor-interference issues that have hindered the use of isobaric tagging in proteomics research.

The studies – one led by Harvard researcher Steven Gygi and the other by University of Wisconsin-Madison researcher Joshua Coon – show ways to improve the accuracy of isobaric tagging, which the scientists said they believe could enable higher-throughput and more highly multiplexed quantitative proteomics experiments.

Isobaric labeling uses stable isotope tags attached to peptides of interest to enable relative or absolute quantitation of proteins via tandem mass spectrometry. Digested peptides are labeled with tags that fragment during MS2 to produce signals corresponding to the amount of peptide present in a sample.

These reagents are sold as tandem mass tags, or TMT, by Thermo Fisher Scientific through a licensing agreement with Proteome Sciences, and as isobaric tags for relative and absolute quantitation, or iTRAQ, by AB Sciex.

The TMT method enables researchers to multiplex up to six samples, while the iTRAQ method can multiplex eight. Also, unlike metabolic labeling methods like SILAC, both techniques can work with human tissue.

However, precursor-interference problems can lower the accuracy and precision of the quantitative data obtained through isobaric tagging, a drawback that has hampered its adoption.

In isobaric tagging experiments, "you isolate a target, fragment it with MS2, there's a reporter portion that falls off [the tag], and the proportions of those [reporter] signals tell you the proportions of the various peptides in the samples that you tagged," Coon told ProteoMonitor this week.

The problem, he noted, is that "the resolution with which we isolate a precursor ion is usually somewhere between one and three m/z units, so there are often other peptide ions that fall within that window."

These ions have also been labeled with isobaric tags, and therefore they also contribute to the reporter signal for the target peptide, he said.

"So the signal that you're measuring for the peptide you've identified isn't only for that [peptide], but it's for all these contaminating [peptides], too."

Because the majority of these contaminating peptides are showing no change in expression, their presence will tend to mask any actual changes in expression in the target protein, "so the dynamic range that you are measuring will be compressed," Coon noted, "and often that compression can be very severe."

For instance, in their study, the UW-Madison researchers found that 10-fold differences in yeast protein expression were reported as 4.4-fold differences when measured using TMT tagging. Similarly, the Harvard team found a 3.2-fold median compression of expected 10:1 ratios in a measurement of yeast peptides using TMT tags.

Researchers have been aware of these interference issues for several years, but the extent of the problem was nonetheless surprising, Gygi told ProteoMonitor via email.

"The magnitude of this problem was remarkable," he said. "I believed going into the experiment that there would be two populations of peptides identified, with one showing interference and the other showing no interference depending on how common it was to co-isolate peaks. What we found was not a bimodal distribution as I had predicted, but that almost all quantified evens showed ratio distortion due to co-isolated and co-fragmented peaks."

"Pretty much any large-scale comparison using iTRAQ or TMT will be adversely affected by interference," Gygi added.

To resolve this issue, the researchers took two different but fundamentally similar approaches.

The Coon team used gas-phase, proton-transfer ion-ion reactions to reduce the precursor ions' charge states, thus changing the ions' m/z and enabling them to separate the targets from the contaminants.

"You put in an anion and the anion will remove charge from everything that we isolated, so if the target has three charges and the interferent has two, if I pull one charge off of each, then my target now goes from three to two charges and the interferent goes from two to one charge," Coon said.

"So they're both going to change in their [m/z] value," he added. "But if they were both at 500 m/z, the plus-three target goes to m/z 750, and the interferent goes to m/z 1,000. So now they're separated by 250 m/z, and now I can go back and isolate my target at the same 3 m/z resolution that I have and now it's pure."

Gygi's team attacked the problem by adding a third stage of mass spectrometry to the process, selecting a target ion produced in MS2 for a further round of isolation and fragmentation to produce an MS3 spectra. Like Coon's approach, this technique essentially adds an extra layer of purification before analyzing the target ions.

Both methods significantly reduced the interference problem, with the Coon lab finding that with addition of the PRT technique their measured peptide ratio shifted from 4.4:1 to 8.5:1 – much closer to the true value of 10:1. Using the MS3 method, the Gygi lab measured a ratio of 11.7:1.

Both teams used Thermo Scientific LTQ Orbitrap Velos instruments and Thermo Scientific TMT tags for their work, although the methods should be applicable for ion-trap instruments generally.

Coon's lab collaborates with Thermo Fisher on technology development and, he said, the company "is well aware of what we're doing and [is] interested."

Thermo Fisher was not involved in the Harvard team's work, Gygi said, but noted they "are now working with Thermo Fisher and Proteome Sciences to improve and extend the MS3 method."

According to Ian Pike, chief operating officer of Proteome Sciences, his firm has been working for several years on an MS3-based approach similar to that described by the Gygi lab. The company, Pike told ProteoMonitor, "filed some IP around [the] use of MS3 methods to improve quantitative precision several years ago."

Through its licensing agreement with Thermo Fisher, Proteome Sciences receives royalties on sales of the TMT reagents. In the first half of 2011, its royalties on the reagents increased by more than 50 percent year over year. Resolution of the interference problem could further expand use of the tags, Pike suggested, particularly for large-scale, highly multiplexed proteomics-discovery experiments.

"It's fair to say that historically the majority of proteomics-discovery experiments were pretty poorly powered," he said. "So over the years we have seen the numbers [of samples] increase, and isobaric tagging has really been a key behind that. I think with this enhanced quantitative precision [provided by the PTR and MS3 methods], the argument for using isobaric tagging is completely compelling."

According to Coon, "one of the benefits of isobaric tagging is that you can compare a lot of samples at once – that's huge, the multiplexing. I think these types of approaches are going to make isobaric tagging more effective, and that's going to mean people are going to be more willing to use them, and in the end it means that you're going to be able to compare lots of samples at the same time, so we're going to have high-throughput, multiplexed, quantitative proteomics."

Pike noted that Proteome Sciences plans in the near future to release TMT-tagging kits that enable "at least 18-plex" experiments, a significant jump from the reagents' current six-plex limit.

"With these methods, [researchers] will attain the same levels of quantitative precision for 19 [samples] as for eight," he said.

However, there are trade-offs to the methods. Specifically, both techniques reduce scan speed, leading to fewer peptide IDs – 21 percent fewer in the case of PTR and 8 percent fewer in the case of MS3. However, Gygi noted that "while dataset size is a vital consideration in proteomics … the alternative is to collect data with greatly distorted ratios."

"Even though we get 20 percent fewer IDs, most all of them have quantitatively useful information, and if you didn't do the [PTR] step, more than half of them would be garbage," Coon said. "So the total number goes down a little bit, but the amount of useful data goes up by a lot."

He added that his team was working on optimizing the technique to avoid the loss in IDs. "I think over the next year we'll be able to get rid of that 20 percent."


Have topics you'd like to see covered in ProteoMonitor? Contact the editor at abonislawski [at] genomeweb [.] com.