NEW YORK (GenomeWeb News) – Researchers from the National Cancer Institute's Clinical Proteomic Tumor Analysis Consortium have completed a study examining the use of multiple-reaction monitoring mass spec for protein quantitation.
Detailed in a paper published last week in Molecular & Cellular Proteomics, the effort looked at assay development steps required for successful implementation of MRM-MS assays as well as advances in hardware and software that have contributed to improvements in such assays.
Using eight different LC-MS setups across 11 different laboratories, the study quantified more than 100 peptides corresponding to a total of 34 proteins, 27 of which have been linked to cancer.
According to the researchers, the effort demonstrated that "with appropriate attention to experimental design, analytical validation, and suitable quality control methods," highly multiplexed MRM-MS assays can "be implemented by multiple laboratories to provide sensitive, specific, reproducible, and quantitative measurements of proteins and peptides of clinical and biological interest in complex biological matrices."
The study consisted of three phases. In the first, lists of MRM-MS transitions were generated and then tested on 14 different triple quadrupole mass specs (from vendors AB Sciex, Thermo Fisher Scientific, Waters, and Agilent) to select transitions for each peptide best suited to each specific platform. Ultimately, the MRM assays used the three most abundant and interference-free transitions for each peptide measured.
In the second phase of the study, the researchers generated response curves in depleted human plasma. The plasma was digested using Lys-C and trypsin, then spiked with 125 synthetic isotopically labeled peptides, and then used to generate nine-point calibration curves. This phase allowed the researchers to determine variability associated with different instruments.
In the third phase, the researchers generated another set of nine-point response curves by spiking 27 unlabeled target proteins and six unlabeled, previously characterized proteins into depleted, undigested plasma. This allowed them to assess variation due to sample preparation and the digestion process.
The researchers also used isotopically labeled proteins as internal standards in phase II and III to help account for peptide recovery losses due to incomplete digestion and the desalting process. As has been observed by other groups, use of full-length labeled proteins significantly improved assay accuracy and reproducibility, the CPTAC authors noted.
Ultimately, the researchers observed in phase II intralaboratory CVs ranging from 13 percent to 39 percent with a median of 15 percent across the 13 sites performing the assays. Interlaboratory CVs were 31 percent when using the full-length labeled protein standards. These levels of variability represent an improvement over previous CPTAC efforts, the authors noted, citing more rigorous standard operating procedures, use of pre-packed columns and column heaters, and careful monitoring of instrument performance as keys to this improvement.
As expected, the phase III assays, which took into account variability due to sample prep and digestion as well as instrumentation, had higher CVs than in phase II – 58 percent compared to 15 percent.
The researchers also achieved significant improvements in assay sensitivity compared to previous CPTAC work, reaching limits of quantitation at the peptide level of between three- and five-fold better than in earlier studies while increasing the assay multiplex by more than 10-fold. At the protein level, the improvement was more than 20-fold for the seven proteins measured in both studies.
While improvements in mass spec and LC technology contributed to this higher sensitivity, the authors said that they were not as significant contributors as they'd initially expected. Rather, the primary difference was use of immunoaffinity depletion to decrease the plasma sample complexity.
"While use of the newest technology has the potential to improve sensitivity," they wrote, "signal to background biological noise remained the principal limitation on assay sensitivity."
"The two methods that have been clearly demonstrated to decrease biological noise while retaining high analyte signal are fraction MRM [depletion followed by fractionation] and SISCAPA," they added.