When it comes to the cutting edge of analytical instrumentation for clinical proteomics, the last few years have seen triple-quadrupole mass spectrometry occupying center stage. New triple-quadrupole platforms offer increasingly higher levels of sensitivity, dynamic range, and throughput. But thanks to improvements in electrical engineering, matrix-assisted laser desorption/ionization mass spectrometry, or MALDI-MS, is slowly emerging into the limelight.
Of course, this is not news to mass spec veterans like Arizona State University's Randy Nelson. "You've been able to use mass spec to diagnose people for so long that it's been forgotten. Now it's being re-found by people who bought the latest miracle machine — even an old beat-up mass spec is a good one if it turns around data that's useful downstream," Nelson says. "Everyone in proteomics is waiting for the magic box to show up that you pop open like a Christmas present and this thing will automatically cough out the secrets of the universe. This is almost the mentality of some people."
About 20 years ago, Nelson published some of the first papers on MALDI, demonstrating that the technology could be used to accurately quantify proteins. More recently, Nelson experimented with a new MALDI-time of flight instrument from Virgin Instruments using a mass spectrometry immunoassay called MSIA, which he developed in the mid-'90s while at Intrinsic Bioprobes. Virgin's MALDI-TOF-TOF analyzed proteins isolated by MSIA at a rate of roughly five minutes per 96-well plate, Nelson says.
While Nelson's team has begun to tool up sub-picogram/milliliter quantitative selective reaction monitoring MSIAs for clinical proteins, they continue to use MALDI-TOF routinely on clinical proteins.
"MALDI has always been the fastest game in town, so MALDI-TOF-TOF in particular is very fast and sensitive, with a very low limit of detection that provides a moderate dynamic range from the limit of detection on up to about 100 and sometimes 1,000," Nelson says. "So we've always used MALDI in a situation where it can eat clinical samples as fast as we can make them."
In any event, he adds, "the key to high accuracy quantification is to produce standardized assays, in our case MSIAs, en masse in order to feed the machines. Essentially, the platforms, whether MALDI-TOF or ESI-SRM, will only work as well as the specifications embedded in the assays."
[ pagebreak ]
Some argue, however, that MALDI-TOF for protein quantification is susceptible to high co-efficients of variation. Leigh Anderson, CEO of SISCAPA Assay Technologies, and his colleagues have shown that by coupling MALDI with their Stable Isotope Standard Capture with Anti-Peptide Antibodies, or SISCAPA, technology — which is typically used with multiple reaction monitoring-mass spectrometry — throughput in the workflow is increased, just by eliminating the need for chromatography. While MALDI-TOF cannot provide as robust a dynamic range as triple-quadrupole machines, with the addition of SISCAPA, it has been shown to produce pure peptide analytes and sensitivity that matches certain assays on triple-quadrupole mass spectrometer.
In January, a team led by Anderson published a paper in the Journal of Proteome Research describing a process that combines MALDI-TOF MS with SISCAPA. They found that MALDI-TOF MS could give conventional nano-LC-MS a run for its money when it comes to quantification. Their method provided precise quantitation of high-to-medium abundance peptide biomarkers over a 100-fold dynamic range, which suggested to Anderson and his team this technique could facilitate verification studies of protein biomarkers on a scale that is currently not practical using nano-LC-MS.
"There are two main ways to do quantitative analysis on a really simple mixture of peptides: The conventional way is by MRM on a triple-quad mass spectrometer, but we were interested in the possibility of doing it really simply and doing it really fast and sensitively with MALDI," Anderson says. "It turns out that MALDI — unexpectedly to some people, but not everybody — is phenomenally precise within certain parameters. We're getting coefficients of variation on measurements of the ratio between the internal standard and the analyte peptides of less than 1 percent, which is really very good precision."
Anderson and his team plan to develop SISCAPA assays capable of covering all proteins known to be clinically useful — currently around 205. Triple-quadrupole instruments, which have been used for decades to run drug assays, also promise to affect technologies like SISCAPA, Anderson says. Companies such as Agilent have continued to offer increased sensitivity with higher flow rates on their triple-quadrupole systems that not only make LC systems seem more robust, but have also ramped up productivity by reducing the time per sample by a factor of 10 or more. "That kind of practical improvement has had a profound impact on the clinical prospects for moving things like SISCAPA assays, and our long-term technical objective now is to figure out ways of removing chromatography with its robustness and throughput limitations from the equation entirely," Anderson adds.
The ability to extract data from large clinical sample sets is another focus for Anderson, who says he is far less concerned with the newest discovery platforms for proteomics than with the development of directed assay technologies capable of being run on large sample sets. "I'm much less interested in the general discovery platforms than I used to be because there has been too much of it — there are so many publications of so many candidates, and none of these candidates have ever been tested in large sample sets, which means that nobody actually knows anything about their true clinical value," Anderson says. "The only thing that's going to allow us to find out what is real and what is not are these directed assay technologies. If you can't run on 1,000 or 2,000 sets of samples, you will never know if a protein has clinical significance."
[ pagebreak ]
In 2001, Christoph Borchers, director of the University of Victoria-Genome British Columbia Protein Centre, developed a peptide immunocapture MALDI method, called iMALDI, which is used in the clinic to identify microbes using Bruker Daltonics' MALDI-TOF Biotyper platforms. Borchers and his collaborators are now working on broadening the platform's clinical applications, like measuring a biomarker that diagnoses hypertension with a panel reactive antibody assay developed using iMALDI.
"If you want to go into the clinic, and you have a simple enough mixture of target analytes, MALDI is the way to go because it's much faster," Borchers says. "Now we want to implement this iMALDI technology on the Bruker Biotyper — over 300 of these instruments have been sold in the EU for clinical usetimes — to implement new diagnostic methods on that platform. iMALDI is nothing new, but it's just a great tool — very robust and quantitative, and much faster and more user friendly than LC-MS; any person on the street could use it."
But triple-quadrupole mass specs have not lorded over the clinical proteomics space without reason. The technology has continued to evolve in the form of platforms like Agilent's 6490 triple-quadrupole mass analyzer, which provides high-accuracy, high-reproducibility, and robust tandem mass spectrometry.
"Speed and sensitivity have gotten better, and then also you get all the bonuses of knowing the mass very well, which gives you more certainty and identification. And for quantitative analysis, it's nice to have good resolution," says the University of Wisconsin, Madison's Joshua Coon. "The trend is definitely happening and continues to happen. I would expect that we will see continued speed and improvements in these analyzers to deliver high accuracy and high resolution, as well as interesting ways to combine them with other analyzers like ion traps and quadrupole mass filters."
Coon also expects that more and more hybrid mass specs will come along to provide investigators with the ability to do any type of peptide fragmentation. Ion trap-Orbitrap hybrids have been around for roughly six years. Recently, Thermo Scientific released a quadrupole-Orbitrap LC-MS/MS hybrid that provides very high-resolution isolation. "I think hybrids of these accurate mass analyzers and other things are still going to be developing, but the biggest most obvious change has really shifted towards doing all of their mass analysis with high-resolution, high-accuracy machines," Coon says. "Five years ago, however, people who used ion trap hybrids like Orbitrap or Fourier transform ion cyclotron resonance mass spectrometry hybrids ... would do MS1 scanning at high resolution and accuracy, and then do MS2s with the ion trap, which was faster and more sensitive."
Hybrid systems are also becoming smarter and increasingly user-friendly. Both Thermo Scientific and Waters offer a type of data-dependent decision tree idea that Coon's lab helped develop. "You could imagine more intelligent versions of that type of thing and as hybrid systems continue to evolve they will have more dissociation techniques and more advance control over how they're used so that the user doesn't have to figure it out ahead of time — the instrument can figure it out for you," Coon adds.
[ pagebreak ]
The adoption of instruments offering high-resolution mass-accurate fragments — like Thermo's coupling of higher energy collisional dissociation fragmentation with Orbitrap mass spec analysis — provide mass accuracy that's better than the data produced by linear ion trap instruments or other older instruments, which have been the standard for basic protein identification. This means that a lot of the current proteomics analysis software will need to be revisited to be able to make the best possible use of datasets containing this higher level of accuracy. "As somebody that has specialized in this, I know well that different search engines can identify wholly different subsets of spectra from a dataset, which makes me think that we're not really done getting the most value out of the data that we could be," says Mark Chance, director of the Center for Proteomics and Bioinformatics at Case Western Reserve University.
A common misconception Chance sees as problematic is that proteome informatics is generally regarded as a settled problem by investigators not deeply involved in the field. "There's going to be this tension forming between the people who feel that making identification algorithms in a cookie-cutter fashion is the first priority and people who think that there's still additional wheat to be gleaned from that field. So, that's a challenge," he says.
"I would think we are doing very well with the instruments we have available for discovery proteomics, including the Orbitrap LTQ instruments and the triple TOF 5600 platform from AB Sciex. But, of course, we always want more powerful instrumentation — there's never enough sensitivity and never enough throughput," says Ruedi Aebersold, a principal investigator at the Institute of Molecular Systems Biology in Zurich. "However, I don't think that is where the bottleneck is — the bottleneck for proteomics currently is in what we do with the data."
Between the realms of discovery proteomics and targeted proteomics, there is an emerging intermediate. Known as data-independent analysis, the approach aims to improve the consistency of peptide identification in addition to improving protein sequence coverage in complex samples.
"Rather than fretting over the latest and highest-throughput proteomics platform, the real focus should be on using all this proteomics data effectively and efficiently to generate new biological knowledge," Aebersold says.
In January, he and his colleagues described in Molecular and Cellular Proteomics a targeting analysis strategy that queries sample sets for proteins of interest using available data in fragment ion spectral libraries to mine complete fragment ion maps created by a data-independent acquisition method.
Aebersold says the impetus for developing a data-independent approach came from what he sees as one of the primary challenges for proteomics — the ability to accurately and consistently detect and quantify large fractions of proteomics across numerous samples.
"These machines have always been great, they started out as good, now they're mind-boggling," says Arizona State's Nelson. "Now is the time to tighten up our belts and start to define what we're using it for clinically."