Skip to main content
Premium Trial:

Request an Annual Quote

Sophisticated Separation

Premium

The origins of chromatography can be traced back to the beginning of the 20th century when Russian scientist Mikhail Tsvet used columns of calcium carbonate to separate plant pigments for his research on chlorophyll. Later in the 1940s and 1950s, partition chromatography was developed, which in turn led to various other separation methods — including high-performance liquid chromatography, a technology that has undergone a steady evolution for the last 25 years and which, when coupled with mass spectrometry, helped make proteomics and metabolomics research possible.

"The ability to do higher pressures allows you to go to smaller particle size and I think that's allowed people to take advantage of the faster scanning mass spectrometry in order to dig deeper into proteomes," says John Yates, a professor at the Scripps Research Institute who specializes in mass spectrometry and proteomics. "While people at the bleeding edge — such as myself — have been making their own columns, and splitting flows on HPLC pumps, and doing multidimensional chromatography, what has been happening is that over the last 10 years commercial technology has been steadily coming along." This has allowed researchers who are not experts in proteomics technology to use advanced tools that are at the extremes in terms of size, pressure, and flow. "It's becoming easier, and chip technology is more of a democratization technology where it allows people who are less sophisticated with chromatography to do things like small diameter columns or capillary columns," Yates adds.

In the early days of proteomics, investigators had to use two-dimensional electrophoresis or gel electrophoresis to separate samples, and matrix-assisted laser desorption ionization mass spectrometry to detect them. As proteomics began to gain popularity in the mid-1990s — when researchers became increasingly interested in understanding the proteome through structural and functional analysis and profiling peptides and proteins — HPLC technology was put to the test. Continually improved fractionation, separation, and detection methods followed. Specifically, the demonstrable need for increased sensitivity and selectivity led to the development of columns with smaller and smaller internal diameters as well as miniaturized columns for the chip format, more sophisticated stationary phases, and better detection technology, such as Orbitrap, TOF-TOF, and Fourier transform mass spectrometry. Over the years, commercial vendors like Agilent Technologies and Waters, as well as academic HPLC tinkerers, have devised higher-throughput platforms with larger analytical ranges and higher sensitivity via ultra-performance liquid chromatography and ultra-high-pressure liquid chromatography.

Nano-level

Proteomics has required the development of multi-dimensional HPLC as well as lab-on-a-chip or nano-HPLC technology — defined by liquid chromatography technology that uses a flow rate on the scale of nanoliters per minute — in order to achieve lower flow rates with increased sensitivity and smaller sample sizes. The development of nano-flow systems, which are usually defined as LC platforms that have on-chip microfluidic columns with a rectangular cross-section, or ones that have flow rates on the scale of nanoliters per minute, is perhaps the biggest proteomics-driven revolution in chromatography.

[pagebreak]

"Nano-HPLC is very capable of providing consistent retention times over a series of runs, which is very important for the emerging targeted proteomics enthusiasm," says Daniel Martin, a principal investigator at the Institute for Systems Biology.

However, targeted proteomics at the nano-level still depends on knowing when the target protein will appear in the mass spec — a tricky proposition. "Increasing the throughput or the number of things you can sample for is critically dependent on how reliably you can say a peptide is going to appear at a given time," Martin says. "The more reliable it is, the narrower the window of time that you are looking for a particular peptide, which means you can squeeze more peptides in at the same time and the throughput goes up. I've been telling my people that the most important thing about targeted mass spectrometry is actually the chromatography if you want to achieve high throughput." His group works on targeted proteomics, using the triple-quadrupole LC/MS platform and the selective reaction monitoring approach. "That is absolutely critically dependent on having highly reproducible HPLC," Martin adds.

Traditionally, experts have had to push the development of these technologies in order to achieve the kinds of peak capacities required for proteomics investigations without dramatically increasing the amount of time it takes to do them. "We've always been trying to find ways to increase our peak capacity while at the same time trying not to increase our run times too much because proteomic analysis tends to be slow," says Michael MacCoss, an associate professor at the University of Washington. "They're not high throughput in most ways compared to genomics technologies. Our goal is [to] try to solve the problem of how you improve your peak capacity so that you can separate and distinguish multiple things better while not taking more time." In the past, labs did nano-HPLC, but they used traditional HPLC and split the flow rates down. Now, MacCoss says, there are better commercial products coming to the market as well as the introduction of nano-HPLC at ultra-high pressures.

MacCoss, whose lab develops mass spectrometry technology for targeted proteomics, says he is also interested in answering a question that has yet to be fully investigated: just what exactly happens at the nano-level during separation? "One thing that no one is talking about is the question of whether or not the theoretical effects of column length and peak capacity hold true with nano-flow rates," he says. "A lot of things may change when you go to nanoliter-per-minute flow rates, so that is something that we want to look at as far as the theory behind it goes."

[pagebreak]

Christoph Seger, director of the mass spectrometry and chromatography division of the Institute of Medical and Chemical Laboratory Diagnostics at the University Hospital in Innsbruck, Austria, regularly assists researchers working on a range of biomedical research projects with various types of HPLC and LC-MS. "In both proteomics and metabolomics, HPLC technology has played a crucial role as a perfect discovery tool that makes a lot possible in terms of sensitivity. It's also quite unique in that it's very specific, so you can focus on a single molecule and it only needs ionized molecules so it can also pick up everything," Seger says. "This is especially crucial in metabolomics, where you want to see the whole metabolome, or in proteomics, where you want to see the whole proteome. It absolutely depends on this kind of technology and is the only readout possibility from these 2D gels where you can detect the up-regulation and down-regulation."

Hyphens galore

Various hybrid approaches — characterized by their hyphenated nomenclature — have also been a boon to proteomics over the years, in particular the coupling of HPLC technology with mass spectrometry, which offers extremely sensitive and robust sample analysis tools. The primary application of LC-MS has been to sift through samples with peptide masses that overlap — even with extremely high-resolution mass spec equipment for which the samples were usually first separated using some form of HPLC. The need for better tools to facilitate shotgun proteomics — where researchers need to quickly and accurately investigate complex proteome samples like cell lysates or multiprotein complexes like splicesomes — has resulted in the growth of multidimensional analysis tools like ones for liquid chromatography.

"There has been steady progress with the growth of both mass spec platforms and LC — the LC part often develops independently from the MS part, so maybe you buy the LC from one company and the MS from another company and then hyphenate them," Seger says. "For example, I would go for a Dionex machine for the LC part and for the MS part I would maybe buy a Thermo Scientific machine, but if I do a different kind of application, I might buy an AB Sciex machine on the MS side — it all depends on what you want to do with it."

Seger's research is focused on exploring the potential and limitations of the technology to bring the power of proteomics into patient care. This year, he and his colleagues published a paper in the journal Clinical Chemistry that pointed out some of the shortcomings of LC-MS in the health care setting. Seger et al. suggested that while LC-MS/MS is capable of providing highly accurate analyses of samples, this does not necessarily mean that the analyses provide accurate results. He points to interference from insource transformation of conjugate metabolites and matrix compounds sharing mass transitions with the target analyte as possible causes of less than 100 percent accurate LC-MS/MS analyses. The upshot is that while the technology is certainly robust, results should be taken with a grain of salt before being translated into the clinic.

[pagebreak]

Despite Seger's words of caution, LC-MS has continued to demonstrate its ability to make measurements of sufficient scope to enable the discovery of useful biomarkers. Richard Smith, chief scientist at Pacific Northwest National Laboratory's Environmental Molecular Sciences Laboratory, and his colleagues used LC-MS to study the response of the blood plasma proteome to a severe burn. They compared the plasma protein concentrations of healthy control subjects with those of severe burn victims at various times following the injury. As they report in the Journal of Proteome Research, they found that the regulation of proteins corresponds with previously reported results for burn response, but also identified roughly 50 proteins not previously associated with burn response. Smith and his colleagues say that by finding proteins involved in the response to severe burn injury, they can uncover targets for therapeutic interventions as well as possible biomarkers for certain patient outcomes, such as multiple organ failure.

New developments

Smith currently operates a number of platforms in his lab, including FTICR systems — mass spec platforms that use Fourier transform instruments for determining the mass-to-charge ratio of ions — and Orbitrap systems, but he is not one to wait for vendors to roll out the next big thing in HPLC and LC-MS. "We have quite a large operation here, and part of it involves new technology development. We have been working [on] a new platform that uses ion mobility separations and time-of-flight mass spectrometry to provide greater dynamic range and data quality, and perhaps most crucially, speed to move along the overall analysis to further increase throughput," Smith says. "That is something which is incredibly important to us because a lot of what we can do in proteomics or metabolomics is limited by the throughput; we are concerned with how many measurement scans we make and if we can run replicates, for example. Having higher throughput means we can fractionate more and dig deeper into the proteome."

Smith's group internally refers to the new platform as the "next generation" and says that there is currently nothing commercially available that is comparable. "Although there are ion mobility spectrometry-MS systems, there are none that have been developed to achieve the sensitivity, measurement dynamic range, and throughput that our new platform has achieved as a result of three crucial advances that include a much more sensitive ion source; fast, high-resolution ion multiplexed mobility separation; and integration with a new, broad dynamic range time-of-flight mass spectrometer developed by Agilent," Smith says. "My view is that it will help significantly advance all MS-based 'pan-omics' measurements including proteomic, metabolomics, lipidomics, and glycomics, as the benefits in data quality, coverage, and throughput apply across the board. Ultimately, I would like to have smarter mass spectrometers that make decisions on the fly."

[pagebreak]

Smith adds that he would like to see a platform that could pick a specific peptide and protein ion and get additional information. "That is something that's happening and it is more or less in the minds of the commercial vendors," he adds.

Smith isn't the only one who says that more technology development and commercialization are needed. Scripps' Yates says that much of the HPLC and LC market is not at the level that proteomics researchers need — it is aimed at the pharmaceutical industry's scale. "I think for more people to be able to do these things, there needs to be more commercialization of columns," Yates says. "To do state-of-the-art proteomics, you have to be operating at the nano-flow region. As the market for proteomics and mass spec has grown, there has been more commercial interest in making these technologies, and it's been clearly an evolution, not a revolution."

The Scan

Booster for At-Risk

The New York Times reports that the US Food and Drug Administration has authorized a third dose of the Pfizer-BioNTech SARS-CoV-2 vaccine for people over 65 or at increased risk.

Preprints OK to Mention Again

Nature News reports the Australian Research Council has changed its new policy and now allows preprints to be cited in grant applications.

Hundreds of Millions More to Share

The US plans to purchase and donate 500 million additional SARS-CoV-2 vaccine doses, according to the Washington Post.

Nature Papers Examine Molecular Program Differences Influencing Neural Cells, Population History of Polynesia

In Nature this week: changes in molecular program during embryonic development leads to different neural cell types, and more.