Skip to main content
Premium Trial:

Request an Annual Quote

June 2005: The Art of Sample Prep

Premium

When most people think about advances in technology that have been important for proteomics experiments, the first thing that comes to mind is the mass spectrometer. After all, the mass spectrometer is one of the most advanced pieces of analytical instrumentation available to scientists, it's expensive, and furthermore, it can have really technical-sounding names like MALDI-TOF/TOF.

But behind the scenes lies another element to proteomics that in most cases is equally important to scientists' ability to extract valuable information from the experiment: the protocols and technology required to separate or otherwise prepare the protein sample prior to analysis by mass spectrometry.

Take the pilot stage of the Plasma Proteome Project as an example. Organized by the Human Proteome Organization as a group effort to catalog the proteins found in a particular sample of human plasma, the experiment, performed by 31 labs and completed in late 2004, found that variations in the approach to protein separation significantly contributed to major discrepancies in the number and type of proteins identified by the participating laboratories, according to scientists involved in administering the project.

"Once you start thinking about all these different fractionations and chromatography steps that can be used in sample prep, there are enough different alternatives so that it becomes very tricky to figure out the best combination of steps to do," says Leigh Anderson, who heads the Plasma Proteome Institute. "That means everybody is getting different sets of proteins from the sample, and that's what largely came out of that HUPO project. … We're unfortunately a long way from standardizing the platforms for discovery work."

But even if standard protein sample preparation protocols are still a long way off, it may still be instructive to take a look at recent developments in the science behind sample fractionation and separation techniques to see which types of methods seem to be most effective — or at least garnering the most attention. State-of-the-art technology in proteomics changes rapidly, but what's happening in the field now will certainly influence which direction the path toward standardization takes.

In short, there are three basic approaches to preparing a complex sample of proteins before attempting to identify or otherwise analyze the sample by mass spectrometry: one can pull out certain types of highly abundant proteins and separate the remaining components by 2D gel electrophoresis; one can string together a series of liquid chromatography steps designed to strip away abundant proteins and produce multiple fractions of time-resolved separated proteins; or one can fish out only the certain classes of proteins of particular interest to the problem under investigation. In all three categories, vendors and researchers have recently made strides in performance.

Gel electrophoresis

But some areas have seen more activity than others. In general, classical 2D gel electrophoresis, which involves separating whole proteins on the basis of isoelectric point and then by size, has reached a plateau in terms of advances in the underlying technology, says Steve Carr, a proteomics researcher who runs a lab at the Harvard/MIT Broad Institute. The technique, which first entered the proteomics scene in the mid-'70s, is still quite popular — especially in Europe, according to Anderson and Carr — but the expertise required to reproducibly perform the procedure has kept the technology concentrated in a relatively fixed number of labs.

Despite the somewhat slow rate of innovation in 2D gel technology, its ability to resolve different proteins is widely considered to be unsurpassed. In part because the technique can be so labor intensive, it can separate proteins found only in very low concentrations in a sample — at a much lower range than that of liquid chromatography approaches. In addition, 2D gels are known for their ability to separate different isoforms of the same protein — that is, proteins that vary only by their type of post-translational modification. Says Anderson: "I don't think that chromatographic separations have ever achieved the resolution that 2D gels have, [but] they can be easier to automate obviously."

Many of the recent innovations related to 2D gel electrophoresis have instead revolved around how to transfer proteins separated on gels to a platform amenable to MALDI mass spectrometry. The problem, briefly summarized, is that manually picking out spots on a gel and arraying them on a non-standard MALDI target often introduces additional variation and sources of experimental error — not to mention the process itself being quite tedious. One example of improvements in this area includes the commercially available spot-picking robots sold by companies like GE Healthcare and Genomic Solutions and specially designed MALDI targets — like Bruker Biosystems' AnchorChip — that concentrate proteins in a spot on the array while simultaneously desalting the sample.

LC/MS

In recent years liquid chromatography approaches to separating proteins have taken firm root, especially as labs have tried to increase sample throughput by installing automated systems for preparing and injecting protein samples into mass spectrometers. On the mass spectrometry side of the equation, algorithms such as SEQUEST for deconvolving complex mass spectra and identifying the constituent proteins have made it easier and faster to extract accurate protein data from proteomics experiments, further encouraging high-throughput separations protocols based on liquid chromatography technology.

Traditionally, the two basic components of any chromatographic separation scheme consist of an ion exchange column for separating proteins on the basis of charge, and a reverse phase column for separating on the basis of hydrophobicity. Recently, however, vendors have begun developing new types of columns, designed specifically for extracting the most abundant, and typically least interesting, proteins in a biological sample. Agilent Technologies, for example, has devised an immunodepletion column with polyclonal antibodies covalently bound to the packing medium designed to remove the six most abundant proteins in a sample, says Jerome Bailey, Agilent's marketing manager for bioreagents.

Carr, at the Broad Institute, is enthusiastic about Agilent's new immunodepletion column, as well as their new macroporous reverse phase HPLC column. "New phases are becoming available that seem to be useful for processing biofluids and for protein separation," he says. "More and better tools are needed, but we should be a bit cautious as to how important any one of these tools will be until carefully evaluated in labs doing lots of real experiments."

Agilent's other recent contribution to new chromatographic methods relies on microfluidics technology to perform reverse phase separations on a chip. The advantage to this approach, according to Bailey, is that it eliminates fittings between columns, reducing the potential for excess volume, which results in sharper bands that increase the detection sensitivity. "It usually takes a very skilled scientist to identify where sample losses are occurring [using standard columns at low flow rates]," he says. "With the chip, the laser ablated channels eliminate these sources of sample loss, and you substantially increase the ability to automate sample prep. It's all about ease of use."

Waters, the other major provider of chromatography columns for proteomics applications, is also working to develop systems better able to handle small protein samples, as well as miniaturized devices designed for higher sensitivity separation. "With a small column and a sensitive mass spectrometer, you can now get down to protein concentrations at the attomolar concentration range," says Tom Wheat, principal scientist and manager of the life science laboratory at Waters. The most important advances at Waters, he adds, involve automating protein separation schemes and developing software for matching chromatography and mass spectrometry data to make comparing protein expression data between two samples more automated and more accurate.

Enriching/Extracting

While many vendors and tinker-happy researchers have focused on separations schemes that strip out high-abundance proteins in order to probe the depths of potentially important proteins present only in trace quantities, other scientists have taken the opposite tack: devising methods for pulling out proteins of interest from the mass of high-abundance proteins. And while one might assume that a researcher would have to know which proteins were of interest to extract them from a sample, recent developments show that this is not necessarily the case.

One such strategy has taken shape at the American Red Cross' Holland Laboratory in Rockville, Md. David Hammond, executive director of the department of plasma derivatives, has led an effort to apply principles of combinatorial chemistry and affinity chromatography to pulling out a wide variety of proteins from samples of human plasma, whole blood, and other complex mixtures. The idea, he says, is that by attaching a vast library of small molecule capture agents (ligands) to chromatography beads, one can bind almost all of the proteins in the starting material to the column under conditions in which the abundant proteins are diluted, while the trace proteins are concentrated.

After binding to the library, the ligand-bound proteins can be assayed for a desired biological or biochemical activity using a technology Hammond has called FIoNA, for Functional Identification of Novel Activities. The proteins dissociate from the ligands during the assay: new proteins, or new activities of known proteins, can be discovered without prior knowledge of the protein or the ligand to which it bound.

Alternatively, Hammond says, all of the proteins can be eluted in one step, as a sample preparation method for proteomic analysis, in a way that allows the eluted mixture to contain a percentage of almost all of the original proteins, but at a much narrower concentration range. This strategy, which Hammond has called the Protein Equalizer, is being co-developed with Ciphergen Biosystems. Because none of the proteins is specifically depleted, and because trace proteins are enriched relative to their original concentration, this strategy can identify proteins that are undetectable using more traditional approaches to removing high-abundance proteins. A description of the Equalizer has been accepted for publication in the journal Electrophoresis.

"But if you think about it for a minute, if the library is a million different beads, of which maybe one in a hundred thousand bind albumin, and one in a hundred thousand bind an interleukin, and you add saturating amounts of plasma serum to that library, under that circumstance you can bind 10 ng of albumin," Hammond says. "If you take the other extreme, and you've got an interleukin present at one pg/mL, you can probably bind all your interleukin, plus some more. … So what we do is to decrease the dynamic range of the source material by depleting the abundant and concentrating the trace."

Another recent addition to the strategies available for pulling out certain types of proteins from a sample has emerged from Eric Peters' lab at the Genomics Institute of the Novartis Research Foundation, and was published in the April issues of Nature Biotechnology. In an effort to develop a new method for simplifying protein mixtures, Peters, a group leader in protein profiling and mass spectrometry, devised a scheme for tagging functional classes of peptides with perfluorinated compounds, and a fluorous-functionalized solid phase to capture these tagged molecules.

While the new method — termed "fluorous proteomics" by Peters — is functionally quite similar to the fairly widely used biotin-streptavidin interaction as a method for pulling out specific types of proteins, Peters says there are several advantages to his approach. "Biotin-avidin is the gold standard, but what's usually not talked about are the problems," he says. "You can get non-specific binding, and it is often difficult to fully elute the captured protein from the resin. We wanted to come up with a 'chemical' approach, and it's potentially an extremely cheap way to go. It has all the benefits of a solid phase extraction approach — with cheap available resins and reagents — but with some of the high level of selectivity that people associate with avidin-biotin."

The problem with these types of approaches, says Anderson of the Plasma Proteome Institute, is that when performing a proteomics discovery experiment, it's hard to know a priori whether the proteins involved in a pathway of interest share a similar functionality that could be used to help pull them out. "The fluorous proteomics approach is a cool idea," he says, "but it may be just a technical improvement over biotin."

Putting It All Together

Perhaps it's not altogether surprising that the most thorough approaches to prefractionating protein samples involve mixing and matching various chromatographic and gel-based separations schemes. Given the lack of any silver bullet for dealing with all the issues associated with stripping away high-abundance proteins — without losing potentially valuable sample material — many researchers have turned to devising their own unique concoction of sample prep protocols.

One of the researchers particularly well-known for mixing and matching is Dave Speicher at the Wistar Institute in Philadelphia. Speicher's four-dimensional fractionation approach gained fame when it helped him identify the most proteins out of 18 labs that participated in a HUPO pilot study in 2003 and 2004 that looked at proteins in plasma. In addition, the majority of the proteins known to be present in these serum or plasma samples in the ng/ml range were detected, and a small number were even detected in the pg/ml range, according to results presented at the US HUPO meeting this March in Washington.

To achieve ng/ml protein identifications, Speicher and his research group first remove the top six most abundant proteins using Agilent's immunodepletion column, and then separate proteins according to isoelectric point using a technology called MicroSol-isoelectric focusing, which Speicher's lab developed and is now marketed as the Zoom IEF fractionator by Invitrogen. This technology uses small chambers separated by large pore acrylamide discs with immobilines at specific pH's to separate cell extracts into well resolved pools based upon isoelectric points. For the third dimension, researchers run fractions out on a one-dimensional electrophoresis gel as an additional separation mode that is orthogonal to the previous two modes. Then, for the final dimension, researchers slice up the 1D gel, digest each slice with trypsin, and run the resulting digests on a nanocapillary reverse phase column coupled directly to a high-sensitivity linear ion trap mass spectrometer.

Although Speicher's intensive fractionation scheme seems to have paid off in the HUPO plasma proteome pilot project, it's not clear that other researchers are about to adopt his approach in the name of standardization. "It'll take a long time" to standardize protein fractionation steps, says Anderson. "People will keep looking for the highest-performing methods, but in terms of pulling out particular proteins, nothing comes close to immunoassays." Carr adds that mixing and matching separation schemes works only as well as the researcher's ability to eliminate losses and contamination during the transfer steps.

For his part, Carr says that the HUPO plasma proteome pilot project has spurred researchers in the field to start taking standardizing sample prep seriously. "What's happened is that partly out of [the desire to] learn, and partly out of frustration with the way the HUPO project was actually carried out, there have been some expert consortiums formed to specifically test what are the protocols that actually work, in terms of getting us down to a reasonable level of depth of coverage in a proteome in discovery mode," says Carr.

In addition to a mouse model consortium, at least two other sets of groups with expertise in separation science, mass spectrometry, and access to high quality samples have begun working together to test and share methodologies and data, Carr adds. "Those sorts of consortium activities, where there's shared learning, will actually drive the field forward."

SIDEBAR: An Antibody for Every Important Protein?

One of the major challenges to a proteomics experiment in discovery mode is that the range of concentrations at which proteins are present varies across such a wide range — about 109 M, to be precise — that devising a system with that kind of sensitivity is a serious technical obstacle. Immunoaffinity assays, such as those that rely on antibodies or other capture agents to pick up a desired protein, are the most sensitive assays available. But in a discovery experiment, how could one develop an antibody against a protein that has yet to be discovered?

Leigh Anderson of the Plasma Proteome Institute thinks the time may not be too far off when researchers will have at their disposal capture agents for every protein in the human body — or at least capture agents designed to pick up peptides representative of every protein. The idea is that scientists would express proteins for every gene in the human genome, and devise antibodies against the peptides derived from each protein that behave well in the mass spectrometer. In this way, the theory goes, a researcher could pull out and quantify the amount of every protein of potential significance in a sample.

Anderson says, "We're getting close to the day when 22,000 genes doesn't seem like a lot. At that point, you're entitled to begin thinking about whether or not we should start from the bottom up, and just design some kind of mass spec-based assay for each protein that exists, and go look for those specifically rather than always rediscover every protein de novo."

Of course, much of this is still conceptual. But Anderson says he and his collaborators are starting to think about developing capture agents against the proteins that researchers now know to be of clinical significance. In a recent review paper in the Journal of Physiology, Anderson lists 177 candidate biomarker proteins associated with cardiovascular disease and stroke that could represent a starting point for what he refers to as "directed proteomics."

— JSM

ProteoMonitor story: "At HUPO's Plasma Proteome Project, Standardization Takes Center Stage"
http://www.proteomonitor.com/articles/view-article.asp?Article=20041028185738

"Enrichment and analysis of peptide subsets using fluorous affinity tags and mass spectrometry" Nature Biotechnology 23, 463 - 468 (2005)
http://www.nature.com/nbt/journal/v23/n4/abs/nbt1076.html

"Microscale Solution Isoelectrofocusing: A Sample Prefractionation Method for Comprehensive Proteome Analysis"
Methods Mol Biol. 2004;244:361-75
http://www.humanapress.com/ChapterDetail.pasp?isbn=1-59259-655-X&ccode=1-59259-655-X:361&returntoisbn

"Candidate-based proteomics in the search for biomarkers of cardiovascular disease"
J Physiol 563.1 (2005) pp 23–60
http://jp.physoc.org/cgi/content/abstract/563/1/23

The Scan

Push Toward Approval

The Wall Street Journal reports the US Food and Drug Administration is under pressure to grant full approval to SARS-CoV-2 vaccines.

Deer Exposure

About 40 percent of deer in a handful of US states carry antibodies to SARS-CoV-2, according to Nature News.

Millions But Not Enough

NPR reports the US is set to send 110 million SARS-CoV-2 vaccine doses abroad, but that billions are needed.

PNAS Papers on CRISPR-Edited Cancer Models, Multiple Sclerosis Neuroinflammation, Parasitic Wasps

In PNAS this week: gene-editing approach for developing cancer models, role of extracellular proteins in multiple sclerosis, and more.