Why Proteomics Can’t Let Go of 2D Gels
By Aaron J. Sender
Two years ago, when Ruedi Aebersold, then at the University of Washington, first let his ICAT method for analyzing protein mixtures out of the bag, a communal sigh of relief echoed through the world of proteomics. Many hoped that this was a sign that the end was near for the much-despised protein analysis technique that has dominated the field for more than two decades — 2D gels.
ICAT has become the center of much hype, says Tom Neubert, who heads the New York University core mass-spec lab and has sworn off 2D gels for good. “And it’s easy to see why: 2D gels are so hard, they are such a pain.”
Neubert’s sentiment is widely shared by those trying to study proteins on a large scale. 2D gels are messy, difficult, slow, inconsistent, and leave the majority of proteins undetected. Even some devout proponents of 2D gels have given up proselytizing. “Everybody hates them,” says Large Scale Biology CSO Leigh Anderson, who has been using the technology since it first emerged on the scene in the mid-’70s. “I spent 10 or 15 years trying to convince people that 2D gels are a great thing, and I finally stopped doing that three or four years ago.”
Aebersold is the first to admit that neither ICAT (isotope-coded affinity tags), which uses isotopes to differentially label peptides that are then run through liquid chromatography-mass spec, nor any other current separation technology alone is ready to replace 2D gels. For example, while ICAT allows researchers to quantify proteins in a sample, it cannot differentiate among post-translationally modified proteins as 2D gels can.
“Is there going to be a continued role for 2D gels? Clearly no other technique currently has the resolving power of two-dimensional gels,” says Aebersold, “nor is it apparent how it can happen in the near future.”
So, the separation technology everybody loves to hate continues to be the technology they can’t let go.
IDENTITY CRISIS
Why? Simply put, the problem with proteomics is that it’s not genomics. DNA is a simple molecule. It efficiently stores information with a quaternary code. Figure out how to tell the difference between each of the four unique bases and you’ve got yourself a genome sequence. On the other hand, there are hundreds of thousands of proteins to separate and identify, each with its own personality: Every protein has its own shape. Some are large, some small. Some love water, some hate it. Some are acidic, some basic.
Where in genomics PCR plus automated fluorescence sequencer equaled a complete view of the human genetic code, there is no simple formula for unveiling a proteome. “There’s not going to be one proteomics platform where you take a sample, put it in, and out comes all the data you want,” says Aebersold. “There are going to be individual silver bullets that address one particular question.” Much to the chagrin of many a lab tech, the 2D gel remains one of those.
The gel’s powerful ability to render proteins distinct comes from the consecutive use of two unrelated steps: it separates proteins on one dimension by their charge at a certain pH; then it separates them by molecular weight. State-of-the-art gel technology can pull proteins apart into 100 gradients in each dimension, for a total resolution of 10,000. The technology has become robust enough to tease apart modified forms of the same protein. The results of each experiment are delivered visually — as what appear to be blots of watery ink on a Jello-mold-like slab — instead of through tables and numbers.
But one look at the mess that is a 2D gel quickly reveals its drawbacks. As Lynx CEO Norrie Russell puts it, the gel with its seemingly random pattern of smudged spots “looks like somebody sneezed on it.” The wobbly gels sag, wiggle, and easily warp. The chances of any two gels being the same are slim, making reproducibility a problem. And the gels are picky. They are biased against the largest, smallest, most acidic, or most basic proteins. “The list of proteins excluded from analysis is very long,” says MDS Proteomics’ acting COO Ole Vorm. Plus, conventional staining techniques keep hidden even many of the proteins that do make it aboard. “The rule of thumb is that you probably see 15 percent of the total proteins that you actually have in the gel,” says Vorm.
These limitations are enough to keep some emerging protein analysis projects from even considering the 2D gel technique. “That’s the only thing I’m going to say: No 2D gels,” says Claire Fraser, director of the Institute for Genomic Research, which plans to break ground soon on a functional genomics facility in Rockville, Md. “It just seems like going backwards to think about bringing something like a 2D-gel system online when we clearly know right now that it’s limited in terms of throughput.” The fact that there is no viable alternative to 2D gels, says Fraser, “is really very much the reason why we don’t have a proteomics facility at TIGR today.”
BIPOLARITY
It’s not for a lack of efforts to supersede them that 2D gels are still the workhorse of proteomics. Various teams of entrepreneurial scientists, looking to exploit the void, are hard at work developing new separation technologies. Lynx’s ProFiler, a liquid phase 2D electrophoresis microetched glass slide, for example, promises to get down to the femtomole level when fully developed. Ciphergen says it has already delivered a protein chip-mass spec hybrid platform to the top 15 pharmaceutical companies. And a handful of companies and academic labs touting multidimensional liquid chromatography-mass spec systems, as well as Aebersold’s ICAT, are beginning to capture the imagination of protein chemists still shackled to gels.
Until recently, 2D gels and proteomics were one and the same. It was the only way to study any significant number of proteins at once. Even Aebersold wasn’t always anti-2D. As a postdoc at Caltech in the mid-’80s he used the gels to separate small amounts of proteins for amino acid sequencing by Lee Hood’s newly developed automated Edman degradation sequencer. “People got quite excited, because one could imagine now doing all these two-dimensional gels and then go and pick out spots and sequence them,” says Aebersold. By the early ’90s mass spec began replacing the Edman, and soon, instead of spending two weeks identifying a single protein, researchers were nailing sequences in hours.
But as labs worldwide began attempting to scour the proteome with the available technologies — 2D gels, liquid chromatography, and mass specs — Aebersold began to realize that their attempts were falling short. “When you went to meetings and read the papers, by and large the same proteins were always showing up in the list of identified proteins,” he says. “Significant numbers and classes of proteins didn’t show up at all.”
Aebersold’s solution was to get to the less abundant proteins by reducing the complexity of the sample. Instead of sending all the peptides of every protein through the mass spec, why not just select a signature peptide to represent each protein? And with the introduction of a stable isotope to measure the relative amount of each peptide selected, ICAT was born.
Claiming lack of funds, the University of Washington refused to pay for a patent for Aebersold’s invention. “They said I could patent it if I would be able find someone to pay for it, which incidentally could not be myself,” says Aebersold. Applied Biosystems jumped at the opportunity and began shipping the ICAT kit this March. Aebersold is now working on the next generation of the kit with ABI to eliminate problems with nonspecific binding of the tags by linking them to beads as well as modifying the chemistry to be able to detect modified proteins. “Many people saw right away that this could be a replacement for two-dimensional gels,” says Aebersold.
CODEPENDENT
Yet 600,000 protein chemists still use gels. Industrial scale proteomics facilities such as Oxford GlycoSciences, GeneProt, Large Scale Proteomics, and Proteome Systems rely on them. And even MDS Proteomics, a vocal 2D-gel critic, uses the technology for some separations. The proteomics community is torn between two camps: Those who think 2D gels will continue to improve at a faster pace than potential replacements and those who are ready to see it go. “It’s almost a theological debate,” says GeneProt CSO Keith Rose. “There are believers and non-believers.”
Amersham Biosciences, the leading supplier of 2D gels, of course, is a believer. It says the technology is not only not dying, but experiencing a renaissance. Its continuous improvements, such as introducing fluorescence to detect more proteins, multiplexing, and better analysis software to improve reproducibility, will keep gels far ahead of emerging technologies for some time to come, says Joakim Rodin, development director for proteomics at Amersham’s Uppsala, Sweden, location. “It’s going to be a major platform for a number of years ahead,” he says.
Not surprisingly, Aebersold disagrees. “We’re exactly at the stage where there is an established technique that is incrementally being improved and eventually you say, ‘Removing all of these limitations one at a time is really tedious and slow. Let’s move to another approach,’” he says.
Some users are also coming to terms with the technology’s limitations and adapting their protocols to increase the gain-to-pain ratio of 2D gels. “What we’ve done is said, OK, the 2D gel is the right method to use,” says Proteome Systems’ executive vice president and head of array technology, Ben Herbert. “So rather than get upset that it has a number of problems, let’s just go about trying to solve those problems.”
To get at low-abundance proteins, Proteome Systems doesn’t just run the cell lysate on the gel. It performs many pre-separation steps. First, researchers separate the sample into pH ranges within one to two units and run each pre-fractionated sample separately. Membrane proteins are also isolated separately. “It’s really about the quality of the data, not whether you can automate it or not,” says Herbert.
If you’re stuck in a lab handling gels day after day, you won’t be heartened to know that there’s no quick fix on the horizon. But look on the bright side, says Large Scale Biology’s Anderson: “More than half of the audience at the meetings I go to have to run 2D gels and almost none of them want to do it as a career. And I don’t blame them. The bottom line is that it’s unfortunate that it’s a difficult technique. But when you think about it, it’s really amazing that you can have any technique at all, given [proteins’] tremendous variety and physical properties.”
In Search of a Proteomics Project
One year and several meetings after its founding, the Human Proteome Organization is still trying to carve out a space for itself in the field of proteomics. With no shortage of HUPO critics, newly appointed president Samir Hanash has his work cut out for him.
The organization has been called “self-appointed” and “useless.” Academics accuse it of being too cozy with commercial conference organizations. And industry players see no compelling reason to join a public-private effort to characterize the proteome, if such an effort were even feasible.
“It’s clear that there’s a group of people who have come together to debate what the agenda is going to be in proteomics,” says Large Scale Biology CSO Leigh Anderson. “But most of us in commercial organizations and a lot of the people in academic labs have a pretty clear idea of what they’re doing already and they’re going to continue doing it.”
Ruedi Aebersold of the Institute for Systems Biology agrees. “I don’t see a good value in starting some kind of human proteome project,” he says. “People are already doing that and they would be fools to share their data.”
Yet if HUPO opts to settle on a specific cell type or tissue, it risks alienating both those not interested in that cell type and those already working in that area who will see the effort as a competitive threat.
What to do? Aebersold suggests HUPO stick to education and technology dissemination. Ben Herbert of Proteome Systems has a different idea. “What HUPO could do is act as a facilitator between private and public institutions to get companies access to vast banks of high-quality tissues and samples that are validated.”
Whatever HUPO decides to do, Hanash says, “We want to be inclusive and certainly not exclusive.”
— AJS