As the first step in what has become a common two-step process — large-scale screening and then winnowing down the hits to a select handful — protein microarrays and the expanding methods of both creating and using them offer a complementary approach to existing protein expression technologies. From cloning techniques to mass spectrometry, many of the technology advances in protein expression revolve around improved versions of established tools. "All [the] advances are taking those existing methods and multiplying them by a thousand," says Trey Ideker, a bioengineer at the University of California, San Diego, who uses protein microarrays to build large-scale protein interaction maps. One of the biggest boons for protein microarrays has been advances in expression vector technology. For one, Invitrogen's Gateway recombinational cloning system, known for its flexibility in shuttling open reading frames between different expression vectors, has made life easier for protein biochemists. With this technique, Marc Vidal's lab created the first C. elegans ORFeome, as well as the first version of the human ORFeome in 2004. By building these genome-wide collections of protein expression vectors, protein expression can be done in high throughput in a number of different experimental organisms. And whether these large quantities of proteins are needed for drug discovery or for building protein microarrays, current ORFeomes have largely conquered the notorious lack of reproducibility when proteins are cloned in different lab animals. They haven't been around long, but protein microarrays are already at the forefront of systems biology research. Used mainly to probe protein-protein interactions across many thousands of molecules, researchers can use this data to study any number of networks. While high-throughput protein purification and spotting is still the norm, some researchers have begun expressing genes directly on the surface of the microarray glass slide. Using cell-free translation — building the array in real-time, so to speak — helps researchers avoid the labor-intensive process of traditional arraying techniques, reduce storage time, and prevent potential loss of protein stability and activity. Growing up Mike Snyder's lab at Yale has become known as a major innovation center for protein microarray technology. Heng Zhu was a postdoc there when he helped create the first yeast protein microarray by individually expressing and purifying the nearly 6,000 proteins of the yeast proteome. Since then, the Human ORFeome Project has created an easily accessible library of the entire set of human open reading frames, which Zhu is using to create a human proteome-on-a-chip. Now an assistant professor at Johns Hopkins, he focuses his research on identifying networks and pathways. "We [don't] restrict ourselves to transcription factors. We can also add other types of proteins, such as chromatin-associated proteins, the co-activators and regulators of transcription factors, and RNA-binding proteins or nucleotide-binding proteins," Zhu says. "It is up to your imagination because this is … [an] unbiased approach." Zhu's not the only one thinking ahead. Several years ago, Harvard's Josh LaBaer began delving into synthesizing proteins from DNA spotted on a glass slide. In 2004, Niroshan Ramachandran, then a postdoc in LaBaer's lab, published proof-of-concept work showing that, with optimized chemistry, he could take this cell-free expression system and create a functional protein array using DNA, an in vitro translation system, and a universal capture tag embedded into each protein's sequence that would bind to an antibody on the slide. Creating this nucleic acid programmable protein array in situ was easier and more reproducible, and the proteins ended up being of higher quality and stability, Ramachandran says. "It was extremely difficult and expensive to purify proteins, and the proteins that we made were often not of good enough quality," he says. Scaling up is a big issue, though, and most arrays aren't worth the effort to produce if they don't cover a large number of proteins. "That's where most alternate technologies have struggled, is trying to figure out how you take a technology that works in a handful of genes and make it work at thousands of genes," Ramachandran says. David Lubman has taken a slightly different approach, calling the arrays that he's building "natural" protein arrays instead of the usual functional variety. Lubman, a professor of surgery and of surgical immunology at the University of Michigan Medical Center, has constructed a glyco-microarray which probes for changes to glycoproteins as a basis for early detection of cancer. Lubman builds these by using lectin columns to extract N-linked glycoproteins from a sample mixture, which, after a series of validation steps, he spots down onto a coated glass microarray slide. Since the proteins are isolated from actual cells instead of using cDNA, "the proteins that we spot on the array contain the structure that you would see in a disease cell," he says. Applied arrays Protein arrays have already shown promise as discovery tools, whether for finding biomarkers or small molecule interactions. "I think microarrays will be great for discovery on large sets, in a way you can't do on a 2D gel because it's just too cumbersome and too slow," Lubman says." One area of interest for UCSD's Ideker is using microarray data to create protein interaction maps. These can be used both to compare evolutionarily distinct species and to improve disease diagnosis. Using arrays along with techniques like chromatin immunoprecipitation, Ideker's main focus is creating blueprints of intracellular interactions that incorporate all of these measurements. Because one measurement is imperfect, maps that include multiple data sets from different experiments are bound to yield a truer representation of the cell. For diagnosing disease, this is especially important. "The whole idea of using these networks in human health is about to explode," Ideker says. For instance, matching cancer-causing mutations previously thought to be independent of each other to the same pathway can be very effective. "There are many paths to disease," Ideker says. "What you want to know is what are the common paths, and without prior knowledge of the pathways, it's hard to do that. The blueprint I'm talking about is exactly giving you that prior knowledge." Another area where protein microarrays have seen widespread application is for immune profiling — taking a serum sample and washing it over an array to see which antibodies a patient already has. "It's a good way now to develop a diagnostic, if you will, or to identify a vaccine," he says. Profiling a serum sample would allow vaccines to be tailored to an individual's protein biochemistry, theoretically avoiding side effects. "It's much better if you can identify which proteins enlisted an immune response inside your body that makes you protected against that pathogen, as opposed to injecting the whole pathogen," he adds. Other areas of potential include profiling in autoimmune diseases for preventive therapies, and cancer immune profiling to find tumor antigens, which often are specific to individual patients.
Because the protocol for purifying large sets of proteins is difficult, Zhu has been researching alternatives. For the past three years, he and a postdoc in his lab have been working on an in vitro translation system that uses immobilized mRNA to synthesize and capture peptides simultaneously on the slide surface. Since purifying proteins is tricky, Zhu hopes one day to replace the traditional method with this one. "It will be much easier for labs without expertise in protein purification to get access to protein microarrays."
Primetime for Protein Arrays
Premium