Skip to main content
Premium Trial:

Request an Annual Quote

Harvard Proteomics Chief Says More Protein Interaction Studies Needed for Field to Grow

Premium
SAN FRANCISCO The director of Harvard Medical School’s Institute of Proteomics is calling for more research into how proteins interact and function, and for the creation of a resource for analyte-specific reagents for each protein in order to move the study of proteomics forward.
 
At last week’s Beyond Genome conference, Joshua LaBaer told ProteoMonitor that while the task of just detecting proteins is still in its early days, it’s time to move past detection to deeper work into protein interactions.
 
Furthermore, he said, the proteomics community needs to start building a repository for antibodies for each protein, especially if the field is to overcome one of the most significant challenges it faces: the high dynamic range of serum.
 
Held by the Cambridge Healthcare Institute, the three-day Beyond Genome conference kicked off with a panel discussion about some of the advances and challenges facing systems biology since the Human Genome Project published its findings four years ago.
 
LaBaer, a panelist, told the audience that current understanding of protein interactions is at the “ball-and-stick” level: Researchers know that A binds with B, but beyond that, not much else is known. What’s needed is greater information about how quickly it binds and how quickly it unbinds.
 
“Right now, we have a very crude model for protein interactions, and the model is based entirely upon whether you get a hit or non-hit in one of a couple of major assays,” he told ProteoMonitor afterward. Proteins, however, don’t interact on a simple “yes” or “no” basis, but with some affinity that can range from micromolar to sub-nanomolar levels.
 
Because they have on- and off-rates, the interactions are kinetic, he said. And “ultimately if we’re going … to build a computational model for the cell, we can’t just do it with ‘This interacts with that.’ We have to do it with ‘This binds to that with this on-rate and that off-rate,’ and the on-rate and the off-rate are affected by phosphorylation, whatever modifications changed those parameters.
 
“And we’re going to have to do that to get a better understanding of how things truly behave,” LaBaer said.
 
While some methodologies currently exist to measure kinetic interactions between proteins, they are low-throughput. The result is that researchers have a fairly clear understanding of how proteins interact in individual cases, but “in terms of really understanding global protein interactions, I think we’re pretty much in the infancy,” he said.
 
‘Ain’t Going to Cut It’
 
If his sentiments seem familiar, it’s because other thought leaders in proteomics have emphasized a need to go beyond a shotgun approach to the research. For example, during the 7th Siena conference on proteomics last summer, Ruedi Aebersold, professor of molecular systems biology at the Federal Technical University in Zurich and the University of Zurich, told an audience that work in the field has stalled and a new strategy based on mapping out the whole proteomic space with statistical analysis to validate the quality of data, and a targeted approach to the research are necessary [See PM 09/07/06 and 09/07/06].
 
Indeed, LaBaer said that one of the things holding proteomics research back is a gear-head mentality that is overly focused on improving technology for mass spec detection. What the field needs, he said, is input from biologists and statisticians who have a better grasp of the biology involved and how to evaluate results.
 
The proteomics field has been largely made up of researchers with backgrounds in analytical chemistry. That is changing with greater numbers of scientists with biology training, he said. LaBaer himself is a medical doctor.
 
“There are some folks like John Yates [at the Scripps Research Institute], who have enough of biology smarts that he just does nice studies, because he is a good biologist and he’s a good analytical chemist, so he just does great stuff,” he said. Among others he cited as having the proper biology background to push proteomics along are Aebersold and Matthias Mann at the Max Planck Institute of Biochemistry in Martinsried, Germany.
 
“But there are a lot of good analytical chemists out there who don’t necessarily have as much of [a biological background],” he said.
 
And while the quality of the data is also improving, he said he still sees papers drawing conclusions based on results from two patient samples and two normal samples. “A statistician will tell you right away that ain’t going to cut it,” he said.
 

“In terms of really understanding global protein interactions, I think we’re pretty much in the infancy.”

According to LaBaer, proteomics is still setting itself up and trying to figure out what direction it needs to go. In the meantime, a vast amount of data has been amassed. What to do with it is not clear, though.
 
“That still remains to me one of the biggest challenges we face,” he said. “We get this scad of data, [but] how do you turn that into useful biological interpretation? How you go from data to knowledge is really the fundamental question.”
 
LaBaer also said that one of the major challenges facing proteomics is simply the biology, especially how to handle the dynamic range of proteins in concentrations. The high dynamic range in blood is a well-known obstacle in proteomics research. Commercial vendors have responded by continuing to develop more robust columns, plates, and cartridges.
 
But according to LaBaer, those technologies don’t reduce the dynamic range by enough orders of magnitude to get at the truly low-abundance proteins. Instead, he advocates greater resources for the development of antibodies for each protein.
 
“If you had a nanomolar binder, let’s say, to every protein, or preferably two or three, so you can do cross-wise experiments, then you could imagine realistically analyzing even something as wide ranging as serum because the antibodies wouldn’t care that something was present in 10-6 concentration because they could still pull it out,” he said. “A strong antibody can pull out a tiny amount of a compound or of an analyte. So I think generating good quality antibodies for every protein will make a huge difference.”
 
Efforts to create such antibodies are already underway. The Human Protein Atlas has more than 1,500 antibodies and 1.2 million images showing the expression and localization of human proteins. The National Cancer Institute also is developing a reagents resource.
 
However, LaBaer said that he believes that such initiatives have moved too slowly, mainly due to a lack of government funding, particularly in the US, for such an ambitious long-term task.
 
“It’s a challenging project,” he said. “It’s one that we should have the vision to start. We may not be able to [finish] it in 10 years, but we should start.”
 
Like others in the field, LaBaer said he is not surprised that proteomics has been progressing only glacially. The Human Genome Project took 14 years to complete, he said, and the scope of that was elementary compared to trying to decipher the human proteome, an effort that he anticipates will take decades.
 
The first stage, which proteomics is still in, is cataloguing the proteome. Afterward, work will shift to mapping out some of the interactions among the proteome, followed by a functional study of what proteins do, how they behave, and how the pathways connect, he said.
 

“Proteomics has an immeasurable number of chemistries, it’s not just one chemistry,” LaBaer said. “It’s thousands of chemistries and it’s not just decoding a single linear sequence. It’s decoding all kinds of potential interactions and activities and functions.”