Skip to main content
Premium Trial:

Request an Annual Quote

Genentech Scientists Offer Perspective on Proteomics' Evolution Into General Research Tool

Premium

NEW YORK (GenomeWeb) – Since its early days some decade and a half ago, mass spec-based proteomics has slowly but steadily evolved from being strictly a technique of specialists to something more and more approaching a broadly used tool for biological research.

Instrument vendors and proteomics scientists continue efforts to expand the field in this direction through, for instance, initiatives like the Human Proteome Organization's working groups on developing more broadly accessible mass spec devices.

In a similar vein, Genentech scientists last week published a review in Molecular & Cellular Proteomics intended as a "field guide" of sorts for biologists interested in using multiplexed quantitative proteomics in their research.

Surveying a variety of labeling and label-free approaches to quantitative proteomic experiments, the paper, the authors noted, provides an overview of "quantitative methods for proteomic mass spectrometry and … their benefits and weaknesses from the perspective of the biologist aiming to generate meaningful data and address mechanistic questions."

"Clearly, the movement is underway where mass spectrometry is becoming a routine tool for characterizing proteins," Donald Kirkpatrick, an author on the paper and senior scientist, proteomics and biological resources at Genentech, told GenomeWeb.

"I see groups that I would generally categorize as biology groups, particularly ones that have significant amounts of resources at their disposal, beginning to adopt mass spectrometry directly under the umbrella of their own groups," he said.

At the same time, he added, the growing number of groups and firms offering proteomics services has made such work more accessible to researchers without the interest, expertise, or funds to take it on in house.

"There is such a growing list of groups that are capable of doing really high-quality outsourcing-type experimental work, that it really is becoming feasible to do that on a more limited budget," he said.

Indeed, Kirkpatrick said that while Genentech has considerable in-house proteomics expertise, it also takes advantage of outsourcing and collaborations, particularly when looking to implement newer technologies.

"As new technologies come on board it's those groups that have specialized in that new technique or technology and that become very good at it," he said. "And in order to really test it in the early days and test it in comparison to what the benchmark technologies are, you really need to have a group that is functioning and operating at the very best of what the technique can do."

That said, Kirkpatrick noted that in recent years vendors have focused heavily on designing their instruments to appeal to non-experts. This, he said, has been particularly noticeable on the software and instrument control side of things where, "even five years ago you would have had many dials and buttons and different controls where you would have [control] over some of the variables of how the instrument was functioning."

In more recent machines, "many of those things are done under the hood, and you have a much more user-friendly user interface."

On the one hand, Kirkpatrick said, this gives "the generalist biology groups the opportunity to do those [proteomic] workflows that have become standardized and to do them with less barrier to entry." The downside, he said, is that this can make it a little more difficult for expert users to access the innards of an instrument in order to develop methods beyond existing workflows.

In the MCP paper, Kirkpatrick and his co-author and Genentech colleague Corey Bakalarki focused largely on established quantitative workflows like SILAC labeling and isobaric tagging as well as label-free quantitation, weighing the pros and cons of each for various sorts of experiments.

For example, while metabolic labeling methods like SILAC provide reliable and quantitatively accurate results, such techniques are poorly suited to large multiplexes, making it cumbersome, for instance, to look at samples at a large number of points over a time course experiment.

Isobaric tagging, meanwhile, offers higher levels of multiplexing, but these methods can suffer from precursor interference wherein tagged ions other than the targeted ion can contribute to a signal, lowering the accuracy and precision of the quantitative data generated through such experiments. However, as the MCP authors noted, methods for mitigating these issues have been introduced in recent years.

The paper also discussed some less established technologies, most notably, the NeuCode labeling method developed by the lab of University of Wisconsin-Madison researcher Josh Coon, which uses differences in the nuclear binding energy — the energy needed to break a nucleus up into its component nucleons — of different isotopes to label amino acids.

Coon and his colleagues have thus far used the approach to multiplex as many as 18 samples in a single mass spec experiment and it could, in theory, allow for multiplexing of as many as 39 samples. The method potentially brings together some of the best aspects of isobaric labeling and SILAC, combining the multiplexing capabilities of the former and the MS1 level quantitation of the latter, which eliminates precursor interference problems.

While the technique has to date been used mainly within Coon's lab, Kirkpatrick said he believes it is "on the verge" of moving into the wider community.

"We are quite interested in it here," he said. "We have done a bit of work in collaboration with the Coon lab using that technology. For doing studies where you need to be able to do metabolic labeling to accurately profile [multiple time points], NeuCode offers a unique opportunity because it extends multiplexing."

Kirkpatrick noted several other proteomic technologies that he and his Genentech colleagues are keeping an eye on, including top-down proteomics, where researchers such as Northwestern University's Neil Kelleher have recently demonstrated the ability to do quantitative comparisons of more than a thousand intact proteoforms in an experiment.

"We are not actively doing that, although we certainly are keeping our eye on work from [Kelleher] and many others," Kirkpatrick said. "There is quite a bit of interest there, but there is still a bit of hesitation just given the number of proteoforms that exist, as to whether or not we are going to be able to [look at them] in a way that answers questions that we are interested in."

In any event, he said, technology development on the top-down side has proved a boon for the intact protein work Genentech does, such as characterization of purified proteins and biologics.

"The tools that are being developed by labs like the Kelleher group for characterizing whole proteomes are bearing fruit for those of us in biotech in really opening the door for really carefully characterizing the proteins and molecules that we are generating," he said.

Kirkpatrick and his colleagues have not, thus far, taken up data-independent acquisition mass spec methods like Swath in a significant way, he said. "Clearly, there is a lot of interest around that, and I put it in that category of methods we are keeping a close eye on. We do a little bit of it, but it hasn't necessarily made its way mainstream within our own group."

"I think one needs to make decisions in order to focus on the technology side because it is very difficult to be good at all of the technologies," he said.