Skip to main content
Premium Trial:

Request an Annual Quote

Vanderbilt s David Friedman on Using Quantitative Proteomics Technologies

Premium
David Friedman
Associate director of Proteomics Laboratory
Vanderbilt University

At A Glance

Name: David Friedman

Position: Associate director of Proteomics Laboratory shared resource in the Mass Spectrometry Research Center at Vanderbilt University, since 2003. Interim director 2001-2003. Research assistant professor in Vanderbilt's department of biochemistry since 2003.

Background: Instructor, Department of Cellular & Structural Biology, University of Colorado Health Sciences Center, 2000.

Associate, Howard Hughes Medical Institute, University of Colorado, Boulder, 1997-2000.

Senior fellow, Department of Biochemistry, University of Washington, Seatlle, 1994-1997.

PhD in genetics, University of Washington, Seatlle, 1993.


David Friedman is scheduled to give a talk on quantitative proteomics at next month's Association of Biomolecular Resource Facilities meeting in Long Beach, Calif. ProteoMonitor spoke with Friedman to find out about his research and the quantitative proteomics technologies he uses.

How did you get into the proteomics field?

My degree is in genetics, so I started as a molecular biologist geneticist. I did a postdoc in biochemistry, and started working in protein phosphorylation, and that got me into mass spectrometry. This was in the mid- to late-90s. That's when proteomics was really becoming hot. That led me into a second postdoc where I actually learned all of the mass spectrometry. So I very quickly became a proteomics researcher.

What kind of research were you doing when you got into proteomics?

In my case I was more of a facility person. I was trying to develop technology and then to collaborate with my colleagues. So I was working on a diversity of biological questions.

What kind of equipment were you using, and what kind of technology were you trying to develop?

Well, when we came out to Vanderbilt University, that was to start a new proteomics laboratory. That was part of our existing mass spectrometry research center, which is directed by Richard Caprioli. So this was now in 2000, 2001 — they were starting a new proteomics laboratory where they wanted to offer shotgun proteomics and gel-based proteomics to the Vanderbilt community.

We started small. I basically was hired to set this up. We've grown tremendously since then. It became apparent that in order to maintain multiple high-end technologies, we needed multiple experts, not just one. So I share the associate directorship now with a colleague whose specialty is in shotgun proteomics. So then I sort of kept my focus on gel-based proteomics.

We also have Dan Liebler here from the University of Arizona, who is an expert in shotgun proteomics. So collectively now we maintain and run this facility. It's all within the mass spectrometry research center — Richard Caprioli's center.

Did you develop new techniques in gel-based proteomics?

Well, when we started proteomics here at Vanderbilt, it was just when the Difference Gel Electrophoresis technology, or DIGE, came out from Amersham Biosciences. So as part of the technology access program, we started with that technology. That's basically 2D gel proteomics with quantification built into it. So I would say we didn't develop that technology, but we certainly implemented it here at Vanderbilt, and have used it with great success.

How would you say things have improved or changed since you first implemented the DIGE technology?

I'd say it's a pretty robust technology. The improvements have been mainly in the software tools used to interpret the data. That's just the front end, and that's only one form of differential display proteomics that we can do — the front end being gel separations. You get quantification on intact proteins, and then you follow that up with mass spectrometry for identification of those proteins. The identification is part of what we do here as well.

Were you using the DIGE technology to study diseases?

We were using it for a wide variety of biological questions. We have published papers on human colorectal cancer, looking at normal versus tumor — that was a pilot study. We've done studies on basic science — Drosophila, mouse, a whole variety of things. We serve both the medical school as well as the arts and sciences campus. So we do a lot of basic research, as well as a lot of clinically related research.

We have multiple technologies, and they each have their unique strengths. They complement each other very well. From a differential display, proteomics view, the DIGE technology is very powerful at looking at expression changes, in terms of both the global expression of a protein, as well as the differential extent of post-translational modification, which you can detect on a gel because you're looking at the intact protein.

We use this technology to basically do the differential display on a global scale and target proteins of interest, which we then subsequently submit to a mass spectrometer to identify the protein.

Do you think that using DIGE to do quantitative proteomics is advantageous over other methods?

As much gel-based proteomics as we do here, we also do shotgun proteomics. They are both very powerful techniques, and they can both be made quantitative. The gel is made quantitative using the DIGE technology. The shotgun approach is made quantitative using stable isotope tags. And we do a fair amount of that as well. It really depends more on the biological question that you're asking.

The shotgun approach is really good at getting sensitive protein identifications, and you can get quantitative data out of that, but you're more looking at the overall expression of a protein. The only way to get information on post-translational modifications is to get mass spectral information on the peptides that contain modifications. Which of course is very doable in a targeted approach, but more and more challenging when you get to doing global comparisons.

On a gel, you can see these changes in post-translational modifications that alter a charge of a protein, or changes due to proteolysis without necessarily having mass spectral information on the modification.

But there's other tradeoffs in there as well, because you see different things with the shotgun approach than you see with the gel-based approach. Which is why we maintain both technologies here. And often times our users will apply their questions to multiple technologies, not just one.

So, there is no real winner. They're very complementary, and it really depends on the nature of the question being asked. If people can, I recommend they use multiple technologies. You always get more information when you use complementary technologies.

Do you think absolute quantification is an increasingly important technology?

Again, it depends on the nature of the question being asked. The field is very broad. There are lots of good questions and good science being done. Some biological questions require absolute quantification, and there are techniques available to do that, mostly with single reaction monitoring using mass spectrometry and labeled standards. And that's all possible, and it's all very powerful. It really depends on the question you're asking.

In your current work, what kind of questions are you asking in your lab?

Again, my lab is a resource facility for Vanderbilt. We develop and maintain these technologies and then apply them to lots of biological questions at Vanderbilt. So those questions range from clinical related questions to basic science questions — what's changing in disease, what's changing when you knock out this gene in Drosophila — those sorts of things. We don't have a major biological focus here. We do a lot of cancer-related work, both in vivo and in vitro.

What do you see as the future direction that proteomics is going in? It seems like people are going more into quantitation now. There's not many people just doing cataloguing of proteins.

There's still a fair amount of cataloguing going on, but I agree, a lot of the research is in learning what's changing in a system rather than what's just present in a system. I see the technology going towards greater sensitivity and greater accuracy. Sensitivity mostly in the mass spectrometer, in addition to speed. But also on intact proteins — separations and resolution there as well — not just gels, but looking at multiple pre-fractionations prior to the mass spectrometry stage.

Do you think there are certain mass spectrometers that are better for doing quantitative analysis?

It depends again on what kind of question you're asking. If you're doing a shotgun-based analysis where you have good sample of 50 proteins, and you want to know everything that's in there and get good information on post-translational modifications, you would do a shotgun approach and probably use an ion trap instrument to get really sensitive information very, very quickly.

If you have a gel-based experiment where you've done very careful quantitative studies using DIGE, and you have 50 proteins of interest that are changing in abundance out of thousands of proteins that you surveyed, and you want to know what those proteins are, we would take that to a MALDI TOF/TOF instrument, because for those already resolved proteins, we can get the mass spectral information much more quickly.

But we can take that same sample to an ion trap and still get high-quality, high- sensitivity identifications, it's just the workflow is a little different. And we can also do an LC-MS/MS experiment on the TOF/TOF instrument. They're both very, very powerful. Sometimes it boils down to one instrument is more streamlined for one workflow than another. But they're both equally powerful.

Do you do a lot of pre-fractionation?

It's a double-edged sword here. To get lower in the proteome, you need to enrich for the proteins of interest. I think one reason that you see more sensitivity in the shotgun approach, for example, is because you're not looking at the whole proteomes necessarily. You're looking at pre-fractionated proteomes.

But every time you do something like pre-fractionate a sample, you're introducing variability into your sample, so you need to make sure that you have the appropriate amount of repetition built into the experiment to account for that sample prep variation.

So yes, we do try to pre-fractionate when it's feasible, but when we do so, that builds into the complexity of the experiment. You don't just isolate nuclei one time and do a comparison, because how do you know that one prep was as efficient as the next?

So there's benefits to enriching, but there's also risk, and they need to be balanced. I'm real big on that — repetition and controlling for variation.

Do you see any new quantitative technologies out there that you're interested in, that you haven't implemented yet?

There's always new stuff out there. We're always keeping our eyes out for new improvements. The best technologies all put together still don't come close to seeing all of the proteome. And one certainly sees less than two or three or four combined.

File Attachments
The Scan

Not as High as Hoped

The Associated Press says initial results from a trial of CureVac's SARS-CoV-2 vaccine suggests low effectiveness in preventing COVID-19.

Finding Freshwater DNA

A new research project plans to use eDNA sampling to analyze freshwater rivers across the world, the Guardian reports.

Rise in Payments

Kaiser Health News investigates the rise of payments made by medical device companies to surgeons that could be in violation of anti-kickback laws.

Nature Papers Present Ginkgo Biloba Genome Assembly, Collection of Polygenic Indexes, More

In Nature this week: a nearly complete Ginkgo biloba genome assembly, polygenic indexes for dozens of phenotypes, and more.