At A Glance
Name: Richard Smith
Position: Director of the NIH Research Resource for Integrative Proteomics at Pacific Northwest National Laboratories; Adjunct faculty member, department of chemistry, Washington State University.
Background: PhD, physical chemistry, University of Utah, 1975.
B.S. Chemistry, Lowell Technological Institute, 1971.
You’ve been involved in a series of things related to mass spectrometry and separations over the years. What did you do when you first joined Pacific Northwest National Laboratories?
Well, I started a couple of things — working with mass spectrometry looking at environmental samples and trying to develop LCMS methods to deal with complex biological samples — that’s been probably the central theme of most of my scientific career. Initially it was environmental work, but the focus quickly became biological samples and biological problems. The reason for the environmental emphasis was that briefly in the mid to late 70s, environmental work was very well funded.
What kind of environmental problems did you work on?
With the Department of Energy funding our program, there was a lot of work with synthetic fuels, and the impact that they would have on the environment. They were very complex samples — fuels made from coal liquefaction and various processing of coal and oil shale and so on, so they were very complex organic mixtures.
Mass spectrometry was one of the tools we used to analyze the fuels, and I was involved in trying to develop some better analytical tools — LCMS was one of them. A little bit later, that caused me to go into supercritical fluid chromatography and its use with mass spectrometry.
What is supercritical fluid chromatography?
Supercritical fluid is a high-density gas. It’s a gas under high pressure where it gets dense enough to begin to act like a liquid. The advantage of that is you can do separations of things that normally you couldn’t analyze with gas chromatography with much greater speed than you could do with liquid chromatography, and much better effectiveness.
Is that still used today?
Not widely. It turns out to be not the best solution to the problem. So then I moved back into liquid separations, initially with a focus on capillary electrophoresis. So I spent a lot of time in the mid-80s and early 90s developing separations based on capillary electrophoresis and different kinds of approaches coupled with mass spectrometry. That worked out quite well. It’s still being used in a number of variations very effectively because of the high quality of the separations and the speed that can be obtained. And then as the interest in proteomics grew, and the technology got better for liquid chromatography, that began to be the emphasis of our work. That was in the 1990s.
Is it true that in the last five years you’ve spent less time developing technologies and more time on applications?
In the last five to 10 years, that’s been a big part. In proteomics, the applications are amazingly broad. We’re involved in work for major programs in both the Department of Energy and the National Institutes of Health. For the DOE, we’re studying microbial systems that have a lot of applications in energy production and global climate change. For the NIH, we have a number of projects. One big area is biomarker discovery — trying to identify biomarkers for various disease states, so you can, for example, predict cancer at a much earlier stage where it’s more curable. Or trying to develop biomarkers to better predict clinical outcome so you can begin more aggressive treatment for patients who otherwise would become very sick and die after a severe trauma for example.
Biomarker discovery from blood plasma is one area we’re involved in in a major way. Another area of proteomics beginning to become very important to us is in the area of host-pathogen interactions — understanding the interactions, for example, with Salmonella — how it interacts. Or other organisms, trying to understand how they interact with the human host and identify potential targets for new drug therapy. That involves identifying proteins that seem to play a key role in that pathogen interaction with the human host, and identifying those proteins that are potentially good targets for new drugs.
For these studies, are you using any kind of techniques that are innovative or different?
Yes, I think so. A key part of this is using mass spectrometry that is very sensitive and also where we get extremely accurate mass measurements. Because one of the key problems in proteomics is we’re looking at very complex mixtures, and the components vary in their abundance over many orders of magnitude. Also because it’s a complex mixture, you have to be able to identify one component from another, so accurate mass measurements with a mass spectrometer are very important. So we have made some significant developments that led to the improvement in mass measurements. And that has helped that area significantly. That’s using the Fourier transform.
Were you involved in developing the Fourier transform?
We didn’t develop the Fourier transform — that’s been around for 30 years at this point. We we’ve done is come up with methodologies to greatly improve the accuracy of mass measurement. And that was done in our case by coming up with a method to control the ion population in the FTICR analysis.
What have you done in terms of applying liquid chromatography technique to mass spectrometry?
There, there’s two key things that we’ve done. One is we’ve gone to higher pressures that allow us to gain better quality separations, so more peaks in a given amount of time. Whenever you improve the separation quality, it allows us to characterize much more complex proteomes, so see more of the complexity. So we’ve improved the separations by going to higher pressures.
The other thing that we’ve done is develop separations based in very small diameter capillaries. And the attraction here rather than getting that high separation quality, which you can still get, is the extremely high sensitivity. So at very low flow rates, the electrospray ionization method that we use to produce ions from the analyte in the interface with the mass spectrometer becomes very efficient. In fact, the ionization efficiency approaches 100 percent. So this is really important in that it allows very high sensitivity to be obtained, so we can work with, as a result, extremely small samples. We’ve done some work with very small cell populations — under 100 cells — and a few experiments from even a single cell where we make measurements from a number of different proteins.
We’ve done work with collaborators where we’ve looked at small liver microbiopsy samples, where they take these small plugs of liver after transplant of a human, and we get a few micrograms of sample to work with, and we’re able to do a detailed proteome characterization of that small sample. These are transplants that are done after a hepatitis C virus infection. What we’re following is basically a time course after a person has a transplanted liver, looking at how they progress.
What other projects are you working on?
We have a number of other projects. One thing I should mention is we also have a center funded by the national center for research resources — a part of NIH. That funds us to do developments of aspects of the technology — separations and mass spectrometry and so on, along with some of the informatic tools for data analysis and for dissemination to our collaborators.
So we have about 10 different projects at present involving collaboration with NIH funded researchers. I’ll just mention one of them — it’s a project we’re doing with Desmond Smith at UCLA. What we’re interested in doing here is developing detailed 3-D images of mouse brains — proteomic images. And Desmond has developed a method called voxellation. Basically, he takes a mouse brain and chops it up into maybe five or six hundred little tiny cubes. And what we’re doing is developing the methodology to do high-throughput proteomic analysis of each of those cubes. So this will give us the information to put together a three-dimensional image with all the depth of a real detailed proteome analysis for the mouse brain.
One of the things from the instrumental side that may help this quite a bit is the work going on with ion mobility separations in front of mass spectrometry. This is work that we’ve started in the last year here. The idea is to use ion mobility separation to get much greater speed so we can do a proteome analysis in less time.
How does ion mobility separation work?
Ion mobility is like electrophoresis in gas rather than in a liquid or a gel. And the advantage of it being in a gas is it’s several orders of magnitude faster. So the separation takes place in hundreds of milliseconds rather than minutes or hours. So it’s a big increase in speed. There’s some challenges that come about from that increase in speed but the attraction is that they’re very fast separations, and we can potentially use that in conjunction with fast LC separations. So instead of an LC separation taking two hours, we might be able to use an LC separation that takes five minutes, combine that with an ion mobility separation and use high speed mass spectrometry to give a three-dimensional analysis that would cover much of the proteome in maybe five minutes. That’s one of the things in the lab at the moment that has very high potential.