At A Glance
Name: David Myszka
Position: Research associate professor of biochemistry, University of Utah, since 1996.
Director of the Center for Biomolecular Interaction Analysis, Protein Interaction Laboratory, University of Utah, since 1996.
Background: NIH post-doc in macromolecular sciences, SmithKline Beecham Pharmaceuticals, 1991-95.
PhD in biochemistry, Ohio State University, 1991.
BS in biochemistry, Ohio State University, 1987.
How did you get involved with proteomics?
I did my graduate work creating new compounds and got interested in the properties of molecular recognition — like how an enzyme can recognize a substrate, and if you change that substrate it would either bind differently or might not bind at all. From there I went to SmithKline Beecham and started studying molecular recognition at the protein-protein level. A new technology was being developed that I started to apply: optical biosensors, or SPR-based biosensors. This technology allowed you to measure the interaction of any kind of molecule without labeling, and in real time, which meant you could get at the kinetics of a reaction. This was about 12 years ago. So [since then], we’ve mainly been working in that field of biosensors. Along that time, the whole interest in the proteome has come out, and it just so happens that this technology I’ve been working on is going to be really important for figuring out the interactions, in a detailed level, of the proteome.
Tell me about the technology you’re working with now.
It’s a commercial platform. What we’ve done is develop the applications. We’ve validated it [to show] that it’s correct in returning the rate constant and things like that. The majority of the instruments come from Biacore. But now we’re seeing more and more of these kinds of platforms being developed because there’s a greater and greater need. So recently, Applied Biosystems released a similar kind of instrument based on an array format [the 8500 Affinity Chip Analyzer].
What do you think of the ABI SPR chip?
For about the past 10 to 12 years, we’ve been using instruments that can do four reactions at a time. Now that ABI has developed this sensor that can do 400 at a time, it kind of creates a different mindset when you approach this kind of experiment. Any time you scale something by 100 fold, it really creates a different opportunity to apply it. What we’re seeing now is people starting to try to develop those assays that would require 400 surfaces at one time.
The technology works in terms of what it’s supposed to do. I think in some sense it’s a little bit ahead of its time, which is kind of good, because it’s nice to have technology that leads the field instead of following it up. I think we’ll see that these hurdles in protein expression and immobilization [on a chip] on a large scale will all be solved. We’re working on these problems. Even starting with the protein, we’re working on technology to do microscale purification in a high-throughput way.
Tell me about the applications you’re working on for Biacore.
Currently in our group, one of the areas we’re focusing on heavily is membrane-associated receptors. This is an area that’s been neglected, because of the difficulty of dealing with membrane-associated proteins. But it’s such an important area from a pharmaceutical standpoint that we think it’s worth investing the effort to see if we can make them work. We’re looking at 12 different receptors right now, to see if we can help develop the technology to make it amenable to these types of systems.
This involves a lot of basic biochemistry, and even some molecular biology — first getting these proteins in an expression system that’s amenable to what we’re trying to do, putting special tags on the proteins that allow them to be coupled to the biosensor surface, [getting] standard membrane biochemistry detergent solubilization conditions. This is where we think the sensor has never really been applied but could have a big impact, which is that it allows you to screen different solubilization conditions.
So do you use SPR to follow the denaturing process?
Almost the opposite — [we use it] to try to prevent the denaturing process. So we took two extremes in conditions: one, detergents we knew would be very deleterious and denature the receptor, and the other, conditions that in the past have been used for other receptors to retain activity. We solubilized the receptor under these different conditions and we showed that it affects the activity, which is what you would expect. The advantage of the sensor is that what we’re doing is first capturing these receptors onto the surface, and [then] the sensor itself can quantitate how much is there. Then we test it for activity. It’s different than radiolabeled ligand-binding assays, where it’s typically very difficult to tell how much there is, but it’s very good at telling you how active it is. But without knowing how much is there, you can’t get the total activity. The nice thing about the sensor is that it quantitates how much is there to start with and then how active it is.
We used [the sensor] to try to find an optimum solubilization condition. It’s not the same for every receptor, so this gives us a way to rapidly screen through. We can with the current Biacore technology probably screen a plate of 96 different solubilization conditions in a day.
What is your ultimate goal in working with these membrane receptors?
The ultimate goal is to try to reconstitute a fully active membrane receptor, and ultimately to be able to follow the signal transduction process. The class of proteins we’re looking at are G-protein coupled receptors. They bind a ligand on one side, and then the ligand binding transduces a signal that’s felt on the inside of the cell and affects the binding of these G-proteins. Ultimately we want to be able to measure that tripartite interaction — ligand binding, as well as the effect of the ligand on the G-protein. This would be a great biophysical tool for research, but it would be very good for discovery or drug development if you could look in detail at how compounds are affecting the signal-transduction process.
What are you doing in microscale purification?
[I just met] with a company called PhyNexus. One of the directors of that company came out of HTS Biosystems, which is who ABI partnered with to develop the array. He saw there was going to be the need for purification of proteins to put them on the array, and he started up a small company to develop this kind of technol-ogy. So we’ve been working with them on a couple [of] approaches they have — one is a standard approach just using small beds of purification beads to purify all the proteins on the scale of maybe 25 to 30 µL beads that they’ve engineered into special tips for purification. Recently they’ve been developing a capillary system for purification, where they immobilize a special capturing tag inside these capillaries typically a meter or so long, so it gives them a tremendous amount of surface area. If you capture the protein on the inner wall of the capillary and then you can remove it with a fairly small volume of regeneration reagent to release the protein and recover it, you can typically end up concentrating and purifying the protein by 100 to 1,000 fold. The sensor doesn’t use a lot of material, so you don’t want to take a lot of time purifying milligram quantities of material, you just want to purify what you need. So this technology looks like a good match for that kind of array technology. [We just received] a new instrument that can do 8 to 10 samples in parallel, to try to now ramp up the whole process.
PhyNexus developed the technology and we helped develop the applications. It was the same thing with HTS and ABI — they came to us two years ago to really develop the applications. We take technology that other people are developing, and we bring the biology to it.
Another technology that we’re working on now is made by a company called Proterion. They’re working on a system called plasmon waveguide resonance. It’s similar to Biacore, but it’s designed to look more at structural changes of proteins, and one of the big applications here could be looking at membrane-associated receptors, because as you get binding, you get changes in conformation. We’ve been working with their instrument to try to help develop that whole approach. This was another example of when people come to us with new technology, and we test the limits of it, validate it, [and] develop applications for it.
What big obstacles do we still need to overcome in functional proteomics?
We’re starting to develop these nice protein maps of networks, and what we believe is, there’s a missing element to these networks — the time scale of interaction of these different proteins. You’ll find in a network one node of protein that supposedly has interactions drawn to eight other proteins. So if we want to recreate that interaction, we’re going to need to understand the timescales, of all these proteins. We think the sensors are going to be good tools to help us develop the timescales for maps.
[Also], what we’re doing is trying to get the mindset out there that if these numbers are going to be useful, then you have to collect the data under certain conditions and analyze the data a certain way. Not everyone’s doing that currently — there’s no standard art to how you would measure a protein-protein interaction with one of these technologies. So the NIH has this program announcement around functional proteomics [National Technology Centers for Networks and Pathways, see PM 10-3-03] and we’ve been encouraged to apply. I think the kind of sensor technology we use could fit into that, but it would require a lot of other supporting technologies. No one technology is going to solve the proteome.