Skip to main content
Premium Trial:

Request an Annual Quote

Daniel Knapp on Heart Proteomics and Chromatography on a Chip

Premium

At A Glance

Name: Daniel Knapp

Age: 60

Position: Professor of pharmacology, Medical University of South Carolina, since 1972.

Principal Investigator of NHLBI Cardiovascular Proteomic Center, MUSC, since September 2002.

Director of MUSC Proteomics Center.

Background: Assistant professor of experimental medicine, University of Cincinnati, 1971.

Post-doc in pharmacology, University of Cincinnati, 1970-71.

Post-doc in chemistry at University of California, Berkeley, 1969-70.

PhD in organic chemistry, Indiana University, 1969.

BA in chemistry, University of Evansville, Evansville, Ind., 1965.

 

Why did you apply for the NHLBI grant?

We were already in the protein analysis business and looking to expand our involvement with that; we just expanded a gel proteomics facility on campus with a significant investment in that; we just started a new degree program in bioinformatics; and there’s a long history of strength in cardiovascular research here. So it was just good timing for bringing all those pieces together.

Tell me more about the center.

The NHLBI center contract has seven projects and five core units. Three of the projects are biological applications, and four are for technology development. For the three biological application projects we drew a project from each of the three NHLBI program project grants that we already had at the university.

The first was a program project focused on heart failure, led by Mike Zile — he had been studying diastolic heart failure for a long time at the gene level. This was an opportunity to add a proteomics dimension to that. The second project was in cardiovascular development, led by Roger Markwald. The third was a project focused on the cardiovascular effects of diabetes. That involves human studies looking at the proteomics in muscle biopsies from humans.

The one farthest along is in heart failure. They’re developing a new model in pressure overload hypertrophy and volume overload hypertrophy in the rat. We’ll be looking at changes in protein expression patterns in the cardiac muscle as that hypertrophy develops.

Initially with all these projects we’re starting out with traditional 2D gel-based approaches with mass spec analysis. But we are also beginning to look at chromatography-based approaches using ICAT methodology. As we develop new methodology in the technology development projects, we’ll be phasing that into applying it in the biological studies. In all of the NHLBI centers the theme is the development of new technologies in the context of applying it to cardiovascular research.

Tell me about the technology development that you’re doing.

We have four technology development projects going on. The first is John Arthur’s project to improve 2D gel-based methodologies. He’s looking at a variety of things to increase the capabilities of seeing more proteins with higher sensiti-vity. He’s currently working the most on doing differential extraction methodologies trying to fractionate protein mixtures up-front in the initial extraction from the biological sample. The second technology project is my project. What we’re trying to do is put a 2D chromatography system on a microchip that interfaces directly to the mass spec, initially by electrospray interface, but we’re beginning to look at interfacing by MALDI as well. In the conventional 2D chromatography approach — Yates’ MudPIT approach — he uses a single tandem column. We’re trying to build a system with a single first-dimension column, but an array of many second-dimension columns. It’s practical to do that in a microchip because it’s easy by microlithography-based fabrication methods to make multiples of functions in a microfluidic chip. So it makes it practical to do something like that that would not be practical in conventional hardware.

We’ve begun at the mass spec electrospray interface and have been working on microfabricating electrospray emitters. The next thing to add to that is the second-dimension separation, which is a reverse phase separation, and we’ve been working on making monolithic columns by using photo-initiated polymerization methodology that was pioneered by Frantisek Svec at Berkeley. We’re just beginning to look at new approaches to actually distributing fractions from the first-dimension separation into an array of second-dimension columns.

What’s the advantage of your method over other multi-dimensional chromatography methods?

One of the big challenges in proteomic analysis is to increase the separation space — if you can spread mixtures out over a larger space you can see more things. So we think this is one thing that will allow us to increase the separation space. It will allow us to do part of the process off-line from the mass spec whereas a conventional serial approach ties up a mass spec for more than 24 hours typically. It will allow you to look at part of the separation space and archive the rest of it if you’re not interested in looking at the whole thing

How can you apply this technology to the biology projects at the Center?

We’re already looking to use the ICAT methodology with conventional 2D chromatography approaches. We would phase this into that as we reach the point of developing. Where we’re headed eventually with that is to build 2D separation systems for intact proteins. That will feed into our third technology development project where [Kevin] Schey is working on methods for mass spec analysis of intact proteins. So we eventually hope, if we’re successful with that, to build microchip devices that would do the separation at the protein stage and feed into that project.

Lots of people have said that proteomics is moving away from peptide analysis and toward whole protein analysis — why?

It simplifies the process. When you start chopping up proteins, you lose information, so if you can separate and ultimately analyze at the intact protein level you would maintain more information in terms of the actual modified forms of the proteins. The other thing is just to simplify the overall process if you remove a stage.

Are you looking to commercialize your technology?

Certainly, particularly with the microfluidic devices. We have a couple [of] patent applications in now on some aspects of the electrospray emitter designs. Our intellectual property management organization will attempt to license those patents to an appropriate company to commercialize them and make them available to other places.

What are you working on outside of the Center?

We’ve also formed a university proteomics center that’s pursuing a variety of other things. The most recent success is we were funded on a National Eye Institute vision core grant that includes a proteomics core. We [also] have a project that’s pending right now in the cancer area. On Sept. 15 we submitted a proposal for the new NIAID biodefense proteomics centers program. We also recently got funded on a state-funded project — it’s a Centers of Research Excellence program where we’ve been funded to recruit new endowed chair faculty who would be working on next-generation proteomics technologies — technology beyond anything we’re trying to work on right now at the NHLBI center.

What is next-generation proteomics technology?

I’m not sure anyone knows what the next-generation is. Some are betting on microarrays. But it’s not really clear yet how one approaches doing proteomics by microarrays. Other possibilities are totally new technologies. Another way to look at it is, some people refer to first-generation proteomics as basically cataloguing the expressed proteins in a system, second-generation would be quantitative measurements of expression or differential expression of proteins, and third-generation would be looking at protein-protein interactions. Certainly there’s a need for new technologies to look in a global way at protein-protein interactions, and that’s an area we’ll be looking at for new faculty recruitment.

Ultimately, we’re going to reach the point where we know what proteins we’re interested in and how they change in a biological system, and there will probably be more time-efficient, cost-effective ways to do that in a routine manner, to actually apply proteomics to biolo-gical studies. It would be something other than mass spec, and of course a lot of people think the answer to that is microarrays, but then the big question is, what do you put on a microarray to specifically capture proteins? The answer to that is not clear. Many people are betting on antibodies, but those have problems. Some people are working on aptamers and other binding molecules.

Do the NHLBI centers at different institutions work together?

There’s an NHLBI effort to have the 10 centers work together. That’s fostered by a two-day meeting of all of the 10 centers twice a year. We have the second one of these meetings scheduled for October — that will be focused primarily on bioinformatics issues — data management and data format standardization issues. The NHLBI [recently announced] the award of the contract for the administrative coordinating center to MUSC. That is an add-on to our contract, and will be coordinated by Margaret Schachte, who runs our research and development office. Among her responsibilities will be coordinating all the meetings and interactions among the centers. [Her group] will also establish a web page for the whole initiative. This ran our total contract up to $18.4 million over seven years, which was a record for the largest competitive research award ever in the state, according to the university provost.

The Scan

Positive Framing of Genetic Studies Can Spark Mistrust Among Underrepresented Groups

Researchers in Human Genetics and Genomics Advances report that how researchers describe genomic studies may alienate potential participants.

Small Study of Gene Editing to Treat Sickle Cell Disease

In a Novartis-sponsored study in the New England Journal of Medicine, researchers found that a CRISPR-Cas9-based treatment targeting promoters of genes encoding fetal hemoglobin could reduce disease symptoms.

Gut Microbiome Changes Appear in Infants Before They Develop Eczema, Study Finds

Researchers report in mSystems that infants experienced an enrichment in Clostridium sensu stricto 1 and Finegoldia and a depletion of Bacteroides before developing eczema.

Acute Myeloid Leukemia Treatment Specificity Enhanced With Stem Cell Editing

A study in Nature suggests epitope editing in donor stem cells prior to bone marrow transplants can stave off toxicity when targeting acute myeloid leukemia with immunotherapy.