Skip to main content
Premium Trial:

Request an Annual Quote

Bill Marzluff Maps NIH Roadmap to UNC s Systems Biology Strategy

His colleagues will tell you, there is nobody who knows more about the more than 1,000 funded research projects at the University of North Carolina, Chapel Hill, than Bill Marzluff, executive associate dean for research, at the UNC School of Medicine, and director of the school’s program in molecular biology and biotechnology.

This week, Marzluff added more programs to track as the university received three National Institutes of Health Roadmap for Medical Research planning grants of approximately $1.7 million over three years to start multidisciplinary centers to investigate obesity; inflammation and imaging; and develop computational techniques to understand genomic data from clinical studies and population models.

BioCommerce Week visited with Marzluff this summer to talk about the university’s strategy for systems biology.

The NIH Roadmap grants are aimed to foster a culture of interaction between academic research disciplines. Is that true here?

Chapel Hill has had that culture forever. We try to keep nurturing it. One of the things that is different about this place is the close interactions between the college and the medical school.

We have a very strong school of public health, so we do a lot of epidemiology, which is obviously will involve a lot of high-throughput genotyping, which is what [epidemiology] is going to be very soon. And, we have a very good pharmacy school, so we do a lot of pharmacokinetics and drugs things, and that is going to be high-throughput metabolomics very soon. So, we are fortunate to have everybody together in this small town.

The advantage we have is that we have all these people from different areas who at least were willing to talk to each other, and have done so in the past — the hard-core computational scientists, the biologists, and the public health people, and a lot of the social-science people for the inflammation stuff, who deal with how you adapt to arthritis and all of these chronic diseases. We felt that because of our culture, we were in a particularly strong position to respond to these interdisciplinary grants.

To get one of those, you feel pretty good. To have two, you feel amazed.

What do the grants mean in terms of equipment purchases?

This is really to pay for bringing all of these people together, and not about buying tons of mass specs. We have a $25 million gift that we used to do our proteomics facility and buy all of the mass specs. We have a mass spec TOF-TOF and QSTAR from ABI, and a MALDI-TOF and ESI machine from Bruker. I think we are going to buy a $2 million [Fourier transform ion cyclotron resonance spectrometer], the premier mass spec for proteins. Three have been sold so far. We use the one from Bruker a lot.

The unusual thing about the way the grant applications were worded is that they were very careful to spell out that they didn’t want you to propose to do too much science, and if you had a group of people already working on a scientific problem, they were unlikely to be interested in that. I think the attraction to the mathematicians, and the high-end computer science guys, and the high-end statistics guys, were twofold. One, these are legitimate problems for them, sort of like the guys who used to model the weather. This is a problem of that scale. It is going to attract lots of money, and it doesn’t hurt to get people interested in them. But I think the fact that it was the right caliber type of problem, and then the fact that there was going to be significant support for this and that this research is going to turn out to be very visible publicly.

What kind of expenditures do you expect this to entail?

The computing things will be spent on building the computing infrastructure, and getting the people together. Dan Reed, who ran the supercomputing center at Illinois, came here a few months ago. The reason he came here, one of them, was he wanted the opportunity to apply some of what he has been doing to biomedical problems. So I think he will be the one to build these platforms.

My guess is the inflammation thing is much more broad. Part of it is developing non-invasive imaging methods to monitor information, so there will be equipment based on that so that they can monitor what is going on without putting holes in people and sort of do it continuously and non-invasively. I think part of it will probably have a major component for doing some animal modeling and some genetics, and funding of some patient clinical trials. Because, ultimately, one of the reasons that the NIH has done this is to try to find additional ways to promote translational research.

How do you define systems biology?

Well, it’s not as bad as bioinformatics, where each person defines the field totally differently. I think systems biology is defined as integrating all the different systems, all the different inputs. There a number of different scales of systems biology. There is the systems biology of the cell, where people are integrating all of the pathways within a single cell and I think that will make some people really happy. For biomedical research, I think of systems biology as largely at the cellular level, the integration of different cells into organs and ultimately into the entire organism.

What about at the UNC level?

What people see now is another opportunity. We have learned an awful lot with these microarrays, with all the molecular biology, and all the genetics. We are starting to learn an awful lot about all the molecules and how they work within the cell. So I think we are in a position now to start to integrate that information over the whole organism. And, that is something you couldn’t do in a reasonable way before. That is really what it is going to be about. So, for example, we are making the heavy investment in recruiting an expert on genetics of obesity. There is already a system that is well developed on heart disease and the whole interaction with everything in the cardiovascular system. It’s an area that we are very strong in from the fundamental molecule standpoint. I think that, sort of like anything else, when you get to a point where suddenly you now have what you perceive as the opportunity to make definitive conclusions about things, you will get people to zip into those areas to do that. I think that where this is starting to be at the level that people care about — molecules.

So this is not about needing new sequencers?

We have about $1 million in sequencing equipment, $5 million in proteomics, and we have at least $1 million in array instrumentation. I think the challenge is to keep the current sequencers busy. I run the sequencing facility, so what we will be doing [is] a lot of what we call resequencing and that’s sequencing the same gene in 1,000 people. There will be an awful lot of that done. I think the sequencer is good enough and there is no demand to make the sequencer a factor of 10 better. But, there is a demand for high-throughput genotyping and it looks like Illumina may have met that at the moment. I think that people still feel there is another order of magnitude, at least in price, to make it acceptable.

We spent about $3 million on these mass specs and we have another $2 million, if you count the money we have spent and money we have raised by grants. We have one of these TOF-TOF instruments, which is really a high-throughput mass spectrometer. That thing really works, in terms of being a real workhorse, and the challenge is to handle all the data that it can generate. We have spent a lot on the sequencing, about $1 million. Now we have to go into the robotics, and that is really expensive, probably about a couple million there if we set up to ultimately do real genotyping in house, but we are still arguing with that. Our feeling is that doing it yourself buys you two or three years of being ahead of the competition.

Who do you see as your competition?

Everybody — the people who wait for it to be outsourceable. We will have a small window where you can do things that nobody else can do. The biotech companies, the supply guys, the Invitrogens, the Clontechs, have become very proactive in trying to develop tools and kits. Our view is that by the time something is available in a kit, you have already passed the time of when you could have been ahead of the curve in using it. We are looking at what may become the next kits.

What we don’t have yet is the mass specs to do the metabolomics, which is a different set of mass spec technology. The mass specs to do proteins and peptides are set up, ideally, for high-molecular-weight type stuff, and actually anything under 500 molecular weight, they don’t even detect. And almost all your compounds in metabolomics are 500 and less. We don’t think metabolomics is going to get done by mass spec. There are going to be nanotechnology-type sensors developed.

What kind of time frame for that?

Five to 10 years. We have a very good nanotechnology group in physics. Putting effort in developing that technology is something that is worth doing. Capillary electrophoresis was invented here, but not patented here. Jeez. How much money do you think that would have brought in? Can you imagine? The ability for miniaturizing chemistry is really strong here. I think that’s what is going to be the thing that will get done in metabolomics at some level.

What do you think academia’s role is in this emerging field?

What we can do well is to identify and validate targets for drugs instead of one-gene-at-a-time discovery mode. It will be taking all the information one gets on all of these genes, and figuring out what the key node genes are. The pharma guys are not interested in putting a lot of money in studying a particular target until somebody has come up with really good evidence that it probably is one. Whereas, we will study a gene because we think it is important, or neat, or whatever. I think that will still hold.

At the end of the road, are patients going to get better?

Yes, I think so. The genome stuff has opened a lot more stuff than people thought when they started it. A number of people thought it was the stupidest thing in the world to try to sequence the human genome, but you don’t hear too much from those people. So I think that like there was a golden age of physics in the 30s and 40s, and chemistry in the 50s, we are in the golden age of biology right now, and it doesn’t show any signs of slowing down.

The Scan

Positive Framing of Genetic Studies Can Spark Mistrust Among Underrepresented Groups

Researchers in Human Genetics and Genomics Advances report that how researchers describe genomic studies may alienate potential participants.

Small Study of Gene Editing to Treat Sickle Cell Disease

In a Novartis-sponsored study in the New England Journal of Medicine, researchers found that a CRISPR-Cas9-based treatment targeting promoters of genes encoding fetal hemoglobin could reduce disease symptoms.

Gut Microbiome Changes Appear in Infants Before They Develop Eczema, Study Finds

Researchers report in mSystems that infants experienced an enrichment in Clostridium sensu stricto 1 and Finegoldia and a depletion of Bacteroides before developing eczema.

Acute Myeloid Leukemia Treatment Specificity Enhanced With Stem Cell Editing

A study in Nature suggests epitope editing in donor stem cells prior to bone marrow transplants can stave off toxicity when targeting acute myeloid leukemia with immunotherapy.