Skip to main content
Premium Trial:

Request an Annual Quote

NIH s Hewitt on TARP, PepTalk, and Tissue Arrays

Stephen Hewitt
Tissue Array Research Program, National Institutes
of Health

Name: Stephen Hewitt

Title: Director, Tissue Array Research Program, National Institutes of Health

Professional Background: 2000 — Present, Clinical Investigator, Laboratory of Pathology, Center for Cancer Research, National Cancer Institute, National Institutes of Health; 1995 — 1996, Adjunct Research Instructor, the University of Texas Health Science Center, Houston Graduate School of Biomedical Sciences; 1993 — 1995, Pre-doctoral Fellow, NIH Training Grant in Molecular Genetics of Cancer, the University of Texas MD Anderson Cancer Center.

Education: 1996 — MD, the University of Texas Health Science Center, Houston Medical School; 1995 — DPhil, Genetics, the University of Texas Health Science Center, Houston Graduate School of Biomedical Sciences and the University of Texas MD Anderson Cancer Center; 1988 — BA, philosophy, the Johns Hopkins University.

As chief of the National Institute of Health's Tissue Array Research Program, Stephen Hewitt has been involved in the world of tissue microarrays since the dawn of the technology. At TARP, Hewitt is involved in projects to move existing assays onto array platforms, creating cell-line microarrays and xenograft microarrays.

He is also an advocate for the potential use of tissue microarrays in translational medicine. During a presentation at Cambridge Healthtech Institute's PepTalk conference held in San Diego last week he argued that tissue arrays are protein arrays too, and offer the same promise as protein arrays to make clinical testing, in his words, "better, cheaper, and faster."

To learn more about TARP, tissue arrays, and the status of the protein array arena in general, BioArray News sat down with Hewitt following his presentation at PepTalk last week.

How many of these events have you been to?

This is my second PepTalk. I was out here last year and you can tell the difference; there's 25 percent more people and there's 50 percent more intensity. The meeting is much more intense this year and I can't say that the technology over the last year has changed that much — there hasn't been a major breakthrough over the last year. Rather, there's more adoption, people are more comfortable with the field, the quality of the data is just better because people have been working for another year. I think there has also been a shift. People are thinking, 'OK, we are going to go towards an array-based technology.' I think a lot of people were on the fence wondering if it was going to be mass spectrometry. I think mass-spec plays an enormous role, but I don't think it's the end-all of end-alls, and I think some of us are trying to think about how we can marry mass spec to the array platform.

How come we hear so often that protein arrays are 'nascent'? I had several people at Chips to Hits tell me that they were still waiting for protein arrays to grow up, but if you look around here it appears to be a mature technology.

This is the difference between the crowd at Chips to Hits — which is a very diagnostic-driven crowd that thinks that expression microarrays are going to be the diagnostic tools — and what you are seeing here, [which] is [that the] people who are [here are] looking for function. These people, especially the pharmaceutical companies, are really into this because it's allowing them to measure phosphoproteins, functions, and pathways, which transcriptomics couldn't do for them.

Also, to be quiet honest, this technology is growing up. I mean how big is Chips to Hits and how much more mature is that technology? Basic microarray technology is about 10 years old, tissue array technology is now six, and protein arrays entering the functional level is about five. And at first it was all antibodies. Until Lance Liotta came out with a reverse phase array you didn't have anybody doing anything else. Investigators have made their peace that they don't have PCR for proteins. Now, we've grown up and said, "We don't have it, we are going to deal with it a different way." What you see here is more chemistry. These people are chemists — and that's the difference between molecular biologists of the transcriptome world and these guys here.

During your presentation you said several times that tissue arrays are protein arrays. Do you feel there's some resistance to that idea?

I think I said it to drive home the issue to people who are working on protein arrays that this is the path towards clinical value. If you are working with tissue on protein arrays, the path to the clinic is still going to be through paraffin-formalin embedded tissue.

Because that's where all the samples are?

That's where all the samples are. And we don't have the capacity to take the American population and turn it on a dime and change the way we handle specimens. If you work in serum it's a very different beast. But if you are going to work in tissue, then you have to make it work within the realm you've got. That's why I am saying, "I've got a protein array; its right here." I am not saying, "don't use the others," I am saying it's a layered, stacked approach. I think that's one of the great things about protein arrays is that you can stack your approaches using your same tool, using your antibody, all the way up towards clinical utility.

Where does your organization come into play in this technology?

I am chief of the Tissue Array Research Program, or TARP, lab. Basically we consider in the lab that if it involves formalin-fixed paraffin-embedded tissue it falls under our purview of an experimental modality. So, whether [it's] expression microarrays for chip-based SNPs, protein arrays, tissue microarrays, antibody arrays — it's all in our field.

You also mentioned cell-line microarrays and xenograft arrays…

Yes, and in general we are putting those into paraffin-embedded platforms, not always formalin-fixed, and the reason we are doing that is it's the quickest way to transfer those discoveries back to the clinic. There was a bottleneck. We had great research and great discoveries, but we couldn't get them to the clinic. And so the sooner we put it in a clinically relevant environment, the sooner it's off to the races.

Who is interested in this technology?

Everybody. We interact with the Cancer Therapy Evaluation Program at the NCI in drug development, we interact with basic scientists at the NCI, we work with the translational medicine groups, I work with groups that are NCI-sponsored, and I work with epidemiologists. We interact with industry all the time. Industry buys our platforms, they buy my tissue microarrays, and they want my xenografts and so on. I have had my cell-line microarrays knocked off by companies. Academia? Yes. We actually distribute microarrays. We've got a large number of people who use those. The interest is pretty much the entire community.

Where are you getting the tissue?

We have a lot of collaborators and a lot of friends. So we have a large archive that we've assembled over the last five years that we haven't really tapped completely, and then we are working on certain projects. I am working on esophageal and gastric cancer from China right now and people have done expression microarrays on those and now we've turned it around and people are doing tissue microarrays on them to find biomarkers involved in progression from benign to malignancy. The xenografts come from organizations like the Developmental Therapeutics Program at the NCI. They make the material for me so it's heavy collaborative work.

During your talk you described the instrumentation you are working with as "crude" but at the same time [the instruments] are fairly expensive. Are you satisfied with the tools that are available to people working on tissue microarrays?

The instrumentation market has traditionally been slightly under-funded. As a result the companies involved are lacking the resources to be able to develop the [tools] the way you'd like to see them developed. We've been collaborating to get new tissue instruments and I have been putting an enormous amount of energy toward image analysis. Array instruments to build tissue arrays are brutal. Protein array instruments are just crude on all points. Because of the viscosity of proteins, they are very hard to work with. Image analysis is breaking strides. The algorithms are improving constantly. I don't want to say that they are crude. There's been a great leap forward in the past five years and it's all been driven by bigger cameras and bigger computers.

Where do you think this area is going to be by the next time you come to PepTalk?

It's a broad question, but we are making better efforts in regards to [achieving] quantifiable protein arrays [so that] one can probe with antibodies in a more user-friendly approach. A lot of our efforts have been to make protein arrays accessible to investigators and so that's what we are going to be focusing on a lot over the next year. Instead of having the infrastructure to build a reverse phase array, can you build a cheap desktop array? I hope to have some better biophysical understandings of why some methods work better than others.

To be honest, I am going to be working on formalin-fixed paraffin expression arrays a great deal, and if that happens then we'll come and work on FFP-based protein arrays. They go hand in hand for us. We can't separate them. Where is the field going to be? It's going to be settling in as to what platforms they think are really going to move forward. Right now we still have a lot of forces in arrays; what are the performed protein platform arrays? Are you going to use a printed array with an antigen that you are going to probe with antibodies, or is it an antibody array? I think we are going to see a refinement all around.

When you say "platforms" do you mean commercial platforms?

I think we are going to see it in all directions. Right now the biggest problem is that the commercially available protein arrays are just too expensive for most investigators. Sure, biopharma can afford them, but researchers can't afford them, and academics can't. The other problem is that you wind up in fixed platforms. But I think as people develop their own robotics, instruments, and substrates to print on, you're going to see a lot more in-house printing.

If you look back on expression arrays, what did you have initially? You had the big guys who were giants, and then saw a wave of custom arrays. Then the big guys got cost efficient and pushed the custom arrays out of the market. Now what are we seeing? We are seeing the return of custom array companies like NimbleGen and CombiMatrix finding their niche and coming back into the market. The same thing is going to happen in proteomics.

File Attachments
The Scan

Genetic Risk Factors for Hypertension Can Help Identify Those at Risk for Cardiovascular Disease

Genetically predicted high blood pressure risk is also associated with increased cardiovascular disease risk, a new JAMA Cardiology study says.

Circulating Tumor DNA Linked to Post-Treatment Relapse in Breast Cancer

Post-treatment detection of circulating tumor DNA may identify breast cancer patients who are more likely to relapse, a new JCO Precision Oncology study finds.

Genetics Influence Level of Depression Tied to Trauma Exposure, Study Finds

Researchers examine the interplay of trauma, genetics, and major depressive disorder in JAMA Psychiatry.

UCLA Team Reports Cost-Effective Liquid Biopsy Approach for Cancer Detection

The researchers report in Nature Communications that their liquid biopsy approach has high specificity in detecting all- and early-stage cancers.