Skip to main content
Premium Trial:

Request an Annual Quote

NCI s Stephen Hewitt on Developing Tissue Arrays as a Tool for Biomarker Discovery, Validation

Premium
Stephen Hewitt
Clinical investigator
Laboratory of Pathology, National Cancer Institute

At A Glance

Name: Stephen Hewitt

Position: Clinical investigator, Laboratory of Pathology, National Cancer Institute, since 2000.

Background: Residency, Anatomic Pathology Training Program, Laboratory of Pathology, NCI, 1996-2000. Chief of NCI's Tissue Array Research Program.

MD, University of Texas Health Science Center, Houston Medical School, 1996.

PhD in genetics, University of Texas Health Science Center, Houston Graduate School of Biomedical Sciences and the University of Texas MD Anderson Cancer Center, 1995.


At the Cambridge Healthtech Institute's PEPTALK conference held last week in San Diego, Stephen Hewitt gave a talk on using tissue microarrays as a tool in translational medicine. ProteoMonitor caught up with Hewitt after his talk to find out more about his background, and about the development and uses of protein tissue arrays.

How did you get into developing protein arrays, and more specifically tissue protein arrays?

I have a PhD in genetics and I'm an anatomical pathologist by training. I had gone into pathology because that's where the tissue was. I had learned from doing my PhD that if you wanted to get the tissue, you needed to be close to it, so pathology was the natural path there.

I was doing my training in pathology, and at about the time I was finishing up my training, the technology for tissue microarrays was developed. At that time, Rick Klausner was the director of the NCI, and the original technology was developed across the street at the National Human Genome Research Institute.

They said, 'This is great technology. We should use this for cancer. Let's create a lab.' So they created the Tissue Array Research Program lab, and Lance Liotta was my boss at the time, and he looked at me and said, 'Do you want to run the lab?' And I said, 'Sure.'

So suddenly, I went from a background in genetics and pathology into proteomics. That's where I was and where I landed, and when we opened up the lab and started doing tissue microarrays, we had resources at hand and questions at hand, and we said, 'OK, let's start doing a few other things.'

Then someone said to me, 'Is a tissue array a protein array?' and I said, 'Prove to me it's not.' And that's pretty much how I got to where I am.

When did you start running that lab?

We started the lab in July of 2000. We just celebrated five years. I've had the same chief tech with me for the entire time. I've got two postdocs and two technologists, and that's it.

What kind of problems were you working on when you first started the lab?

The lab has always been more of a technology shop than a hypothesis-driven shop, so the first technology issues were, 'Could we produce a platform that represented what we wanted to study? Could we accurately represent the tumors of interest in the tissue microarray platform at high enough density and with enough samples that they were of biological utility to ask those questions?' And the next question was, 'Could we put the samples down in a fashion so that we could do biologically meaningful assays?'

So first it was, 'Could you build it?' Then once we'd built it, 'Could you manipulate it to do something more?'

I think the technology questions have evolved since then. Initially the question was, 'Could I build an array?' Then the question was, 'What could I do with an array?' And then the question was, 'Could I make an array more reproducible?' or 'Could I make an array available for phosphoproteins?' Or someone came to me and said, 'We want to do IR — infrared spectroscopy.' I said, "Well, OK, I think you're crazy. OK. Let's try.'

So we did that. And we're considering doing mass spec in an array platform shortly, working with fixed material to kind of turn the paradigm on its head. Rather than analyzing a few samples and trying to make conclusions, we're going to analyze tens of samples, and then ask more rigorously validated questions.

When you first developed the protein arrays, were there a lot of other people working on that kind of array?

When we started getting into the hard core protein array work, Lance had just developed his reverse phase array, which meant you had to do laser capture microdissection to derive your material. Mike Buck, who had helped Lance develop LCM, had been working with some technologies to transfer protein. And he said, 'This is good, but frozen tissue's not so common. … Well, you're good at formalin fixed, why don't you give it a try.'

So we just set down a lot of parameters: Needs to work with formalin-fixed, paraffin-embedded tissue, needs to work off of a glass platform. We sat down and said, 'This is just a problem. Let's go solve it.'

And that's what we did. I was working with a guy who had helped develop the original antibody arrays. And we had a lot of other people at the institution at the time who had been working in all of the protein platforms. So it was just the right combination of people at the right time. We were able to trade ideas around and try things.

People use the term, 'Think outside the box.' That's not necessarily what we did. We had a tendency to say, 'Well, the dogma said this, but what does the dogma know? So let's try something that's unconventional.' We were so used to breaking rules. People thought we couldn't blot material from a glass surface without using electrophoresis. We could. They said, 'Well how do you do it?' I said, 'Capillary action.' But it didn't have to be trans-capillary action. It could just be contact capillary action.

We've had a lot of instances where people said, 'You can't do X.' And we've said, 'Show me the data.' And they've said, 'Well that's the general thinking.' And we've said, 'Well, the general thinking is wrong.' We've proven it. We've done a lot of that, and that's been a lot of our success.

What were some of the challenges of developing a formalin-fixed tissue array?

Really it was being able to provide a quality product. So you had the technical issues of what the material you were working with was, you had technical issues of manufacture, you had reproducibility, and managing expectations. A lot of the challenges we had were dealing with what the basic scientist wanted, versus what the pathologist thought he was going to get, and educating the consumer. A lot of the technology behind the tissue microarrays isn't that hard, but it's educating the consumer on how to use that technology.

And the concept of high-throughput pathology was just novel. No one had thought about it. Once we had that concept of high-throughput pathology, it changed everything. You went from a descriptive endeavor of, 'Well, this is an immunohistochemical stain, and let's stain some tumor cells,' to 'I want to quantify. I want to put it into a database that says: 'Stains tumor cell. Nuclear pattern. Intensity three.''

People had been messing with image analysis for years, and it had all been, 'Well, I have a pretty picture, and I analyzed this pretty picture.' Once we had tissue microarrays and bigger computers, it suddenly became like everything else. You could crunch numbers and really develop some data behind it.

When I just started doing the image projects, we were doing image development work for people who weren't biologists, but were coming from physics and other fields where they were used to more quantitative sciences, not observational sciences.

You go from, 'Oh, yeah, it's there,' to 'Yeah, it's there and it has some meaning.' The problem with pathology and tissue microarrays is that proteomics is not just the presence of a protein, or a functional protein, or a modified protein. It's, 'Do you have the protein?' Then the next question is, 'Where is the protein?' And 'Does the protein move and/or interact with anybody?' It's more complex than a lot of the transcriptomics approaches of, 'Well I made transcript. Making transcript's enough. I'm done.'

When you start getting into proteomics, we're confronting functional proteomics.

What do you see as the major applications of the arrays that you've developed?

I specialize in clinically relevant questions. Not that I'm averse to basic science — a lot of what I do still is basic science. But nearly everything we do we try and wrap in the question of, 'Am I either finding the biomarker that relates to a disease, or am I finding the druggable target that relates to treatment of disease?'

We're doing far less of the raw discovery for discovery purposes. If I have a hypothesis, can I predict the outcome of a disease, and can I modulate the outcome of a disease? For me it's a matter of, 'OK, I want to make an observation. OK, I want to validate an observation. OK, I want to compare an observation between disease processes.' So, if I'm working in kidney cancer, does it pertain to bladder cancer? If I'm working in colon cancer, and I have one observation and I want to repeat it in a different cohort, those have been the touchstones of the work we're doing. And I think our mantra is, 'More samples, more samples, more samples.'

So throughput's got two aspects. It's either more samples or more analytes. I have a tendency to focus on more samples.

What do you think are the obstacles of getting these arrays into clinical use?

Most of what I do is discovery platforms for validation in clinical use. Very few of what I'm building as an array is in clinical use. I guess our transfer technology that we developed would have clinical use, more on whole sections than in an array situation, but really the challenge of arrays in clinical use is a very simple challenge. That is, the American healthcare system is built around a model of point-of-care testing, where you don't want to have to collect a bunch of samples to perform an assay. You want to perform the assay on the sample you got that day. And so that's many times the biggest challenge we face. Most of the arrays I deal with day-in, day-out are many patient samples, one analyte.

What is your lab looking to work on in the future?

All bets are off. My PhD advisor quipped at my PhD defense, 'Stephen, I would have let you graduate earlier had you stopped starting new projects,' to which I replied, 'Yeah? But I published them all.'

So right now we are wrapping some experiments with some antibody arrays. We may at the end of that decide that we want to put more effort into those.

We are developing some additional technologies for protein arrays for tissue that we hope are more appropriate for point-of-service testing, where you would be able to take an analyte, grind it up, and put in into multiple wells in a multiplex format, where you're not having to invest in a $1,000 platform that would hold 20 samples just to test one patient. You want to be able to do point-of-service testing.

I'm back to transcriptomics. We're focused specifically on the question of, 'Can we work within the confines of clinically relevant microrarrays for diagnostics?'

I think we're going to pursue high-throughput mass spectroscopy. We're looking at FTIR, we're looking at Raman spectroscopy. All those are more physical chemistry approaches to protein arrays, but if I can develop a methodology that allows me to identify protein modification in an array platform as a discovery tool, that's proteomics.

I think we need to question ourselves a lot more in all of these fields. There are aspects of what we're doing that we hold up as dogma. For example, when we're working with antibodies, we all consider the Western blot the gold standard for the quality of an antibody. Is that any good? Maybe not. Yes, in that assay it detects a single band. But there are other ways to skin that cat, and we need to be pursuing those at a more chemical level to try to define those interactions. That's one of the problems we've been having, whether it's an antibody array or a protein array. An antibody-protein interaction is not the same platform to platform to platform.

File Attachments
The Scan

Could Cost Billions

NBC News reports that the new Alzheimer's disease drug from Biogen could cost Medicare in the US billions of dollars.

Not Quite Sent

The Biden Administration likely won't meet its goal of sending 80 million SARS-CoV-2 vaccine doses abroad by the end of the month, according to the Washington Post.

DTC Regulation Proposals

A new report calls on UK policymakers to review direct-to-consumer genetic testing regulations, the Independent reports.

PNAS Papers on Mosquito MicroRNAs, Acute Kidney Injury, Trichothiodystrophy

In PNAS this week: microRNAs involved in Aedes aegypti reproduction, proximal tubule cell response to kidney injury, and more.