Skip to main content
Premium Trial:

Request an Annual Quote

U Pitt s Watkins on the Value of Imaging in Cell Biology, Drug Discovery

Simon Watkins
Professor, cell biology, physiology, and immunology; Director, Center for Biologic Imaging
University of Pittsburgh

At A Glance

Name: Simon Watkins

Position: Professor, cell biology, physiology, and immunology; Director, Center for Biologic Imaging, University of Pittsburgh

Background: Assistant professor, University of Pittsburgh, 1991-2000; Postdoc, Pasteur Institut and Harvard Medical School (Dana Farber Cancer Institute); PhD, University of Newcastle upon Tyne

At IBC's inaugural High-Content Analysis conference held last month in Washington, DC, Simon Watkins, director of the University of Pittsburgh's Center for Biologic Imaging, gave a talk that stood out from many of the others because it was slightly unusual for a drug-discovery meeting. It was a first-hand account of how he and his colleagues, using a sophisticated homemade cellular imaging platform, serendipitously stumbled upon a previously unknown method by which immune cells might communicate with one another — via tiny elongated structures known as nanotubes. Watkins' research, while not necessarily "high-content" in the drug screening sense of the term, nevertheless underscored how important cellular imaging can be to both basic research and drug discovery.

Last week, CBA News caught up with Watkins to discuss his recent findings, the implications it has for immunology, and how his group's approach to imaging might fit into the world of drug discovery.

Your most recent work focuses on intracellular communication within the immune system, correct?


So your group has found evidence that many of these immune cells are connected by what are termed nanotubes. Was this term coined by your group?

That term was coined a couple of years ago, and we used it because it's the nearest thing to what we think we see. The work we do deals specifically with dendritic cells, which are antigen-presenting cells within the body. They tell your body when an antigen has entered it, and it takes the information up to the lymph node region during an immune response.

These nanotubes had been seen previously, correct?


So explain what your group saw with the calcium dye response experiments.

We went at it a different way in that we did not know that these tubes existed when we did the work. We found this calcium response, and went back to the literature to see what could explain this flux we saw, and found these papers on the tubes. We then investigated whether we could actually find these tubes in the cell populations we were studying, and lo and behold, we did. That helped us understand why and how this cellular communication works.

Can you recap what you actually saw?

We were looking at activation of the cells by pathogens, and we were delivering reagents down microinjection tips. In one experiment the tip was blocked, which happens quite frequently, and by accident I touched a cell. I was looking at metabolic response, and one of the things you normally do is use calcium dyes to measure what's happening. So we used this calcium dye, and I accidentally touched a cell, and we saw this flux running right across the field, rather than in just the cell I touched. That led to this "eureka" moment, to try to figure out what on earth just happened. After that I started studying the literature, and after a few minutes we found this paper. We took that to the next step, which was to try and explain, with my colleague Russ Salter, the functionality of these tubes and how they are important in biology.

Do you know yet how the cells communicate with one another via these tubes?

We know it's an electrical signal because we've got the calcium flux. We know there is a signal going down, but the thing is, we don't know how many signals go down these tubes, or whether the cells can actually deliver materials from one to another down these tubes, or whether they're just signaling mechanisms.

What implications are there for the way the immune system works?

Imagine a situation where you get an infection in your foot, and you only stimulate one of these cells. That cell has to move all the way up to the back of your knee, for example, where your lymph nodes are; or, if you had an infected finger, it would be your armpit. For a very small cell, about 10 microns across, traveling half a meter is a very long way to go. So we think that many cells get activated via this cross-talk between them, so you've got a much higher likelihood of the signal successfully reaching the lymph node and being amplified to an immune response.

This work was certainly high-content, in that you used highly detailed cellular microscopy. But the term high-content has begun recently to refer to higher throughput methods…

When you say that, though, it's not really the case. High throughput is in the range of 120 images per second. So when you say "high content," do you mean high-content in terms of looking at many, many cells? Or do you mean high content in that we look at one cell or one group of cells in a very high-speed fashion? I'm taking another view of high content, which is not traditional, which is that high content means high information content. That information can be spatial, temporal, spectral — and all these different axes reflect content. So we might collect spatially in three dimensions, we might collect five colors, and we might collect the data set as fast as we can, which is essentially 1.8 gigabytes per minute.

So hasn't imaging always been high content?

Well, imaging with microscopes has always been high content. But right now we're at this impasse — and this came out in some of the other presentations at this meeting — where data sets, and dealing with those data sets, becomes one of the biggest issues we have in front of us now. I can generate half a terabyte in the morning, and dealing with that data, from the viewpoint of moving it, storing it, and analyzing it, has become extremely difficult. There are products out there, like the Molecular Devices [former Universal Imaging] product, and there are a few others, but they still don't deal with these massive data sets. It remains to be seen how we develop solutions. For example, I'm working with IBM research in Westchester [County, NY] at the moment to build a content-management solution. There is the Open Microscopy Environment, which I think will become more and more prescient in what we do, and more of us will be using that as the backbone of our imaging platform. Together with cyber-based communication and very fast storage, we'll be able to start approaching a way of storing, moving, and archiving that data. The problem right now is that even the instruments we use are extremely expensive, anything up to three-quarters of a million dollars. The actual storage systems are even more expensive. There is a price barrier that needs to be brought down before we can use these in a routine fashion.

You said in your talk that you've sort of eschewed the microscope-in-a-box approach, and have built your own complex imaging system…

Yes, we have 22 microscopes, seven of which use complex light sources.

Do you think that there are any inherent problems with the microscopes in boxes?

The reality for us is that first of all, technology moves at a much faster pace than the people who build integrated systems. So if you can't change out parts — for example, cameras have changed dramatically over the last few years from standard cold CCD cameras to high-speed very low-temperature electron-multiplying CCDs. These need to be changed out rapidly, so you've got to keep up with technology, which is both costly and complex. If you're using a single, unique platform without the ability to expand, you're going to be somewhat limited right from the beginning, because you have to use what the vendor decides you have to use. If you change your experimental paradigm outside what the vendor chooses to supply you with — for example, UV or IR imaging — you're not going to be able to do that with a microscope in a box. You need the flexibility to train, build, and adapt the instrumentation you use to the current question you're after. The downside to that is that you need highly skilled, highly trained people who understand the photonics and the optics of the systems, and placing an instrument like this in a core technology facility would not work.

The nanotube work — do you think that's the kind of thing researchers might miss if they use automated image-analysis programs as opposed to being able to look at all the images?

We saw this on a screen, and I think it all depends on how you sample. We sample at very high speeds, with very high resolution, on a very small area, so we look at very small numbers of cells. So we see intimately what's happening with those cells. If, for example, you're scanning a 96- or 384-well dish, and you're scanning it once a minute, and the event you're looking at happens once a second, then you're not going to see it. You've got to sample appropriately for what you're trying to find. Now, if you don't know what you're going to find, you obviously have to maximize everything to maximize your chances.

Drug companies have obviously adopted imaging recently more than they had in the past, but at the same time, the MO for drug companies has always been "faster." Is there a happy medium?

I think you need to find a paradigm where you're mixing the research capabilities of imaging, and also high-content and high-throughput platforms. I think it behooves the drug companies to find extremely gifted research imagers — in other words, not people who just press buttons, but people who think about the biology, think about the algorithms, and build and test systems around that. The way we do it now, for example, is to build algorithms, which are molecular algorithms inside a cell, and this is scaled up to a 96-well plate. But we do not rely on anyone but ourselves to build those algorithms. We rely on a combination of new dyes, or pre-existing dyes, and image analysis platforms on which we can build the algorithm. But if you expect one box to do it all, and it's a box designed by a company without flexibility, it will be very difficult. So I believe industry needs to find a hybrid, where there are highly trained individuals — in academia it would be faculty-level researchers — that can dissect these problems in a cogent fashion.



File Attachments
The Scan

UK Pilot Study Suggests Digital Pathway May Expand BRCA Testing in Breast Cancer

A randomized pilot study in the Journal of Medical Genetics points to similar outcomes for breast cancer patients receiving germline BRCA testing through fully digital or partially digital testing pathways.

Survey Sees Genetic Literacy on the Rise, Though Further Education Needed

Survey participants appear to have higher genetic familiarity, knowledge, and skills compared to 2013, though 'room for improvement' remains, an AJHG paper finds.

Study Reveals Molecular, Clinical Features in Colorectal Cancer Cases Involving Multiple Primary Tumors

Researchers compare mismatch repair, microsatellite instability, and tumor mutation burden patterns in synchronous multiple- or single primary colorectal cancers.

FarGen Phase One Sequences Exomes of Nearly 500 From Faroe Islands

The analysis in the European Journal of Human Genetics finds few rare variants and limited geographic structure among Faroese individuals.