Skip to main content
Premium Trial:

Request an Annual Quote

UCSF s Finkbeiner Discusses Huntington s Disease, Automated Microscopy

Premium

At A Glance

Name: Steven Finkbeiner

Position: Assistant professor, neurology and physiology, University of California, San Francisco; Assistant investigator, Gladstone Institute of Neurological Disease

Steven Finkbeiner is the lead investigator on a study published in last week’s Nature that unearthed a long-standing mystery surrounding the progression of Huntington’s disease. Many of the group’s findings were made possible by a custom-made robotic microscope and related analysis software. Last week, Finkbeiner discussed his research and possible commercialization of the microscope with Inside Bioassays.

Which came first: your interest in automated microscopy or your interest in neurological diseases?

The history is that I was trained at Yale as a PhD student in the laboratory of a guy named Steven Smith, who was an imager there, and is now at Stanford, a professor there. He was the guy I learned imaging from, and some computer programming, and some motorized control work from. And there we were primarily in its conventional application, which was time-lapse video microscopy to follow events in the short term. And then I went and did a fellowship with a guy at Harvard in molecular neuroscience, and that’s when I became interested in Huntington’s and neurodegeneration. The genesis for the idea was partly watching people spend literally hours in front of a microscope scoring cells in manual fashion, and thinking about how much faster we could go if it was automated, and number two, if there was some way to make it less user-dependent. I felt like the data might be less susceptible to bias. Those were two major motivations. And then the third came when we just kept thinking more and more about the way conventional work’s been done, which — in neurodegeneration in particular, but I imagine it’s widely applicable for any disease-related question where cell death’s involved — is that if you take a snapshot on one day, and then another snapshot sometime later, it’s really tricky to infer what’s happened in between, if you’re not following the same cells. And the more that really sunk in, and the more that a lot of the questions we were trying to answer I felt could be interpreted in different ways with the same data set, the more I felt like if I wanted to do this seriously, I needed to come up with a new solution. That was where really focusing on the ability to follow cells longitudinally over arbitrary lengths of time came in.

Tell me a little about how you use this platform in your research right now.

We use it in two general ways. The first is to be able to ask or find mechanistic questions that deal with cell death, or, the other half of our lab works on another long-term phenomenon: learning and memory. In both of those cases, we’re trying to watch what it is that we’re studying unfold, and then apply a suite of statistical techniques called survival analysis to help us relate changes that occur in some intermediate stage to the fate that they predict. First of all, it’s to figure out whether there’s a relationship, and if there is, whether it predicts that the fate will occur sooner or later, or is more likely. And also, it just helps to give us a quantitative handle on how important that factor likely is. If it’s highly predictive, that suggests that it’s a major factor, and if it isn’t, then it might be a minor one. And the other application is more of the high-throughput stuff. There, it’s especially suited to look at the role that either small molecules might play, or genes that we introduce, or siRNA to inhibit specific genes might play, and trying to figure out what molecules or what genes might be able to improve survival or make it worse.

What have been your major findings so far for the specific disease states you’ve been studying?

The story that’s [just been published] in Nature — the basic finding of that study is that for seven or eight years, in Huntington’s disease, people have known that you can see abnormal deposits of the protein form inside neurons. Those deposits are called inclusion bodies. It’s been very controversial what role they play in the disease. When they were first found, the feeling was that they might actually be causing the disease — that somehow, Huntington’s, by aggregating into these structures, led directly to cell death. But then we, and some other people, presented some results that later made it less clear. It seemed like, at least in some cases, that maybe you could get death without them, and maybe there were even some manipulations that paradoxically led to their disappearance, but a worse outcome. So it really was very confusing. Basically, what we could do with this microscope [was] to introduce a version of this Huntington’s protein with a disease-causing mutation in it, fused to green fluorescent protein, and we put in together with that a version of red fluorescent protein that led us to see the cell’s morphology and see that it was alive. Then we put it on our microscope system and watched these cells over periods of days to weeks, and basically could see the whole process unfold. Some cells would form inclusions on some days, and some on other days, and then they would die at various times. What we could do with our image analysis and statistical approach was to ask the question: Do inclusion bodies predict death or survival, or are they not predictive at all? And what about the other form of Huntington’s that we see in cells, which we call diffuse Huntington’s, for lack of a better name? The surprise was that it was levels of that diffuse form that predicted death, and if cells formed an inclusion body, they actually lived longer than if they hadn’t produced it at all. The other thing that we [could] see is that once an inclusion body formed, within about a day or two, the levels of this other diffuse form fell to almost baseline level. With the survival analysis, we could calculate the risk changes to that neuron for subsequent death to occur, and we found that the inclusion bodies dropped the level of cell death to almost background levels. It also reduced the levels of the diffuse Huntington to levels that were almost immeasurable. So the take-home for us was that it seemed like the diffuse levels of Huntington’s were really the toxic form, and that inclusion body formation might be an attempt to sequester it, and in any case, predict survival. The microscopy is really helpful because in many of these disease mechanisms, the interactions can be really complicated. But really understanding which processes are part of the problem or part of the cell’s coping response are very hard to figure out with conventional methods, but are really important for people looking for drug targets and picking which pathways would be the most useful to target.

The recognition and matching software you’ve written — is that a relatively new concept?

The system is definitely very much still beta, and still in development. In fact, one of the reasons I’m happy to do this interview is I think it has huge potential, and I’ve outlined what it is exactly I think we need to do, but our focus has been primarily on trying to use it as soon as we brought it to a place where it could be used — to ask a biological question — so we could keep getting grants and all those other good things. But we have written pieces of software. I think the key thing for us is that the commercially available stuff we got from Universal Imaging [now part of Molecular Devices] — MetaMorph — some of it just didn’t work that well. The autofocus wasn’t very reliable, and we had a really hard time getting it to be able to return to the same cell, until we developed additional algorithms that helped us do that. We have written software that will then take the images that we store off line and follow individual cells, but a lot of that stuff is still being worked on. But I can say that for the high-throughput stuff, we can do it in a fully automated way. We use a different optical strategy to collect the images and solve some of the problems that we deal with when we deal with really high-resolution issues. There, we do what I call a ‘quasi-survival analysis.’ But the neat thing is that it’s fully automated, you can do all the cell counting in an automated fashion, and because it returns to virtually the same field every time, you can do the survival analysis, which is just unbelievably sensitive — maybe two to three orders of magnitude more sensitive than other approaches to data analysis we’ve used to quantify cell death. I’m really excited for its potential use for small molecules because I think it can really provide people with a very quantitative handle of how big the effect is that they’re looking at. And because it looks at the effect over the period of several days, it can be really sensitive. So I think it could be part of a development loop where you find something that has some activity, you go back and modify it, and you have a nice quantitative measure of what progress, if any, you’re making.

What type of hardware do you use, and is that something you would see as necessary for an upgrade?

I chose a Nikon Quantum TE-300 as a base — I wanted a basement port, because I wanted to be able to do all this in plastic tissue culture dishes, in part because neurons tend to be healthier there than when they grow on glass, and I also wanted to be able to exploit the dish. I though it would be easier to develop a method to get back to the same neuron if I had the cells relatively fixed to the dish. The problem is, when you do fluorescence through plastic, you take a big hit in sensitivity, and you can be more limited in the fluorophores that you use. Some of the short-wavelength ones tend not to work quite as well in plastic. But I’ve got to say, we made some modifications to make it as optically efficient as possible, and we’ve been really successful — we can use blue fluorescent protein, which is really hard to use. So I think the system seems pretty sensitive. We have a Hamamatsu Orca II [CCD] camera. I chose it because of its linearity and sensitivity. And then we use some standard parts — two Sutter filter wheels, a xenon light source from Sutter. We had to do some customization — we had to cut a hole in the vibration isolation table, and I had to weld some feet to be able to handle all this stuff.

Can you talk about any commercialization plans?

UC San Francisco has filed a patent for it. And yes, we are deeply interested in commercializing it. The technology transfer at UC works pretty glacially, so frankly, I’m taking things into my own hands, and I think opportunities like this are good ones. Frankly, I think it would be a huge help. When I’ve gone to meetings, I’ve had a lot of people ask if I could sell it, because I think scientists realize the power that this might bring to the table, and I’d be very interested. We have a lot of ideas about how it could be commercialized, as is, and we also have a bunch of ideas about how its power could be enhanced with a really modest amount of effort, primarily in software development — but development of programs that we’ve already developed. Within high-throughput mode, we can definitely go through 10,000 samples in a work day, if we needed to. So it would definitely work in that fashion — it’s just not something we’ve emphasized, because as an academic lab, we still are primarily focusing on cellular mechanisms.

 

The Scan

Push Toward Approval

The Wall Street Journal reports the US Food and Drug Administration is under pressure to grant full approval to SARS-CoV-2 vaccines.

Deer Exposure

About 40 percent of deer in a handful of US states carry antibodies to SARS-CoV-2, according to Nature News.

Millions But Not Enough

NPR reports the US is set to send 110 million SARS-CoV-2 vaccine doses abroad, but that billions are needed.

PNAS Papers on CRISPR-Edited Cancer Models, Multiple Sclerosis Neuroinflammation, Parasitic Wasps

In PNAS this week: gene-editing approach for developing cancer models, role of extracellular proteins in multiple sclerosis, and more.