Part one in a two-part series.
Kurt Herrenknecht, Head of Cellular Applications, Evotec Technologies
Dietrich Ruehlmann, Product Manager, Imaging and Analysis, BD Biosciences
Michael Sjaastad, Director of Marketing, Imaging, Molecular Devices
Joe Zock, Senior Customer Support Scientist; Manager of HCS User Services, Cellomics
LONDON — As one of the few conferences to focus solely on high-content screening, the Marcus Evans Practical Experiences in High Content Screening conference held here last week drew representatives from several of the major HCS vendors, as well as pharma and academia to discuss current trends in this rapidly growing industry. Conference participants weighed in on such topics as image analysis, RNAi, informatics bottlenecks, and the application of HCS to primary drug screening. Cell-Based Assay News decided to sharpen the focus even more and discuss some of these topics in a roundtable discussion with representatives attending the conference from four major HCS providers. Here is what they had to say.
Imaging and cellular analysis is clearly at the start of a growth curve … why? How has it gotten there?
Herrenknecht: I think that one advantage is that it simply gives you more information than a biochemical-based screening approach. I think that this is the simplest answer to this question, because you see what happens in the cell, and you can look for some cellular reactions. Without an imaging approach, you have no chance for a real localization of the cellular components. The only way to monitor this, and to quantify it, is with an imaging approach, and at least to my knowledge, there is no biochemical way to monitor this. I think this is the big advantage that people see.
Zock: Absolutely. One of the major reasons is that you have to crawl before you walk before you run. And we're just to the point now where there are some fantastic, robust, useful tools. As a group of vendors, we've been able to develop what I believe are core, useful tools, and also, been able to niche them. They are starting to end up in places where people are saying, 'The value of this is X, and we're going to use it, and get our data, and move on.' That's just now blossoming, especially in the last 18 months — with what I've seen from the [Evotec] Opera, the BD Pathway — it's really incredible.
Herrenknecht: I'd like to add to this. I was at last year's high-content screening conference in San Francisco, and I've been to this year's [conferences]. Just from 2004 there has been such a huge difference. It was OK in 2004, but it was great, it was fantastic, in 2005. We have really seen that it has taken off, but why has it taken off? The technology has been developed; we are all in sort of the second-generation instruments; the instruments have reached now a stability and user-friendly interface so that it can be used for their purpose. And, I think it has found acceptance in upper management of the pharmaceutical industry, who are saying, 'Yes, now the time is right. We see the limitations that a pure biochemical approach is bringing.' I think in 2004 we have seen an acceptance of this concept of high-content screening.
"And I think high-throughput [biochemical] screening has just passed the chasm, and I think HCS will be the HTS of tomorrow"
Ruehlmann: I fully agree with Kurt and Joe's comments in terms of the acceptance level, but I also think that counteracting this is a move to smaller and more complex assays, and high-throughput [biochemical] screening has not necessarily developed. I think you mentioned this, Joe, about the 'hangover phase,' which is very true. Normally when you look at adoption profiles of new technologies, you see an initial blip of the early adopters and the crazy guys who are not worried about the outcome, and are just into the technology. And then you see a [chasm], and then it takes off and does its usual hockey-stick thing, if it works. And I think high-throughput [biochemical] screening has just passed the chasm, and I think HCS will be the HTS of tomorrow. The screens will be smaller, they will be a lot more complicated, there will be a lot more force from the vendors to develop solutions for more complex biology, and HTS may eventually go away.
Sjaastad: I've been an early provider, and I went back four years as a user, and I think that the early adopters have done a great job. Other companies, especially Cellomics, did a great job of making early adopters comfortable with the technology. But coming back as a provider now, I sense a comfort level in accepting this technology when I talk to people. I have to talk about very different things, and that's why I sense that there is going to be growth.
The idea has been thrown around that this is being used almost exclusively in secondary screening. Do you agree with this, and if so, what is it going to take to get it into primary screening?
Ruehlmann: I disagree with this, but I disagree in that we should all abandon the concept of high-content screening and high-throughput screening. You're going after intangible targets. You're going after — which AstraZeneca has shown with [GE Healthcare] IN Cell 3000 data — you're going after a million compounds, and you cannot go after this target by any other means. Of course there are going to be biochemical assays, but I think that as soon as the bioinformatics develop, and you can mine old legacy data, your screens become smaller, and they become amenable to HCS systems. I really believe that the idea of differentiation between HCS and HTS will go away very soon. In applications, I think, it has gone away. From my own experience, you can go into an assay development lab, and they happen to have three [Amersham] LeadSeekers or two HTS systems in there. At the same time, in a functional genomics lab, you find HCS systems. So I think the boundaries are dissolving.
I understand this concept of HTS and HCS merging, but do you think that high-content imaging will be fast enough to suit the tastes of pharma, to do these industrial-scale screens?
Zock: I have a question in response. We've been screening with high-throughput methods for a while now. Where are all the drugs? That concept of faster, or 'screen more, get more,' doesn't really exist — it's a fantasy. So this new paradigm of taking a more in-depth look at the context that you're putting these assays through is going to bear fruit. And it's going to bear fruit in ways that we're not even thinking about right now. That's why we're starting to look at data mining, and clustering — using the data that's generated is going to become increasingly important. Nobody designed a sequence stitcher until the human genome was on its way, and they were making more sequences than they knew what to do with. So now is the time to be working on that.
Sjaastad: I think the capacity to screen as many compounds as you want to is there — you can scale up the systems that exist until it's cost-effective. And I agree with Joe that the assays are successful. I'm aware of customers that have put drugs in the clinic using these assays, and they haven't had to scale up to ultra-high-throughput amounts of compounds. They've had to use multiple imagers, but it's very cost-effective.
"We've been screening with high-throughput methods for a while now. Where are all the drugs?"
Ruehlmann: I don't disagree with that, but taking a slightly different angle, I think that in screening applications, where we are actually looking at biological effects of entities — be they analytes or small molecules — I think that the need for throughput hasn't been reached yet. I think people will have to build faster systems, or systems that are scalable, so you can have five or six systems running in parallel, because the number of compounds screened will be offset by the complexity of the screens that we run. You will not run 20,000 or 50,000 compounds anymore, but you will run 100 different screens, against 100 different cell lines, each of those against 10,000 compounds. I don't think the throughput needs are reduced; I think the complexity has increased. So the systems still have to be fast. Technology gets faster, cheaper, and smaller, always. So I don't think this is going to be an issue in the future; I just think that the throughput of the screening [method] will stay the same or go up.
On that point, another popular idea is that the image analysis and the informatics are really going to drive this toward increased throughput, and that the instruments are not going to drive this as much. Someone said that "It's not rocket science, it's just a microscope in a box," and that the informatics and software are going to be most important. Do you agree?
Zock: I disagree. If it was as simple as that, it would have been done already.
"Let's face it — we do not differ very much in the instruments we provide."
Hennerknecht: I disagree to some extent. I think the key is to extract data. Let's face it — we do not differ very much in the instruments we provide. These are all good instruments, and they are all capable of doing a decent throughput, and there are various cameras, lasers, lenses, whatever. But I believe in the end: What are you going to do with your images? What sort of information do you extract from this, and I think this is coming down to software.
Ruehlmann: I still think there's some mileage in the technology, and it's not just a microscope in a box. The systems have become, on the hardware side, very sophisticated, whether they have, for instance, in our system, spinning disc confocal, or confocal with lasers, or [optical grating] illumination, or something completely different — these all differ substantially on the technology side, and each will eventually find its niche and application. But I think it's very dangerous to think it's just a microscope, because what happens then is that you stop thinking about the next step in the technology. I think we have to think ahead, and think about whatever technology you want to think about — but look at the hardware side. I don't think we have maxed this out yet. Nobody has, yet, to my knowledge, applied transmitted light to a screening application. A lot of vendors have it, but nobody has actually done a screen with it.
Zock: People seem to be asking for that, and we ask them what they would use it for. And they say they have no idea. Well, [we say] then maybe we'll do something else right now.
Ruehlmann: But as soon as there is a killer application, the technology will be pushed forward. And at the moment, all the killer applications that exist, if there are any, are in the fluorescence space. Now, if someone can promise me that they are going to live in the fluorescence space for the next ten years — no; that would be crazy. They'll live maybe in the fluorescence space, and maybe somewhere else, and in that case, we do have to follow up with something to the instruments.
Sjaastad: I think the most successful use of this technology is when you see the combination of the good focus and good science — the assay development is critical, as well as reagents. The instruments are now reliable, scalable, and software is going to help us extract more information, and informatics well help us to make better use of it. But you have to get all three right.
Next week the roundtable participants discuss new HCS assay formats, reagents driving HCS, selling to academia versus pharma, and the integration of HCS into so-called systems biology.