Skip to main content
Premium Trial:

Request an Annual Quote

Max Planck s Ivan Baines on Collaborating with Industry for High-Content Screening

Ivan C. Baines
Scientific Coordinator, director of services and facilities
Max Planck Institute of Molecular Cell Biology and Genetics

At A Glance

Name: Ivan C. Baines

Position: Scientific coordinator, director of services and facilities, Max Planck Institute of Molecular Cell Biology and Genetics, Dresden, Germany, since 1999

Background: Director of business development at Cenix BioScience, Dresden, Germany, 2002-2003

Part one of a two-part interview.

The Max Planck Institute of Molecular Cell Biology and Genetics serves as the hub of perhaps the largest industry-academic collaborations in the field of high-content screening for drug discovery: It is the coordinating institute in the High-Throughput Technology Development Studio project, a collaboration that is funded by the German government through the BioMeT Network and includes the Technische Universität Dresden; HCS vendor Evotec Technologies; image-analysis software provider Definiens; and RNAi firm Cenix.

As one of the coordinators of the project, Ivan Baines has a unique perspective on how HCS transcends industry and academia, and how the two groups can work together to further the field. In the first installment of this two-part interview with CBA News, Baines discusses the beginnings of the collaboration, why MPI-CBG works with the vendors it does, and the importance of flexibility when it comes to working with HCS vendors.

Maybe you can start with how the collaboration started, when it started, and what the idea was behind it?

The collaboration was announced on Sept. 19, 2003, by a joint press release between Definiens, Evotec, and MPI-CBG. We also subsequently issued a press release with Cenix. In fact, the collaboration began before that, and really was a proactive effort by the Max Planck Institute, because we were interested in using our expertise and experience in genome-scale cell-based screening as a basis for asking another question: Whether you can multiplex different screening technologies to generate a more in-depth data set. In the first instance, we wanted to perform, in parallel, screens using RNAi, compounds, and potentially expression analysis. For instance, we incorporate some of the more specific genetic screening techniques; for example, there's one called TILLING, or Target-Induced Local Lesions in Genomes. It's a method for generating conditional and non-conditional mutations in particular proteins in a guided fashion, which was developed for Arabidopsis, and then moved over to zebrafish. The idea was that we would run the same assay across multiple technology platforms, and then we'd collate the results of the screen into a standard format database to commit a more seamless meta-analysis across the different data sets.

Specifically, we knew in the first instance that there was a short-term goal of being able to generate a catalog of RNAi-induced cellular phenotypes across multiple assays, and that we could use it to interpret compound-induced phenotypes. So the idea there is not new today — nor was it then — but it clearly has exciting potential. The belief is that if you measure enough parameters — and this is the essence of multi-parametric analysis — with a high-enough level of resolution, you should be able to precisely match an RNAi-induced phenotype — the signature, if you like, of an RNAi specific to one gene, to the signature of a compound-induced phenotype. At that point in time, you could screen compounds without knowing their target. On the basis of making a sufficient number of measurements across a limited number of assays, you'd be able to already know what the target was.

Because whatever result you saw with the compound, you would be able to match it to the siRNA library results — the phenotypic results?

Exactly. You're cross-referencing and matching phenotypic signatures from RNAi and compound screening. So the questions we wanted to ask were: How many parameters would you need to measure, at what resolution, across how many technology platforms, before you can make this cross reference with the required level of confidence?

So that was, if you like, the basic science-driven query. We were also interested in the struggles in the pharmaceutical sector in the need to improve the predictive accuracy of primary screening. We all know about the trials and tribulations of pharma as development costs across the traditional drug-development pipeline are clearly prohibitive. So it works for the blockbuster model, but if you wanted to develop drugs for the $100-million to $500-million market, you can't really spend $800 million developing each drug.

Pharma has further realized that it is not sustainable to perform clinical trials across multiple different disease areas. There is an urgent need, therefore, to improve the predictive accuracy of the first screen. In other words, we need to be better in predicting which compounds at the earliest stage will be successful later. Of course, that's an easy thing to say, but how on earth will one actually do it? And how would one prove that it can be done?

So this is the second question: Again, will increasing the number of measurements and improving the resolution of those measurements improve one's ability to make a prediction with regards to efficacy and toxicity of a drug candidate? So those are the two goals, one basic and one applied.

So the corporate collaborations are primarily focused on these areas.

We recognized that we wanted to have the latest, most state-of-the-art automated microscope to be able to do this multi-parametric analysis — to be able to make these measurements — at a high resolution. How did we choose Evotec? We actually sent an Excel sheet with, I think, 17 questions, to each of the major providers of automated microscopes. We queried them on the basis of the specifications, and we had a number of particular interests that we wanted to do, which were not provided at that time. And we also asked the question: Would you be willing to develop this specific functionality, and secondly, would you be willing to collaborate with us in a real sense? Not 'You sell us your instrument and then disappear into your R&D department.' Based on the responses — and we had about 19 screens we wished to perform — basically, the Evotec Opera, at that time, was able to satisfy the requirements of 14 out of 19 screens, and the next best instrument could support eight of the 19 screens. And we were further convinced of Evotec's genuine interest to collaborate with us in a real sense — daily contact, discussion of specification, opening up the labs, and so on. That determination has proved completely correct. We've been delighted with our collaboration, and I think it has been mutual. And they have developed functionality specific to our needs — for example, excitation and emission lines, different filter sets, the ability to work with transmitted light — a whole host of things that they've done to meet the needs of our screens.

One of your colleagues, Eberhard Kraus, said at the Marcus Evans HCS conference in June that Evotec would be providing more upgrades this year to further this work.

There actually is a new Opera available, and it's actually out in the field. We were at the front of the line, and we decided to continue with our current Opera, which is working extremely well, and we didn't want the downtime. But we're going to substitute sometime over the next three or four months, and use that time to add some additional functionality that we're particularly interested in. So at no time will we really be using exactly the same instrument as is available — but very close.

Is it the case that if you had a different objective in which HCS was to be implemented, then the Evotec platform might not necessarily be the best choice?

Yes, basically we look at what is the net cost over all the possible applications. So we have 19 very diverse screens, but absolutely, you can imagine a situation where the Cellomics KineticScan does kinetic measurements, but non-confocal, and it supports different applications that necessitate kinetics but not confocality. We actually have added a collaboration with Cellomics since that time (see related article, this issue).

What do you think about the concept that HCS vendors may have to provide more flexibility, to be willing to conform to their customers' needs on an ongoing basis — such as you mentioned with Evotec — rather than an "out-of-the-box" solution, in order to sell more platforms?

Absolutely. It's totally essential. I should say that one of the other major providers was ruled out early on due to a lack of flexibility. We rejected the Amersham instrument on that basis.

Next week, Baines discusses the Max Planck Institute of Molecular Cell Biology and Genetics' software and RNAi collaborators, how technologies from the participants complement one another, and bringing pharma on board.

File Attachments
The Scan

UK Pilot Study Suggests Digital Pathway May Expand BRCA Testing in Breast Cancer

A randomized pilot study in the Journal of Medical Genetics points to similar outcomes for breast cancer patients receiving germline BRCA testing through fully digital or partially digital testing pathways.

Survey Sees Genetic Literacy on the Rise, Though Further Education Needed

Survey participants appear to have higher genetic familiarity, knowledge, and skills compared to 2013, though 'room for improvement' remains, an AJHG paper finds.

Study Reveals Molecular, Clinical Features in Colorectal Cancer Cases Involving Multiple Primary Tumors

Researchers compare mismatch repair, microsatellite instability, and tumor mutation burden patterns in synchronous multiple- or single primary colorectal cancers.

FarGen Phase One Sequences Exomes of Nearly 500 From Faroe Islands

The analysis in the European Journal of Human Genetics finds few rare variants and limited geographic structure among Faroese individuals.