Skip to main content
Premium Trial:

Request an Annual Quote

J&J s Sergey Ilyin Discusses Integrating HCS with Other Technologies

Premium
Sergey Ilyin
Bioinformatics / Translational Technology
Group Leader
Johnson & Johnson Pharmaceutical Research & Development

At A Glance

Name: Sergey Ilyin

Position: Bioinformatics / Translational Technology Group Leader, Johnson & Johnson Pharmaceutical Research & Development

Background: Bioinformatics Group Leader, Senior Scientist, J&J — 2002-2005; Postdoc/Scientist, CNS research, J&J — 1999-2002; Postdoc, molecular biology, University of Delaware — 1997-1999; PhD, molecular biology/neuroimmunology, University of Delaware — 1997.


As leader of the bioinformatics and translational technology group at Johnson & Johnson Pharmaceutical R&D, Sergey Ilyin's work overlaps several divisions of J&J's drug-discovery efforts. His group is responsible for analyzing, applying, and integrating data from a variety of drug-discovery tools, including high-content image-based screening, microarray analysis, and qPCR. Ilyin took a few moments last week to share with CBA News his perspective on some recent trends in integrating technologies in drug discovery.

How are you involved with high-content screening and cell-based assays at Johnson & Johnson?

My responsibilities are for managing a group called Bioinformatics and Translational Technologies. Part of our responsibility relates to things like biomarkers and target validation. Obviously to validate targets, we need to have relevant in vitro and in vivo models. To that end, we were involved very early on with various technologies for high-content screening, and also, of course, integrative screening technologies; for example, in addition to looking at translocation, we can see the effect of a compound or siRNA on gene expression using microarrays or PCR, so we can link different dots and understand what happens in terms of biology with a system.

Can you talk a little bit more about how you combine data from various screening methods? How is that working right now, and what improvements can be made?

First of all, I might split this question into a series of sub-questions. One would be 'Why are we doing it now when we were not doing it before?' Previously, by which I mean a couple of years ago, PCR was not a practical way to look at compounds or other things because of a limitation in throughput. When I was in graduate school, we had to use ultra-centrifugation to purify RNA samples. And total throughput was about 12 samples per day. Eventually the throughput improved so we could process maybe 50 or 60 samples. But recently there have been systems introduced that allow us to process thousands of RNA samples at very low costs. One of these technologies, for example, are plates that use the principal of RNA capture using oligoG incorporated in the bottom of the plates. With the introduction of this technology, PCR became very practical, and it was very timely because these PCR assays allow very practical use of microarray data. So if you identify a signature in a microarray experiment, you can use this signature for screening using very inexpensive and high-throughput qPCR assays. These assays can help either with efficacy of compounds, or safety — for example, you might identify a signature that relates to certain adverse effects. So this is why qPCR is used more and more recently.

Another aspect here relates to the ability to integrate information from, for example, high-content screening with qPCR information. That requires you to build pipelines for data, and that is just data importing and data storage. It also took us some time to find out appropriate ways to do it. And of course, there is a third dimension to this, which is the development of microfluidics systems, which can make these approaches even easier and more affordable.

When you refer to qPCR, what types of technologies are you referring to?

I would say that today, the most widely used system comes from Applied Biosystems. It's genuinely TaqMan technology in cell systems that allow you to process 384 samples at any given time. But at the same time, there are newcomers to the market; in particular Biotrove, which offers what I would say is cheap technology — microfluidics technology [that] allows us to reduce reaction volumes to 33 nanoliters, and allows us to run more than 3,000 reactions, basically, on one chip. And you can process multiple chips at one time. So it essentially gives you unlimited throughput.

How does this tie into live-cell assays?

Biotrove is not yet adapted to live-cell assays, but it is a very active area of research for us right now. It's not something we have in production, but we actually filed some applications for technologies based on the BioTrove platform. We have in production systems that combine high-content analysis on systems like Cellomics with additional analyses using qPCR, such as Applied Biosystems' technology.

So none of these is a great stand-alone technology for drug discovery?

Right. Let me give you an example: if you take various chemotherapy compounds, and you also use some compound that is actually a nuclear-targeted therapy. You may actually get the same readout in proliferation assays — most of these compounds will demonstrate efficacy. However, the mechanism of this efficacy is very different. But in cellular assays, just looking at proliferation would be misleading. In this case, we would like to use assays that would actually give us information about mechanism of action. We were able to develop very nice and robust assays from information derived from microarrays, and tweaked using very robust qPCR assays. So our approach would be to obtain all traditional measures of efficacy, such as proliferation, receptor binding, and, if appropriate, translocation of target molecules. But we would also go into the dimension of molecular networks.

Many proponents of high-content screening believe that it can reveal pathways and mechanism of action …

Well, it is true. But the only thing is [that] generally it's only if you review the things you're specifically looking for. The ability to go into another dimension is very valuable. I think at this point, we can connect with another interesting dimension — working with purified molecules, such as enzymes, we can look at the affinity of compounds for targets. Then we obviously have to go to the next level and use cellular systems to makes sure compounds pass membranes, don't have strong cytotoxic profiles, et cetera. And generally, from here, we skip to animal testing. But I think we skip another whole important dimension, which is using some intermediate system. Throughput of animal studies, in many cases, is very limited. A very interesting experiment that we did recently related to the fact that if you put two different cell types together — and I can't elaborate on the specific cell types we used — the cells interact and often express novel molecules that were not present in any of these cell types before they were put together. That might change biology significantly. We found that in areas such as, for example, oncology, we can use three-dimensional models that can mimic, for example, invasion of tumor cells into blood vessels. This becomes a very practical and important tool to select compounds for future testing in animal models.

In some ways, there is a similarity to high-content screening: The cells are labeled with different fluorescent tracers, you get a fairly complex image, and image-analysis becomes very important. In this case, you have a next dimension, which is to say that you might be dealing with a three-dimensional image. This is not simple — it is still a big challenge for us, and we have not yet fully automated this process. This is a frontier that we might find very challenging.

Is there any initial progress being made in this area?

We are looking at a number of vendors, and also seeing if we can learn something from other areas of medicine and science that deal with 3D information. Obviously systems such as CT and PET deal with these types of images, but we have not yet determined if we will be able to borrow concepts from these areas.

In the past five or 10 years, many of these technologies experienced a growth curve and leveled off quite a bit — microarrays, for example. HCS seems to be at the beginning of that growth curve now. Do you see any of these technologies standing alone eventually, or are they all destined to be integrated?

I think the future is in integrative technologies that will allow us to obtain multi-dimensional information. That would allow us to make better decisions. I think that we'll actually see even more technologies and platforms; for example, I would think that in many fields there will be that intermediate step between cells and in vivo models. I think it may even get more complicated than 3D models. In other words, as we learn how to build artificial surfaces, and some artificial organisms, people will move into that area more and more. A single cell type in culture just doesn't reflect real physiology, especially if you want to look at interactions. In many cases, therapeutics are intended to block interactions between several cell types. So you need to have that interaction reproduced in in vitro systems, where you can still achieve good throughput. Even though we still have problems with image analysis, we can still do more as opposed to experiments with animals.

Let me come back to microarrays a bit though. In my humble opinion, there was a big bubble in the industry when microarrays were introduced, and in reality, I don't think we have yet learned how to use them: how things can be done, and what can be done. The initial speculation was that microarrays may completely change discovery and drug development. But eventually people realized that the role of microarrays is somewhat more limited. Nonetheless, they're very useful for investigating specific questions. The only drawback is that the investigator has to understand the technology, and be able to intelligently ask these questions. The good news is that over the past five or 10 years, we have been able to make microarray information much more practical. So on one hand, we have the development of various qPCR assays, which are very important to validate microarray findings; and on the other hand, we have novel informatics tools that make the micorarray analysis less cumbersome. So nowadays, bench scientists can analyze microarray data, and perform computations that were previously limited to people in informatics. Furthermore, the development of functional genomics tools, such as siRNA, has been important. Now, hypotheses derived from microarrays can be quickly tested using in vitro systems. So I think that microarrays will still have a profound impact.

Also, what we define as microarrays is changing rapidly. People got excited with the ability to essentially perform high-content screening using chips, and nowadays you can see people printing siRNA in microrray format, and trying to print compounds. Of course, it's all difficult, and many of these technologies aren't fully validated. But nonetheless, they represent a consistent trend, and some of the success stories will continue, and eventually many things will be transitioned to microarray/microfluidics format, with which you can perform multiple assessments at relatively low costs.

But doesn't that get away from so-called 'high-content,' and move back towards strictly 'high-throughput'?

I think that high-content methods can be moved to microarrays. I'm not sure about 3-D models, but I think that would be a somewhat separated area anyway. This technology would be used to reduce the cost of animal studies, and helps deal with the trend of animal rights.

With the current stringency related to quality of compounds, you obviously want to connect all the dots before you put your compound in a human.

File Attachments
The Scan

Interfering With Invasive Mussels

The Chicago Tribune reports that researchers are studying whether RNA interference- or CRISPR-based approaches can combat invasive freshwater mussels.

Participation Analysis

A new study finds that women tend to participate less at scientific meetings but that some changes can lead to increased involvement, the Guardian reports.

Right Whales' Decline

A research study plans to use genetic analysis to gain insight into population decline among North American right whales, according to CBC.

Science Papers Tie Rare Mutations to Short Stature, Immunodeficiency; Present Single-Cell Transcriptomics Map

In Science this week: pair of mutations in one gene uncovered in brothers with short stature and immunodeficiency, and more.