Skip to main content
Premium Trial:

Request an Annual Quote

Bristol-Myers Squibb s Wardwell-Swanson on RNAi and HCS in Drug Discovery

Senior Research Investigator II
Bristol-Myers Squibb

At A Glance

Name: Judith Wardwell-Swanson

Position: Senior Research Investigator II, Applied Genomics Department, Bristol-Myers Squibb

Background: Scientist/Senior Research Scientist, Wyeth-Ayerst Research, 1994-2000; Scientist, Hoffman-La Roche, 1983-1991; Research assistant/associate, University of Connecticut Health Center, 1980 -1983; BS, biology, Bucknell University, 1980.

Having been responsible for bioassay development and cell biology studies at three major US pharmaceutical companies, Judith Wardwell-Swanson has witnessed the evolution of up-and-coming drug discovery technologies such as RNAi and high-content screening. She has also been making her way around the conference circuit, having recently presented at last month's EmTech 2005 conference in South San Francisco and January's High-Content Analysis conference in San Francisco. Last week, Wardwell-Swanson shared with Cell-Based Assay News some of her thoughts on the use of RNAi and high-content screening together at her current company, Bristol-Myers Squibb.

What research area are you responsible for at BMS?

I am in applied genomics, and we support a broad spectrum of disease area biology. There is no particular one that we're married to, but my lab does spend a large amount of its time supporting oncology target discovery.

How did you come into this area professionally?

I was a cell biologist and molecular biologist. I joined this group largely because of my cell biology background, and the fact that this group was beginning to do a lot of target validation in cellular models following a pretty extensive effort to discover targets using technologies such as microarray expression profiling analyses. I joined this group because of that expertise and to begin validating some of those targets coming through.

How do RNAi and image-based screening play into the context of drug discovery?

Basically we're using RNAi as a genetic tool to get a handle on new targets that would be new entry points for drug discovery. Just as people were using overexpression paradigms in the past, this gives us a new angle where we're looking at loss of function, and exploring targets' roles as their activity is diminished using a knockdown approach.

Prior to this, what were the major techniques for target discovery and validation in a pharma setting?

Target discovery was largely being done by using microarray technology — expression profiling of diseased versus normal tissue. For target validation, we were largely using cDNA overexpression, where we'd look at the overexpression of a protein and look for a phenotypic outcome. In addition, we used dominant-negative overexpression where a mutant or dead form of the protein can mimic loss of function of that protein. But RNAi has made that a much [more] simplified process because you don't need to clone genes, and it's actually quite easy to deliver RNAi, as it turns out, to most cell models. Prior to HCS, we prosecuted our targets one by one utilizing a battery of biological assays and several different instrumentation platforms including confocal microscopy and a variety of plate readers.

With this type of work, is it implied that high-content or image-based cellular assays would be used as the readout technology?

I think it's quite an obvious endpoint for RNAi, largely because in the past, when we were doing smaller-scale target validation, we had used fluorescent microscopy and fluorescent endpoints to validate targets. So when we were suddenly able to validate more targets with the advent of RNAi, it meant that we needed a more automated fluorescent microscopy approach, and that's really what high-content screening is about.

Even though automated imaging has made great strides over the past few years, it's still not near the throughput of large-scale biochemical assays. How is this approach fitting in to the whole high-throughput paradigm of pharma companies?

It's certainly true that the throughput is not as fast on a per-well basis. But when you think about the ability to multiplex endpoints in a single well — and we commonly multiplex as many as four different endpoints in a well — you're actually running four assays in parallel. And there, you do see a greatly increased throughput; especially if you think about the ways those assays are typically run in high-throughput screening, where they're run as a series. If we can run four assays in a single well, then we're increasing the throughput in that regard. We may not be processing as many wells per day, but we're getting a lot more information out of those wells. And I think it fits very nicely side-by-side with biochemical assays. I don't think there is a need to abandon traditional high-throughput biochemical assays — I think this is just a very nice parallel approach.

Other pharma researchers have said that cell-based assays — not necessarily high-content — now constitute as much as 50 percent of their screening efforts. Do you see a similar figure?

I'm not in the lead discovery area, so it's difficult for me to comment on how that breaks down currently. I know that there is a lot of interest in increasing the amount of cell-based screening that goes on, mainly to avoid these disconnects you get between compounds that are active in biochemical assays, but then have severely diminished activity in cell-based models, or even in vivo. So I know that there is a trend towards increasing the number of cell-based assays.

Have image-based high-content screens gotten to the level of automation where a researcher feels comfortable setting something up and just walking away? Or is there still a lot of looking through microscopes?

I think once the assay is established, it's pretty much plug and play. We do go back and look at our hits, and we go back and look at suspicious results, and I think that's actually an advantage to be able to go back and look at the images. Sometimes you can pinpoint fluorescent compounds or cytotoxic compounds from the images. So we don't completely abandon the images, but at the same point we've been able to get assays that were well validated to run fairly unattended.

What types of high-content screening technologies have you evaluated, and which ones are you using at BMS?

We evaluated a lot of different types of instrumentation. We evaluated the lower-resolution, higher-throughput [TTP Labtech] Acumen Explorer, which is more of a PMT-based instrument. And we evaluated the [GE Healthcare] IN Cell 3000, which is a confocal-based instrument. And we also evaluated several xenon halogen lamp-based instruments, such as the Cellomics ArrayScan — and that's what we're currently using.

Is that the cream of the crop right now?

We still view [Cellomics'] data management and data analysis tools as being more advanced than some of the other vendors. And I think there's a big advantage for us in having that data-management piece in place, for this kind of data. As you can well imagine, we're generating terabytes of data, and to be able to manage it effectively from the day you install the machine is quite important.

Can you discuss any technologies — either in RNAi or high-content screening — that you think will improve the quality or throughput of your research?

I'm very optimistic that the viral-based delivery of short-hairpin RNAs will extend the knockdown duration, and make our functional assays even more robust than they currently are. So that's something we're really keeping our eye on. And I think it also really has the potential to allow us to use RNAi in a much broader spectrum of cell types, including primary cells. So I think that's going to be a very important thing to focus on. As far as high-content screening goes, I think the instruments will continue to improve, and I think the speed and throughput are likely to improve as time goes on. I think those improvements will make that technology much more appealing to the lead discovery parts of the organization, where the concern is certainly throughput.

File Attachments
The Scan

Study Finds Sorghum Genetic Loci Influencing Composition, Function of Human Gut Microbes

Focusing on microbes found in the human gut microbiome, researchers in Nature Communications identified 10 sorghum loci that appear to influence the microbial taxa or microbial metabolite features.

Treatment Costs May Not Coincide With R&D Investment, Study Suggests

Researchers in JAMA Network Open did not find an association between ultimate treatment costs and investments in a drug when they analyzed available data on 60 approved drugs.

Sleep-Related Variants Show Low Penetrance in Large Population Analysis

A limited number of variants had documented sleep effects in an investigation in PLOS Genetics of 10 genes with reported sleep ties in nearly 192,000 participants in four population studies.

Researchers Develop Polygenic Risk Scores for Dozens of Disease-Related Exposures

With genetic data from two large population cohorts and summary statistics from prior genome-wide association studies, researchers came up with 27 exposure polygenic risk scores in the American Journal of Human Genetics.