Much has been made of the contributions from gene expression analysis to finding new targets, but behind the scenes scientists in big pharma have also made progress in using genomics to help weed out potentially toxic drug compounds before they enter the clinic. Toxicogenomics, as this field is known, has taken particularly strong root at Abbott Labs.
The motivation for starting a toxicogenomics laboratory is relatively straightforward, according to Jeff Waring, senior research scientist for cellular and molecular toxicology at Abbott. Traditionally, evaluating the toxicity of a candidate drug molecule involved testing large numbers of laboratory animals — usually rats and non-rodent species — in a time-consuming effort to identify drug compounds likely to cause serious harm in patients. “Toxicology was a fairly strong bottleneck because it occurred at such a late stage of the drug discovery process,” Waring says.
So in 1999, Waring and his colleagues at Abbott got the green light to try something different. Microarray analysis, they reasoned, could pick out changes in gene expression from rat serum or cell cultures indicative of toxic response at a much earlier stage in the game, saving researchers both time and money. Working with Rosetta Inpharmatics and Iconix Pharmaceuticals, Waring and his fellow researchers developed custom microarrays and databases for detecting gene changes associated with toxicity, and in the intervening years managed to prove the validity of the approach, he says. One such success involved identifying gene changes that correlate with elevated cholesterol and triglycerides induced by some HIV protease inhibitors. By screening follow-on HIV protease inhibitors for these gene changes, it may be possible to identify new drugs less likely to have these side effects, he says.
But like many microarray groups, Waring says he and his colleagues struggle to ensure that their results are precise, given the wide variation across microarray platforms and labs. In results published this year, Abbott and other pharma groups participated in a study by the International Life Sciences Institute to compare gene changes associated with toxicity using a common sample. The results show that cross-platform and cross-laboratory issues are manageable but remain important, he says.
Waring also sees several new technologies that may be useful for toxicogenomics applications in the future. Abbott is looking into moving away from an array-only approach and investigating a multiplexed, high-throughput platform for performing gene expression analysis, Waring says. In addition, he says using mass spec or NMR spectroscopy to help identify metabolic changes in animals indicative of toxicity — an approach known as metabonomics — might serve as an area for expanding toxicogenomics. “There’s still some technical issues involving collecting samples [suitable for analysis], but the technology has great potential,” he says.
As for whether the ability to eliminate toxic compounds early has made finding effective compounds easier, Waring says data are still coming in. “It’ll be a few more years before we see the ultimate effect,” he says, “but we have managed to more easily screen out the bad actors, so in that way it’s shown to be a good savings.”
— John S. MacNeil