Skip to main content
Premium Trial:

Request an Annual Quote

Genomics Goes Downstream

Premium

By John S. MacNeil

 

Traditional toxicology can be pretty brutal. Dose a rat or mouse with various concentrations of a new drug under development, and once the animal has died from exposure to the drug or been killed for the study, employ histopathology and liver tissue analysis to gauge the relative toxicity of the new drug.

Could genomics offer a less invasive, and perhaps faster, option? We’ve all heard about how the human genome has served as a catalyst for discovering new drug targets, as well as better tools for understanding biological pathways associated with particular diseases. But what’s less apparent is whether genomics — and the technologies derived from it — have made an impact on the downstream processes of drug development. In other words: Has the packet of tools and technologies derived from genomics actually helped push drugs into the clinic?

The question is debatable, to be sure. There are examples of genomics applications that theoretically should be able to contribute to drug development: toxicogenomics, for example, and more recently, pharmacogenomics. But it’s not yet clear whether these new approaches to deciding which compounds to test in clinical trials, and choosing which patients should receive them, will actually have as great an impact as some scientists and pharma industry executives have foretold.

Tox heats up

In assessing the state of toxicogenomics, one of the first places to start is with Mountain View, Calif.-based Iconix Pharmaceuticals, a seven-year-old drug discovery and tools provider that four years ago teamed up with MDS Pharma, Incyte, and what was then Amersham Biosciences to begin building a database of gene expression profiles associated with toxic responses in rats to certain drugs. The motivation behind the project, according to Kyle Kolaja, Iconix’s vice president for chemogenomics and toxicology, was to create a resource for Iconix and for outside customers useful for predicting the toxicity of new drugs based on validated gene expression profiles correlated with known toxic responses.

To do this, scientists at Iconix spent the first couple of years selecting more than 630 compounds and running in vitro and in vivo gene expression experiments using microarrays on rat tissue samples including liver, heart, kidney, and primary cultured hepatocytes. In addition, the researchers populated the database, now known as DrugMatrix, with data from the literature on the biological pathways and toxicity associated with various drugs, as well as with pathology data and in vitro pharmacology data supplied by MDS Pharma. Iconix then overlaid the database with software tools designed to help researchers involved in drug discovery predict the safety of new compounds based on the accumulated knowledge in DrugMatrix.

Kolaja says the Iconix database is unique because of its size — it now contains information on more than 1,500 compounds and gene expression data from more than 15,000 microarray experiments — and because of the speed with which researchers can use the resource to gain insight at an early stage into how patients could respond to a new drug. “You can’t tell [potential customers] to give us six months to come up with an answer,” Kolaja says, “and in some of the best cases we’ve been able to analyze new compounds and send out data on predicted response within a week to two weeks.” Of course the limitations, he adds, are that in some cases toxic responses are not reflected in gene transcription, and that in other cases toxicity only manifests itself months to years after a patient begins taking a drug — think Vioxx, for example, he says.

“For us the theme [of toxicogenomics] has been knowing a lot more about a drug or chemical a lot earlier in the process, and that can allow you to feel more comfortable with a particular compound, or it can make you less comfortable and [lead you to] deprioritize it based on that knowledge,” Kolaja says. “It’s certainly not a replacement technology; I’d put it more in the category of complementary,” he adds. “But it allows much greater insight into the process of decision-making for candidate selection throughout the process.”

Gaithersburg, Md.-based Gene Logic, among others, has also developed a toxicogenomics platform, based on microarray analysis of gene expression dysregulation caused by compound treatments. The drug discovery services platform, called ToxExpress, provides information on predictive, mechanistic, and investigational toxicology, the company says.

In Action

In practice, pharma customers of Iconix and Gene Logic say applying toxicogenomics in drug development has helped scientists choose potentially more promising drug candidates at an earlier stage in the development process than was possible before. Researchers at Bristol-Myers Squibb and Abbott Laboratories, who work with Iconix, and at Millennium Pharmaceuticals, who work with Gene Logic, express optimism that investing in the approach to predicting toxicity of new compounds will pay off, but are quick to point out that improvements to the technology and its implementation will help speed their return on investment.

At BMS, scientists led by Mark Cockett, the company’s vice president for applied genomics, and Bruce Car, executive director of discovery toxicology, began delving into toxicogenomics about three years ago when they started collaborating with Iconix. Since then, the company has implemented toxicogenomics at several stages of the drug development process, most extensively at the boundary between discovery and development, at which point BMS profiles all the compounds selected from the discovery operation for further development. In addition, BMS researchers use the technology at a later stage to perform gene expression experiments on compounds that passed through the initial toxicogenomic assessment but later in the development process exhibit a more worrisome toxicologic profile.

Cockett says that the goal of using toxicogenomics to profile compounds that enter development is to help interpret mechanisms of drug activity, and to predict how a patient will respond. Doing this requires performing microarray-based gene expression studies on tissues derived from rats and other laboratory animals, and using bioinformatics tools to compare those expression profiles with those stored in their own and Iconix’ DrugMatrix databases. These toxicogenomic studies are always performed in the context of traditional toxicology protocols, involving live animal studies and histopathology, says Car. He adds that the toxicogenomics work performed at this stage of the drug development process has revealed new information relevant to the drug’s viability in about one-third to one-half of all cases. Specifically, toxicogenomic evaluations at BMS have helped researchers understand novel mechanisms of carcinogenicity, provided evidence for, or ruled out several mechanisms of toxicity, and allowed researchers to compare liabilities in backup molecules to those of successful clinical candidates, Car says.

When a compound exhibits toxicity via a novel mechanism — that is, it passes through initial toxicological screens but later on in the process researchers determine that it’s toxic — BMS scientists under Car and Cockett revisit the compound with the help of toxicogenomics. This process, they say, involves investigating the gene expression profile associated with that novel toxicity, in the hope that this information can be applied to future studies of new compounds. “Essentially toxicogenomics helps you dissect out the mechanisms,” says Cockett.

Implementing toxicogenomics at BMS has also involved a significant effort in revamping standard operating procedures and upsetting traditional mindsets about how toxicology is performed, Car and Cockett say. It was no small effort, Car says, to put into practice and harmonize new methodologies at three discovery and three development sites. “The people who are conducting the studies in the traditional tox sites have been used to doing very traditional analyses, and we’ve introduced a cutting-edge technology that requires expertise in terms of bioinformatics that is really quite foreign to a traditionally trained toxicologist,” Car says.

But all this begs the question of whether toxicogenomics has actually had an impact on streamlining new drug selection and development. At BMS, Car says there are two ways to evaluate the program’s success: by analyzing a specific drug development program and determining whether the toxicogenomic study helped discover or explain something faster and with greater insight than other approaches; and by looking at the overall attrition rate of new compounds brought into the process to see whether new methodologies like toxicogenomics have managed to bring down that rate from what have been historic highs.

With respect to individual programs, Car and Cockett were hesitant to provide specific details, but noted that toxicogenomics has helped terminate a development compound, and has provided essential benchmarking criteria for advancing compounds from discovery into development. And when it comes to attrition rates, Car says BMS has reduced them in the non-clinical development area from 60 to 80 percent to virtually zero as a result of pharmacokinetics, metabolism, and toxicology. Adds Cockett, “Our success rate in going from selecting a drug candidate to getting to ‘first in man’ has improved dramatically.”

Downstream moves upstream

Over at Abbott Laboratories, scientists have also worked with Iconix to develop capabilities in toxicogenomics. And in a similar manner, Abbott has added it at more than one spot along the drug development process. Toxicogenomics has found a role in ranking compounds in terms of their potential for success in later stage development, as well as in helping researchers further downstream pick apart the biological mechanisms at play in a particular drug’s activity, says Eric Blomme, director for cellular and molecular toxicology at Abbott Laboratories.

Most recently, Abbott has expanded its work in toxicogenomics even further upstream in the drug development process, to where the company categorizes it essentially as a part of drug discovery. “In development, where we already [do traditional] toxicology studies, it’s really a complement,” says Blomme. “But in discovery, it’s not really a complement, because before there was nothing. We were dealing with dozens of molecules and we had to make a decision on which molecule to take forward based only on the [pharmacokinetic] or the pharmacology activity. By adding in the toxicology component, that helps us choose the best molecules.”

The nature of Abbott’s early stage toxicogenomics work involves developing new types of in vitro screens that would help make it easier to select out potentially dangerous compounds even earlier in the process, Blomme says. In addition, putting toxicogenomics to work at this stage in discovery/development provides Abbott scientists with another approach to investigating the biology of the drug target itself. Given the number of targets big pharma is working with, he says, learning about what makes a good or bad target at the gene expression level with respect to toxicology can lead to better target selection in future drug discovery programs.

In assessing the contribution of toxicogenomics to ameliorating the chances for success in drug discovery and development, Blomme says he believes toxicogenomics as it’s applied at Abbott will help his group achieve a 50 percent reduction in the rate of attrition for two- to four-week preclinical studies. Although this hasn’t happened yet, “this is a metric we can obtain,” he says. “When we start collecting data, that will be one of the factors in determining the success of [toxicogenomics],” he adds.

In the long run, Blomme says toxicogenomics will never be able to replace traditional animal studies, but he sees its contribution as another means of making slightly better decisions. And given the scale of the drug discovery operation in a large pharma company, such slight improvements translate into a marked impact on the overall drug development process. “For toxicity, we still don’t have as many data and therefore it’s very difficult to give a number as far as [the effect on failure rates],” he says. “But we don’t need to be much better to significantly impact the process, quite frankly, because the failure rate is so high.”

Meanwhile, in biotech …

Millennium Pharmaceuticals in Cambridge, Mass., has taken a more circuitous route to establishing a toxicogenomics program, but has nonetheless managed to build expertise in the methodology to rival that of their larger pharma competitors. Peter Smith, Millennium’s senior vice president, says that when he joined the company four years ago, he and his colleagues surveyed the landscape of available toxicogenomics know-how for hire, and came to the conclusion that building expertise in-house would be more efficient than trying to license in or acquire toxicogenomics capabilities from an outside provider.

Rather than spend $7 million over three years in a collaboration to acquire access to toxicogenomics expertise, Smith says Millennium chose to capitalize on its own scientists’ abilities, and ultimately created a spin-off company called Horizon that specialized in toxicogenomics. Millennium, in turn, sold Horizon to Gene Logic in 2004, in return for three-year access to Gene Logic’s toxicogenomics database. “We actually took that database and stripped it to raw data, and have been feeding that raw data into our own systems and coming up with algorithms on the basis of the raw data,” Smith says.

Currently, toxicogenomics at Millennium involves employing in vivo rat liver systems and in vitro hepatocyte rat and human cell cultures as the raw material for gene expression studies using Affymetrix DNA microarrays, Smith says. With the help of Gene Logic’s ToxExpress database of in vivo rat liver gene expression profiles, and its own accumulated knowledge, Millennium scientists have primarily applied toxicogenomics to helping rank the compounds under development at the hit-to-lead stage of the process in a semi-high-throughput mode, he adds. Additionally, Smith says toxicogenomics has found a role at Millennium in investigating the gene expression profiles associated with compounds exhibiting unique toxicity.

At the moment, Millennium is evaluating the effectiveness of its approach to toxicogenomics, and the results are mixed, Smith says. The scientists working in toxicogenomics have generated some very interesting and potentially useful results, he says, but from a value standpoint, the investment will only start paying off once Millennium can truly make the process high throughput, he says. “We’ve calculated the actual cost of the manpower, the cost of the chips, the cost of preparing and running the studies, and we’ve concluded that we need to do this in a high-throughput mode, and we probably need to use high-throughput PCR to really make it a value for us,” Smith says. “These are all new conclusions that we’re in the process of coming to.”

Improvements still needed

Cockett and Car at BMS and Blomme at Abbott agree that there are ways to make toxicogenomics a more effective component of a robust drug discovery and development strategy. Improving the consistency and quality control of samples derived from live animal studies would translate into more meaningful gene expression studies, Cockett at BMS says. Like Smith at Millennium, he thinks that limited throughput for toxicogenomics experiments constrains its potential contribution to the drug discovery process. Currently, BMS researchers are working with Affymetrix to design a new GeneChip system that would allow the company to increase its throughput of samples for toxicogenomics analysis by five to 10 fold, Cockett adds.

Blomme at Abbott also made clear that a higher-throughput system for carrying out toxicogenomic studies would positively impact its value to the company. Defining predictive gene expression profiles and developing algorithms to use them in future studies is a relatively well-established methodology, he says, but the limitation to its cost effectiveness lies in the relatively slow laboratory systems for carrying out the experiments. “We need a more cost-effective, faster platform,” he says, “with robotics and automated systems for increasing throughput.”

One could argue that adding data on protein expression profiles associated with toxicity could augment the predictive power of toxicogenomics, says Smith at Millennium, because information at the protein level might tie more closely with actual biochemical mechanisms. However, scientists agreed that proteomics technology is at a much earlier stage in its lifespan as an analytical technique than gene expression profiling. “Proteomics hasn’t panned out yet in its ability to be comprehensive,” says Cockett, “whereas the genome is sufficiently described — for rats, mice, humans — and there are mature tools and techniques for analyzing gene expression results.”

FDA steps in

In spite of the limitations, however, it’s pretty clear toxicogenomics is here to stay. Even the Food and Drug Administration is in the game, with FDA currently licensing the Iconix DrugMatrix database as a means of helping train reviewers on the significance and proper interpretation of gene expression data in the context of toxicology. In addition, scientists at FDA’s National Center for Toxicological Research in Jefferson, Ark., have developed a piece of software called ArrayTrack designed to help reviewers evaluate gene expression data submitted as part of applications for new drug approval, says Dan Casciano, NCTR’s director.

And given FDA’s encouragement to pharma companies to submit their gene expression data as part of toxicology and efficacy studies, it follows that toxicogenomics will continue to play a significant — if at the moment limited — role in the drug development process. Already, Cockett and Car at BMS, and Blomme at Abbott say their companies are interested if not already participating in FDA’s effort to educate their reviewers on the potential value of toxicogenomic data.

“The idea is that we would provide the FDA interdisciplinary review group with the opportunity to get some material to develop their skills,” says Blomme. “It’s in the best interests of industry to help the FDA develop their understanding of what data they may receive and develop the tools to understand them. Likewise, I think we will benefit from going through the process.”

 

Helping Drug Development? SiRNA Is the Drug

Applying genomics to drug development usually means developing a tool or technology useful in learning something about a new compound with therapeutic potential. But what if that tool is also the therapeutic agent? It just so happens that this is the case with RNA interference, and in recent months several groups of researchers in academia and industry have begun making significant progress in pushing siRNA-based therapeutics into the clinic.

That a short interfering strand of RNA can be induced to knock down the expression of a troublesome gene implicated in a disease is an established fact. But the hard part in turning siRNAs into drugs has been in delivering the reagent to where it’s needed in the body to treat the disease. SiRNAs in the bloodstream tend to be picked up quite rapidly by the kidneys and destroyed, and even if the reagent manages to escape the clutches of the kidneys it still must pass through cell membranes to reach the mRNAs it’s designed to knock down.

But scientists at Alnylam Pharmaceuticals in Cambridge, Mass., have developed one approach to systematically delivering siRNA reagents to the body intravenously. In a paper published last November in Nature, Alnylam scientists described how chemically modified siRNAs were able to silence a gene in mice expressed in the liver and intestine that encodes for apolipoprotein B, a protein associated with high cholesterol.

To do this, researchers led by Hans-Peter Vornlocher at Alnylam’s Kulmbach, Germany-based lab modified the siRNAs using a technology licensed from Isis Pharmaceuticals, and coupled the siRNAs to a cholesterol molecule that helps protect the reagent from elimination by the kidney and promotes cell uptake. According to Nagesh Mahanthappa, Alnylam’s senior director for business development and strategy, this strategy shows particular promise for diseases associated with the liver, such as hepatitis C, and metabolic disorders.

Meanwhile, at Sirna Therapeutics, headquartered in San Francisco, researchers have developed analogous strategies for modifying siRNAs to improve their pharmacokinetic properties, as well as a method for coupling an siRNA reagent to a lipid-based nanoparticle that can target hepatocytes quite efficiently, according to Barry Polisky, Sirna’s senior vice president for research and CSO. In addition, Sirna is developing a topical drug delivery system that the company hopes will prove useful in targeting a gene that when silenced, prevents unwanted hair growth.

— JSM