Skip to main content
Premium Trial:

Request an Annual Quote

FEATURE: As Pipelines Wither, Pharma and FDA Explore Whether Microarrays are Ready for Primetime

This is the first in a two-part series that examines how--or whether--drug companies and the US Food and Drug Administration can use microarray data in the drug-approval process. Today's installment covers whether microarray technology has a place in the later stages of drug development, while tomorrow's article will explore how the FDA and pharma firms plan to usher the technology into the mainstream.

 

NEW YORK, April 15 - A little knowledge can be a dangerous thing.

 

Just ask drug developers and regulators, who are struggling to decide whether and how the US Food and Drug Administration should begin incorporating microarray data into its drug-evaluation process.

 

If it lives up to its promise, increasingly powerful microarray technology could  revolutionize toxicology testing and provide entirely new insights of drug mechanisms.

 

But when can the pharmaceutical industry start trusting the technology? When should it make the leap from investigative, experimental technique to industry standard? And should the FDA start using microarray data to make critical decisions about the safety and efficacy of new drugs?

 

In short, are microarrays ready for regulatory primetime?

 

Yes, say some drug developers, who point to the potentially invaluable knowledge that microarrays could provide.

 

No, say others, who fear that the uncertainty, sloppiness, and lack of reproducibility that still plague the chips will inadvertently kill off potentially valuable drugs.

 

A dozen industry and agency experts recently chewed over this question at an FDA forum about new genomic technologies in regulatory decision making. Many agreed that microarrays could have a glorious future in drug development, but several voiced the same concern: Abruptly thrusting this technology into the regulatory sphere is a bad idea.

 

"Unfortunately, many people who work in this area are in a bit of damage control at this moment, because it was perceived as an incredibly powerful technology when it was first introduced," William Pennie,  director of molecular and investigative toxicology at Pfizer, said in an interview with GenomeWeb. "If we don't consider the technology's lack of maturity--if we buy into the hype around it ... we're positioning ourselves to do more damage than good. We'll be chasing red herrings, we'll be trying to find significance in insignificant data."

 

Pennie, who was one of the speakers at the FDA forum, emphasizes microarrays' current upstream value, particularly in screening, ranking, and exploring potential compounds. The technology helps "open the black box," he said.

 

But Pennie and others warn that taking the data too seriously too soon could backfire.

 

For one thing, there's simply too much data, much of it worrisome and little of it conclusive. Now that near-global chips make it possible to run tens of thousands of assays simultaneously, hundreds of false positives are almost guaranteed. Distinguishing real toxicity warnings from the false positives is no simple matter for either pharmaceutical researchers or FDA reviewers.

 

"It's a very difficult challenge to explain each and every one of the changes that may occur," said Frank Sistare, director of the division of applied pharmacology research for the FDA's Center for Drug Evaluation and Research. "The situation that industry is in is as follows: If I ask a question which I'm not obligated to ask, and if I thereby raise issues and concerns that I can't address, will that hurt me more than help me?"

 

Sistare, who is also co-chair of the FDA's genomic and proteomic intercenter working group,   said that FDA reviewers are starting to get a dribble of data from small, focused arrays, mostly limited to toxicology and tumor typing.

 

The agency would like to see more of it, he said, but pharmaceutical developers may fear a double bind. If they start using toxicology chips further downstream in clinical development, they are likely to find gene-expression changes that they can't interpret but are obliged to report to the FDA--a nasty little hurdle when one considers that perhaps only 10 percent of data coming from the largest arrays is really understood, Sistare said.

 

By playing it safe and using microarrays only on reference toxicants or "dead" drugs, though, drug companies may be throwing the baby out with the bathwater for fear of what the FDA or other regulatory agencies might say.

 

That hesitancy could cripple the adaptation of microarray technology in the pharmaceutical industry, Pennie told the FDA forum, which convened in February. "We're using [microarrays] in a rather conservative way...when there's a real opportunity for learning if we start to apply the technology on real issues."

 

The delicate challenge that faces the agency now is how to create the right regulatory climate that encourages pharmaceutical companies to use this technology--and allows them to make use of the data it generates.

 

It's a tightrope act, and an important part of the future of the genomic contribution to pharma is in the balance. "This is a fresh technology, still in the process of evolution," said Sistare. "If we try to be terribly proscriptive now, we can do more harm than good."

The Scan

Shape of Them All

According to BBC News, researchers have developed a protein structure database that includes much of the human proteome.

For Flu and More

The Wall Street Journal reports that several vaccine developers are working on mRNA-based vaccines for influenza.

To Boost Women

China's Ministry of Science and Technology aims to boost the number of female researchers through a new policy, reports the South China Morning Post.

Science Papers Describe Approach to Predict Chemotherapeutic Response, Role of Transcriptional Noise

In Science this week: neural network to predict chemotherapeutic response in cancer patients, and more.