Skip to main content
Premium Trial:

Request an Annual Quote

The Need to Make Microarrays Mainstream


FDA and pharma experts alike worry that gene arrays may generate too much misleading data that could accidentally sink a tender young drug. But if drug developers shy away from using gene chips, many say the pharmaceutical industry will suffer.


By Kathleen McGowan


A little knowledge can be a dangerous thing.

Just ask drug developers and regulators, who are struggling to decide whether and how the US Food and Drug Administration should begin incorporating microarray data into its drug-evaluation process.

If it lives up to its promise, increasingly powerful microarray technology could revolutionize toxicology testing and provide entirely new insights of drug mechanisms.

But when can the pharmaceutical industry start trusting the technology? When should it make the leap from investigative, experimental technique to industry standard? And should the FDA start using microarray data to make critical decisions about the safety and efficacy of new drugs?

In short, are microarrays ready for regulatory primetime?

Yes, say some drug developers, who point to the potentially invaluable knowledge that microarrays could provide.

No, say others, who fear that the uncertainty, sloppiness, and lack of reproducibility that still plague the chips will inadvertently kill off potentially valuable drugs.

A dozen industry and agency experts recently chewed over this question at an FDA forum about new genomic technologies in regulatory decision making. Many agreed that microarrays could have a glorious future in drug development, but several voiced the same concern: Abruptly thrusting this technology into the regulatory sphere is a bad idea.

“Unfortunately, many people who work in this area are in a bit of damage control at this moment, because it was perceived as an incredibly powerful technology when it was first introduced,” says William Pennie, director of molecular and investigative toxicology at Pfizer. “If we don’t consider the technology’s lack of maturity — if we buy into the hype around it — we’re positioning ourselves to do more damage than good. We’ll be chasing red herrings, we’ll be trying to find significance in insignificant data.”

More harm than help?

Pennie, who was one of the speakers at the FDA forum, emphasizes micro-arrays’ current upstream value, particularly in screening, ranking, and exploring potential compounds. The technology helps “open the black box,” he says.

But Pennie and others warn that taking the data too seriously too soon could backfire.

For one thing, there’s simply too much data, much of it worrisome and little of it conclusive. Now that near-global chips make it possible to run tens of thousands of assays simultaneously, hundreds of false positives are almost guaranteed. Distinguishing real toxicity warnings from the false positives is no simple matter for either pharmaceutical researchers or FDA reviewers.

“It’s a very difficult challenge to explain each and every one of the changes that may occur,” says Frank Sistare, director of the division of applied pharmacology research for the FDA’s Center for Drug Evaluation and Research. “The situation that industry is in is as follows: If I ask a question which I’m not obligated to ask, and if I thereby raise issues and concerns that I can’t address, will that hurt me more than help me?”

Sistare, who is also co-chair of the FDA’s genomic and proteomic intercenter working group, says that FDA reviewers are starting to get a dribble of data from small, focused arrays, mostly limited to toxicology and tumor typing.

Tightrope act

The agency would like to see more of it, he says, but pharmaceutical developers may fear a double bind. If they start using toxicology chips further downstream in clinical development, they are likely to find gene-expression changes that they can’t interpret but are obliged to report to the FDA — a nasty little hurdle when one considers that perhaps only 10 percent of data coming from the largest arrays is really understood, Sistare says.

By playing it safe and using microarrays only on reference toxicants or “dead” drugs, though, drug companies may be throwing the baby out with the bathwater for fear of what the FDA or other regulatory agencies might say.

That hesitancy could cripple the adaptation of microarray technology in the pharmaceutical industry, Pennie told the FDA forum. “We’re using [microarrays] in a rather conservative way ... when there’s a real opportunity for learning if we start to apply the technology on real issues.”

The delicate challenge that faces the agency now is how to create the right regulatory climate that encourages pharmaceutical companies to use this technology — and allows them to make use of the data it generates.

It’s a tightrope act, and an important part of the future of the genomic contribution to pharma is in the balance.

“This is a fresh technology, still in the process of evolution,” says Sistare. “If we try to be terribly proscriptive now, we can do more harm than good.”

Careful collaborations and fireside chats

To navigate these straits, drug developers and regulators concur that they’re going to have to do something that doesn’t always come naturally: sit down and really talk.

The next year will bring a regular gabfest of data sharing and industry/agency summits designed to help hammer out just how data from new genomic technologies can be organized, standardized, and phased into the regulatory process.

“Essentially, philosophically, the biggest barrier is fear,” Sistare says. “Fear that we do not have the sound judgment and skills to correctly interpret the data, and that we will not misinterpret the data.”

In an effort to conquer those fears, FDA planned a mid-May workshop with pharma to discuss microarray and polymorphism data and to focus on data quality and the preclinical and clinical applications of these new tools.

FDA is also giving its drug reviewers a crash course in pharmacogenetic and genomic data, and is now developing new guidances to clarify some of the regulatory ambiguity around microarray data.

“Part of this process that we’re going through with pharma is to make sure we each understand each other, try to put that together, and put that on paper,” says Sistare. “That would be the ideal.”

Lose the magic

Another promising joint venture is a collaborative microarray test project coordinated by the International Life Sciences Institute’s committee on the use of gene expression and proteomics in risk assessment.

That committee, chaired by Pfizer’s Pennie, has organized about 35 labs from industry, government, and academia to conduct parallel gene-expression experiments to establish basic, reproducible microarray data on some well-known prototypic genotoxins, hepatotoxins, and nephrotoxins.

The project, like a shakedown on a global scale, will ideally produce a fuller picture of both the possibilities and limitations of gene-chip toxicology, says Pennie, who directs Pfizer’s molecular and investigative toxicology research. Lab work is now wrapping up, and the committee is developing a public gene-expression and -analysis database with the European Bioinformatics Institute that is scheduled to kick off in 2003.

This sort of open-minded, experimental approach is exactly what the industry needs to carry microarrays and other genomic technologies through the rigors of standardization and validation, agreed many of the panelists at the FDA forum.

It’s not just an extraordinary opportunity for the genomic industry — it’s also an obligation. “We’re in an era of an amazing set of technologies,” Thomas Cebula, director of the FDA’s division of molecular biological research and evaluation, said at the forum. “It’s as [science-fiction writer] Arthur C. Clarke said: ‘Any sufficiently advanced technology is indistinguishable from magic.’ [We need] to take some of the magic out of it and replace it with a firm underpinning of good science.”

The Scan

UK Team Presents Genetic, Epigenetic Sequencing Method

Using enzymatic DNA preparation steps, researchers in Nature Biotechnology develop a strategy for sequencing DNA, along with 5-methylcytosine and 5-hydroxymethylcytosine, on existing sequencers.

DNA Biobank Developed for French Kidney Donors, Recipients

The KiT-GENIE biobank described in the European Journal of Human Genetics contains DNA samples, genotyping profiles, immune patterns, and clinical features for thousands of kidney donors or transplant recipients in Nantes, France.

Cardiometabolic Disease May Have Distinct Associations With Microbial Metabolites in Blood, Gut

By analyzing gut microbes in combination with related metabolites in feces and blood, researchers in Nature Communications found distinct cardiometabolic disease relationships at each site.

Study Reveals New Details About Genetics of Major Cause of Female Infertility

Researchers in Nature Medicine conducted a whole-exome sequencing study of mote than a thousand patients with premature ovarian insufficiency.