Skip to main content
Premium Trial:

Request an Annual Quote

FEATURE: Nothing to Fear But Fear Itself: How FDA and Pharma Will Make Microarrays Mainstream

This is the second in a two-part series that examines how--or whether--drug companies and the US Food and Drug Administration can use microarray data in the drug-approval process. Today's installment explores how the FDA and pharma firms plan to usher the technology into the mainstream. Yesterday's article covered whether microarray technology has a place in the later stages of drug development.


NEW YORK, April 16 - How do you transform a  promising new technique from a fascinating experiment into an industry standard? In the case of microarrays and related genomic technologies, the answer seems to be carefully, collaboratively, and with a whole lot of chatty meetings.


Drug developers and drug reviewers are now grappling with the tricky regulatory question of how to make use of the onslaught of rich but confusing microarray data. Experts on both sides of the regulatory fence worry that gene arrays may generate too much misleading data that could accidentally sink a tender young drug. But if overcautious drug developers shy away from powerful gene chips completely, the pharmaceutical industry as a whole will be held back.


To navigate these straits, drug developers and regulators concur that they're going to have to do something that doesn't always come naturally: sit down and really talk.

The next year will bring a regular gabfest of data sharing and industry/agency summits designed to help hammer out just how data from new genomic technologies can be organized, standardized, and phased into the regulatory process.


"Essentially, philosophically, the biggest barrier is fear," said Frank Sistare, director of the FDA's division of applied pharmacology research for the Center for Drug Evaluation and Research. "Fear that we do not have the sound judgment and skills to correctly interpret the data, and that we will not misinterpret the data." Sistare spoke at a February FDA forum on genomic technologies in regulatory decision making.


In an effort to conquer those fears, the agency has planned a mid-May workshop with pharma to discuss microarray and polymorphism data. The meeting will focus on data quality and the preclinical and clinical applications of these new tools.


The FDA is also giving its drug reviewers a crash course this spring in pharmacogenetic and genomic data, and is now developing new guidances to clarify some of the regulatory ambiguity around microarray data.


"Part of this process that we're going through with pharma is to make sure we each understand each other, try to put that together, and put that on paper," said Sistare. "That would be the ideal."


Another promising joint venture is a collaborative microarray test project  coordinated by the International Life Sciences Institute's committee on the use of gene expression and proteomics in risk assessment.


That committee, chaired by William Pennie, an official at Pfizer, has organized about 35 labs from industry, government, and academia to conduct parallel gene-expression experiments to establish basic, reproducible microarray data on some well-known prototypic genotoxins, hepatotoxins, and nephrotoxins.


The project, like a shakedown on a global scale, will ideally produce a fuller picture of both the possibilities and limitations of gene-chip toxicology, said Pennie, who directs Pfizer's molecular and investigative toxicology research. Lab work is now wrapping up, and the committee is developing a public gene-expression and -analysis database with the European Bioinformatics Institute that is scheduled to kick off in 2003.


This sort of open-minded, experimental approach is exactly what the industry needs to carry microarrays and other genomic technologies through the rigors of standardization and validation, agreed many of the panelists at the February forum. 


It's not just an extraordinary opportunity for the genomic industry--it's also an obligation. "We're in an era of amazing set of technologies," Thomas Cebula, director of the FDA's division of molecular biological research and evaluation, said at the forum. "It's as [science-fiction writer] Arthur C. Clarke said: 'Any sufficiently advanced technology is indistinguishable from magic.' [We need] to take some of the magic out of it and replace it with a firm underpinning of good science."

The Scan

Self-Reported Hearing Loss in Older Adults Begins Very Early in Life, Study Says

A JAMA Otolaryngology — Head & Neck Surgery study says polygenic risk scores associated with hearing loss in older adults is also associated with hearing decline in younger groups.

Genome-Wide Analysis Sheds Light on Genetics of ADHD

A genome-wide association study meta-analysis of attention-deficit hyperactivity disorder appearing in Nature Genetics links 76 genes to risk of having the disorder.

MicroRNA Cotargeting Linked to Lupus

A mouse-based study appearing in BMC Biology implicates two microRNAs with overlapping target sites in lupus.

Enzyme Involved in Lipid Metabolism Linked to Mutational Signatures

In Nature Genetics, a Wellcome Sanger Institute-led team found that APOBEC1 may contribute to the development of the SBS2 and SBS13 mutational signatures in the small intestine.