Skip to main content
Premium Trial:

Request an Annual Quote

Gov t Study Identifies Roots of Variety In Array Data; PGx Upshot Observed


A recent study by a consortium funded by the US National Institute of Environmental Health Sciences has identified the sources of variation between gene-expression microarrays, Pharmacogenomics Reporter has learned

In addition, a second consortium may be set up after a US National Institute for Standards and Techology meeting in May to provide the next step in the process of microarray standardization. At this meeting, which will take place in Boulder, Colo., on May 16, the NIST will discuss the idea of forming a gene-expression metrology consortium that may provide reference data and models of array chemistry to help array manufacturers make their platforms compatible between experiments and brands.

Together, these consortia may help encourage the use of microarrays in drug discovery and development, which is a lynchpin in the broader use of pharmacogenomics technologies by pharma.

In its study, which appears in the May issue of Nature Methods, the NIEHS, through its Toxicogenomics Research Consortium, compared microarray data and methods using data from 64 researchers at seven locations.

"What [the US Food and Drug Administration's Interdisciplinary Pharmacogenomics Research Group] has gained from us is something that they don't have to do — we're showing what the sources of variability are and how to control them, and that gives them a step to move forward without them having to do it themselves," Brenda Weis, an author on the TRC's paper, told Pharmacogenomics Reporter this week. Weis is program administrator in the division of extramural research and training at the NIEHS.

To that end, the NIST's metrology consortium might supply "fundamental reference data on hybridization affinities between particular sequences of DNA or RNA-DNA duplexes that would help to feed into good product development" in the private sector, in addition to chemical modeling," said Mark Salit, project leader of metrology for gene expression at NIST. Salit said he hopes the effort will lead to objectively based product development in the prvate sector.

Microarray Variability

According to the TRC's study, the most effective way to cut down on inter-experiment variability was through the use of a standard experimental approach. "That's a pretty huge finding," said Weis. "When we launched the study … no one really had an idea of what were the sources of variability in those experiments and what kind of data could be generated," she said.

In fact, the arrays used in the study were the highest sources of variability. "It has a lot to do, we believe, with the level of quality control and engineering that goes into the manufacture of arrays, and that lends itself to our finding that the commercial arrays produced the most consistent results," with some exceptions in large academic laboratories, Weis said.

Another major source of variation between experiments was the slight differences in researchers' laboratory techniques, the study found. "There was a lot of belief beforehand that there were other steps in the process that would contribute significantly — let's say the labeling and hybridization — [but] we didn't find that that contributed nearly as much as" manufacturing differences and lab techniques, said Weis.

The third contributor to variability was the software package used to scan the array, and the "feature extraction parameters" researchers chose while using the software, she said. With a standard set of these parameters, even more variability can be avoided, she added.

The seven laboratories tested two different RNA samples — a rat-liver RNA sample and a mix of RNA from five rat tissues — using their own protocols with microarrays manufactured in-house. The TRC analyzed the results through a systematic analysis of different SNPs, said Weis. "Some of them were using Affymetrix' [GeneChips] and [GE Healthcare's] CodeLink, and we had two standard arrays, which we manufactured; one was done by one of our research labs, and the other was developed in a partnership with Icoria and Agilent," she said.

Could the study help lead to standard microarray methods? "This was a huge outlay of time, resources, and expertise to conduct our study, and the findings are quite definitive, and that's going to help [pharma and regulators] make decisions about which way to go," said Weis. "They are most interested in that."

"We do a lot of interface with pharma, FDA, and US Environmental Protection Agency, and they're waiting to hear what our findings are," Weis said. "I can bet you that internally, these pharma companies are using their own standard reference materials," she said. Members of the consortium sit on "a lot" of committees with drug makers and tool vendors such as Merck, Pfizer, Agilent, Amgen, Gene Logic, and others who are interested in getting "answers they can build on," she added. "We've given them some best practices to use."

"Because we now understand the sources of [laboratory and assay] variability, we can focus on the sources of biological variability," said Weis. The TRC has completed the second phase of its study, which addresses biological variability across model systems, and "we're looking for conserved responses, because that's the best way to understand biological processes," Weis said. "We've collected our data and are getting ready to publish that in the next year."

Another Step Down Standardization Road?

Regardless of whether it creates the second consortium next month, NIST will lay out its plan for easy platform comparison. Traditional chemical metrology tools should prove useful for describing microarray gene-expression analysis, said NIST's Salit. "Those sort of tools look like being able to estimate uncertainties for gene-expression profiles — in other words, what are all the components that give rise to variability, and can we quantitatively estimate them and come up with a confidence interval on a particular gene's expression level?"

The group will likely also focus on validation, said Salit. "That might get into finding quantitative ways to assess not only uncertainty, but bias," such as cross-hybridization, error bars for expression levels, and calibration issues, he said.

"The major microarray manufactures have shown interest in finding out more about this," as well as whether competitors can work on the project without conflicting proprietary interests, Salit said. He declined to say whether any manufacturers had shown reluctance to join in. "Some have probably been less enthusiastic than others," while some companies may have more of an interest, he said, without elaborating. "The enthusiasm has been encouraging," he added.

"Ultimately, they'll be able to better understand the quality of their microarray gene-expression results, and from that draw a better inference about performance of a drug candidate," Salit said.

— CW

Filed under

The Scan

Genetic Risk Factors for Hypertension Can Help Identify Those at Risk for Cardiovascular Disease

Genetically predicted high blood pressure risk is also associated with increased cardiovascular disease risk, a new JAMA Cardiology study says.

Circulating Tumor DNA Linked to Post-Treatment Relapse in Breast Cancer

Post-treatment detection of circulating tumor DNA may identify breast cancer patients who are more likely to relapse, a new JCO Precision Oncology study finds.

Genetics Influence Level of Depression Tied to Trauma Exposure, Study Finds

Researchers examine the interplay of trauma, genetics, and major depressive disorder in JAMA Psychiatry.

UCLA Team Reports Cost-Effective Liquid Biopsy Approach for Cancer Detection

The researchers report in Nature Communications that their liquid biopsy approach has high specificity in detecting all- and early-stage cancers.