A number of standardization efforts are crawling through multiple ongoing consensus-building initiatives as the microarray industry and its 10-year-old technology eyes entering the $20 billion-a-year molecular diagnostics marketplace.
This move towards standardization for this technology, one of fastest growing segments of the molecular biology tools segment, is also critical if the technology is to operate in the future, when the instruments and the data coming off ‘omics’ technology platforms will have to interoperate and function in a more interconnected data environment.
This interoperability is based on scientists’ desire to build systems-level biological knowledge. But that is a nascent market, one that is just beginning to organize and not near commercial viability.
The microarray industry’s standardization effort may well illustrate the multidimensionality that molecular biology tool makers must display when nurturing efforts in an ongoing market. They must do this while having to deploy resources and the requisite costs of developing tools that are believed to be critical for the future, yet that are now only at the conceptual stage.
Today, the more attractive path from mass-manufacturing microarrays that cost $600 apiece - which is what big shops such as Affymetrix, Agilent, and GE are doing- is selling physicians inexpensive, easy-to-use instruments that use microarray-based technology. But that path runs directly through the US Food and Drug Administration.
Regulatory approval is the hoop that the industry will have to jump through en route to the high-growth clinical market, and there is no doubt that this technology will one day enter that marketplace. But, the question is when?
The microarray industry is involved in many wide-ranging efforts to carve out supporting technologies, protocols, and certifications to take the “trust-me element” out of the data that comes from microarray-based gene-expression analysis. The industry wants to move the technology to higher levels of reproducibility and reliability that might one day enable researchers to use the data to help guide clinical decisions.
Today’s microarray technology processes are a multidimensional matrix that could almost serve as the very definition of variability. There are so many things that can have an effect on a microarray experiment, from sample preparation, hybridization, the accuracy of the DNA sequence used to design the probe, the amount of ozone in the air, and the level of experience of the lab technician performing the experiment.
The National Institute of Standards and Technology, a unit of the US Department of Commerce, thinks it will take about five years to throw a statistical lasso around variability, tackle the molecular biology and bioinformatics, and develop a score for telling how well the technology can describe what genes are expressed in a complex mixture.
NIST, based in Gaithersburg, Md., has set aside $5 million to fund a new program to enhance the reliability of products and services based on gene-expression technologies. The program, which is part of a larger project to bring gene expression technologies to the clinic, will be run by Marc Salit, a NIST chemist.
The program contains a $1 million equipment budget to build a “world class quantitative PCR laboratory” for doing benchmark measurements for application to the project, according to Salit. The work will be done in conjunction with microarray companies, and NIST is looking to hire a number of investigators and technicians.
“We will be completely agnostic with respect to platform,” Salit said. “The potential return for this NIST investment of a few million dollars is at least 10 times that investment for the country. That is what NIST has seen its activities do for other industries with standards-related problems in the past.”
Those industries include semiconductors, software, interoperability, and the petrochemical industry when it first started, said Fireovid.
“The distillation towers you see in New Jersey outside of New York, they had to have thermodynamic data in order to design the equipment,” Fireovid said.
In NIST’s fact-finding project prior to funding the project, researchers pinned a $350 million value on the microarray market, counting only arrays. “The annual growth rate in dollars is not that big,” he said. “Growth won’t come from numbers, where prices are going to come down. But the real areas for growth are in clinical trials and diagnostics, where NIST hopes to make a big difference.
“If NIST doesn’t do this, it could impact the growth significantly,” he added.
NIST is also working at all levels of the FDA, from the commissioner’s office to a number of different FDA laboratories, Salit said.
ERCC in Action
The External RNA Controls Consortium consists of a mix of scientists and multi-omic-platform manufacturers such as ABI and Agilent. This group has been meeting for a year to design RNA controls to include with experiments.
“Things are slow because of the validation process — validated control sets have to be [validated],” said Andy McShea, director of applied science at CombiMatrix of Mulkiteo, Wash., a microarray platform seller. “The meetings are very good, [and] there is not much posturing. The [consortium] is getting close to a final validation.”