Expression Analysis is preparing to begin a second round of its proficiency testing program for microarray facilities, in which it compares variability in Affymetrix gene expression experiment outcomes among different labs and within labs.
But while experts agree that standardization will be important to the industry, whether or not the US Food and Drug Administration will require proficiency testing as a basis for approval remains a big question.
Expression Analysis recently completed phase 1 of the program, which is not intended to grade facilities on a pass/fail basis, but rather to provide a basis for developing performance standards. The company believes that these performance standards are likely to be crucial to firms that eventually seek FDA approval of microarray-based diagnostic products.
Steve McPhail, president and CEO of the Durham, North Carolina-based microarray services firm, told BioArray News, “We thought that through development of this type of program, the benefits would be to participate in evaluating performance standards for use in microarray gene expression assays, to assess inter- and intra-laboratory comparability issues crucial to microarray process validation, to provide input into microarray data quality benchmarks that may be required for regulatory submission, and to contribute to the development of a standard analysis method for microarray data review.”
He added, “In clinical diagnostics, proficiency testing is done in all CLIA-certified laboratories. So, we felt as microarray testing progressed into clinical trials and use in clinical diagnostics, proficiency testing would become a requirement.”
Andy Brooks, director of the American Medicine Development Company (AMDeC) Microarray Resource Center, agreed. “I think it’s going to be essential,” he said. “It is going to be critical for the ultimate standardization of these technologies.”
AMDeC, which is a biomedical research center that has 39 affiliated institutions throughout New York State, has a contract to collect data from participants in the proficiency program.
“From a basic science perspective, MIAME (Minimal Information About a Microarray Experiment) has come a long way in defining the vocabulary we need to be able to compare data,” said Brooks. “But ultimately, there is still a big disconnect over the efficiency and variability from lab to lab. And if labs participate in the proficiency testing program, not only could you look at your own data over time to measure your performance but you could also more effectively compare data between labs.”
The FDA, however, has not yet said that proficiency testing will be a requirement for approval of microarray-based diagnostic products for clinical or non-clinical use. John Leighton, a supervisory pharmacologist within the FDA’s Division of Oncology Drug Products who has extensive experience with microarray technology, said that it is entirely possible the FDA won’t require such testing as a basis for approval, at least not when it comes to a specific test.
“Probably what we’ll say in the end is, from the non-clinical perspective, that what you need to do is demonstrate certain characteristics of your assa,” he said. “So, we’ll take a step back and define what you need to do, and how you go about that will be up to you. That probably allows for the most flexibility,” and would allow the makers of different platforms to come up with their own testing for consis-tency in data.
But, Leighton added, “The thinking could be modified. The field is rapidly evolving, and it may turn out that one standard is appropriate for all tests.”
Four Rounds of Affy
In phase 1 of the program, Expression Analysis provided participants with relative “deviation-driven” measures to compare experimental outcomes. Thirteen facilities participated in the first phase of the program. Each facility received three samples from two separate rat tissue pools with signature spikes in each sample. The facilities all used Affymetrix GeneChips to assay the samples and then sent the data back to Expression Analysis for analysis.
“We looked at intra-laboratory reproducibility, we looked at comparability across labs of gene lists, and we looked at sensitivity across labs,” McPhail said. “So, each individual lab is compared to a greater laboratory population.”
The second phase of the program is set to begin in mid-August and will look at the same metrics that were analyzed in the first phase. “This will give us the unique opportunity to look at data over time … [and] we’ll compare round 1 data to round 2 data,” McPhail said.
Expression Analysis plans to run four rounds of the pilot proficiency program. McPhail said the firm may then choose to modify the program and open it up to a larger laboratory population.
He also said that Expression Analysis may eventually open up the program to other platforms other than Affymetrix, but for now is limiting it to users of the Affymetrix systems. “The reason for that is that Affymetrix has major market share at this point and time, and most of the clients with whom we have conversations are utilizing Affymetrix technology.”
He noted that a major obstacle in extending the program to other microarray platforms is compar-ability related to the gene list used by each platform. He said, “For it to make sense for Agilent or GE (Amersham), for example, we would need to have multiple laboratories. Otherwise, the statistical significance would be questionable without multiple labs participating.”
According to AMDeC’s Brooks, comparison across platforms isn’t possible right now. “I think the limitations in doing this [are] at the level of informatics, not at the level of the array platform itself,” he said.
He noted that he is chairing a panel this summer at an Association of Biomolecular Resource Facilities meeting that will focus solely on the comparison of microarray platforms and the best way to do it.
Expression Analysis’ proficiency program came out of work it was doing with the FDA on developing guidelines on the format, content, and context of microarray data. That project included a “mock submission” of data from a drug project that Schering-Plough had discontinued.
“It seemed as though one of the things the agency was really interested in was identifying levels of reproducibility and comparability between laboratories,” McPhail said. “So, we felt as though this proficiency testing might provide a good forum to begin addressing some of these questions.”
The FDA’s Leighton noted that the agency is working with a number of different vendors to come up with what’s appropriate for genomic analysis from a non-clinical study, and it also is working with various companies on the development of standards.
But reaching conclusions about what the FDA will require from companies submitting data from microarray platforms may be a long way off. Leighton noted that the agency has yet to publish a guidance document seeking comments on such potential requirements. “Is anything going to happen in the next six months? Probably not,” he said.
— EW