After nearly two years of receiving voluntary genomic data submissions as part of its March 2005 pharmacogenomics guidance, the US Food and Drug Administration has released a concept paper that could require all CLIA-approved labs that use microarrays to undergo proficiency testing.
Federico Goodsaid, a senior staff scientist in the genomics group of FDA’s Office of Clinical Pharmacology, told BioArray News this week that the agency has outlined its suggestions for proficiency testing and a number of other topics in a concept paper called “Recommendations for the Generation and Submission of Genomic Data,” and is now seeking public comment for an unspecified amount of time on the recommendations. Goodsaid declined to discuss timing further.
Once the public comment period is over, the FDA will then consider adding the terms of the concept paper to the official PGx guidance. Goodsaid said it is possible that the PGx document could be amended to include the new provisions as soon as next year.
He added that the need for proficiency testing came from the agency’s experience with VGDS submissions as well as the recent publication of the results of the Microarray Quality Control project in September (see BAN 9/12/2006).
The MAQC evaluated the reproducibility and reliability of microarray data by comparing the data from the same human RNA samples run on different platforms at different testing sites. While the project was treated as a landmark breakthrough on microarray platform concordance issues — Nature Biotechnology devoted an entire issue to the MAQC results — Goodsaid said that it was only one impetus among many for pushing for proficiency testing.
“This is a need based on our own internal experience from VGDS, although the MAQC results added to and reinforced this need,” he said.
In the concept paper, which can be found here, the FDA is more specific about what is driving the need for proficiency testing. As the paper notes, many of the discrepancies in array data that has been submitted to the agency stem from procedural failures at labs, rather than the experimental platforms that produced the data.
“In many cases, poor quality of microarray data was due not to the inherent quality problems of a platform but to the lack of proficiency of the laboratory that generated the data,” the concept paper states. “Such a ‘procedural failure’ in a laboratory is much more serious than randomly failed hybridizations that lead to outlying arrays. This is because the laboratory may not recognize that it has a procedural failure problem,” it continues.
To remedy the issue, the “agency recommends that sponsors provide data that will enable FDA reviewers to objectively evaluate the competency of the laboratory that generated the data in a genomic submission.” The paper goes on to recommend achievement of laboratory proficiency through a number of approaches, including inter-laboratory testing of well-characterized RNA sources, such as those used by the MAQC project.
The FDA also proposes that proficiency testing programs “should involve replicate samples of two biologically different samples with known differences in transcript abundance.” Additionally it states that when “multiple laboratories are providing data generated using the same RNA samples and the same platform, the comparability of the detected differences in expression between sites can be assessed. According to the agency, this kind of proficiency testing would provide repeatable data comparisons that could enable microarray labs to past muster for subsequent CLIA certification.
“I think it will have a great impact on all the microarray labs to start comparing our data and noticing the differences in the data and tracking down the variability.”
In summary, the agency recommends submitting microarray experimental design details in VGDS packages in the future, including information on sample processing and labeling and how data were generated and analyzed. In its draft, the FDA also seeks to ensure that submissions include information concerning scanner calibration settings, software settings for image acquisition; how the data from individual microarrays were combined, the normalization method, data filtering, data analysis, and statistical tests.
While the concept paper is at least a year away from being added to the official PGx guidance, some reference labs are already adopting proficiency testing. According to Laura Reid, director of R&D at Expression Analysis, her company is unlikely to see any impact from an updated guidance because it has already been doing proficiency testing for several years.
Still, Reid said that it’s possible that other labs that haven’t been doing proficiency testing will now be forced to institute changes that she expects will benefit the entire microarray community.
“It will certainly impact other facilities that aren’t doing proficiency testing at this time,” Reid told BioArray News this week. “I think it will have a great impact on all the microarray labs to start comparing our data and noticing the differences in the data and tracking down the variability,” she said. “Sometimes the variability is a slightly different protocol and sometimes the variability is inherent in the assay. The more of you do of these inter-laboratory assays, the more you get a handle on the variation and the quality of the microarray data.”
Reid, a MAQC co-author, said that EA has been using the RNA samples from the MAQC project as well as a variety of other samples to do proficiency testing at its laboratories.