NEW YORK (GenomeWeb News) – The US Federal Government plans to move away from testing for toxicity on animals toward studying compounds using new genomics and other high-throughput screening technologies, in what National Institutes of Health Director Elias Zerhouni today called “high throughput 21st century toxicology.”
Under a new five-year program, researchers at two NIH centers and at the EPA will work together to harness new technologies each organization is using and develop new methods that can advance toxicology research.
Specifically, the NIH Chemical Genomics Center, the National Toxicology Program, and the National Center for Computational Toxicology will collaborate under a memorandum of understanding to implement methods to study more compounds, faster, more closely, and with greater relevance than has been done in the past.
The plan, which National Human Genome Research Institute head Francis Collins and leaders from the NTP and the FDA outline in an article in tomorrow’s issue of Science, essentially is designed to push toxicology studies away from in vivo animal research and to “in vivo and in vitro assays with lower organisms, and computational models for toxicology assessments.”
In a conference call today, Zerhouni said the collaboration is “the birth of a new approach” that will change the way the government tests for toxicity in the future and will make all of the results about compounds available to the public worldwide.
The prospective budget for the program has not been worked out yet, but the NIH and EPA plan to meet in March to outline a funding strategy and research timeline, Bob Kavlock, director of the National Center for Computational Toxicology, said in the conference call.
In the Science article, Collins and George Gray and John Bucher call the program “a long-range vision for toxicity testing and a strategic plan for implementing that vision.”
The trans-agency effort is designed to meet certain core challenges, including the large number of substances that require testing, incorporating recent technology into toxicology, relying on human rather than animal data, and increasing efficiency.
“The old approach has given valuable information,” Zerhouni said today, but “it is expensive, uses animals, and it is not precise.”
The new effort should enable US agencies to test thousands of compounds under thousands of different conditions “much faster than we did before,” Zerhouni added. “Because of scaling up of technologies,” he said, researchers can “move from testing one compound at a time in animals to testing thousands in cells that are very specific to humans.”
The program also will focus on pooling resources from several different programs.
Currently, there are 2,800 NTP and EPA compounds being tested at the NCGC in more than 50 biochemical and cell-based assays, according to the Science article.
The EPA’s ToxCast program, launched in 2007, is currently profiling over 300 toxicants, most of which are pesticides, across more than 400 endpoints, including biochemical assays of protein function, cell-based transcriptional reporter assays, multicell interaction assays, transcriptomics on primary cell cultures, and developmental assays in zebrafish embryos, according to the article. That program uses computational methods, genomics, and cell biology to speed up testing and to enhance the capacity to screen compounds.
The program will “take advantage of a lot of things we now can do, some things from the [Human] Genome Project… [in order to] look at different cells from different organs from different animals,” Collins said during the call.