This article has been updated with quotes and additional information from NCGC Director Christopher Austin.
By Matt Jones
NEW YORK (GenomeWeb News) – The Tox21 program that brings together the National Institutes of Health and the Environmental Protection Agency to develop new ways to predict the effects of chemicals on humans and the environment has gained a new partner in the US Food and Drug Administration.
The FDA will work with Tox21, which is led by NIH's National Chemical Genomics Center (NCGC) and the National Institute of Environmental Health Sciences' National Toxicology Program, to develop models that can enable better prediction of how humans will respond to certain chemicals.
Specifically, FDA will provide the program partners with expertise and chemical safety information that can be used in improving current chemical testing methods, and it will work to prioritize chemicals that may require more extensive toxicological evaluation.
"This partnership builds upon FDA's commitment to developing new methods to evaluate the toxicity of the substances we regulate," FDA's Director of the Center for Drug Evaluation and Research, Janet Woodcock, said in a statement.
"The addition of FDA to this effort allows biomedical researchers and regulatory scientists to work together side by side to more rapidly screen chemicals and find more effective ways to protect the health of the public," NIEHS and NTP Director Linda Birnbaum added. "Using the best science to protect human health and the environment is the ultimate goal of this collaboration," Birnbaum said.
"Through the Tox21 collaboration, 2,000 chemicals have already been screened against dozens of biological targets and we are working to increase the number of chemicals to 10,000 by the end of the year," EPA's Office of Research and Development Assistant Administrator, Paul Anastas, said.
A central part of the Tox21 program is NCGC's robotic screening and informatics platform, which screens thousands of chemicals per day for their toxicological effects in cells.
NIH's NCGC Director Christopher Austin told GenomeWeb Daily News that the project, which could last around a decade, will aim to transform toxicology from an animal-based and empirical basis into "a predictive, mechanistic science" that would have the passing effect of reducing animal testing, though cutting the animals out of toxicity testing is not the core goal.
"What Tox21 is doing is a much larger, systems-wide project to develop in vitro assays that will be more predictive, and because they're more predictive, and mechanistic, and cheaper, and faster, it will make the animal studies no longer necessary."
One ongoing program is looking at liver toxicity, which is a common side effect of drugs and chemicals in humans, and the compounds that could cause those problems based on a certain signature for toxicity. This program, Austin said, is seeking to develop signatures for toxicities for all the major organs.
The program began in 2008 as a response to a National Research Council report about toxicity testing in the 21st Century, and it was started with three central goals: to identify mechanisms of chemically induced biological activity, to prioritize chemicals that need to be evaluated, and to develop predictive models for in vivo biological response.
It originally was coordinated through a memorandum of understanding that sought to build upon expertise from NTP, NIEHS, EPA's National Center for Computational Toxicology (NCCT), and NCGC's knowledge and high-throughput screening capabilities, and now FDA has joined the group.
Over the past year and a half since the program began, Tox21 has been working with a collection of around 2,800 chemicals from EPA and from the NTP, which it has tested across several hundred assays and at various concentration levels, Austin said.
Now, he said, the EPA and the NTP have conducted internal consultations to come up with about 7,000 environmental chemicals that are "of high interest" for a variety of reasons. "Most of these things are industrial chemicals, pesticides, things that are produced at high volumes by industrial organizations, so they end up in the ground water," he explained.
The NCGC also has added a list of around 3,500 small molecule drugs to the program — so the full collection, Austin estimates, will be around 10,000 or 11,000 chemicals that the group will test over the next several years.
The FDA involvement fits into the program because the same questions about assessing the toxicity of chemicals in the environment, such as pesticides that people are exposed to generally by accident, apply to pharmaceuticals, which people expose themselves to on purpose.
FDA also brings to the collaboration large sets of data about the systematic human exposure to chemicals – pharmaceutical compounds - something that the EPA and NIH do not have. Using that knowledge from FDA will require some working out, Austin acknowledged, because there are questions about how freely FDA data on commercial compounds that were submitted for drug approval may be used in research.
The program will develop large amounts of data over the coming decade, Austin said, and all of it will be made publicly available.
"The whole intent here, much like the [Human] Genome Project, is that we will generate the data, [and] we will draw initial conclusions based on our own analyses, but these data are so complicated that we want the world to be able to compute on them," Austin said.
"The thing that is great about this project is that it brings together four different expertises from four different federal agencies, all of whom are working together very closely to generate gobs of data that then become public. This is something that doesn't happen in the federal government that often, and it really works," he said.