By Matt Jones
NEW YORK (GenomeWeb News) – A federal trans-agency initiative focused on using large-scale genomics, bioinformatics, and other technologies to predict how chemicals found in drugs, consumer, and industrial products, as well as food additives, will affect the human body and the environment has completed its first testing phases and now is moving on to begin screening a total of 10,000 chemicals.
The Tox21 program, a collaboration of the National Institutes of Health, the Environmental Protection Agency, and new member the US Food and Drug Administration, officially launched in 2008 with the aim of developing new ways to predict toxicity for thousands of chemicals and sharing that information with researchers and drug developers around the world.
Led at NIH by the National Chemical Genomics Center (NCGC) and the National Institute of Environmental Health Sciences' National Toxicology Program, Tox 21 wrapped up its first phase about a month ago when it completed the screening of around 2,500 unique chemicals and began ramping up to pursue screening thousands more, Chris Austin, director of the NIH Chemical Genomics Center, told GenomeWeb Daily News in an interview.
The purpose of the project's first phase "was to evaluate whether this approach to toxicity testing was valid – could one generate relatable, reusable data using high throughput screening processes," said Austin, who also is a senior adviser for translational research at the Office of the Director of the National Human Genome Research Institute.
The Tox21 program has several central aims that are all linked: to research, develop, and translate new methods for characterizing toxicity pathways; to discover new tools to identify chemical-induced biological activity mechanisms; to prioritize which chemicals most need evaluation; to develop more effective models for predicting how chemicals will affect biological responses; and to identify chemicals, assays, informatics approaches.
Over the course of phase I about 150 different screens were conducted separately on the collection of around 2,500 chemicals at NCGC, Austin told GWDN, explaining that avoiding single concentration screening was very important for toxicological evaluation, as different doses of different compounds may trigger dramatically different results.
"Given those 150 screens, combined with hundreds of other assays by partners at EPA – as part of their ToxCast program – it was very clear that one could develop predictive algorithms of animal toxicity," he said.
The ToxCast program, which has completed its own proof-of-concept phase in 2009 and is now screening 1,000 chemicals for input into the ToxCast Database, is providing Tox21 with access to HTS data and its chemical library to increase the amount of data it has on the 10,000 chemicals targeted by Tox21.
The success of the phase I efforts appear to have justified the program and gives it extra momentum as the partners move forward to tackle the much larger group of chemicals.
Austin said the partners now have begun initiating their first testing protocol, which will use the "huge, unprecedented amount of data" generated in Tox21's first phase to find out if there are certain assays that seem to correlate with animal toxicity.
He explained that NCGC will use the results from a series of in vitro assays to develop a fingerprint for a reaction such as liver toxicity, and then use these assays to predict liver toxicity for chemicals that they haven't tested yet at all. Ultimately, what Austin and his colleagues hope to do with their data is find out if there are groups of assays that when put together can predict toxicity in an animal model or in a human.
In preparation for the large amount of new high-throughput screening studies it is about to undertake, NCGC recently boosted its capabilities by installing Wako Kalypsys and Aspect Automation robotics instruments in its labs.
"Tox21 has used robots to screen chemicals since 2008, but this new robotic system is dedicated to screening a much larger compound library," NHGRI Director Eric Green said in a statement last month.
The new robots will enable Tox21, and NIEHS' National Toxicology Program , to test chemicals "smarter, better, faster," NIEHS and NTP Director Linda Birnbaum added.
"We will be able to more quickly provide information about potentially dangerous substances to health and regulatory decision makers, and others, so they can make informed decisions to protect public health," she added.
The collection of 10,000 chemicals the Tox21 partners plan to work through includes compounds in every pharmaceutical drug that is currently available, Austin added, saying it will create a collection of "enormous value" to researchers and to the drug development community.
Austin also said that the Tox21 group is "constantly developing new algorithms" for studying the HTS data and seeking meaning in it, and that the partners plan to make available its analysis of some of the data, which he said is "far more complicated than genome sequencing data."