Skip to main content
Premium Trial:

Request an Annual Quote

Toxicity Database Collaboration Between EPA, FDA Could Be Boon for Computational Toxicology Efforts


The US Food and Drug Administration's Informatics and Computational Safety Analysis Staff has released a database of toxicity data for more than 7,000 chemicals that is expected to drive development of computational methods to predict the toxicity of compounds.

The Genetic Toxicity, Reproductive and Development Toxicity, and Carcinogenicity Database, released in late January, is the result of an informal collaboration between ICSAS, which is a research unit of the FDA's Center for Drug Evaluation and Research, and the US Environmental Protection Agency, which launched its own computational toxicology effort in 2004 [BioInform 01-12-04].

"We're building a relationship with the EPA, where we now have a lot of their non-proprietary information to add to our databases," Robert Daniel Benz, ICSAS database manager, told BioInform. "The bigger the database, the better your prediction because the computer has more information to base its decision on."

EPA contributed genetic toxicity data on more than 5,000 chemicals, which ICSAS combined with genetox, reproductive tox, and carcinogenicity data from FDA studies and other sources to create the complete database (additional information about the database is available at:

"We're going to have people taking these medications every day for the rest of their lives, so we're going to need a whole lot more evidence than just something a computer program can come up with right now."

The database, along with three others that ICSAS has released to date (, form the foundation for some recent studies that Benz and his colleagues have published in an effort to raise the profile of computational toxicology both within FDA and externally.

"We are establishing the credibility of this field by publishing papers and making presentations based on the same research," Benz said.

Most recently, the ICSAS team published two papers in the March issue of Regulatory Toxicology and Pharmacology based on the new database.

Benz said that the first paper essentially reaffirmed the effectiveness of the FDA's current panel of carcinogenicity tests, while the second paper helped validate the use of predictive software as a potential replacement for wet-lab studies.

"It was surprising how well these modules predict," he said, "but it probably shouldn't have been surprising, because the bigger your database, the more information you can give the computer, the more likely it is that the computer will give you the right answer, and that's basically what happened."

Commercial Partners

Leadscope, a cheminformatics software company based in Columbus, Ohio, recently announced that it has signed a Cooperative Research and Development Agreement with the FDA to distribute the ICSAS databases in ToxML format.

The agreement also extends to a suite of similar databases from the FDA's Center for Food Safety and Nutrition.

Chihae Yang, vice president of toxicology and predictive modeling at Leadscope, said that the company is selling the databases for $250 per user, of which 70 percent goes back to the FDA to support further development of the resources.

Leadscope benefits from the CRADA because it "gets access to new customers who access this data, and then we will be able to produce better models and better software products because we have better data available on the market," Yang said. The company also markets a product called Leadscope Database Manager that was developed to manage and visualize information from multiple FDA databases.

Benz said that the ICSAS group supports its research through similar CRADA agreements with MultiCASE, MDL, and two other informatics companies. "Under the CRADA law, we provide development services to the software companies. We provide them with toxicological and clinical information that we get from FDA files … and then they sell their product and the databases that we construct that go with the product, and under the CRADA law, we get payments back to our group and we use that money to hire students and contractors" to supplement the activities of the four full-time members of the ICSAS staff.

Moving into the Mainstream

Benz said that the tiny computational toxicology group has been in place for about a decade, "and we're making slow progress into becoming a mainstream online function in the review process here at the Center for Drugs."

The group's mission is to work toward "eliminating the need to test animals and possibly even to test people when it comes to establishing the safety of chemicals," he said. While acknowledging that in silico approaches will likely never fully replace web-lab studies, he said that current technologies should at least be able to reduce the time and cost associated with animal testing.

FDA management seems to agree. The agency's Critical Path whitepaper, which outlines its agenda for speeding drug approvals, includes in silico methods "such as predictive toxicology" among a number of "opportunities" for speeding safety assessment, and cites estimates that computational modeling could reduce the overall cost of drug development by as much as 50 percent.

"The general attitude that I perceive within CDER here is, 'That's very interesting, and we want to know more.' So there's nobody against this, but they want to have some proof and some assurance that what we're doing works," Benz said.

Benz said that his team is currently "accumulating all the approvals that the Center of Drugs did last year, and we're going to predict all the toxic and clinical effects of those drugs and compare that to the decisions that were made in 2005, and use that as sort of a proof of principle."

In addition, he said that ICSAS is working to access the EPA's proprietary toxicity data in order to improve the predictive modeling software without releasing the information into the public domain.

ICSAS currently serves primarily as a consulting service for FDA reviewers who need some last-minute in silico tests run on a compound to corroborate experimental evidence.

"No decision is made at the Center for Drugs solely on the basis of computational toxicology at this time," Benz said. "Our dream is that someday that will happen, and possibly, fairly soon."

Computational methods are already making inroads at the FDA. The ICSAS counterpart at the FDA's Center for Food Safety and Nutrition is making some safety decisions "based solely on computer predictions," Benz said, noting that these decisions primarily involve food packaging and other items that don't pose a high degree of risk.

"The safety concern is quite minor [at CFSAN], and because of that, making a decision based solely on what a computer can come up with is enough, is acceptable," he said. At CDER, on the other hand, "we're going to have people taking these medications every day for the rest of their lives, so we're going to need a whole lot more evidence than just something a computer program can come up with right now."

One goal, he said, is for the ICSAS team to move from its current role as a consulting service to more of a routine step in the review process in order to educate FDA reviewers about the value of computational predictions. Under that scenario, "For any chemical that is submitted to the Center for Drugs … the first thing that will happen is that we'll run it through our computer programs. We'll make our different predictions of our different endpoints, and we'll post that on our internal webpage, and then the different people who have to consider the safety information will have that available to them."

The ICSAS team will also continue to publish its findings over the next year.

Benz said that he's become "infamous" within the field of computational toxicology for forecasting in 1995 that computer predictions would be the sole method of approving chemicals by 2007 — a forecast that is "constantly, but good-naturedly, being thrown in my face."

Nevertheless, Benz is still holding out some hope that his prognostication will prove true. "I'm thinking maybe by next year we'll convince people that what we're doing is mature enough that people can rely on it to make some decisions — solely on the basis of computer information, and not as we're doing now as an adjunct to the other information they have to make decisions."

— Bernadette Toner ([email protected])

Filed under

The Scan

Myotonic Dystrophy Repeat Detected in Family Genome Sequencing Analysis

While sequencing individuals from a multi-generation family, researchers identified a myotonic dystrophy type 2-related short tandem repeat in the European Journal of Human Genetics.

TB Resistance Insights Gleaned From Genome Sequence, Antimicrobial Response Assays

Researchers in PLOS Biology explore M. tuberculosis resistance with a combination of sequencing and assays looking at the minimum inhibitory concentrations of 13 drugs.

Mendelian Disease Genes Prioritized Using Tissue-Specific Expression Clues

Mendelian gene candidates could be flagged for further functional analyses based on tissue-specific transcriptome and proteome profiles, a new Journal of Human Genetics paper says.

Single-Cell Sequencing Points to Embryo Mosaicism

Mosaicism may affect preimplantation genetic tests for aneuploidy, a single-cell sequencing-based analysis of almost three dozen embryos in PLOS Genetics finds.