Skip to main content
Premium Trial:

Request an Annual Quote

Not So Clear

Researchers participating in the US Department of Defense's challenge to detect bioterror threats from DNA samples say that the program rules and scoring process are not clear, ScienceInsider reports. "The way they organized the competition and the way they scored it was just horrible," David Ainsworth, a PhD student at Imperial College London whose team has made it to the next round, tells ScienceInsider.

The Defense Threat Reduction Agency's Algorithm Challenge is offering a $1 million prize to whoever wins its contest to develop a bioinformatics approach to quickly and accurately identify species and genes from raw DNA. Participants received nine datasets containing DNA from a mix of species. They then uploaded the results from their bioinformatics programs to an automated scoring system, and to move on to the next round, they had to receive a minimum accuracy score.

However, ScienceInsider reports that that process contained a number of bugs and that participants were unsure of how accuracy scores were calculated.

At first, no groups made the cutoff, but after a revamping of the rules, three of the 103 teams have made it to the next round, ScienceInsider notes.

The organizers of the challenge note that it was difficult to find the right balance for the contest. "You don't want to make it so high that nobody can win, but you don't want to set the bar so low that you have 200 people tied for first place," says Christian Whitchurch, the project manager of the Algorithm Challenge.