SAN ANTONIO, TX (GenomeWeb) – Core labs can help address the issue of research reproducibility, according to speakers at this year's Association of Biomolecular Resource Facilities meeting here.
"I think cores are an incredibly important and vital route toward increasing the reproducibility of the published literature," said Ian Sullivan from the Center for Open Science during his talk in a session devoted to research rigor and reproducibility.
In 2012, two researchers from Amgen announced in a Nature commentary that they had been unable to reproduce the majority of landmark cancer studies. They reported they had tried to reproduce 53 studies, but could only confirm the results from six, suggesting only 11 percent of these studies might be reproducible.
A subsequent effort to quantify the problem in the life sciences, published in PLOS Biology in 2015, found study irreproducibility to be less prevalent, but still placed the number of studies that cannot be reproduced at about 50 percent, which Stefan Uebel from the Max Planck Institute for Biochemistry called a "shocking figure" during his presentation.
To address the problem, the US National Institutes of Health in 2014 issued guidelines for reporting preclinical research. The guidelines, for instance, encourage the use of community-based standards, data sharing, and more.
"We see an opportunity where cores can play a role [to] encourage reproducibility," said the University of Iowa's Kevin Knudtson, who chaired the session, during his introduction.
But, as Vanderbilt University Medical Center's Susan Meyn noted, a survey of nearly 250 people conducted by ABRF's Committee on Core Rigor and Reproducibility (CCoRRe) found while that half of respondents were familiar with the NIH guidelines on reproducibility, about a quarter were somewhat aware and another quarter were not at all aware of them.
The survey, she said, pointed to a lack of training, expertise, and mentorship as contributing to being unaware of reproducibility guidelines. But, she added, core labs can play a role in that education of the wider research community.
At Vanderbilt, she is part of a strategic planning initiative that includes an emphasis on rigor and reproducibility. "We were inspired by CCoRRe to think about how we might implement simple guidelines, templates, even beyond cores to the research enterprise," she said during her talk.
Meyn assembled a working group of core lab directors at Vanderbilt to establish guidelines for core rigor, reproducibility, and transparency across disciplines. Their draft guidelines, for instance, call for cores to document methods, ensure staff are properly trained on instrumentation, validate software and reagents, explain limitations of data interpretation, and more.
Max Planck's Uebel along with his colleagues from the European biophysics association ARBRE-MOBIEU and the Protein Production and Purification Partnership in Europe (P4EU) network developed more field-specific advice. They homed in on three minimal quality control tests for recombinant protein samples. Protein samples, he noted, can be plagued by aggregation problems, but also by protein degradation as well as the protein concentration being incorrect and the purification of the wrong protein.
To combat this, they suggested that each sample undergo at least three tests: purity, by, for instance, SDS-PAGE; aggregation by size-exclusion chromatography; and identity and integrity by mass spec.
In an ARBRE-MOBIEU survey of about 130 proteins contributed by nearly 50 labs, Uebel and his colleagues found that of the proteins that passed those minimal tests, only 6 percent failed in their downstream applications. But of the proteins that did not pass, about a quarter failed in their downstream applications.
The Center for Open Science's Sullivan noted that cores' efforts could help shift what is considered 'normal' in a field and push researchers toward more reproducible science.
It also helps, he added, if it is easy to do. His organization has developed a framework, called OSF, to enable private collaboration between, say, a core lab and a researcher. That way, all the necessary methods and other information about minimum standards are easily accessible when it comes time for the researcher to write a manuscript. The framework also generates permanent URLs that can be included in the manuscript to link to longer, detailed methods that would make reproducing a study easier.