Next-generation sequencing has established a strong foothold in the R&D departments of pharmaceutical firms, both in target discovery and validation and for developing DNA-based biomarkers.
But the technology is still undergoing a lot of change, and many pharmas and biotechs have opted to outsource their sequencing needs, rather than build large in-house facilities. During a panel discussion at Hanson Wade's NGS Translate conference in Cambridge, Mass., last week, representatives from big pharma and a genomics service provider talked about the rationale for outsourcing and how they go about finding the right partner.
"Without question, in the last three to four years, we have seen a huge shift in the pharma industry, [which has been] outsourcing genomics services and shutting down their own labs," said Jon Williams, vice president of Covance Genomics Laboratory in Seattle, which provides genomics services to large and small pharmaceutical and biotech firms as well as academic and non-profit customers. The decision to outsource, he said, is not primarily based on cost savings, but on where to focus internal resources.
While finding a good outsourcing partner takes some time, it allows pharmaceutical researchers to remain focused on their goals, said Brian Dougherty, who leads next-gen sequencing and other genomics efforts in oncology at AstraZeneca in Cambridge, Mass. "If our core job is to come up with new drugs, do I really need to build a genome center to do that internally?" he said. "You've got to figure out what's the best way to leverage buying equipment and [hiring] people internally … versus what you can do externally better, faster, cheaper, and more effectively."
The outsourcing trend in next-gen sequencing is very much in contrast to how pharmaceutical companies adopted microarrays when they first appeared, several panelists noted. "Everyone did it internally, everybody made their own spotted arrays, everybody had different standards, and nobody could talk with one another or compare datasets," Dougherty said. Standards emerged slowly, and users eventually converged on the Affymetrix platform.
According to James Cai, head of disease and translational informatics at Roche Pharmaceuticals in Nutley, NJ, pharma companies used to run 95 percent of their microarray experiments internally. "But today, when we look at RNA-seq, exome sequencing, the proportion is really reverse, 95 percent is done outside," he said.
Roche does maintain limited internal next-gen sequencing capabilities, he added, but those are reserved for small projects that require a fast turnaround time.
Having some kind of internal NGS facility is important, Dougherty stressed, because the technology is complex and internal scientists need to know what to expect from a service provider. "If you don't have the expertise in next-gen sequencing, you have no idea what the quality is of what you're getting back — you have no base reference," he said. In order to be a "sophisticated consumer," maintaining a small NGS lab is a good idea, "and that's been a model that a number of groups have turned to."
But while many pharmas appear to be ready to outsource sequencing, most prefer to do the data analysis in house. "So far, we are doing most of our analysis in house, or have the ability to do it in house," said Jason Hughes, who leads the computational genomics team in the informatics IT group at Merck in Boston. Like Roche and AstraZeneca, Merck has been outsourcing much of its sequencing, allowing it to try different technologies, he said.
According to Covance's experience, "most large pharmas want to do their own informatics," said Williams, whereas mid-size pharmaceutical and biotech firms often do not have the in-house capabilities or expertise.
In the future, companies might find it "tempting" to outsource primary data processing and analysis, Dougherty suggested, while keeping integrative analyses and data interpretation internal. Alternatively, as software tools improve, they might just purchase a software analysis package and keep the primary data analysis in house, he said.
According to Cai, it is unclear whether data analysis will follow the outsourcing trend, as many providers are starting to offer analysis services along with the sequencing. "So far, we are not ready to [outsource] everything yet, but it will be interesting to see in a few years," he said.
When looking for a service provider, companies consider a number of criteria, including cost, quality, level of service, and speed.
According to Dougherty, low cost is often not compatible with high quality, and the total project cost might be higher if the sequencing is cheap because the data require a lot more in-house work.
To assess a potential partner, both word-of-mouth and pilot projects are important. At AstraZeneca, for example, scientists from several departments that use next-gen sequencing sometimes get together to exchange their experience with different vendors and technologies. "For me, one of the most important things is to hear good or bad outsourcing partner experiences," Dougherty said.
Others pick up the phone and talk to their colleagues at other pharmaceutical companies about how they have liked a particular firm, or they find out from talks at conferences. "If they have good experience with a vendor or provider, that helps us," Cai said.
For large projects, Cai said, Roche always runs a small pilot project first "just to get a taste" of the provider. AstraZeneca has taken a similar tack for finding the right partner. "It's a bit like speed-dating, you try lots of different vendors," Dougherty said. In the end, there might be several suitable partners, depending on the project, "because they each have their strengths and weaknesses."
Comparing prices and services for making procurement decisions is not always easy, he said, because unlike microarrays, next-gen sequencing has so many variables. "It's very difficult for you to price it in unless you're doing nothing but exomes at a certain coverage and [using] a very specific analysis," he said.
Williams said his service lab has had "lots of dances" with prospective clients, the first of which is usually a discussion about technical capabilities, often followed by a small pilot project. "It is really, really hard to recover from a bad experience, both from our side and also from the client side," he said. "We try to avoid those."
One difficulty for judging the quality of vendors is a lack of reference materials that could be sent out to multiple vendors for analysis, several panelists pointed out. Efforts like the National Institute of Standards and Technology's "Genome in a Bottle" (CSN 9/5/2012) will help. "Having a standard in next-gen sequencing will be an advantage, especially from the informatics perspective," Cai said.
In the absence of reference standards, scientists come up with other criteria to judge the quality of a provider's data. For an exome sequencing project, for example, they might look at the concordance with chip-based genotyping, the depth of coverage, and the percentage of the exome covered. Companies might also send vendors replicates of the same sample and do some limited upfront genotyping on their samples to be able to detect mix-ups.
"One of the issues, though, is 'How much time do you spend doing pilots and drafting statements of work?'" Dougherty said. "The analysis you have to do on the data you get back is incredibly time-consuming, so you have to have the right balance between benchmarking as well as getting your work done."