Name: Thomas Turi
Title: Vice President of Science and Technology, Covance Discovery and Translational Services, since 2008
Experience and Education:
Senior Director of Translational Biomarkers and Mechanistic Biology, Pfizer, Groton, Conn.
PhD in molecular genetics, University of Cincinnati College of Medicine
BS in biochemistry and chemistry, University of Illinois at Urbana-Champaign
With more than $1.9 billion in annual revenues and over 10,000 employees in more than 30 countries, Covance is among the world's largest drug discovery and development services companies.
Headquartered in Princeton, NJ, Covance supports the entire process of drug development, including early preclinical work, safety toxicology studies, clinical trials, and approval and post-marketing. More than 90 percent of its clients are from the pharmaceutical and biotechnology industry.
Tom Turi is the vice president of science and technology for Covance Discovery and Translational Services. Part of that division is the Covance Genomics Laboratory in Seattle — a research group and facility that was previously part of Merck subsidiary Rosetta Inpharmatics, which Covance acquired in 2009. CGL offers a variety of genomics services, including gene-expression profiling, SNP genotyping, and miRNA profiling.
Earlier this month, Covance announced a collaboration between CGL and the Institute for Systems Biology to study the regulation of gene expression in glioblastoma multiforme (see other article, this issue), which will make use of the lab's next-gen sequencing platform.
Turi recently spoke with Clinical Sequencing News about CGL's next-gen sequencing services and expertise, and the growing role of sequencing in the drug discovery process. Below is an edited version of the interview.
Can you provide some background on the Covance Genomics Laboratory?
We acquired it in 2009 to expand our genomic capabilities. Within Covance, we did offer genomics solutions through our Covance Central Labs. This [acquisition] really expanded our capabilities in terms of high-throughput genomics, as well as the service offering to be able to support preclinical as well as clinical applications. CGL has what we think is unrivaled experience and expertise in this space.
The capacity that we have at CGL in Seattle exceeds about 300,000 samples annually in current footprint. The only way to be able to [process that many samples] is to have a highly industrialized and automated process that has to be tightly controlled at every stage. We automate as much of the process as we can; we have full control of every [step] and we run it in terms of an industrial Six Sigma-type of operation.
We have a longstanding history and tradition in terms of developing new technologies. Early on, that facility was responsible for developing its own microarray technology, which was subsequently outlicensed and commercialized through Agilent. When we acquired it, one of the chief plans we had for it was to obtain CLIA accreditation, which we achieved in 2010. And second, to expand the existing sequencing capabilities.
How are you equipped with next-gen sequencing technology?
When we acquired the lab in 2009, there was already expertise and know-how related to DNA sequencing technologies. It was really geared toward research-based applications. We recognized there was a need to further industrialize and expand those existing capabilities. Towards that end, we made a significant set of investments in both instrumentation as well as the computational platforms. We currently run both Illumina and [Life Technologies SOLiD] platforms. We have the GAII as well as HiSeq platforms, and we have the ABI SOLiD system as well. I don't want to get into actual numbers but we have all of the platforms necessary to be able to serve our clients' needs and demands.
Because the DNA sequencing technology space evolves very rapidly, and there is a number of third-generation technologies, we are in discussions with all the third-generation technology providers and evaluating when those will be ready for industrialized applications in terms of the type of environment that we run under. And when those become ready for industrial-scale operations, we will be one of the early adopters of those technologies as well.
[ pagebreak ]
What are your criteria for deciding that a platform is mature enough to bring it in house?
It really needs to be ready in terms of an end-to-end solution, in terms of front-end sample processing, the assays that are available on those platforms, and the actual platform itself being robust enough to withstand the rigors and throughput that we will put these instruments through in terms of performance and reproducibility and reliability.
Why don't you run the 454 sequencing platform?
When we acquired the facility, 454 was not one of the platforms that was originally there. It's a robust instrument, it's one of the first second-generation platforms out there, you get exceptionally long reads, and it has certain applications it's well suited for. However, talking with our clients, the types of applications that we typically encounter don't necessarily require those long-read capabilities. That's one of the reasons why we haven't adopted it. If we saw an uptick in requests for that type of platform, we certainly would adopt it. We're technology-agnostic — it really comes down to the applications and the needs of our clients.
What kinds of next-gen sequencing assays do your clients currently request?
What clients are really looking for is an end-to-end solution provider with respect to genomics and DNA sequencing. What they really want is a partner who can support things like experimental design, who has the greatest breadth in terms of sample processing — from cells, tissue, and formalin-fixed paraffin-embedded samples. They want a provider and partner that can support a whole host of assay types, whether it be targeted sequencing, whole exome, RNA-seq, whole-genome sequencing, methylation-type of analyses. That's just the data-generation piece. And then they are also asking for someone that has the computational biology expertise that can assist in analysis as well as interpretation of the data.
Are certain NGS applications more popular than others among your clients?
Most requests are probably coming in the following three areas: whole exome/targeted sequencing; RNA-seq or transcriptome analysis because of the richness that it provides and how it complements our capabilities on standard microarray analysis; and then, lastly, whole-genome sequencing.
I'll add that it's not just the assays, but the application of next-generation sequencing.
We see an increased prevalence across the R&D continuum — of course, initially, it is as a lot of research-based applications. But now, we are beginning to see a fair amount of interest by our clients in terms of clinical applications.
What types of clinical applications?
The area where it has the most prevalence or penetration is in oncology — people want to do mutational analysis of tumors. But we are beginning to actually see a fair amount of interest in applications of next-gen sequencing approaches in other disease areas as well.
Are these mostly targeted sequencing projects, or also whole-genome sequencing?
Initially, it was targeted, looking at maybe a few dozen to a couple of hundred genes. We are also now beginning to see either whole-exome, or, in the case of oncology, people want to have a much broader, deeper look at the molecular basis for these tumors, so whole-genome [analysis] is beginning to creep up into there. The perfect example of that is the collaboration with the Institute for Systems Biology on glioblastoma that we have entered into (see other article, this issue). It's focusing on one of the most deadly forms of cancer.
[ pagebreak ]
The sequencing data is just one component to a very holistic integrated approach to analyze the molecular basis of these tumors. [It also involves data integration through computational biology.] That's a key area that is sorely needed for next-generation sequencing. Our ability to generate data, in many instances, outweighs and outstrips our capacity to analyze and interpret the data. The key element there is data interpretation. We have recognized that for some time now, and we have been looking for partners to assist in that area. We recently announced a collaboration partnership with Ingenuity Systems to aid in further biological interpretation of next-generation sequencing data.
What percentage of your assays does next-gen sequencing make up right now? Do you still run a lot of microarrays?
It's probably fair to say that the microarray platform is still a fair amount of the capacity. But sequencing is rapidly, rapidly growing as people begin to migrate off of microarrays and onto the sequencing platform, especially in the area of transcriptome analysis. The data richness that one gets from RNA-seq far exceeds what you get from standard microarray analysis, and our clients see that and are rapidly adopting RNA-seq as kind of the new standard for transcriptome analysis.
So costs have come down far enough for that?
I think for clinical experiments, the cost differential is negligible. For exploratory, not-quite-sure clients, we will still do the initial experiment on the microarray platform, so it's not going away by any means, but when people find an interesting result, and they want to see a much richer and deeper analysis of the transcriptome, RNA-seq is the way to go.
We will be able to capitalize on that and the capabilities that we have in terms of sample processing, starting from very small samples, to be able to recover and retrieve those samples and apply them both on a microarray platform as well as a next-gen sequencing platform.
Besides computational analysis of the data, where do you focus your technology development now?
We're constantly trying to enhance the front end in terms of sample processing from different sample types. Because we support the entire R&D continuum, the types of samples that we are asked to process are highly variable. Everything from pre-clinical animal tissues or cells all the way to a diverse set of clinical samples, be they punch biopsies, whole blood, fine-needle aspirates, or just a few cells from hair follicles. So we are constantly trying to refine and optimize our sample processing capabilities to be able to handle these disparate types of samples, getting down to lower and lower starting amounts of material. We are also seeing new types of assays being brought forward all the time. It's not just targeted sequencing but also different forms of RNA-seq-type applications, and epigenomics-type of applications. We are always trying to add new types of assays into the mix.
How interested has big pharma been in next-gen sequencing, and are your clients already using next-gen sequencing as part of clinical trials?
Certainly the pharmaceutical and biotech community does have an interest in next-gen sequencing. They have been early adopters. We are seeing an increase in the use of next-gen sequencing in clinical trials, especially in areas you might expect, like oncology. As I said before, we are seeing an interest in other disease areas as they try to understand the genetics that may lead to different patient responses.
Companies like Complete Genomics are specializing in whole-genome sequencing services and are targeting pharma companies as their customers. Do you view them as competition, and how do you differentiate yourselves from them?
They are competitors. However, what differentiates us is that our focus has always been and always will be on drug discovery and development. Having a tightly integrated set of capabilities that are tailored to drug discovery and development, as well as our global footprint, and being able to support clinical trials globally, clearly differentiates us from all the other competitors. We support about a third of the world's clinical trials currently at Covance. I don't think any of our competitors in the next-generation sequencing space can say that.