By Justin Petrone
Name: Paolo Fortina
Title: Director, Laboratory of Cancer Genomics, Kimmel Cancer Center, Thomas Jefferson University
Background: 2006-present, professor of cancer biology, department of cancer biology, Kimmel Cancer Center, Thomas Jefferson University Jefferson Medical College, Philadelphia; 2003-06, professor of medicine, department of medicine, Thomas Jefferson University Jefferson Medical College; 1998-2002, associate professor of pediatrics, department of pediatrics, the Children's Hospital of Philadelphia, University of Pennsylvania School of Medicine, Philadelphia; 1992-1997, assistant professor of pediatrics, University of Pennsylvania School of Medicine;1986-1988, research fellow, the Children's Hospital of Philadelphia
Education: 1991— PhD, University of Turin School of Medicine, Turin, Italy; 1984 — MD, University of Turin School of Medicine
The Cancer Genomic Microarray Facility at the Kimmel Cancer Center of Thomas Jefferson University last week installed Febit's Geniom RT Analyzer for use in targeted resequencing and microRNA analysis.
Paolo Fortina, director of the Laboratory of Cancer Genomics at the Kimmel Cancer Center, said in a statement that the addition of the RT-Analyzer, which supports Febit's HybSelect sequence capture application, would be a "powerful complement" to the lab's existing technologies, which now include a Life Technologies Applied Biosystems SOLiD sequencer.
For Fortina, second-generation sequencing and targeted resequencing are just the latest genomic research tools to be deployed to aid in the treatment of cancer. Over the years he has used a variety of technology platforms, including microarrays, with the aim of creating better molecular diagnostics. Now, like many lab directors, he has to decide where to put valuable resources in order to stay on top of the changing tools landscape.
BioArray News spoke with Fortina last week about how he is adjusting to the changes in the array and sequencing markets. Below is an edited transcript of that interview.
Can you give an overview of your background?
I received my medical degree in 1984 at the University of Turin in Italy, and started my career as a Research Fellow in Pediatrics at the Children’s Hospital of Philadelphia. My research involved studies of the regulated expression of human globin genes in normal and diseased states and the general field of detection of nucleic acid alterations in pediatric and in X-linked diseases.
In 1990, I joined the faculty of the Department of Pathology and Clinical Laboratories as Director of the Molecular Diagnostics Core, and in 1991, the Department of Pediatrics at the University of Pennsylvania School of Medicine.
In 1997, my laboratory contributed to the identification of the connexin 26 gene being responsible for autosomal recessive non-syndromic neurosensory deafness. In 2003, I moved to the Jefferson Medical College of Thomas Jefferson University in Philadelphia to implement a Center for Clinical Genomics as Professor of Cancer Biology in the Kimmel Cancer Center. I have been working in microarray-based technologies and contributed to the development of instrumentation and novel SNP-based assays to improve clinical diagnoses. In the area of microfabrication and microarrays, I have collaborated with biotechnology companies and have been sponsored by Molecular Dynamics, Applied Biosystems, Nanogen, Pyrosequencing, Hitachi High-Technologies, Callida Genomics, Affymetrix, Beckman Coulter and Febit.
Within the array field, Affy has been the company I have worked with most of the time, most recently with the DMET [drug metabolism enzymes and transporters] panel.
What do you think about DMET?
DMET has great potential for improving the way we can treat our patients. However, it is a new product and may need some improvement with new markers, as well as in the way the software allows the user to extract and analyze the data. It seems that clinicians are not very well aware of the information that this technology offers. In addition, it seems that there are a limited number of publications on DMET technology. In December 2008, one study on Warfarin was published in Blood, and more recently in the New England Journal of Medicine on Clopidogrel and in the European Heart Journal. We have been using this technology to study patients with cancer; however, at this time we have not enough data yet to reach a conclusion.
What are your main goals as director of the Laboratory of Cancer Genomics?
As director, my main goal is to test and implement new technologies that make diagnosis easier and more cost-effective. These tests can ultimately improve the quality of care we offer to our patients. I would also like to provide more opportunities for our investigators in the area of genomics.
[ pagebreak ]
You recently acquired a Febit Geniom RT-Analyzer. What were your reasons for getting this system?
The RT-Analyzer allows us to reach a specific chromosomal region. We have limited experience with it, but I do believe the technology has some promise. There are alternative technologies, like RainDance. But like Febit, RainDance requires a platform and hardware in the laboratory. An alternative approach is offered by Agilent’s SureSelect, which does not require equipment, and has interesting features like being solution-phase.
The common denominator with these technologies and others for target isolation and enrichment is that they are all new and need to be tested and optimized. We had an early discussion with Febit and agreed to work together. We plan to do a number of studies, possibly in collaboration with other institutions within the region and around the country. If something better should come up, we are open to it.
Do you think everybody will eventually be doing targeted resequencing?
Targeted resequencing of chromosomal regions of interest is just one of the many applications that next-generation sequencing platforms offer. Resequencing facilitates genome analysis and improves the way we can study specific genes, define mutations, rearrangements or genomic imbalances. Regardless of the platform used or the application, the reality is that next-generation sequencing is maturing. However, there are challenges such as improving existing information technology to handle the massive amount of data. Quality control, data transfer and storage as well as computational analysis are just some of the issues that need to be well thought out at the time acquisition this technology is sought.
How are you integrating next-generation sequencing into your studies?
The Applied Biosystems SOLiD 3 was purchased about three months ago. At this time we are running some ChIP-based experiments on human breast cancer samples.
Do you think you'll move more of your projects over to the sequencer?
If this question was posed to me in six months or a year, I’d be able to give a more exhaustive answer. Microarrays have been used for more than a decade and have contributed to the understanding of our genome. Today we know what we can achieve with array technologies. As NGS technology and data analysis improve, there will be complementarity between these two technologies. Additional players in the sequencing arena will come to market in the coming months, and it is expected that cost will go down facilitating new projects to be pursued on NGS platforms.
Some say that the market is evolving slower than expected because of the time it takes to do an experiment.
Always when people come to me and ask if it will really take a week to do a library, I say, "It's true. It will take a week. So what?" In 2000, we had chips with 10,000 SNPs. Today, arrays come in different flavors with millions of registers, for a wide variety of applications. Protocols used to take several days, today we have assays for genome-wide analysis which require less than three days. Protocols in NGS-based applications may require more time, but I do not see it as a major hurdle.
You mentioned higher-density platforms. Affymetrix has just launched a new genotyping platform and Illumina has a roadmap for even higher-density arrays to be launched next year. Will you buy either?
The GeneTitan would expedite the process and reduce the turnaround time. But we have neither the finances nor the need right now. We will continue using our two Affymetrix platforms including a DX2. At this time I have no plan to acquire an Illumina platform. Both are mature technologies as demonstrated by thousands of publications and it is becoming evident, as recently demonstrated, that small gene sets combined with novel microarray analytical methods such as Baysian Model Averaging might become powerful tools for developing diagnostic tests. Therefore, I would be more interested in targeted gene or disease-specific arrays.
Some say the idea of sequencing in diagnostics is farfetched. That it's just too expensive.
That is what they said about currently available technologies like Affymetrix, Illumina and Agilent back in the late 1990s and early 2000s. Today, high-density arrays are moving into cytogenetics laboratories and may eventually replace fluorescent in situ hybridization technology. I agree that the Illumina Genome Analyzer II, Roche 454, Helicos Heliscope, and the ABI SOLiD 3 Plus are expensive tools. However, I have no doubt that the price of hardware will come down. Assays will improve in terms of biochemistry as well as the software packages for data analysis. There is a great deal of potential in diagnostics using NGS technology.
It is hard to make predictions though as technology moves forward in this field. An optimist will believe that technology has always been improving for the past 2,000 years, and would anticipate that next-generation sequencing will be used for diagnostics. But remember, there are also federal and state regulations that need to catch up with the technology. It took some time to get US Food and Drug Administration-cleared array-based tests. Also, the number of approved tests utilizing these genomic-based techniques is limited compared to the number of diseases that could be diagnosed.
You mentioned that it's hard to make predictions in the field. How do you decide what technology to use?
There are multiple considerations which emerge when making a decision in investing in a new technology: company reputation and track record in introducing reliable platforms and assays, technical innovation, publication record when available, quality of service and assistance as well as sharing information and opinions with our peers. Last but not least, time of market introduction as well as some common sense or personal judgment at some point kicks in.
People often ask me why I use Affymetrix and not another platform. Affymetrix came out earlier than other currently available platforms and most of all I had the opportunity to develop a relationship with a number of scientists at the company’s R&D department. I opted for the ABI SOLiD 3 because it is common opinion and it is my personal belief that the biochemistry underlying base interrogation is less prone to sequence misreading, and therefore provides an additional advantage in diagnostic applications.
What are your opinions on the use of arrays in medicine?
Array and sequencing technologies can definitely improve the way we do diagnosis, but they have one hurdle: that is the capability of forming meaningful data analysis. The technology is available and ready, but we are not yet sure how to make sense of the data that this technology yields. In addition, these technologies, especially next-generation sequencing, are expensive. You need the financial capability to acquire the instrument and the instrument support, which is why at this time they are limited to major academic centers and medical institutions. As long as you have support from an institution, implementing the technology is possible. But if we look at cost and implementation of the array technology in 1994 or 1995, and compare it to now, I think we can predict that genome analysis will also become easier and more cost-effective and thus accessible to a larger number of users.