Skip to main content
Premium Trial:

Request an Annual Quote

International Genomics Consortium, Arizona s Desert Flower, Gains Momentum

BERKELEY, Calif., Sept. 11 - Dwindling revenues may dampen R&D spending among pharmaceutical companies these days, but a handful of drug firms still expect to pony up around $35 million in the coming months to support Arizona's nascent International Genomics Consortium.


Arizona's state and local governments, catching on, have given the tissue-sample group a major boost this summer when they agreed to help fund its homestead in Phoenix. They've also spent an additional $92 million on the IGC's Translational Genomics Research Institute, designed to find the genomic basis of disease.


Now it's up to the public-private IGC to standardize the country's disparate and disheveled cancer-biopsy databanks and help biopharma streamline its R&D.


E pluribus unum


Veteran cancer researcher Daniel Von Hoff first floated the IGC concept in April 2000. His idea was simple, but compelling: US medical centers would standardize the ways in which they obtain patient consent, along with the methods they employ to collect, store, and analyze tissue samples. All analyses would be annotated uniformly and deposited in a database that can be accessed both by private and public researchers.


When von Hoff began his quest two years ago the map of the human genome was nearly complete and researchers found themselves with access to a windfall of genotype data. But the phenotype and gene-expression data that became available was collected and stored differently among medical centers and remained extraordinarily disaggregated and non-standardized. Moreover, individual medical centers tended to regard phenotype data as proprietary and shared them with biotech and pharma companies only selectively through clinical trials.


Von Hoff and NIH genome researcher Jeffery Trent reasoned that a consortium uniting medical centers and drug companies could maximize the strengths of both groups. To assure the project would remain open to the public, their only condition was that drug-company partners that helped fund the venture would not get preferential access to the database.


"The genome project and the SNP Consortium created environments of [genotype] data that can be used and reused in research," says Arthur Holden, CEO of First Genetic Trust and chairman of the non-profit SNP Consortium, which last week signed a data-sharing agreement with the IGC. "SNP data is just the letters or the alphabet of genomics; it's not the words or paragraphs. That's why the IGC is so important. It establishes a platform, like the SNP consortium, where phenotypic and gene-expression data can be reused."


The IGC concept gained momentum throughout 2000, nudged by the rainmaking efforts of Phoenix-based attorney Dick Mallery. Mallery, who became interested in the project after his wife succumbed to a rare form of cancer, thought the IGC could generate support for biomedical research in Arizona.


By early 2001, Affymetrix got wind of the fledgling group and agreed to help pay to organize it and to develop a pilot project. It was around that time that Trent and Nic Dracopoli, executive director of clinical discovery technologies at Bristol-Myers Squibb, asked Holden to bring to the IGC his experience with the SNP Consortium.


"This [collaborative] model worked fine for the SNP Consortium," says Dracopoli, who stresses that BMS has not yet made a final decision to fund the IGC. "We work with patient [data] only in our clinical trials. Human samples are not research tools. They have to be managed differently."


Not only differently, but also with new and improved bioinformatics tools. Organizers expect that the IGC's first project, the Expression Project for Oncology, or expO, will produce a 10,000-spot microarray for each analyzed tumor sample it collects. The spots will possess a range of intensity paired with patients' basic clinical information and treatment regimens, the IGC says.


Early estimates suggest that each patient-sample record will contain approximately one gigabyte of data. Collecting and distributing the tumor samples to participating institutions, meantime, may require a parallel supercomputing cluster that will perform 7.5 trillion calculations per second.


The scale of the project is simply too massive for any single research organization, public or private, to undertake, making collaborative efforts a more likely format. As it happens, collaborative-research arrangements are more appealing to drug companies during the current economic slump.


In the 1990s, big pharma's booming research budgets allowed them to invest in nifty new genomics tools and technologies as fast as they were rolled out. Today, with the bubble burst, drug companies aren't exactly looking for new ways to invest research dollars, says BMS' Dracopoli. More often, they are inclined to invest in those late-stage drug candidates or research, such as tumor-expression data, that can be quickly plugged into existing oncology pipelines and proteomics programs.


With its extensive network of medical centers, the IGC promises to give pharma companies economies of scale and access to a wider range of patients than any of them could realize on their own. Not surprisingly, BMS, Pfizer, and eight other drug-discovery companies have said they will work with the consortium and the 19 medical centers that have signed letters of intent to participate in expO.


"Pharma companies are looking for non-proprietary sources of reliable [gene-expression] data that offers good standardization," observes Alan Williamson, a retired pharma executive who worked at Merck and Glaxo. "No single company would wish to collect the extensive data that IGC can offer, but all would like to have access there.


"Because owning the data is not important to pharma companies, sharing makes sense," he says.


Helping hand


A handful of genomics firms, among them GeneLogic, Genomics Collaborative, and CuraGen, were created based on compiling and selling biocontent and tumor-expression data. But the scope of the public-private IGC project does not lend itself to rapid growth for most companies. Instead, the consortium will likely help these firms grow slowly by enhancing their access to varied medical centers and researchers.


"Recruiting patients and collecting [tumor] specimens is a lot more difficult than people think," according to Mike Pellini, CEO of Genomics Collaborative and a member of the IGC pathology committee. The core problems, he says, are patient consent and confidentiality. Medical centers tend to process and admit patients differently, which leads to inefficient and non-standard data collection. For example, if a patient gives her consent to participate in a study, the use of a tissue sample derived from that patient, and the patient information itself, is often limited to that single study.


"What you want is the consent and re-consent to use the data in comparison population studies," says Holden. By applying the data to population studies, researchers can verify and validate the conclusions drawn from smaller studies that have wrapped up.


Though the IGC's goals to unify researchers and standardize protocols are certainly ambitious, the clinical reach of expO remains surprisingly modest.  While an estimated 3 million biopsy tumor samples are harvested each year in the US, expO aims to collect only 30,000 of these over the next three years. By comparison, GeneLogic, expO's corporate kindred spirit, has gathered just 10,000 tissue samples in the past six years.


To be sure, a shortage of tissue samples is not the chief roadblock facing cancer research. Collecting a sufficient number of quality, uniformly annotated samples with consistent RNA integrity is.


"Everyone understands it's very confusing to [compare] results of gene expression when different storage and collection techniques are used," says Affymetrix executive Grace Colón, who served as the IGC's interim COO. "The good part is researchers already have been able to draw [clinical] conclusions in spite of the confusion."  Researchers have been able to discern patterns of disease, asserts Colón, in spite of the many techniques currently employed.


Yet to advance cancer research, hundreds of samples have to be collected and fully validated with robust signatures and interpretations. Insiders say the IGC is the logical public-private forum to establish protocols that use similar algorithms of analysis.


"Moving to standardized [research] protocols is key for the future of cancer therapy," says Colón.


"The bread is about half baked and soon will come out of the oven," says Holden, speaking of the IGC's development. "The IGC's objective is for the greater public good and a public-private consortium is the only cost effective way to achieve it."


The complete version of this article appears in Genome Technology, GenomeWeb's sister publication.

The Scan

Cell Signaling Pathway Identified as Metastasis Suppressor

A new study in Nature homes in on the STING pathway as a suppressor of metastasis in a mouse model of lung cancer.

Using Bees to Gain Insights into Urban Microbiomes

As bees buzz around, they pick up debris that provides insight into the metagenome of their surroundings, researchers report in Environmental Microbiome.

Age, Genetic Risk Tied to Blood Lipid Changes in New Study

A study appearing in JAMA Network Open suggests strategies to address high lipid levels should focus on individuals with high genetic risk and at specific ages.

Study Examines Insights Gained by Adjunct Trio RNA Sequencing in Complex Pediatric Disease Cases

Researchers in AJHG explore the diagnostic utility of adding parent-child RNA-seq to genome sequencing in dozens of families with complex, undiagnosed genetic disease.