CAMBRIDGE, Mass., Dec. 14 – Representatives of pharmaceutical companies know precisely what they want from the genomics sector, and they aren’t shy about letting it know.
At a two-day symposium here sponsored by the Massachusetts Institute of Technology, academic scientists from around the region presented novel technologies to peers, pharma reps, and government.
The meeting, which is held every two years, is designed to bring the three faces of life sciences—academia and the private and public sectors—up to date with the state of the art.
This year’s gathering, generously titled The Future of the Pharmaceutical Industry, offered a novel twist: A luncheon was transformed into a kind of market experiment in which 10 or so individuals representing different segments of the life science universe shared a cordial lunch and were asked to discuss how each group might help the other.
The lunch—a spicy tomato soup followed either by grilled chicken or ravioli—was short. The rap session, which organizers stressed was off the record, was not.
Behind one table sat three life-science graduate students, two big-pharma reps, an industry financier, an MIT professor acting as a “facilitator,” an MIT note taker, and a reporter. The discussion, punctuated by stabs at breadbaskets and saltshakers, quickly and predictably settled into What Genomics Can Do to Help Pharma Flesh Out its Drug Pipeline.
To be sure, the meeting comes at a peculiar time for pharma and biotech. The traditional drug-development industries have been producing fewer new products while racking up a significantly larger bill than usual doing so. Research recently released by the Tufts Center for the Study of Drug Development, for example, shows that the biotech revolution seems so far to have slowed the drug-discovery process.
The report, released one month ago today, tracked the clinical drug-development process during the last two decades and found that the time spent on clinical trials has surged by nearly 80 percent since the mid-1980s, from an average of 33 months to an average of 68 months.
The average cost of developing a new prescription drug has also jumped, to $802 million from $231 million a decade ago, according to a similar report put out by Tufts two weeks later. This figure includes the cost of human trials, preclinical studies, expenses for product failures, and the impact of long development times on investment costs.
And although the drug industry remains the most profitable worldwide—it generated profits as a percentage of revenues four times the median rate for all Fortune 500 firms during the end of the last decade, according to a Kaiser Family Foundation report released that day—an editorial in this month’s Nature Biotechnology by David Horrobin, CEO of Laxdale Research, in Stirling, Scotland, had this to say: “With rare exceptions, most of the top 20 multinational pharmaceutical companies are not generating in-house the new products needed to sustain the rates of growth they have enjoyed in the past.
“No serious industry onlooker could dispute this depressing picture,” the commentary continues. “Although a few pharmaceutical companies may survive in their present form, most cannot…. A few brave companies are recognizing the obvious: large companies excel at sales and marketing but are hopeless at innovative research.”
But at lunch in Cambridge on Thursday, corralled and occasionally prodded by the moderating facilitator to keep issues broad, lunch mates focused on one part of the pipeline where genomics tools and technology can best be applied to pharma goals.
It’s not a money issue, stressed one pharma official. It’s a time issue. Early-stage research at his company, one of the bigger drug firms in the US, is often stymied not by extravagant costs (there aren't any at that stage) but by self-imposed time constraints and that nagging notion that a competitor is one step closer than they are to hitting paydirt.
One topic of discussion was technology that may rapidly and efficiently put to work the vast stores of combinatorial chemical compounds his and other pharmas have collected over the years. A godsend, said the other rep, would be an inexpensive tool that would permit their researchers to test a compound on multiple targets. But she quickly added that outcomes may differ significantly when the compound is tested in patients.
To be sure, the toast of the first conference day was pharma’s rich libraries of combinatorial chemical compounds and other small molecules, and not the genomic sector’s own gene and protein databases. The meeting’s moderator, Anthony Sinskey, a co-director of MIT’s Program on the Pharmaceutical Industry, stressed that the millions of chemicals sitting on pharma’s shelves need to be investigated, and that it is up to the array makers, microfluidics developers, and informatics providers to help them predict and measure outcomes and bring products to market.
Some genomics types in the audience grumbled privately about whether this means that data culled directly from the human genome and ensuing proteomics research would be sidelined. The industry financier at the lunch table said as much, even airing the well-known theory quietly espoused by most big pharma that pharmacogenomics, that holy grail of genomics, may shrink markets and cost big pharma their place on the mountain even as it revolutionizes health care.
But Horrobin, in his editorial, writes that combinatorial chemistry “[has] been around for over a decade, but to date [has] yielded relatively few products considering the extraordinary size of the investment. Advocates say, ‘give us time, give us a higher throughput rate for our assays, give us a higher synthesis rate for our chemical reactions, and the results will follow.’ But there should have been more success by now.
“Could it be that there is something wrong with the technology in principle, and that the target choices and the target configurations are fundamentally flawed?”