Skip to main content
Premium Trial:

Request an Annual Quote

Tufts Center Director Kaitin Discusses Pharmacogenomics Future

Premium

At A Glance

Name: Kenneth Kaitin

Position: Director of the Tufts Center for the Study of Drug Development

Background: Following a postgraduate “stint” at Stanford University, Kaitin joined Tufts, where he has remained for the past 17 years.

Age: 49

 

If you’ve heard of the Tufts Center for the Study of Drug Development you know its famous pronouncement that it costs $800 million for a drug to win regulatory approval in the United States.

That number, together with the 10-year R&D road to get to market, has become a shield for biopharma companies defending their steep drug prices and a kind of battle cry for those promoting pharmacogenomics.

According to one market study advocated by the Tufts Center, if drug companies improve their clinical success rates by around 30 percent, they can expect to shave $200 million off their overall to-market cost. Similarly, if pharmas reduce by 19 percent the time it takes to move their candidate through the FDA approval process, they will see a $100 million savings. Pharmacogenomics promises both to increase candidate success rates and shave off valuable approval time.

Amid some skepticism of these statistics by consumer groups and some SNP tool providers alike, Kenneth Kaitin, director of the Tufts Center, has been holding his own torch for pharmacogenomics. SNPtech Reporter recently caught up with Kaitin to talk about the future of pharmacogenomics in drug development, how pharmas may have “over-embraced” the new technologies but fear rejecting them, and those sneaky games big pharma likes to play with tool vendors.

What is your impression of the pharmacogenomics atmosphere today? How do pharmaceutical companies perceive the discipline?

A lot of the firms that embraced pharmacogenomics in the early 1990s have now backed off from the technology. This is so because it was fairly clear that it was raising the cost of drug development, especially at the discovery level, and it was drastically increasing the number of leads. But there was still no good screening technology that enabled firms to make decisions about which product to drop out of the pipeline.

I think the change we’re going to see now is an increasing number of screening technologies [that] will enable firms to make better decisions on all of those millions of leads that they’re getting through pharmacogenomics, through combinatorial synthesis, through high-throughput screening, and all of these new technologies. [These technologies] were supposed to revolutionize the way pharmaceutical firms develop new products. But actually [they] turned out to help firms reach the bottleneck in their drug-development pipeline faster. I think a lot of firms have identified this as a problem. You have the celebrated example of Henry McConnell, the CEO of Pfizer, being interviewed … and saying that the reason why pipelines are so sparse right now is that they over-embraced new technologies, including pharmacogenomics, and [the technology] just didn’t pan out. So their new approach is to back off that, go back more to rational drug design and some of the other tried-and-true methods of bringing new products to market.

But I don’t think anybody can run away from the new technology. It’s too powerful and so radically different from the old way of ‘hit-or-miss’ of rational drug design, and I think there will be an increasing reliance on pharmacogenomics, but that hinges on adequate technologies for screening compounds.

Who are some of the screening players that you have noticed?

This is a good example of a technology which is probably best left to an outsourcing provider, including the high-throughput screening-end of pharmacogenomics. But, in particular, the screening technologies are going to be left up to ... [the] software developers — it’s primarily a software issue — and they are going to be the ones that industry is going to rely on to essentially hand over a lead-optimized product. Right now what industry is getting is a lot of leads, but not an optimized lead, not the lead that stands the best chance of being a successful marketed product.

Is there a technology out there today that can do this for drug makers?

I hesitate to say that a vendor I saw at a meeting who says they have the ultimate software actually does have the ultimate software. It’s critical that they say that in order to attract the firm to potentially buy or pay for the software so they can work out the kinks.

This is something that characterizes the pharmaceutical industry, to its detriment: it tends to play off the outsourcing vendors against each other. So when a new technology comes down the line, instead of an agreement between a large pharma firm and a software vendor to work together, work the kinks out of the system, and come out with the best product ... what they tend to do is outsource to a whole group of providers and, as each one runs into a problem, they drop them. They don’t work with one anymore. So nobody gets the chance to work the kinks out of their system.

You see that with other technology, like electronic data capture, which should by now be the standard in the industry. Yet because none of the providers of that technology ha[s] had the opportunity to work closely with the firms and work on the problems, it’s still a technology that most companies want to avoid. Too many companies are saying, ‘We need this technology but we’re not interested in helping you do a better job. We’ll just keep looking until we find one company that is doing a better job.’

It sounds like some of the problems that pharmacogenomics is having within big biopharma have nothing to do with drug discovery.

It’s [the] funny nature of the pharmaceutical industry. It’s a very insular type of industry that for many years refused to share anything with any other firms. You can never find out what a firm was working on: They jealously guard their industrial secrets. And that’s not just proprietary compounds but technologies, project-management systems — everything they do they make sure nobody is looking at it too closely.

So when they look at these new outsourcing providers that they should be opening up and working with, they still keep them at arm’s length, they don’t invite them too closely, and as a result everything moves a little more slowly than it should.

But I really think that is going to change as the industry itself is changing, and a lot of the people that are coming into the industry … don’t have that tradition of secrecy and lack of sharing of information. …

Do you think this growing glasnost within pharma will result in greater outsourcing of pharmacogenomics?

I don’t think there’s any question. It’s now impossible for many of these pharma firms to invest in these technologies. They don’t want to invest in it for several reasons. For one, it’s expensive. But the other [reason] is that they’re not absolutely sure that this is the way they want to go. So, what better way to achieve the best of both worlds than to outsource?

It sounds like there’s more than a bit of hope for providers of pharmacogenomics tool and services.

Oh, I don’t think there’s any question about it. We’re not talking about increasing the number of leads on the order of doubling it. We’re talking about increasing it by many orders of magnitude. And the potential is tremendous as long as what’s out there is some way of finding out which of these various leads is the one that will provide you your pot of gold.

A lot of these firms are now looking at their development pipelines. And, unfortunately, those pipelines have rigid sides that can’t expand to accommodate the increased numbers of leads. I think this is a very short-sighted approach to the adoption of pharmacogenomics technology back in the early 1990s: there’s this perception that we’ll have a lot more leads, so everything will be better. What good is that? All it does is provide a clog. …

Will pharmacogenomics take off? I don’t think for a while. Though many firms say they like to embrace technology they’re still unsure how to justify looking for a smaller patient population. That’s what that boils down to. Are they going to embrace a technology that means a smaller market for them? Why would they do that?

I think the future of this technology is the blockbusters of tomorrow will be those drugs that are based on particular mechanisms of action — for example, immunosuppressants, or those that deal with inflammation. They may work on a small population of people with a particular genotype, but at the same time there are a lot of diseases where inflammation is a problem.

 

Filed under

The Scan

US Booster Eligibility Decision

The US CDC director recommends that people at high risk of developing COVID-19 due to their jobs also be eligible for COVID-19 boosters, in addition to those 65 years old and older or with underlying medical conditions.

Arizona Bill Before Judge

The Arizona Daily Star reports that a judge is weighing whether a new Arizona law restricting abortion due to genetic conditions is a ban or a restriction.

Additional Genes

Wales is rolling out new genetic testing service for cancer patients, according to BBC News.

Science Papers Examine State of Human Genomic Research, Single-Cell Protein Quantification

In Science this week: a number of editorials and policy reports discuss advances in human genomic research, and more.