Gerald Barnett, Director, Office for Management of IP, University of California, Santa Cruz
Gerald Barnett recently left his post as director of IP at the University of Washington in Seattle for the sunnier (and dryer) West Coast surroundings of the University of California, Santa Cruz. BioInform recently caught up with Barnett to discuss his experiences handling the bioinformatics output of his former employer, and his plans for managing the bioinformatics technology portfolio at UCSC.
How much of your tech transfer work has been in bioinformatics?
When I was at Washington about half of our work was in bioinformatics. I’m anticipating that that’s not going to change here in my work at UCSC.
How would you describe your experiences handling IP and tech transfer in this field so far?
As computer science converges with the biotech industry you’re getting two very different approaches to intellectual property that aren’t miscible — they’re not converging well.
In the computer science arena you have a very strong pattern of open source, open architecture, and this broad sense of networked relations as a critical way of deploying new technology, so you get things like the GPL and the BSD license for codes. Those are well established.
On the biotech side, there’s obviously the reliance on patent work, a strong sense of proprietary controls, an expectation of potentially huge investments in clinical trials, validation, and screening of compounds to get to something that can anchor your company for twenty years.
So those two regimes come together as you start getting into bioinformatics, and you get an actually very healthy debate — if it’s conducted right — between the group who thinks that all bioinformatics materials ought to be made available open source on the general principles of sharing, and error identification, and everybody should be free to do what they want to with it since it’s mostly public science anyway; and the group who understands that from the point of view of how you build competitive positions and you invest in them, some kinds of controls are appropriate.
There are good arguments on both sides, but they’re just incompatible points of view. Yet, in bioinformatics you have to find some patterns of compatibility.
Is that possible?
There’s no need to have mutually exclusive positions. You can have data sets and informatics tools that can go out in one version under an open source license, and for the open source community, those things stay open source. But you could also have situations where those data sets or code tools need to come in behind a company firewall, and rather than turning everything in the company into open source, it allows the company to build its own proprietary versions for its own competitive interests. From a university perspective, especially a federally funded university, why should you have to select between those two uses? You somehow get the religion that open source is an unqualified community good, but that isn’t consistent with a broad range of examples that say that allowing private investment and private risk-taking to reap the benefits of private development has also proven to be a good.
Therefore, from a licensing point of view or an IP management point of view, I think the important thing is to continue to exercise judgment to decide how best to make research findings available. Sometimes that means scatter it widely without attention, and sometimes that means be very careful with it.
The fundamental question of technology transfer in a university is not, ‘Is this an invention protectable by patent with commercial potential?’ but, ‘Should what we’ve discovered or created be taught to other organizations?’ And if it should be taught, should we actually have a responsibility to teach it, to maintain it, to shape it, and to guide people to understand what that means?
Intellectual property then becomes a tool of relationship rather than a tool of exclusion.
It sounds like you take a very flexible approach.
My sense is that diversity of models is a good thing. If you look at only one regime, it tends toward a monoculture. So if you say let’s open source everything, well what happens with open source? The reality is, if you look at Linux, it reduces to a monoculture — everybody’s doing Linux. Yes, there are different flavors of Linux, but it’s just Linux — nobody’s doing anything else.
Monoculture can be very helpful because there are standards that arise, there’s a wide body of trained individuals who work in that monoculture. But at the end of the day you’ve lost all the diversity and creativity that would come with alternative ways of viewing things.
So there’s different patterns of engagement that suggest that as you manage data sets or manage software that provides particular kinds of utility in research, that there will be various ways that it comes out — some ways will be strongly proprietary, and some will be totally, altruistically indifferent.
In your bioinformatics experience to date, do any particular instances stand out as examples of good licensing or bad licensing agreements?
[Look at] the University of Washington’s Phred/Phrap distribution. It’s been provided source-available to academics without charge and at modest fees to company partners with the hope that they will extend their use of this under some fairly liberal terms. I think that’s been a flagship distribution by somebody who’s a leader in his area of bioinformatics, Phil Green, and has I think worked very well in building industry relations.
If you compare it with how WuBlast from Washington University has circulated, it might be an interesting contrast. WuBlast has its own sphere of commercial activity, but because its price point turns out to be, last I heard, $50,000 rather than five, the level of transactions has been much lower. It may be that Washington University is happy and pleased with the nature of the relationships it’s formed, and I wouldn’t say that’s been poorly handled, but I’d say it’s a very interesting contrast to having 200-plus industry sites all working collaboratively with a lab vs. perhaps five or ten. Is it the pricing or is it the intellectual property? I don’t think it’s the intellectual property in this case, I think it just has to do with choices about pricing for access.
I think when you get into things like how Celera has been handling its database activities, that’s caused real problems for some people involved in public science, and it’s caused quite a debate. I think it’s an interesting model. I think it’s a model that should persist. I think, though, that people need to understand how it affects the activity of public science and how it engages in a marketplace that’s also highly competitive. So it’s another example of things that come to mind and I try to reason about them — what does that do for the organization? And instead of saying, ‘That’s bad, they should just give it away,’ I go, ‘No, it exists, we should learn from how it behaves, and use that to gauge our own behavior.’
How do you intend to ensure that the UCSC Genome Browser is maintained in the long term?
I have to keep in touch with David Haussler and Jim Kent and have some understanding of what their profile and interest in maintaining the materials are. If they have a strong interest and they have the resources, then in my mind it’s very much up to them how these materials are made available. [When] it appears that they’re not able to or they don’t have the interest in maintenance [...], then you want a chance for the proprietorship. And it could be that you make it public, that is everybody works with it, it could be that you hand it off to another institution, it could be that it’s time to wind it down and nobody’s really going to come forward and take it up.
At that point, the more you have put in the hands of the users who are knowledgeable, the more likely it is that they will be able to continue their use even if you’re no longer there for them. Even if you don’t end up with a single point of control, you do end up with groups using it in-house, even if it’s never going to be a product that people invest in to redistribute. The key element is saying how do you create these environments, and how do you have a management protocol that’s flexible enough to accommodate personal judgment?
That’s the biggest challenge for university licensing programs built on the patent model — to be efficient they can’t afford to have a protocol that respects personal judgment. They want a commercialization plan, they want to see a big enough return that it not only pays for the transaction itself, but it pays for all the losers they’ve been working with that haven’t gone anywhere. And if you have a winners-pay-for-losers mentality, then you get increasingly isolated in working for bigger and bigger winners. What a tech transfer office sees as losers are, in fact, in a university setting, the heart and soul of scientific progress. And to treat them as losers, rather than as distinctive assets to be put into play through a variety of means, really underestimates the importance of a university’s disseminational responsibility, and at the end of the day creates a more inefficient patent licensing operation.
Patent licensing is a pretty tough game to play. For biotech, that game was a lot easier to play in the 80s. Tech transfer offices that witnessed a degree of success in their biotech licensing in the 80s and therefore built their licensing practice around it, I think are coming to find that it’s not so effective a decade and a half later. [These offices will] have a big wake-up call around bioinformatics. As bioinformatics becomes an area of interest, the challenge isn’t to avoid intellectual property, it’s to avoid inflexible application of business methods appropriate to 1980s biotech startups but inappropriate to 2002 bioinformatics exchanges.