PHILADELPHIA – Pharmaceutical companies may be struggling with many of the same IT challenges that they were a decade ago, but it appears that at least one thing has changed: They’re willing to talk about it.
At Cambridge Healthtech Institute’s Bridging Pharma and IT conference held here this week, IT executives from a number of large and mid-size pharmaceutical and biotech companies gathered to discuss various ways they are addressing the industry’s perennial IT headaches, including data integration, knowledge management, proving return on investment, and the difficulty of balancing a stable infrastructure with the need for innovative methodologies.
Many attendees said they have been grappling with these issues for years, and several noted that the conference served as a welcome venue to pool their experiences.
While the pharmaceutical industry is traditionally stingy when it comes to sharing information with potential competitors, the consensus among participants was that pharma increasingly views IT as a pre-competitive technology, and that the best way to overcome some of these hurdles may be by facing them together.
Embedding IT, Enabling Innovation
Ingrid Akerblom, executive director of research information services at Merck, noted that “sharing best practices is critical for the industry” as it strives to improve productivity and lower developmental costs.
Akerblom and her colleague Kevin Chapman, senior director of synthetic chemical research at Merck, gave a joint presentation on the use of “embedded IT” in the company’s discovery operations. Akerblom noted that an agile, innovative approach to IT is necessary in early discovery, but that this philosophy doesn’t fully square with the later stages of the pharmaceutical pipeline, which are more stable and reliable.
In order to address this problem, Merck has “embedded” IT staff within its research groups in order to quickly address informatics demands as they arise. Referencing the title of the conference, Chapman noted that in discovery, “bridging” IT and research is not enough. “If IT is on the other side of the river, we’re over before we even started,” he said. The company has realized that “you can’t separate IT from research,” he added.
One challenge of this approach is managing the lifecycle of technologies that are developed within research. As these methods enter the mainstream research workflow, you need to “spin them out” to operational IT staff in order to let the research IT team focus on new challenges, Chapman said.
Akerblom agreed. Once new methods are mature, she said, “You need to develop them into a deployable system that fits into the broader IT environment.”
Gerhard Noelken, director of research informatics at Pfizer Global R&D, said that his team has also grappled with the issue of balancing innovation and stability. In Pfizer’s case, the company is currently standardizing its IT infrastructure for lead optimization across seven research sites. The system centralizes all the data and experimental protocols necessary for screening across the company, and integrates a number of legacy informatics systems into a single portal.
Noelken described the centralized platform as an “enabler for innovation,” because local sites can plug new applications into the framework as needed. “We’re not taking away innovation,” he said. Rather, he noted, the infrastructure provides a stable foundation “in order to free up people’s time for innovation.”
Allergan, meanwhile, encourages innovation in discovery informatics through a combined strategy of embedded IT and rapid prototyping. Robert Cain, principal scientist at Allergan, said that the company’s IT staff is an integral part of the research staff, with offices located among the employees they support and regular attendance at research group meetings.
Jeff Pierick of Allergan’s IT group described the development of a data warehouse at the company, which relied on an iterative development cycle. Rather than the traditional “waterfall” approach to development, in which a project progresses linearly through a set of predefined steps — from requirements analysis to specification to design to prototyping to testing to delivery — rapid prototyping works in a circular manner, cycling quickly through each of those steps in close collaboration with end users.
Pierick said that each of these cycles took around three or four months, and allowed the end users to “feel more involved in the process.” In addition, the development team was able to quickly address problems with the system before they were hard-coded into the final product.
In the first prototype, delivered at the end of 2004, the user interface was a “spectacular failure,” Pierick said. By May 2005, the team delivered a new version with a redesigned interface that users were much happier with. The final version was released in January of this year, he said.
Cain noted that the system is now the “primary tool” used in medicinal chemistry, and has reached “half saturation” within the company’s biology group. Without the iterative approach, he said, “none of this would have happened. We would have been stuck with that initial system.”
StillBuilding, Not Buying
Another common thread across biotech and pharmaceutical IT groups is the reliance on in-house systems as opposed to third-party tools. Pfizer’s Noelken, for example, said that his group develops around 80 percent of its applications in-house.
At Millennium Pharmaceuticals, around 80 percent of the company’s bioinformatics is developed in-house, according to David Sedlock, director of research informatics, while around 30 percent to 40 percent of the company’s cheminformatics tools are developed internally.
Most companies are building their own informatics systems to fill in the gaps between best-of-breed components, but others are finding that commercial systems fail to meet their needs altogether.
Zhengping Huang, group leader of bioinformatics at Odyssey Thera, said that his team has developed a “fully home-grown information system” to analyze and manage high-content screening data. The company’s Evotec Opera systems run about 70 384-well plates per day, generating about 400,000 images that require 100 GB of storage.
Huang said that the “default” informatics solution from Evotec was “inadequate for our scale,” so the company had to port Evotec’s Windows-based analysis algorithms to C/C++ libraries that could run in a Linux-based high-performance computing environment.
“We’re not taking away innovation.” The infrastructure provides a stable foundation “in order to free up people’s time for innovation.”
Other companies are trying other alternatives. Michael Hanley, vice president of discovery research at Amylin, outlined the advantages of “in-housing”— an external professional services team that works very closely with the company’s scientists — for certain informatics projects. Hanley said that the approach combines many of the benefits of outsourcing — nimbleness, rapid turnaround, lower overhead — with those of an in-house development team, including familiarity with the corporate culture, loyalty, and security.
In 2000, Amylin contracted Kelaroo, a small life science solutions provider, to provide support for some of its MDL and Accelrys systems. That agreement was expanded three years ago when Amylin asked Kelaroo to develop an online data-mining system for the GeneSeq patent and sequence database called GoldMiner.
Since then, Amylin has treated the Kelaroo staff as “full-time insiders with contractor freedom,” Hanley said.
As an example of the degree of independence Amylin granted under the project, Robert Feinstein, vice president and CSO of Kelaroo, said that even though Amylin owned the GoldMiner IP, it recently arranged to give it back to Kelaroo so that it could commercialize the software to “become a more easily self-sustaining company.”
Although the move could potentially provide Amylin’s rivals with the very same software it paid to develop, Hanley said that the company doesn’t view the decision as aiding its competitors. “We see it as a way of establishing standards in the industry,” he said.