Skip to main content
Premium Trial:

Request an Annual Quote

Beachhead Consulting s Fisler Discusses HCS Market Trends

Richard Fisler
Beachhead Consulting

At A Glance

Name: Richard Fisler

Position: Director, Beachhead Consulting

Background: Manager of business development, 454 Life Sciences, 2002-2003; Business director, functional genomics, PerkinElmer (and previously director of marketing, biochip division, Packard Biosciences) 2000-2002; Business unit manager, Applied Precision, 1999-2000; Senior product manager, ultrasound, Siemens Medical Systems, 1989-1999; MS, biomedical engineering, New Jersey Institute of Technology, 1989.

Richard Fisler has spent 15 years in the medical and biotechnology instrument and consumables business. He has experience in a wide range of markets, including medical imaging, live-cell imaging, microarray scanners, and informatics. At Beachhead Consulting, Fisler advises clients in several areas of the life science tools market, and has recently paid close attention to the developing high-content screening arena. At IBC's High-Content Analysis conference held earlier this month in Washington, DC, Fisler chaired one of the sessions, and delivered a report on recent HCS market trends. The full report can be downloaded from Beachhead's website. Fisler took a few moments last week to further discuss some of his company's findings with CBA News.

You've dabbled in a wide variety of biotechnology markets. How did you develop your interest in the HCS market?

I spent time in the live-cell microscopy market, and the company I worked for was Applied Precision. They had what we called high-throughput microscopy, or somebody else may have coined that term, but the researchers were really cutting-edge early adopters, like Peter Sorger at MIT. They were really pushing us to make faster point visiting, but within a slide. So we wanted to look at 100 cells, and we wanted to look at them over time, and we wanted to go beyond looking at one cell at a time, because we wanted to deliver more data. We modified the stage to be able to do that, to be able to hold a microplate, and we saw the great response to this offering. We also worked with the Institute of Chemistry and Cell Biology at Harvard to see how we could modify our microscope to be faster and handle microplates. We weren't looking at pharmaceutical screening, but from the academic side: How do we study lots of cells in three dimensions over time? I left that company and went to work for Packard Biosciences and Perkin Elmer, and a lot of the products that made those companies successful were delivered to high-throughput screening labs — robotics, reagents, and imagers. We really saw, like a lot of people did, the merger of those two technologies. Early on in my consulting, this was one of the high growth areas that attracted interest of some of our clients as well. I also have an engineering background, and am always interested in how to design instruments that can be used by the mainstream community. So not high-end microscopes that take the head of an imaging core to use; but combine these technologies and make them useable by an entire customer base.

Beachhead covers other life science tools markets, as do you, correct?

Beachhead covers all technologies in the life sciences. If a company has a product or technology that they want to understand market dynamics for or to quantify potential value for, we'll perform specific research to create strategies for market entrance. On our own, we also follow a number of markets, including expression microarrays and derivative markets, such as splice variants, [comparative genomic hybridization], as well as technologies in the screening area, and molecular imaging. Those are the most common types of projects we've been doing of late.

In your presentation, you remarked that HCS, along with functional proteomics, are still in their nascent stage and among the smallest of all life science tool markets, at less than $250 million annually. Do you think HCS has the potential to be on par with some of the other technologies that you listed (e.g. high-throughput screening, genotyping, flow cytometry, sequencing), most of which are at least quadruple that figure?

The way that we've segmented the market, we've separated high-content screening and high-throughput screening, which may not be appropriate anymore. But when we look at high-throughput screening technologies, and those other markets that we've sized, we're really just looking at high-throughput robotic systems and the reagents and imaging systems that go along with them. We may want to modify that, as HCS systems are being used in high-throughput screens today. What you may see is not so much high-content screening growing — you might just see it all merging, because a higher percentage of the high-throughput screens that are done will be done by HCS. High-content screening becoming a billion-dollar industry, I think, would be difficult. One of the challenges is getting the systems easy enough to be used. We didn't coin it, but we like to use the term 'the big green button," like a Xerox machine. You want to be able to push the button and get your assay result, and you can't do that yet, so the throughput is limited. A couple of things could happen. Pharma could get away from traditional high-throughput screening and say, 'Running millions of compound libraries through these robotic systems isn't giving us enough answers,' which is one of the reasons HCS is so attractive. We need to get more and more of these imaging systems going.' But at the same time, the prices of those imaging systems are coming down. Initially they were $700,000 to $1 million, and now, the lowest-end systems are around $180,000, and we are now seeing systems for around $250,000 to $300,000. Those prices are only going to come down. But a billion-dollar industry would be tough, particularly as the reagents that are used for HCS right now are not big revenue producers — they are self-built — users engineer cell lines to produce GFP and its derivatives. I think that if antibody companies can come up with good, multiplexed, specific antibody sets for specific pathways that would allow people to use HCS systems to look at three, four, or five elements within a pathway, and follow them over time, then you may see higher growth. That's a big focus in the industry, but there is a lot of work to be done to engineer those systems.

Nevertheless, you cited that there has been a 30-percent growth clip in HCS instrumentation since 2003 …

And between 2002 and 2003 there was a huge blip. We didn't put that in the graph because it skewed the data, but if you go from 2002 to 2005, it's probably more like 45-percent growth.

But it was still about half the growth rate that people initially estimated, you said. Why do you think that is?

There is a lot of excitement about the potential of this technology, and others, protein arrays being another one. What analysts forget is that research groups, specifically in pharmaceutical companies, have methods that are working. Getting new technologies adopted across the board is expensive, and you've got to prove that there is a return on that investment. It's not cheap to change the way things are done. You're changing not only the instrument, but the reagents, the assay development and design, and the type of data that you're getting. You've got to now see what that data means in a biopharmaceutical sense. What does it mean that I can now measure the morphology of a cell? How does that impact whether or not this compound is going to go to the next stage of development or not? It takes a long time for people to understand how the new information you're getting can be used, and then you have to package that information in a way that can plug into the drug-development process so people can make decisions. Pharma companies need to be able to make actionable decisions based on the data that is coming out of these experiments. They can run IC50's or EC50's, and have a cut-off, and say whether a compound is going forward or not. Here you've got morphology changes, [protein] translocation, the presence or absence of micronuclei, and you've got to quantify all of these results, and be able to say 'Based on these parameters, we make this decision about this compound,' rather than say, 'Gee, that's interesting. What do I do with that?' And nobody has done that in a standardized process yet. The more and more people run full-length screens using high-content systems, they'll learn more about how to use the information. At the same time, in the last two or three years, the software has gotten easier to use. Companies like Cellomics and [GE Healthcare] have developed more software algorithms and assay kits that address a wider variety of biological problems. If you look at the catalogs that are available today versus two years ago, there is three to four times as much information and assays that you can buy than you could a few years ago. We also have companies like Roche, GSK, and AstraZeneca really pushing this into high-throughput screening processes. And lastly, the advent of RNAi has come along at the same time as an application that is driving the adoption of a lot of new approaches to research, some of which utilize HCS systems. If I were to project going forward, it would be higher growth than the past three years. Not higher than from 2002 to 2003, but the number of systems and assays that are run are going to be in the 50- to 60-percent range of growth. I can't say now if that translates into dollars, because the prices keep coming down.

Just last week CBA News reported on Beckman Coulter getting rid of its high-content screening business, and this week we're reporting that Vitra Bioscience is out of business. Is this the first bump in the road in the growth of high-content screening?

No, I think it's an isolated thing. I think the sales channels are challenging, and there is a lot of competition. Companies need to decide for themselves if they want to invest, and if they have the core technology platform needed to succeed and build into a product line. Sometimes it takes more risk than companies are willing to take at that point in time. Beckman has invested a lot of money in different companies over the last 12 to 18 months. They invested $140 million in Agencourt. So you really need to see how it fits into a company's strategy at a given time.

You talked about several challenges to broader adoption, including image analysis, standards development, lack of system interoperability, and cost. Do any of those stand out as most critical in your mind?

I actually think that the most critical is showing pharmaceutical relevance of the data. That is probably the biggest issue, and the next one would be usability, followed by cost. Interoperability was an important topic at that conference, but it's kind of a secondary thing. It comes later. I spent a lot of time working in the Digital Imaging and Communications in Medicine committee for medical imaging interoperability in the 1980s and 1990s, and it came last. We had all the imaging systems we needed, and then interoperability came around.

A lot of these vendors have been saying they need to cooperate to further the field, but at some point they have to differentiate themselves to make money. Where can they differentiate themselves the most? Image analysis? Instrumentation?

I think in providing a complete package, a complete assay — meaning the reagent, the software, and the methodology to do a complete experiment, and then 'What does it mean?' When they talk about working together, I think it's more working together to provide that relevance to the output of the assay, and understand what it means in the compound program. And then a company differentiates itself by putting together the whole package. The other way to differentiate yourself is by providing service. People forget that at every technology conference. It may be very boring, but it's the company that provides the best service — because these systems don't work all the time; no system does. If you're a vendor, you come out with the best application specialist group to train the customers and help them understand the data, and the best field service group, people will buy your product, even if you don't have the best microscope objective in there. And it may be slower. But speed isn't the big deal anymore in this market. We tell people all the time 'You can make a great box, but if it can't be serviced, people will not like you.'

File Attachments
The Scan

Genetic Risk Factors for Hypertension Can Help Identify Those at Risk for Cardiovascular Disease

Genetically predicted high blood pressure risk is also associated with increased cardiovascular disease risk, a new JAMA Cardiology study says.

Circulating Tumor DNA Linked to Post-Treatment Relapse in Breast Cancer

Post-treatment detection of circulating tumor DNA may identify breast cancer patients who are more likely to relapse, a new JCO Precision Oncology study finds.

Genetics Influence Level of Depression Tied to Trauma Exposure, Study Finds

Researchers examine the interplay of trauma, genetics, and major depressive disorder in JAMA Psychiatry.

UCLA Team Reports Cost-Effective Liquid Biopsy Approach for Cancer Detection

The researchers report in Nature Communications that their liquid biopsy approach has high specificity in detecting all- and early-stage cancers.