Skip to main content
Premium Trial:

Request an Annual Quote

Jed Harrison on Microfluidics in Drug Discovery and his Biotech Start-Up

Jed Harrison
Chief Scientific Officer
Advanced Integrated Microsystems

At A Glance

Name: Jed Harrison

Position: Chief scientific officer, Advanced Integrated Microsystems; Professor of chemistry, University of Alberta, since 1984.

Background: BS, chemical physics, Simon Fraser University — 1980; Graduate studies, Massachusetts Institute of Technology.

One of the early pioneers in the field of microfluidics, Jed Harrison is one of the original developers of so-called "lab-on-a-chip" technology. Though he came from an analytical chemistry background, Harrison has spent much of his recent career attempting to apply microfluidics to biology, and more particularly, drug discovery. In 2001, he co-founded Advanced Integrated Microsystems, a University of Alberta spin-off trying to develop microfluidics platforms for cell-based assays and proteomics.

Last week, the University of Alberta was awarded US Patent No. 6,900,021, "Microfluidic system and methods of use," on which Harrison is the lead inventor, and which relates to methods for testing compounds against cells. Also last week, Harrison took a few moments to discuss with CBA News the opportunities and challenges of integrating microfluidics with modern drug discovery.

You were one of the original developers of microfluidics technology. How have your interests evolved to where you are today?

Well, we started in the field of microfluidics about 15 years ago, doing some of the first work of separations on a microchip. Early on, we started looking at the applications in chemistry, but after a short period of time it became clear that the applications in biology could become much broader. Part of the simple logic of that is the fluid volumes you work with in a microchip format are about the same as those associated with an individual cell. So it provides an opportunity to do a lot of single-cell manipulation and single-cell assays. We started that work with the cells around 1994 or so, and we filed a patent around 1997, so it took a long time to prosecute.

This patent is for in vitro studies of the effects of compounds on cells. But you've also done work in microfludics for proteomics and other types of applications. Is the cell-based work leading the way here?

The cell-based work is one branch of what we've been up to. The protein separation and protein analysis is basically another branch. When we started taking a look at some of these technologies and how to commercialize them, the path to a product — or really, a market — seemed clearer with the protein sample processing for mass spectroscopy and proteomics than did the cell-based assays. We actually began to have some challenges seeing exactly how the cell-based screening and analysis could be translated into products that we felt the marketplace would accept. I can give you some sense of why that is. We concluded that if we could use the microchip devices in toxicity studies, then there was a clear market for them. But the use of cell-based toxicity studies in the pharma industry was not as nearly developed as we would need it to be to adopt the products. So we looked at that as being perhaps too long a development time for us. The other side of it is that the devices lend themselves well to using very small quantities of material, and doing very large numbers of analyses. But we concluded that cell-based assays on agar-based …

Some sort of supportive media …

Yes. There are a variety of cell-based assays based on that concept, and they're quite well-done already. The microfluidic devices begin to play a role when you want to do more sophisticated things — [such as] get a more sophisticated analysis, get a response time, build in a dose-response curve simultaneously. We ran into a bit of a conceptual problem with that, because once you start wanting to make measurements of dose response and response time kinetics, it usually means that you're no longer at the high-throughput stage. You're already at the stage where you know this molecule is interesting, at the second stage of drug screening. Once you get to that stage, it's less critical that you have ultra-small volumes and ultra-small assays. So we ran into a conceptual problem there.

So in cell-based assays, you're seeking and getting more information, but this technology might have its largest applicability in a high-throughput, and not high-content, setting?

In terms of the cost advantage. Personally, as a scientist involved in it, I still think that there is a large range of potential of where these technologies can go in cell-based assays and drug screening. But identifying the right market niche where your investor's dollar is going to pay off quickly enough — for a small company, it was hard for us to do. So for that, we're still looking at the best applications and development; whereas in the proteomics area, the [needs] associated with analyzing proteins and peptides and their fragments are [many] — the need to automate it is large, and the need to make it faster is huge. So it was much easier to see how the advantages of microfluidic systems could really play a role quickly there, in terms of leading to products. So we ended up with the company focusing on that as our initial primary goal.

So now you're looking at both areas?

That's right. We still regard the cell-based technology as the next potential product area that we would be looking towards. And we've done a lot of thinking about what is not an advantage with the microfluidics, and where the advantages do lie, and so we hope that in the next product development round we can find the right niche for that.

Do you think microfluidics in general has been adopted in drug-discovery research as quickly as you thought? Are there still obstacles to overcome?

To be honest the adoption was much faster than I expected it to be at the early stage, and now it's much slower than I expected it to be in the long-term stage. So in other words, when we started this 15 years ago, I was surprised that only five years later companies were started up to commercialize it. But now that we're 15 years into it, I certainly expected that more of it would have been adopted in products than it has been. That said, there are now various microfluidic products out there, so that's starting.

But there aren't a lot for cell-based assays …

There are not for cellular assays, but I would say that's only about a 10- or 12-year time lag right now. I would say that the first really notable work done with cells on the chips was done by [Peter] Wilding at the University of Pennsylvania in the early 1990s. Now would be about the right time, it seems to me, for things to really start percolating in industry. For it to have happened sooner would have been to expect a lot of it. It's a fundamental change in the way you think about packaging the fluidics. Enough of a knowledge base has to accumulate in terms of the various tools in the box; understanding what the advantages and disadvantages are; and what the potential gains are. Enough clever ideas have to come out before its going to put some possibilities out there that push it successfully into the commercial stage, and I think we're just approaching the edges of that now.

So what are the clever ideas that your company is putting forth? What are you proposing might be valuable about your technology?

We've put enough thought into figuring out what's wrong that I don't want to entirely talk about that. [Laughs] But generically, I think there are a few key things. One of them is that many biological measurements are extremely approximate. The error bars on the values measured are quite large. To a certain extent, that's a function of the biological system, and people accept that. But I think that people have also come to accept those error bars unnecessarily. One of the problems is that we measure ensembles, or complete groups of cells. If, with the microchip systems, we can instead measure a large group of cells one at a time, and average the results of each cell normalized, rather than taking an entire ensemble normalized, we get a big improvement in quality of data. And that quality of data will aid us substantially in the interpretation of phenomena, and the experiments and results. That is a core area where microfluidics can play a big role.

To flesh that out a bit, if you take a cell and make a measurement on it, it's just not enough information. You can measure how much protein it has got in it — but unless you know what its past history was; what the size of the cell is; for argument's sake, what the cell's mass is; what state of its life cycle is it in — then that information is nearly useless. Since I come from an analytical chemistry background, it's the equivalent, let's say, of bringing in an ore sample and reporting on the amount of iron in the ore without ever measuring whether it was a 10-pound sample or a 10-ounce sample. With the microfluidics systems, you can really integrate a substantial number of measurements to evaluate each cell that you're measuring, so you can normalize your results before you average them. And the ability to obtain information that's much more readily interpretable is much greater if your error bars on each measurement are much smaller.

So in some ways, I expect to see this technology have an impact in scientific discovery, where people begin to better understand how a particular biological pathway works because they get much more information when they make each measurement. And that, of course, will lead to better drug discovery, because we understand what we're targeting much better.

I think it can also play a substantial role in toxicology screening, but that first requires that the use of cells for toxicology studies becomes further and better developed. That is something I think that's going on, and my impression is that many companies are looking at the cost of doing toxicity studies in animals, and are trying to develop cell-based approaches so they can do them at least one level beforehand. I think that if that technology develops well enough, then microfluidics can play a very big role in automating that kind of assay and making it fast.

With the chips, we can deliver different doses in a very controlled fashion; we can measure the response time as well as the characteristic average response; and by getting that kind of kinetic and dose-response information on an individual cell basis in a quick way, I believe could play a significant role in toxicology work. And that's an area that we'll be looking at.

So what else is next for your company in its quest to achieve a marketable product?

We're at the stage of taking an internal laboratory system and releasing it to beta-site users over the next couple of months, and getting good feedback on that over a six-month period. And then we're in negotiation with some suppliers to distribute the system if that testing works out satisfactorily.

File Attachments
The Scan

Study Finds Sorghum Genetic Loci Influencing Composition, Function of Human Gut Microbes

Focusing on microbes found in the human gut microbiome, researchers in Nature Communications identified 10 sorghum loci that appear to influence the microbial taxa or microbial metabolite features.

Treatment Costs May Not Coincide With R&D Investment, Study Suggests

Researchers in JAMA Network Open did not find an association between ultimate treatment costs and investments in a drug when they analyzed available data on 60 approved drugs.

Sleep-Related Variants Show Low Penetrance in Large Population Analysis

A limited number of variants had documented sleep effects in an investigation in PLOS Genetics of 10 genes with reported sleep ties in nearly 192,000 participants in four population studies.

Researchers Develop Polygenic Risk Scores for Dozens of Disease-Related Exposures

With genetic data from two large population cohorts and summary statistics from prior genome-wide association studies, researchers came up with 27 exposure polygenic risk scores in the American Journal of Human Genetics.