Skip to main content

Q&A: LGC's Jim Huggett on the Growth of Digital PCR and its Potential in Molecular Diagnostics

Premium

Jim_Huggett.jpgNAME: Jim Huggett

POSITION: Science leader, Nucleic Acid Metrology, Molecular and Cell Biology, LGC


Established in 1842 as the Laboratory of the Government Chemist, UK-based LGC today is an international company specializing in laboratory services, measurement standards, genomics, reference materials, and proficiency testing marketplaces, and serving a variety of markets including food and agriculture, government, pharmaceuticals, and sports.

LGC also serves as the UK’s designated National Measurement Institute for chemical and bioanalytical measurement — similar to the role played by the National Institute of Standards and Technology in the US. In this role, LGC collaborates with key industrial, academic, regulatory, and international metrology stakeholders to provide expertise in measurement research to a wide range of markets.

As leader of the nucleic acid metrology unit for LGC, Jim Huggett is intimately familiar with real-time PCR as a measurement tool, and in recent years has also become heavily engaged in the digital PCR arena.

In the past two months alone, Huggett has co-authored a study published in PLOS One evaluating digital PCR for absolute RNA quantification; given a keynote presentation at Cambridge Healthtech Institute's Digital PCR conference in San Diego; and co-authored an editorial in Clinical Chemistry examining the potential of digital PCR in molecular diagnostics.

PCR Insider recently caught up with Huggett to discuss the latter topic, in particular. Following is an edited version of the conversation.


Please tell me a little bit more about your role at LGC.

We do bioanalytical and chemical measurement research funded by the UK National Measurement Office, and our group's interest is biomolecular measurement. Metrology — the science of measurement — is the term we use. Scientific metrology is second nature in physics and engineering, and the chemists are really on top of it, but biologists are fairly new to this. And that includes some of the clinical research. It's ironic, really, because biology is where all the variation is, so you'd think that's where it's needed most. But it's an exciting time to be in this area.

[LGC] does a range of research [on] the measurement capabilities of different technologies. Digital PCR is a great example. It's being heralded as fantastic, so we [ask] "what factors are really going on here?" One of the things that digital PCR has got all of the measurement institutes sitting up about is that it appears to provide this absolute measure with no need for a calibration curve. This means it can be [and has been] used to quantify and assign values to reference materials, like other instruments used in chemistry; for instance highly accurate mass spectrometry.

Digital PCR is being explored across a wide range of applications, but your particular interest is in its application to molecular diagnostics?

Personally, yes, although [LGC] applies it to other areas. One of my colleagues, Malcolm Burns, looks at it in the context of testing for genetic modification in foods. Our groups also looks at the technology and how it performs in research, for things like measuring microRNAs. But this does generally have a clinical or pre-clinical theme.

In your Clinical Chemistry editorial, the molecular diagnostic studies that you cited both used droplet digital PCR, as opposed to a fixed architecture version of digital PCR. At this point in time, do you think droplet-based methods are the most realistic solution in terms of molecular diagnostics?

I certainly think the more fixed architecture [digital PCR] has potential in this area. It reminds me of Blu-Ray versus HD DVD — which is going to win? The emulsion chemistry, the droplets, offer a very elegant way of getting around the dynamic range problem that some of the earlier chip-based instruments cannot compete with because of the physical nature of the chip. What's going to be interesting is how things develop over the next five years. Life Technologies has a new chip-based instrument [the QuantStudio 3D] that can compete [in dynamic range] with the Bio-Rad [QX200 Droplet Digital] instrument, but then again it's always probably going to be easier to get more droplets, as exemplified by RainDance's RainDrop instrument. But there is additional complexity when you're doing that because of variability in the number of droplets you produce, and downstream analysis may be more complicated. It's a trade-off. I don't know which is going to be the winner, if either is, but certainly the last year or so has been led by a particular manufacturer, [Bio-Rad], that has done a lot of work on developing digital PCR, illustrating what it can do, and building a large customer base.

There have been questions surrounding the complexity of digital PCR even in a research mode, in terms of the workflow and analyzing the results. What are your thoughts on this complexity and how that might mesh with molecular diagnostic applications?

What you're talking about in terms of complexity are what I'd call natural, obvious things [related to] any new technology. It's got to start somewhere, and it's fair to say that all of the digital PCR formats take a hit on complexity compare with real-time PCR, which is established, works fantastically well, and at the moment can do the majority of things that digital professes to offer.

Cost, speed, throughput, and dynamic range are clearly disadvantages of digital PCR when compared with real-time PCR, if we're talking about molecular diagnostics. A key consideration there, though, is: are we talking about quantification or [a] 'yes or no' answer? For example, is there a particular pathogen present in a blood sample or not, versus how much [is present]? The disadvantages with the complexity are due to the fact that if you're using a chip-based method, you've got to load the chip. It's much simpler to load a 96-well plate. With the Fluidigm [BioMark] instrument — and I suspect, though I've not used it, the earlier version of Life Tech's OpenArray — that was similar: you'd load it, run it, and then get your results in essentially a real-time fashion. One of the early recognized benefits of real-time PCR was, as everybody was saying, it was automated; no more gels.

All of the higher-throughput [digital PCR] instruments that are offered now require some downstream processing to analyze the experiment. That is another level of complexity, and certainly if you're thinking about the context of a diagnostic setting. And the other issue is the throughput. I think the Bio-Rad instrument probably has the highest throughput now in that it can do 96 [samples] in one go, but that takes in excess of five hours with all of the steps.

And then add to that the cost — digital PCR is very expensive at the moment compared to real-time PCR.

You also pointed out in your article how important the MIQE (minimum information for publication of quantitative real-time PCR experiments) guidelines are for real-time PCR, especially when using it for molecular diagnostics. Is that kind of standardization as important with digital PCR? If it's absolute quantification, then theoretically it seems you wouldn't need such stringent standardization.

That's an excellent question. The issue with digital is that everyone is saying it's absolute, and doesn't require a calibrator, so don't worry, carry on as you are. Add to this that the instruments are tailored to spit out numbers — they just give you the results without some of the crucial information like the proportion of positive partitions that may be present. You can get this information, but there is a real danger that digital PCR is going to fall into what I call the 'kit-based culture' that applies to a lot of molecular biology, especially qPCR, where the people who perform the technique do not necessarily need to know how it works because it's all being packaged up for them. This leads to problems with dissemination of what's happened and how the experiment has been performed.

Digital PCR appears to be more reproducible than qPCR. You can run it, you get a … value, and you do not need a calibration curve. You do it again and you get a very similar result. I'm very sure if we had a plasmid and I prepared it [at LGC], and sent some to you [to do] the same experiment, we'd get similar results with the DNA. The trouble is, as you start to make it more complex, you start to apply the fact that you don't generally work with tubes of DNA; you generally work with biological samples, and you need to get at that DNA, which requires a very complex process of extraction, which is highly variable … not only in its yield, but also in terms of some of the co-purified stuff that comes through.

If you take that into a diagnostic scenario, biological samples are going to be different. If you look at samples like plasma, which is used quite a lot for cell-free DNA — obviously digital PCR can lend itself to that for rare mutation detection. Well, plasma can vary quite a bit in DNA content. It can actually be wildly variable depending on the conditions.

You've got all of these complexities that must be captured upstream of actually performing the digital PCR method if it is to have maximum impact. And certainly in the real-time PCR fraternity, a lot of this information is just seen as peripheral detail — just "we performed an extraction using a Qiagen method," [for example]. Frequently no information is provided on how the sample was stored, how it was collected, what else was done with it. Differences naturally occur in different protocols or labs … because the techniques have multiples steps, but this is often not captured. I think that leads to a lot of chaos.

[Because of this] I don't think real-time PCR, or PCR in general, has really impacted diagnostics in the way it had the potential to. A great example is tuberculosis. It's been well-demonstrated that PCR can diagnose TB, but it has taken a good 25 to 30 years for an instrument to be generated that can do this, and which people are actually taking seriously, and that's the Cepheid GeneXpert. A lot of this is due to the fact that … people don't think in the context of standardization. It should be as important to me that you can repeat my work as it is to you. And I don't think that's the current message in the research community.

To add to that, digital PCR detects the DNA that is present, and it is almost certainly close to the truth. If you start looking at RNA, for example, which requires the additional reverse transcription step, you get a much-reduced signal. We've demonstrated this quite well in the PLOS One [study] that RNA molecules are present but are not being detected. That leads me to question how well DNA is being detected. You get a signal, and you're counting molecules that you're fairly certain are specific, but it's difficult to model where they're not being detected — we call this molecular dropout, another area we're working in. Molecules are present and are not being detected for a variety of reasons, and obviously that's impacting the quality of the absolute measurement.

To sum up, I would say digital PCR is not calibration-free. It may not require a standard curve like real-time PCR, but to say an instrument is calibration-free is probably a dangerous area to go.

Your editorial also discussed the fact that, as next-generation sequencing begins to replace real-time PCR for a lot of applications, that there is an opportunity for digital PCR and NGS to complement one another. Can you expand on that?

One of the things that digital PCR does is ride the wave of PCR. Real-time PCR has demanded that we have some spectacular reagents. The requirement for high precision in your Cq/Ct [values] has really [motivated] some of the companies to develop reagents that work very well and are very sensitive.

Digital PCR simply takes those and applies them to a different format, and it works. The issue with sequencing is that it's very early days still. It's taken PCR 30 years to now really start to take over and compete with some of the classic methods used in diagnosis. There are areas where it is used and has been used for more than 10 years, but these tend to be areas where nothing else can do it, or [do it] as quickly. If you think of viral load quantification, it's very difficult to do using non-molecular methods. Or, drug-resistance testing in bacteria is often done using conventional microbiological methods, but it may not be fast enough to be clinically informative. So where it's needed, [PCR has] worked very well, but it has taken a long time to get there.

If you look at the development of sequencing, there are a lot of positive developments, but it is still learning and is considerably more complex than PCR. I think PCR and particularly dPCR still have much to offer.

The Scan

Pfizer-BioNTech Seek Full Vaccine Approval

According to the New York Times, Pfizer and BioNTech are seeking full US Food and Drug Administration approval for their SARS-CoV-2 vaccine.

Viral Integration Study Critiqued

Science writes that a paper reporting that SARS-CoV-2 can occasionally integrate into the host genome is drawing criticism.

Giraffe Species Debate

The Scientist reports that a new analysis aiming to end the discussion of how many giraffe species there are has only continued it.

Science Papers Examine Factors Shaping SARS-CoV-2 Spread, Give Insight Into Bacterial Evolution

In Science this week: genomic analysis points to role of human behavior in SARS-CoV-2 spread, and more.