Real-Time PCR, Volume VII

Table of Contents

Letter from the Editor
Index of Experts
Q1: How do you develop standard protocols for your real-time PCR reactions?
Q2: What additives do you prefer to use in your real-time PCR reactions, and why?
Q3: What quality assurance/control measures do you have in place to ensure reliable data?
Q4: How do you design suitable primers and make sure they are specific?
Q5: What computational tools do you use to analyze your data?
Q6: When you encounter problems with your real-time PCR reactions, what is the first thing you suspect and how do you go about seeing what is wrong?
List of Resources

Download the PDF version here

Letter from the Editor

As a tool, real-time PCR's popularity only seems to be increasing. It's now an integral part of assays and tests looking for bacteria, viruses, or even just SNPs — and, according to a quick PubMed search, it has been mentioned in more than 13,000 papers. And why not? It both amplifies and quantifies the DNA or cDNA. With this in mind, for this installment of our technical guide series, we're reprising one of our favorite topics, good old real-time PCR.

In this issue, our experts not only discuss how they develop standard protocols for real-time PCR and how they design primers that do the trick, but they also tackle troubleshooting and what part of the experiment first falls under suspicion when something goes awry. Greg Shipley says it's usually not the machine that messes everything up (sorry, folks) and Deb Grove says that sometimes you just have to roll up your sleeves and make up fresh reagents. Not only do they give advice, our tireless experts also point you to resources (listed on page 22) to keep your real-time PCR experiments running smoothly — or at least as smoothly as can be expected.

— Ciara Curtin

Index of Experts

Genome Technology would like to thank the following contributors for taking the time to respond to the questions in this tech guide.

Alina Akhunova
Director of the Gene Expression Facility
Kansas State University

Zarema Arbieva

Director of the Research Service Facility, Core Genomics Facility
University of Illinois at Chicago

Deborah Grove

Director of the Nucleic Acid Facility
Pennsylvania State University

Mikael Kubista

Professor, Head Of R&D
Tataa Biocenter

Gregory Shipley

Director of the Quantitative Genomics Core Lab
University of Texas Health Science Center, Houston

Q1: How do you develop standard protocols for your real-time PCR reactions?

The most common application of real-time PCR in our facility is the study of gene expression in which we measure the relative gene expression levels in test versus control samples. Depending on the purpose of each experiment,we select the type of chemistry and decide whether this assay will be single-plex or multiplex. In cases when we have a limited amount of material or are interested in increasing the throughput, we multiplex our assay.

Our standard RT-PCR protocol includes the following steps:

RNA isolation and RNAquality test. For RNAisolation, we use various commercial kits depending on the type and amounts of tissue from which the RNA is isolated, following the standard RNA handling procedures. It is very critical to test the RNA sample for integrity and quality. The intactness of RNA could be assessed using the standard formaldehyde agarose gel electrophoresis or Bioanalyzer (Agilent Technologies).

Reverse transcription. We use a two-step RT-PCR reaction in which RNA is first converted into cDNA and then amplified with gene-specific primers. We find that purification of cDNA after a reverse transcription reaction with Qiagen PCR purification kit gives us the most reproducible results.

Selection of genes of interest, reference genes, and primer/probe design. The choice of housekeeping genes is very critical for accurate quantification and could significantly influence the outcome of an experiment. Some studies recommend using multiple reference genes. In cases when their selection is based on the analysis of preliminary experiments with Affymetrix chips or other microarray platforms, we find that it is enough to use just one housekeeping gene for relative quantification.

Amplification efficiency test and optimization of qPCR assay. We perform a primer efficiency test of primers designed to both target and reference genes using serial dilutions of a template (control or treated sample of cDNA). The optimized assay should have high amplification efficiency (95 to 105 percent), consistency across the technical replicates, and a single peak in the melting curve.

qPCR reaction and data analysis. The levels of expressed genes may be measured by an absolute quantification or by a relative real-time qRT-PCR. To achieve optimal relative expression results, appropriate normalization strategies are required: normalization to the sample size, to the total amount of extracted RNA, to quantity of cDNA, to a reference gene. The relative expression of a gene of interest in relation to a reference gene can be calculated on the basis of "delta delta Ct" values, or by Pfaffl method. The "delta delta Ct" method could be used when both target and reference genes have similar (within 5 percent of each other) and nearly 100 percent amplification efficiencies. When the amplification efficiencies of the target and reference genes are different, Pfaffl method could be used for data analysis of relative gene expression.

— Alina Akhunova

We performed a couple of studies on validation of Affymetrix-based expression analysis data. We used conditions recommended by the manufacturer (we use SYBR Green-based one-step realtime RT-PCR kit from Qiagen) and then fine-tuned primer concentrations and cycling parameters based on trial and error. Among other things, we learned that DNase digest may negatively affect the quality of the real-time reactions by monitoring melting curves — they become wider and have some significant tracing of higher molecular weight products. Therefore we would prefer to work with primers that were producing different size products from cDNA versus genomic DNA, as opposed to treating RNA.

— Zarema Arbieva

I use a standard protocol that uses the Applied Biosystems Universal 2X Mix. When I started running real-time PCR in 1997, I optimized for mg, primer, and probe concentrations. With the advent of an assay mix of this type, there is a standard concentration of primer and probe (400 nm primer and 200 nm probe) that gives PCR efficiencies between 90 and 105 percent with high-quality RNA or DNA. If the efficiencies are outside of these efficiency ranges, there could be a problem due to primer design or RNA quality.

Because my primer/ probes are designed using Primer Express, a standard cycling protocol works for every assay.

— Deborah Grove

It is very important to standardize the entire process, starting from sample collection, sample preservation, sample transportation, storage, extraction, reverse transcription and qPCR. In fact, the steps preceding qPCR contribute to most of the variation. The European Union is founding the project SPIDIA starting this autumn, which has as its goal to standardize generic pre-analytical tools and procedures for in vitro diagnostics. When standardizing the final real-time PCR, we eliminate as many manual steps as possible using robotics. In our laboratories we presently have the ePmotion from Eppendorf and the CAS-120 from Corbett. Presently, we are evaluating the WellAware System from BioTX.

— Mikael Kubista

SOPs are developed empirically, starting with a basic protocol with each variable investigated and optimized to fit the workflow in the lab. Although we all perform real-time qPCR, there are many different variables in the actual procedure that may differ from lab to lab. For example,we do not use kits for RT or PCR master mix, but most labs do. Thus, we have an SOP for how to dilute separate 100 mM dNTPs into the final stock solution we use for RT and PCR.We have SOPs and a database for how to mix the RT and PCR master mixes for each assay (some use more or less probe, for example). And we have many more SOPs for all the procedures we use in the process. There is no procedure in the lab that is too mundane for an SOP. Perhaps a better question would have been 'Do you have SOPs in place for all the procedures surrounding real-time qPCR?' The answer would have been: 'Yes, we do.' It is critical to have them and to follow them, which isn't the same and not always followed in every lab. Each person that runs a protocol will put a slightly different wrinkle into the procedure. Most of the time these are benign, but sometimes they are not. Thus, having an SOP and using it properly can be two different things.

— Gregory Shipley

Q2: What additives do you prefer to use in your real-time PCR reactions, and why?

Chemical additives could improve the reliability of complicated PCR assays.The most commonly used additive in our facility is dimethyl sulfoxide, which we usually add at the concentration of 3 percent to 5 percent for amplification of high GC-content targets. DMSO lowers the melting temperature of primers and reduces the effect of secondary structure on PCR. However, if the RT-PCR assay does not improve at the concentrations of DMSO of up to 10 percent,we optimize the RTPCR assay from the beginning, starting with the design of new PCR primers and probes.

Another additive we use is bovine serum albumin. In the final concentration of 0.04 percent, it could relieve the inhibition of PCR by low-purity templates.

—Alina Akhunova

We do not use any additives — just adjust primer concentration and the amount of RNA per sample. The kit that we use works pretty well, as long as optimal primer pairs are selected in the preliminary selection process.

— Zarema Arbieva

The only additive I routinely use is betaine. In conjunction with our sequencing of plant DNA, I have found that betaine increases base reads of GC-rich templates This can be extended to real-time PCR, and the addition of betaine to GC-rich plant cDNAs from maize, turfgrass, and cocoa has worked well. Occasionally, I find it necessary to increase the annealing/extension temperature from 60 to 62 or 63 degrees.

— Deborah Grove

We try to avoid additives. We use a lot of commercial kits and you never know how an additive may affect them. But when we use small volumes we may add some of an inert dye that we have developed at TATAA Biocenter. It colors the solution, making it easy to see where the sample goes, without interfering with the PCR. If samples contain very little material, it may be a good idea to add a carrier, typically already for the RT reaction.

— Mikael Kubista

We make our own RT and PCR master mixes from scratch components. For expression assays, we add nothing beyond the standard components required. We've been doing business this way since 1996 when there were no kits and it all works very well. For SYBR Green I assays or those involving genomic DNA, we add glycerol to a final concentration of 8 percent and Tween 20 to 0.1 percent. This was a mix recommended oh so long ago on the qpcrlistserver (now hosted on Yahoo), if I recall correctly, by Eric Lader when he was at Ambion.

— Gregory Shipley

Q3: What quality assurance/control measures do you have in place to ensure reliable data?

We always include a no template control to exclude the possibility of contamination of RT-PCR reaction. If the Ct value of NTC is 35 or above and the sample Ct value is 10 cycles lower than that of NTC, we assume that there is no contamination in our RT-PCR. However, if the sample Ct value is not sufficiently different from that of NTC or the sample Ct value is higher than 35,we recommend performing additional RT-PCR reactions with the increased amount of sample cDNA. If there is no contamination, the Ct value of experimental sample should decrease, whereas the Ct value of NTC should remain the same.

We also include no reverse transcription control for assessing the amount of contaminating DNA in RNA samples. NRTC includes starting RNA that is not transcribed by the reverse transcriptase. We include this control only if the intron spanning primer design is not possible, or treatment of RNA samples with DNase is not performed due to the limited amount of starting experimental material for RNA isolation. In the latter case, DNase treatment could significantly reduce RNA yields.
Positive controls, such as a standard curves, we use to monitor the efficiency of the assay.

— Alina Akhunova

In addition to quality control parameters implemented in SDS 2.3 software package (Applied Biosystems), we perform additional review of melting curves for all samples. Wells that produce melting curves indicative of contamination/ non-specific amplification are removed from analysis. In doing, so melting curves for all reactions performed with one particular primer pair are compared in order to have a good sense of the aberrant melting curve.

We also check quality of the standard curves and do not perform analysis if the correlation coefficient is not good enough; typically a satisfactory curve would have a correlation of 0.95 and higher.We also like to a see a slope around -3 because this gives better separation of standard curve dilutions and better differential expression p-values at the end. We also make sure that samples would be positioned well within the area of the standard curve.

We use a couple of housekeeping genes and select these genes based on more or less stable expression levels across all conditions. Affymetrix expression values are available for choosing the adequate control genes.

— Zarema Arbieva

The primary assurance is validation of efficiency of the genes of interest as well as reference genes with several biological samples. This is performed over a range of RNA concentrations, usually 10 pgs to 20 ngs in the final PCR.

Of course, minus RT reactions are sometimes necessary. For some organisms sequence information is not always available to allow primers/probes to be placed over an intron, so a DNase step is necessary. And, in some cases, sequences are intron-less. And, if the intron is short, it is possible to get amplification as well.

NTCs are required to be sure that a reagent is not contaminated.

— Deborah Grove

Quality assurance is of greatest importance at TATAA Biocenters and several of our services are accredited. First of all we perform extraction, master mix preparation, preparation of PCR reactions, and post- PCR work in separate rooms to avoid contamination.

When we perform absolute quantification experiments, we always consider the extraction efficiency. This can be done by spiking the samples with RNA or DNA from another species. In relative quantification experiments, the target gene is always quantified relative to a reference gene extracted from the same sample, which compensates for any non-specific variations in extraction efficiency.

Before running RT reactions, the quality of the extracted RNA is tested using the Experion from BioRad or the Bioanalyzer 2100 from Agilent. The extracted material is also quantified so that equal amounts can be added to all reactions. A no-RT control is included in all RT reactions to ensure that the sample is free from genomic contamination.

A no template control is included in all PCR runs to control that no unspecific products, such as contamination or primer-dimers, are present. When using unspecific DNA binding dyes like SYBR Green I or BEBO, we always run a melting curve to test for the presence of any unspecific products that contribute to the signal. Unspecific products generate peaks with different Tm than the target. Before taking probe assays into use we always run an experiment using unspecific dyes to ensure that no undesired products are competing with the target template. An alternative is to combine the probe with a dye with different emission spectrum, such as Boxto, which allows monitoring the presence of aberrant products in a separate channel. As part of the optimization process, the PCR product is run in a gel to verify that only one clear band is visible.

— Mikael Kubista

There are a couple of places where QC is important.We do not isolate the RNA/DNA for investigators. This lets us off the hook if something goes wrong with sample preparation. I recommend that folks have us run an Agilent 2100, chip analysis on their samples, but few bother to do so. It is clear from recent work by Stephen Bustin that RNA quality is a very important issue that should be considered carefully in any real-time qPCR experiment.

For the real-time qPCR assays themselves, we run five-log standard curves on every plate so we can compare PCR efficiencies from run to run for the same assay, make sure the yintercepts are staying about the same and monitor for PCR inhibition comparing the amplification curves of the standards against the samples. The y-intercept is a combination of the concentration of the standard (or calibrator) and how well it is amplified (PCR efficiency). When the y-intercept stays the same from run to run, within tight limits, you know the assay is working consistently from plate to plate. When it starts to slip, we make a fresh dilution of our 10X standard stock. The dilutions run on the plates are made fresh each day.

Since we pipet triplicates for each sample and duplicates for each standard, we can monitor the standard deviations and keep a close eye on how well the robots are working. We don't have a specific robot QC test at this time.

— Gregory Shipley

Q4: How do you design suitable primers and make sure they are specific?

Both primer design and amplicon selection are important for the efficient real-time PCR.We try to design primers with the following parameters: 18 to 22 bases long, with a melting temperature between 50°C and 65°C, with GC content of 50 to 60 percent, with Gs and Cs on ends. During primer design we try to avoid secondary structure (hairpins, primer dimers, self annealing), repeats of Gs or Cs longer than three bases, and regions of cross homology in the template to prevent non-specific amplification. The optimal amplicon size is 75 to 200 base pairs with a GC content of 50 to 60 percent. We avoid templates with long repeats of single bases and secondary structure at annealing temperature. Primers are designed for a selected candidate gene taking into the account its intronexon structure. From our experience, the best strategy is to design at least one of the primers to span the junction between two neighboring exons. This will ensure that qPCR primers amplify only RNA templates. If a gene does not contain introns or primers spanning exon-exon junctions that satisfy primer selection criteria, it is recommended to treat RNA sample with RNase-free DNase to remove potential DNA contamination.
We use Beacon Designer software for primer design. It not only takes care of primer design, but of amplicon selection as well. Also, the Primer3 program has proven to be the efficient tool to design primers. This software is easily customizable for various purposes. Primer3 significantly simplifies primer selection when one needs to design thousands of primers.

To verify the specificity, we compare primer sequences against the databases of repetitive elements and expressed sequence tags using BlastN program.

— Alina Akhunova

We use PrimerQuest Web-based application (Integrated DNATechnologies) for primer selection and design primers with exon-intron boundaries in mind. For general validation studies, we position primers around Affymetrix target sequences (unless testing of differences between family members is a target goal) in order to avoid potential technical discrepancy between Affymetrix and realtime based testing. All primers are validated in PCR reactions with genomic DNA and cDNA to validate specificity of amplification. At that point we also optimize the annealing T and primer concentrations for the actual real-time reactions.

— Zarema Arbieva

I have used Primer Express since 1997. I can also use it for MGB probes. I do a quick test of the primers I have selected with SYBR Green to see if they amplify and if there is one product.

I also Blast primers and probe but have only rarely had specificity issues. Usually the specificity issues come up when someone wants a particular region which has high homology across various strains or species. In that case, I know the issue at the start and I require plasmids of similar sequences to test for "cross-talk."

When I receive the probe, I validate over a range of RNA concentrations. If the efficiency is not where I want it to be, I synthesize a couple of different primers and try these with the probe. Having my own DNA synthesizer allows me to put my choices on the machine immediately.

— Deborah Grove

Thorough primer design is the individually most important step in setting up a new real-time PCR assay. Since the cost of oligonucleotides has drastically dropped during the last few years,we often design and order multiple primer pairs and test which combination of forward and reverse primers works best.

To design the primers we often use the commercial software Beacon Designer or AlleleID (both from Premier Biosoft) or e-prime (from Polyclone Bioservices) depending on the application. We also work with the freely available software Primer3. Short amplicons tend to amplify more efficiently than longer amplicons. For probebased assays, we typically use an amplicon length of 70 to 150 base pairs, and for dyebased assays we use an amplicon length of 100 to 250 base pairs.We always start the search for primers using very stringent primer selection conditions, especially considering primer complementarity as this may lead to primerdimer formation in the PCR. If no acceptable primers can be found, we then gradually relax the criteria until the software identifies at least one acceptable primer pair. Depending on the application, there may also be special considerations to take into account. For geneexpression analysis we try to place one of the primers across an exon-exon junction to avoid amplification of any contaminating genomic DNA in the sample. If this is not possible we try to place the primers in different exons, making it easy to identify any erroneous PCR products arising from genomic DNA.

The primers generated by the design software are validated in silico using, for example, freely available software Netprimer. It is also critical to Blast the primers against relevant databases to assure that the primers only target the intended sequence. It is important, however, to remember that extensive simulation and in silico design is never a guarantee that a primer will work well. Once a primer pair has been designed it is validated experimentally. This is done by running PCR on representative samples and verifying the presence of the expected product, as well as the absence of erroneous products, using melt-curve analysis and gel electrophoresis. If there are any doubts, the PCR product is sequenced. The efficiency of the assay can be estimated by generating a standard curve based on sample dilution, and the formation of primer-dimers can be assessed by running a no template control. Some applications may require additional validation. If the samples contain material from several organisms, we assess possible cross-reactivity to assure that the primers are specific for the targeted species.

— Mikael Kubista

We use probe-based assays that I design myself using either Beacon Designer or AlleleID, depending on what the assay entails. Thus, template specificity is much easier to ensure as both the primers and probe have to bind to the template for signal generation. Having said that, for any assay it is critical that the person designing the assay spend time doing their in silico homework. A good start is using Blast to determine if there are splice variants for the target and alignments to make decisions on what kind of assay you want to have. In the case of microbial assays, you can find related species by sequence homology. Next, if there are variants or related species, perform an alignment to see exactly where the differences reside and look for homologous sequence stretches. For splice variants, you might want a variant four-specific assay or a generic one that avoids all the unique sequence of each variant. You may also find that there may be short regions of homology with non-related transcripts that need to be avoided as well as SNP sites within primer/probe regions. I design four primers around a single probe and try them in all four possible combinations to find the optimal primer pair for an assay empirically. Sometimes the original primer set is optimal but sometimes it is not. I do the same for SYBR Green I assays. The only quick test here is melt curve analysis looking for a single template peak and minimal primer dimers. We don't have the time or hands to do the rest of the assay QC ensuring that there really is only a single template amplified. A single melt peak means the components have the same melting characteristics, but does not guarantee that there is a single template species amplified. To do so requires using either sequencing or asymmetric restriction digest analysis. We haven't run a gel in years.With assays designed from the Roche Universal Probe Library, I go with the pair given but use my knowledge of primer sequence structure to pick the best assay. I have gone back and designed new primers when the original pair didn't perform up to snuff. I don't always use the first primer pair that pops up in the Roche program. You can see more designs if you click on the more assays button. We use SYBR and UPL assays for a quick look to gather information on speculative hypotheses on gene regulation or to validate microarray data. When folks get serious about running a lot of samples with an assay, I make a quantitative assay for them with a standard and probe.

— Gregory Shipley

Q5: What computational tools do you use to analyze your data?

It depends on the throughput of our experiment. For estimating the difference in transcription between one control and one sample, we simply use Microsoft Office Excel. If we have to compare two or more treatments, groups or conditions with multiple data points in sample or control group; or for multiple reference genes and multiple target genes, we use Relative Expression Software.

— Alina Akhunova

We simply use Excel and Visual Basic to automatically import text files generated by SDS and to construct standard curves, perform normalization, calculate relative expression values and perform t-test. We use t-tests to evaluate the significance of the difference between control and the treated samples.

— Zarema Arbieva

I will do delta delta CT or standard curves for absolute copy number for my customers but they are basically responsible for final analysis of the data we return to them. I will direct people to RESTL programs if their PCR efficiencies for the gene of interest and reference genes differ more than 5 percent. Also, they are directed to other types of analysis if they are looking for very small differences among many samples.

— Deborah Grove

We use GenEx from MultiD Analyses. Using the GenEx Data Editor we pre-process the data by normalizing with interplate calibrators, correct for variations in PCR efficiency among assays, correct for PCR efficiency among samples, when relevant, using a spike (see answer to question 3), normalize to sample amount if it's variable, average qPCR repeats, normalize with reference genes, average RT and sampling repeats, normalize to reference sample, when relevant, and convert data to log base two scale. Optimum reference genes are identified using GenEx Normfinder and geNorm functionality. PCR efficiencies are estimated using GenEx standard curve analysis. When performing absolute quantification GenEx reverse calibration calculates also confidence intervals for the concentration estimates. Relative quantification studies are analyzed using GenEx t-test, when possible. If data cannot be obtained normal distributed we use GenEx nonparametric tests. For expression profiling analysis we use GenEx principal component analysis and hierarchical clustering, validating the results using GenEx self-organized maps. For multi-marker diagnostics, we use GenEx artificial neural network to train and validate predictive models.

— Mikael Kubista

I use GenEx but am just getting the hang of it. For most data sets run in the Core Lab I do the post-run analysis (baseline, threshold in the case of ABI or Fit points for Roche) and pass the normalized data set (normalized to one or more transcripts suitable for their experimental system) back to the investigator. It is up to them to carry the analysis forward from there. We run the samples blind so there is no chance of bias on our part.

— Gregory Shipley

Q6: When you encounter problems with your real-time PCR reactions, what is the first thing you suspect and how do you go about seeing what is wrong?

The failure of real-time PCR reaction could result from multiple factors including the poor quality of PCR templates, primers, or reagents. The quality controls included at each step of the experiment are critical for troubleshooting. We always check the quality of RNA samples and include no reverse transcription (to check genomic DNA contamination) and no template (to check contamination in the reaction) controls. During meltcurve analysis, we check if nonspecific products have been co-amplified. Sometimes we setup the regular PCR to test the performance of designed primers. We perform at least three technical replicates which are supposed to be consistent, and use commercially available kits for both RT step and the subsequent PCR to maximize the consistency.

— Alina Akhunova

Because our reactions include total RNA, the first suspect is that the master plates that contain 2X RNA samples (we distribute them from 96-well master plates into 384-reaction plates) were used too many times and RNA started to degrade. We avoid using 2X RNA stocks for more than two to three cycles of thawing and freezing.

— Zarema Arbieva

It is usually just a technical error. Something wasn't made up right.

The other problem is a substandard RNA sample.

The reference gene(s) should give Cts within one cycle. When reference gene Cts span two or more cycles for a particular sample, it could either be due to the reference gene not being truly constitutive or a problem with RNA isolation. The Nanodrop spec scan gives some information as to quality. When customers started submitting plant, bacteria, and insect samples, I found that a peak at 230 was a good indicator of reproducibility problems. This might be due to leftover small molecules from cell walls which were inhibiting. After samples were extracted so that the 230 peak disappeared, problems disappeared. We do not routinely require a Bioanalyzer run.

I also saw problems when samples were extracted only with Triazol type reagents. The microarray director had the same issues. I suspect that there was some carry-over of RNases or something. Advising customers to add a filter step or more than one Triazol step took care of problems we saw with reproducibility.

— Deborah Grove

The actions we take when a problem occurs with a real-time PCR assay depends on the nature of the problem. We always run appropriate controls in each experiment (such as positive controls, no template controls, no RT controls) to spot potential problems. If a validated assay suddenly performs poorly, it may be caused by the sample, the reagents, or by poor instrument performance. If we suspect that the samples may be causing the problem, we test the assay on known high-quality sample, and we may also test the samples with other high-quality assays. If necessary, the samples may go through an additional clean-up before they are used in PCR. If the samples are fine, the problem may be in the PCR amplification, but it may also be caused by problems in the generation of fluorescence during the PCR. We change all reagents (primers, probe, master mix, water) to fresh batches. If necessary, the primers and probe are re-synthesized. We also test for possible problems with instrument performance by running high-quality assays and samples, as well as running the suspicious assay on a different real-time PCR machine.

— Mikael Kubista

This question assumes there are only one or two things that can go wrong with a real-time qPCR experiment. Clearly there are many more than that. Thus, the first thing I would do would vary depending upon the problem at hand. I could write pages on different problems and how to go about sorting them out. Look at the troubleshooting guide at the end of any kit booklet for just how large an issue this can be. So, I will answer with the following selected tidbits. First, contrary to what users want to hear, 99.5 percent of the time it isn't a problem with the instrument. That is based on over nine years of running this core lab and over 12 years of experience running real-time qPCR experiments using four real-time qPCR instruments. When you have a catastrophic event, no data, it's most likely because you didn't have any template in the reaction. PCR reactions starting with DNA rarely fail. As stated above, nearly all the problems in an RT-PCR come in the RT step. I can't stress how important good assay design is to good results. Spend your time making a good assay and you will not have many issues down the road. Start with a poor assay and you'll have a poor result every time. It's simple GI/GO, garbage in/garbage out. The same logic goes to making good samples. Without clean samples stripped of any RT or PCR inhibitors with the least amount of degradation as possible, you are asking for trouble down the line. Contamination is a problem that crops up often, particularly in labs that are new to the game or when a new person comes into the lab. One of the main culprits is the water. Never pipet from your main water source. Rather, pour the water into a new disposable tube for use that day. Pipet from that tube and toss it at the end of the day. Change tips (barrier) every time you pipet. An exception would be pipetting a PCR master mix into a new plate. It takes seconds to contaminate a stock and hours to fix the problem. Last, don't try to find out what is contaminated. Rather, change all the stock tubes and see if the contamination goes away. If it does, toss the old stocks. Don't let a contamination event take on an experimental life of its own. Use new stocks and move on. If the stocks are clean but the problem persists, it's the primers/probe stocks and you'll have to order new ones. An ounce of prevention is worth a pound of cure is a very appropriate rule for real-time qPCR.

— Gregory Shipley

List of Resources

Here's where our experts turn for help or to design the best suited primers for their experiment.

Publications

Kubista K, Eliasson J, Lennerås M, Andersson S, Sjöback R. (2008). What is needed to measure a difference with real-time PCR? Eppendorf BioNews Application Notes. 29:7-8.

Lind K, Ståhlberg A, Zoric N, Kubista M. (2006). Combining sequence-specific probes and DNA binding dyes in real-time PCR for specific nucleic acid quantification and melting curve analysis. Biotechniques. 3 (40) 315-319.

Livak KJ, Schmittgen SD. (2001). Analysis of Relative Gene Expression Data Using Real- Time Quantitative PCR and the 2-∆∆CT Method. Methods. 25:402-408.

Nolan T, Hands RE, Bustin SA. (2006). Quantification of mRNA using real-time RT-PCR. Nature Protocols. 1: 1559-1582.

Pfaffl MW. (2001). A new mathematical model for relative quantification in realtime RT-PCR. Nucleic Acids Res. 29(9):e45.

Stålberg A, Håkansson J, Xian X, Semb H, Kubista M. (2004). Properties of the Reverse Transcriptase Reaction in mRNA Quantification. Clinical Chemistry. 50:3, 509-515.

Websites

BatchPrimer3
http://wheat.pw.usda.gov/cgi-bin/SNP/primer3/batch_primer_design.cgi

Blast
http://blast.ncbi.nlm.nih.gov/Blast.cgi

EasyExonPrimer
http://129.43.22.27/~primer/

Netprimer
http://www.premierbiosoft.com/netprimer/netprlaunch/netprlaunch.html

PerlPrimer3
http://perlprimer.sourceforge.net/

Primer3
http://frodo.wi.mit.edu/

PrimerQuest
http://www.idtdna.com/Scitools/Applications/Primerquest/

qPCR Listserv
http://tech.groups.yahoo.com/group/qpcrlistserver/

Relative Expression Software Tool
http://rest.gene-quantification.info

Acknowledgments

Many thanks to Linda Strömbom and Johanna Eliasson for their assistance on the answers submitted by Mikael Kubista.