A conversation about qPCR with Jo Vandesompele

Research profile: An interview with Dr Jo Vandesompele, internationally recognized expert on quantitative PCR, discussing the common issues researchers face when using qPCR, and the future direction of qPCR technology.

Jan 10, 2012

How did your experience with qPCR lead to Biogazelle?

I started using qPCR methods in my own research while working on a rare childhood cancer called neuroblastoma. I wanted to perform gene expression analysis on some tissue samples, but we only had very small tumor biopsies. A collaborator suggested that I use quantitative PCR. That was in 1998 when I still was a graduate student. At that time, I had not heard of the technology, so I did some research and quickly realized that it was indeed what we needed since it is both quantitative and very sensitive because it is PCR. Our hospital had just purchased one of the first commercially available real-time PCR instruments, the ABI 7700. I had access to that instrument, and my analysis of those samples was the start of my experience with qPCR.

Years later, I started working with Biogazelle’s other co-founder, Dr Jan Hellemans, who was employing qPCR for his PhD thesis research. When he came to me for help with qPCR data analysis, I explained my data analysis to him, going through all the formulas and calculations by hand. He thanked me and I didn’t see him for a while. When he returned weeks later, he had programmed all of my workflows and formulas into Microsoft Excel. I was really impressed and discussed with him other things we could do. Six months later, he came back with what became the first version of the qbase qPCR data analysis software.

Our qbase software was put online and in a couple of months it had been downloaded hundreds of times, which led to the suggestion to make qbase into a professional software package, and shortly after to the founding of Biogazelle.

What are common problems investigators have when using qPCR?

It really varies a lot with experience and the specific application.

The first problem that many investigators face occurs with experimental design. Setting up a qPCR experiment is so simple that it actually becomes dangerous. With the availability of high quality instruments and reagents, it is a very straight forward process to put samples into a tube, add an assay and enzyme master mix, and very quickly generate a lot of data. However, when it is time for data interpretation the researchers often contact our support and we have to disappoint them because they didn’t do the right controls to account for important aspects of the qPCR workflow. There are probably some who do not think carefully about their experimental design and proper controls; however, most people do think about these things, but they just do not understand all the important issues.

The second challenge, also related to experimental design, is the selection of suitable reference genes for normalization—though we addressed this in detail in our seminal paper in 2002 [1]. A common question is whether to use only one reference gene, or multiple reference genes. Based on our own data, we recommend the latter. The more basic issue is how to then use the reference genes to achieve more accurate normalization.

The third common problem is sample quality. Typically, for gene expression studies using microarrays or RNA sequencing, people do extensive sample quality analysis to avoid ruining an experiment that can costs thousands of dollars. However, they often do not perform this analysis for qPCR, partly because the experiments are relatively cheap, and because qPCR appears very forgiving in terms of sample quality due to the very small amplicons used for most assays.

For example, when studying paraffin-embedded, formalinfixed samples that contain substantially fragmented nucleic acids, you can relatively easily use qPCR by designing very small amplicons. The small amplicons make it highly likely that even degraded samples will have enough of the target sequence to provide some amplification. And, because investigators are able to generate product, they believe that the result is reliable, but this is a major misconception. We recently published a paper in which we specifically addressed whether RNA quality is an important factor for qPCR experiments [2]. We methodically determined the degree of impact that RNA quality has on qPCR experiments and the ability to accurately analyze the data from those experiments. The results were dramatic and clearly demonstrated that RNA quality had a direct impact on the variability of both the selected reference genes, and the measured significance of various biomarkers in more than 600 primary tumor samples. Unfortunately, many users are still unaware of the importance of RNA quality.

Much of the training Biogazelle offers through its qPCR courses is designed specifically to address these important issues. The courses are very different from those offered by other companies, which tend to focus on the experimental procedure itself. We conduct Biogazelle training from two different perspectives, namely experimental design and advanced data analysis. The first part of experimental design involves what is called a “power analysis”, which helps ensure that researchers have a sufficient number of samples and controls to enable them to draw statistically significant results from their analyses. The second part of the training demonstrates how to set up plates, what controls are needed, and how to use a sample maximization approach versus an assay maximization approach. We end with biostatistical interpretation of the results.

What do you see as the future of qPCR methods?

The ongoing trend is towards miniaturization, with increasingly small reactions and qPCR platforms that allow for several thousand reactions in a single run. All the features are present to replace microarrays with a technology that is, in principle, superior with regards to specificity, sensitivity, turnaround time, accuracy, and dynamic range. There are still some outstanding issues that need to be resolved, with the major issue being sensitivity and having enough PCR template in these small reaction volumes. However, miniaturization is definitely a trend, and I expect further improvements in this area.

The other trend I see is digital PCR. Here, reactions contain approximately one target molecule per reaction. Many reactions must be performed in parallel to accurately determine what proportion of the total number of reactions contain template for that target. The individual reactions are not quantitative, but by doing hundreds or thousands of reactions you can very precisely determine the copy number in your original sample. So it is extremely powerful, as you can determine the concentration of your target of interest with high precision and accuracy.

The most important issue with digital PCR is the large number of reactions you need to perform. In this regard, platforms that can quantify several thousand reactions at the same time make digital PCR a very promising technology and it may eventually replace conventional qPCR. Again, there are some outstanding issues, but for some applications it definitely has advantages over qPCR today.

What is the significance of the MIQE guidelines?

I am quite happy that there is so much attention given to the MIQE (Minimum Information for Publication of Quantitative Real-Time PCR Experiments) guidelines [3]. The initiative to produce the MIQE guidelines was started by Dr Stephen Bustin, and was the work of an unofficial consortium of qPCR experts who were all frustrated with the problems with qPCR methods in published papers. For example, it is very common that there is not enough information to repeat the experiment. In addition, investigators do not always address everything that needs to be accounted for in a proper qPCR experiment, or you cannot tell if they did because the details are not sufficiently reported. These issues are critical, as professor Bustin rightly points out, because they actually corrupt the integrity of the literature with data that is of questionable quality.

The importance of these concerns in published qPCR studies compelled us to summarize the most essential criteria that should be addressed and reported when setting up a qPCR experiment. The resulting 85 parameter checklist should help researchers document and perform better qPCR experiments. Investigators can find information on the guidelines and partners at the MIQE website.

We do understand that the guidelines were developed by academic groups doing research, and that they may not be appropriate for all fields and applications such as clinical diagnostics, digital PCR, and genotyping. This is why the consortium needs input from the community, so we can design extended or modified checklists for specific applications or fields.

How do you think MIQE will affect qPCR reagent suppliers?

MIQE is all about transparency and the ability to replicate studies. However, there have been extensive discussions in the consortium about what the minimum requirements for transparency should be. Based on the original MIQE paper, one might argue that authors who are using products from companies that do not provide primer and probe sequences are not complying with MIQE standards, and the consortium might want to prevent such companies from selling their products or promote some vendors over others. In an ideal world it is probably correct that we should report all of the relevant experimental information. However, some vendors have said that they cannot provide all of this information, and we have to recognize that we live in a commercial environment where we have to find compromise between intellectual property and the ability to replicate an experiment.

This is why we came up with a consensus paper that states that while it is still recommended to report primer sequences, it is not absolutely required [4]. Instead, the newly modified standard says that providing a context sequence that can be used to identify the applicable amplicon sequence +/-15 bases is sufficient, as long as it allows others to replicate the experiment.

It is interesting to note that many commercial suppliers of qPCR tools are introducing their products as MIQE compliant. It demonstrates that these companies see the importance of what we are trying to accomplish with the guidelines and want to promote them. With that in mind, it is important to not give too much weight to suppliers of tools who state their product is MIQE compliant. It probably means that the product is a useful tool in the qPCR workflow, but it does not really add an extensive quality label to that product.

With that in mind, I do appreciate that some qPCR suppliers, like IDT, do provide all the recommended information, including primer and probe sequences. It is preferable that companies do that and they should be rewarded somehow for providing detailed information, at least through appreciation from the scientific community.

How has your lab used IDT products and services?

We have been using IDT products since IDT came to Belgium in 2008. We are great fans of all your primers, and like that IDT offers quality guarantees, and provides high quality oligos at a good price. We generally do our own custom designs for our experiments, and IDT is the preferred supplier for most of our qPCR primers and synthetic templates. We greatly appreciate the consistent quality, and speedy delivery, and the fact that you can accommodate so many scales and delivery options.

We also like having the forward and reverse primers normalized by IDT when we order primers in plates so that we just need to add water. We can then use our 96-well robotic head to transfer the assays to experimental plates for qPCR. I think IDT is the only company that can do this normalization with consistent quality, at a good price, and with a short turnaround time.

Our lab has had very good experiences working with IDT and we have essentially converted the whole department to ordering from IDT.

Do you have any advice for young researchers?

Never give up and always follow your own ideas and dreams. Scientists should be people with creative minds, always open to new ideas and adventures. However, many researchers are fairly conservative in their approach to scientific study. I would encourage young scientists to be as creative as possible and follow their own gut feeling on how to design experiments, obviously within the constraints of their environment and funding possibilities. I think any idea is a good idea, so follow your own ideas.

Read more about the resources at BioGazelle.

Research Profile

Dr Jo Vandesompele is a professor of functional genomics and applied bioinformatics at Ghent University, where his lab primarily focuses on cancer genomics. He is also an internationally recognized expert on quantitative PCR and co-founder, together with Dr Jan Hellemans, of the biotechnology company Biogazelle. Biogazelle offers products and services to assist researchers with all aspects of performing real-time PCR, from experimental design and data generation through comprehensive, user-friendly data analysis. We recently had the opportunity to speak with Dr Vandesompele about his research, Biogazelle, and the field of qPCR.


  1. Vandesompele J, De Preter K, et al. (2002). Accurate normalization of real-time quantitative RT-PCR data by geometric averaging of multiple internal control genes. Genome Biology 3(7): research0034.1–research0034.11.
  2. Vermeulen J, De Preter K, et al. (2011). Measurable impact of RNA quality on gene expression results from quantitative PCR. Nucleic Acids Research 39(9): e63.
  3. Bustin S, Benes V, et al. (2009). The MIQE Guidelines: Minimum information for publication of quantitative real-time PCR experiments. Clin Chem 55(4): 611–622.
  4. Bustin SA, Benes V, et al. (2011) Primer sequence disclosure: A clarification of the MIQE Guidelines. Clin Chem 57:919–921.