from genengnews.com – by Shawn C. Baker, Ph.D, CSO BlueSEQ (www.blueseq.com)- Reprinted with permission from Genetic Engineering & Biotechnology News (GEN)
Is it time to switch?
With recent advancements and a radical decline in sequencing costs, the popularity of next generation sequencing (NGS) has skyrocketed. As costs become less prohibitive and methods become simpler and more widespread, researchers are choosing NGS over microarrays for more of their genomic applications.
Rising maturity in NGS systems and ancillary protocols such as library preparation and data analysis tools have certainly contributed to the increasing popularity among the research community. Whether it’s a need for more accurate data, better resolution, pressure from granting agencies, or just plain fear of being left behind the technology forefront, it’s clear that the demand for revolutionary sequencing technologies that deliver fast, inexpensive, and accurate genomic information has never been greater.
As outlined in a previous article (GEN Sep 1, 2012; Vol. 32, No. 15), NGS technologies have made great strides both economically and technically, and are gaining in popularity since first appearing on the scene less than a decade ago. With the cost of sequencing a human genome soon to drop to just over $1K and a market trend towards cheaper instrumentation, NGS is more affordable than ever for projects with even the tightest of budgets.
The immense number of journal articles citing NGS technologies is sending a clear message to array users that NGS is no longer just for the early adopters. Once thought of as cost prohibitive and technically out of reach, NGS has become a mainstream option for many laboratories, allowing researchers to generate more complete and scientifically accurate data than previously possible with microarrays.
Microarrays’ Proven Track Record
With all the advancements that NGS has made, why is anyone still using microarrays? The answer is, “lots of reasons!” Microarray platforms have a proven track record spanning nearly two decades in the lab. And with practice comes mastery—researchers have grown comfortable both with operating the technology and analyzing the results. Microarrays are generally considered easier to use with less complicated and less labor-intensive sample preparation than NGS. The same goes for data analysis. While there are still many tools for data analysts to choose from, a general consensus has emerged on the major methods for processing the data. And, despite the rapid drop in the cost associated with NGS, arrays are still more economical and yield higher throughput, providing significant advantages when working with a large number of samples.
Time to Make The Switch?
So when is it time to make the switch from microarrays to NGS? What factors should be considered when deciding between these two technologies? While researchers facing such choices may feel overwhelmed, it boils down to just a few key areas, such as research goals (e.g., discovery vs. profiling), access to technology, maturity of applications, cost per sample, and desired throughput. These key aspects for the primary genomic applications are addressed separately below. For some applications, such as chromatin immunoprecipitation, the transition to NGS is nearly complete, while for others, like cytogenetics, the transition has barely begun.
Chromatin immunoprecipitation (ChIP) experiments were among the first to be switched from arrays (ChIP-chip) to sequencing (ChIP-Seq), as the new NGS technology provides much better peak resolution. The transition was swift as this application is less demanding in terms of what is needed from the sequencing platforms—only a small number of short reads are required. The rapid drop in the cost of NGS has accelerated the transition.
Researchers have been eager to use NGS for gene expression experiments as it provides a more detailed look at the transcriptome. Arrays suffer from fundamental ‘design bias’ —they only return results from those regions for which probes have been designed. Consequently, arrays are only as good as the databases from which they are designed. Conversely, the various RNA-Seq methods cover all aspects of the transcriptome without any a priori knowledge of it, allowing for the analysis of such things as novel transcripts, splice junctions and noncoding RNAs. Despite the technical advantages of RNA-Seq, microarrays remain popular for two primary reasons. One, their longtime use as a genomics tool means many researchers are very comfortable using them—sample labeling, array handling and data analysis methods are tried and true. Two, despite NGS advancements, expression arrays are still cheaper and easier when processing large numbers of samples (e.g., hundreds to thousands).
For genotyping studies, microarrays are still widely adopted as they are substantially less expensive than NGS and much more conducive to processing thousands of samples required for typical genome-wide associations studies (GWAS). Unlike gene expression, array-based SNP assays are much less prone to design bias. However, as microarrays are limited in the number of SNPs they can contain, they tend to focus more on relatively common variants. Some researchers feel more emphasis should be placed on SNPs with lower minor allele frequency (MAF). Although newer arrays are being designed with lower MAFs in mind, many researchers are starting to gravitate towards sequencing as it can capture both common and rare variants. However, the major stumbling block is the cost of whole genome sequencing (WGS). To lower the cost per sample and increase the throughput, many are focusing on exome sequencing. This strategy concentrates the sequencing power on just 2% of the genome (for humans and other mammals) containing the coding sequences and ignoring the remaining noncoding regions.
The choice is more complicated with methylation projects. While NGS unquestionably provides a more complete picture of the methylome, whole genome methods are still quite expensive. To reduce costs and increase throughput, some researchers are using targeted methods, which only look at a portion of the methylome. However, microarrays remain a popular choice as their use reduces the cost and increases throughput even further. Because details of exactly how methylation impacts the genome and transcriptome are still being investigated, many researchers find a combination of both technologies best fits their needs—NGS for discovery and microarrays for rapid profiling.
For the diagnostics market, the transition from microarrays to sequencing has been much slower. Clinicians are a conservative group, and not so concerned with technological advancements. They are more interested in ease of use, consistent results, and regulatory approval. Microarrays tend to fit this bill, offering a stable, proven platform. Microarrays also offer simpler data analysis, as it is known ahead of time exactly what kind of data will be returned. With NGS, there’s always the possibility of revealing something new and unexpected. While that can be really useful in a research environment, clinicians generally aren’t prepared for this extra information. But the power and potential cost savings of NGS-based diagnostics is alluring, leading to their cautious adoption for certain tests such as non-invasive prenatal testing.
Perhaps the application that has made the least progress in transitioning to NGS is cytogenetics. Researchers and clinicians, who are used to using older technologies such as karyotyping, are just now starting to embrace microarrays. Companies are catering to this need by offering updated microarrays, which provide better resolution and lower costs. As with other applications, NGS has the potential to offer even higher resolution and a more comprehensive view of the genome, but it currently comes at a substantially higher price due to the greater sequencing depth required.
While dropping prices and maturing technology are causing NGS to make headway in becoming the technology of choice for a wide range of applications, the transition away from microarrays is a long and varied one. Different applications have different requirements, so researchers need to carefully weigh their options when making the choice to switch to a new technology or platform. Regardless of which technology they choose, genomic researchers have never had more options.