Skip to main content
Premium Trial:

Request an Annual Quote

Despite Recent Deals, Future of Array and Next-Gen Sequencer Synergy is Uncertain

Premium
Two recent deals that pair Roche’s 454 GS FLX sequencing instrument with NimbleGen arrays have underscored the idea that the two technologies are complementary. However, while most experts agree that arrays could play a future role as a front-end technology in sequencing experiments, not all are convinced that such a role would last.
 
Roche, which owns 454 and NimbleGen, announced the back-to-back deals over the past two weeks. Last week, the company said that the Oxford Biomedical Research Centre in Oxford, UK, will install several 454 Genome Sequencer FLX sequencing instruments and use them with NimbleGen’s arrays to conduct genome-wide studies focused on finding links between human diseases and genetic mutations.
 
Roche acquired 454 last year for $154.9 million and followed it up by buying NimbleGen soon after for $272.5 million. At the time of the NimbleGen acquisition, company officials stressed that the two platforms could be used in combination for more powerful studies (see BAN 6/27/2007).
 
According to a statement from Roche, OxBRC, a partnership between Oxford Radcliffe Hospitals National Health Service Trust and the University of Oxford, will use NimbleGen’s chips both as a preparative tool for sequencing and for genome-wide studies of samples with clinical documentation for selected diseases in three areas.
 
The researchers will sequence several genes for private familial mutations; conduct mutation screening in a large number of genes using both the 454 and NimbleGen technologies; and look for genes and regions with genomic imbalances using NimbleGen’s comparative genomic-hybridization arrays.
 
The OxBRC deal comes on the heels of a similar arrangement with Barts and The London Medical School, a UK-based hospital teaching trust. Roche said earlier this month that the trust’s cancer genomics medical group has acquired NimbleGen’s sequence-capture array technology for use with its existing 454 GS FLX (see BAN 7/15/2008).
 
According to Roche, the group plans to combine the technologies to study genetic changes involved in the development of leukemia. The researchers plan to link the sequence information to an existing clinical and cytogenetic database to investigate the relationship between genetic changes and clinical features of the disease.
 
Both deals appear to vindicate Roche’s strategy of using NimbleGen’s sequence-capture method — which allows users to capture targeted regions of the genome for sequencing — as a time- and cost-saving mechanism for second-generation sequencing. However, while current users of both technologies admit array-based methods like sequence capture are likely to gain a customer base in coming months, their long-term prospects are less clear.
 
Roderick Jensen, director of the Virginia Bioinformatics Institute and a 454 and Affymetrix customer, told BioArray News in a recent interview that the popularity of integrating array-based sample-prep methods with second-gen sequencing is likely to increase in coming months, but that the steady decline in sequencing costs will make such methods less necessary, therefore diminishing the financial benefit for array companies.
 
“The game has been with next-gen sequencing technology to generate more and more information for less and less cost,” he said. “As you can imagine, there are lots of companies in the wings trying to do this with an abstract target of $1,000 per human genome, so this is a freight train coming really fast.”
 
Jensen referred to sequence capture as a “stop-gap measure that is going to be very valuable in the short run, but as sequencing gets cheaper and cheaper, you can just go ahead and sequence the whole thing; you don’t have to capture.”
 

“As sequencing gets cheaper and cheaper, you can just go ahead and sequence the whole thing; you don’t have to capture.”

Michael Zwick, an assistant professor of human genetics at Emory University, who helped develop some of the methods being commercialized at NimbleGen (see BAN 10/16/2007) said that he is confident that the sequence capture method will be widely adopted by users of second-gen sequencers and that he has seen “a lot of interest” in the possibility of combining array and sequencing technologies.
 
Still, he told BioArray News last week that “if sequencing entire genomes becomes technically easy and computationally feasible, then perhaps these enrichment approaches will be used less frequently.”
 
At the same time, he stressed that “for many potential applications, there is no need to sequence the entire human genome, so targeted resequencing will likely be quite valuable well into the future.”
 
But Chad Nusbaum, co-director of the Genome Sequencing and Analysis program at the Broad Institute in Cambridge, Mass., said that array-based front-end approaches to sequencing-based projects are likely to generate high prices for researchers, in some ways negating the argument for using array-based approaches at all.
 
“I'm not a huge fan of the array-capture option personally, because it requires an array for each sample,” Nusbaum told BioArray News last week. “Even if you can reuse them a few times — which I don't think has been shown — the price gets steep if one is doing a lot of samples.”
 
According to Nusbaum, for many experiments people will want to do “tens or hundreds” of samples. “As sequencing gets cheaper, the array capture method quickly becomes relatively more expensive and, at some point soon, this will matter a lot,” he explained. "I think array-based capture may be widely used in the next year or so, but may become less popular in the long run if my personal forecast is accurate.”
 
Rather than use sequence capture, the Broad and some other groups are developing methods that perform target capture in the liquid phase using oligo probes. The Broad’s method is based on hybridization to biotinylated baits, Nusbaum said.
 
“These methods scale nicely because they are purely liquid handling steps, and can be done in microtiter plate format by robots, so they can realize the economies of scale,” he said. “The more samples one does, the cheaper they are to capture by these methods.”
 
Agilent Technologies last month licensed the Broad’s method, called genome partitioning, and plans to begin selling it as an extension of its existing oligo library-synthesis offering later this year (see BAN 7/8/2008).
 
While the method does not use arrays to perform sequence capture, it does make use of the firm’s microarray-fabrication infrastructure to design the probes and construct the oligo libraries.
 
Yvonne Linney, Agilent’s general manager of genomics, told BioArray News earlier this month that the process will require users to design a library of probes using Agilent’s eArray tool. The library is then manufactured by Agilent on its arrays, cleaved off, and transferred to a reagent-manufacturing site in Texas that is home to Agilent subsidiary Stratagene. There is goes through a process of in vitro transcription and biotinylation to become a cRNA biotinylated library.
 
Fred Ernani, Agilent marketing manager for emerging genomic applications, told BioArray News that the company believes its genome partitioning offering “addresses a substantial bottleneck in all next generation-sequencing workflows, and thus will be widely adopted.”
 
In addition to constructing oligo libraries for use in next-gen sequencing, Ernani said that Agilent will offer “on-array genome partitioning” using custom microarrays that will directly compete against NimbleGen’s sequence capture method.
 
However, he cautioned that a large amount of DNA is required for on-array genome partitioning, and that the method is more suitable for “proof-of-principle experiments and studies involving modest numbers of samples.”
 
Ernani admitted that second-generation sequencing is “still at an early stage of adoption” but argued that the firm’s array-based approaches would pay off in the end. “Although further enhancements in multiplexing, throughput, and read length will undoubtedly be achieved, we believe genome partitioning as a sample-preparation method is here to stay,” he said.
 
“A great deal of sequencing work will need to be done in order to identify and understand the role rare genetic variants play in disease,” said Ernani. “Target enrichment by some methodology will be necessary for the foreseeable future in order to ensure sufficient coverage and accuracy. In addition, investigators are continually devising new applications for partitioning and sequencing, for example capture of methylated DNA.
 
“The combination of new applications plus enhancements to synthesis chemistry and product format, such as longer oligos or higher complexity libraries, enhance prospects for this emerging business,” he said.

The Scan

Push Toward Approval

The Wall Street Journal reports the US Food and Drug Administration is under pressure to grant full approval to SARS-CoV-2 vaccines.

Deer Exposure

About 40 percent of deer in a handful of US states carry antibodies to SARS-CoV-2, according to Nature News.

Millions But Not Enough

NPR reports the US is set to send 110 million SARS-CoV-2 vaccine doses abroad, but that billions are needed.

PNAS Papers on CRISPR-Edited Cancer Models, Multiple Sclerosis Neuroinflammation, Parasitic Wasps

In PNAS this week: gene-editing approach for developing cancer models, role of extracellular proteins in multiple sclerosis, and more.