Skip to main content
Premium Trial:

Request an Annual Quote

Time for the Next Generation Arrays? Pharmas Lay Out Their Top Priorities


In an effort to set themselves apart from the “big A,” a number of companies poised to enter the gene expression chip market have promised to provide the “second generation” of microarrays.

But what exactly does “second generation” mean? That depends largely on whom one asks: in terms of gene content, most people refer to a small number of probes, either on custom arrays, where users define part or all of the content, or on theme arrays that focus on a certain disease area or gene family. The number of genes on these arrays could reach from anywhere between a few hundred to a few thousand, staying well below whole-genome arrays. With regards to technology, many expect the second generation to enable higher sensitivity and throughput, which could be achieved by formats other than chip surfaces, or to be able to render additional information, for example about splice variants.

But pharmaceutical and biotechnology researchers, whom the newcomers regard as their prime target, are divided over whether there is an increasing need for second generation arrays in their research, and have a range of expectations for these new technologies.

Some do not follow the seemingly logical progression from whole genome arrays in the early target discovery phase to more focused arrays concentrating on a smaller number of genes of interest. Aventis Pharma, for example, prefers to use whole genome arrays, which it receives from Affymetrix, for most of its gene expression studies, which include target discovery and target validation studies, because this decreases chances of missing something important. “There may be 100 genes you know about, but there may be additional genes ... which you don’t know about but that play a role in whatever you are studying. And if you don’t have those genes on the chip, you may miss [them],” said Kathy Call, global head of genomics technology transfer at Aventis’ Cambridge, Mass.-based functional genomics unit. Affymetrix has found this to be true in its sales numbers. “The volume of arrays that we sell that are whole genome arrays vastly outweighs [our custom arrays],” said Elizabeth Kerr, the company’s senior director of marketing for gene expression products.

Getting Focused

Others agree that using the entire genome is generally a good idea for discovery-type experiments, but also see value in using more focused arrays for high-throughput applications like screening compounds. “Whenever there are times where you want to go to as high a throughput as possible, a second generation array that has a smaller content...could allow you to get by with less sample and [could] possibly [be] automated to get much higher throughput,” said Paul Kayne, a senior research investigator at the applied genomics department of Bristol-Myers Squibb. At the moment, BMS uses arrays spotted in-house for these purposes, in addition to its Affymetrix whole genome arrays, but is also testing a number of commercial technologies, he said. According to Call, another promising application for focused arrays would be to study splice variants of a small number of genes of interest. “That’s about the only area I can really see us doing much in focused arrays,” she said.

Focused, though, doesn’t necessarily mean off-the-shelf, and not everyone is convinced of the usefulness of theme arrays. “There are certainly places for theme arrays, in particular if you can increase the throughput while reducing the cost,” said Kayne. However, Kerr admitted that Affymetrix’ cancer chip has not been a great success, and that a major difficulty with theme arrays was that every researcher wants a different subset of genes. “There is some potential in certain areas like toxicogenomics,” she said, “…but it is a challenge to make a single array with a subset of genes that addresses the needs of a broader community.”

Wish Lists

High throughput and automation definitely appear high on companies’ wish lists for technology improvements. This could mean array carriers that fit into a microtiter plate footprint, thus making them compatible with liquid handling systems, or arrays printed in the wells of microtiter plates. Bead- or fiber-based “arrays” could also lead to improved throughput, because “you have ways to sort things much faster,” said Kayne. Though companies value the reliability and consistent quality of Affymetrix chips, they agree the system currently lags in automation capability, an issue Affy is currently working on, according to Kerr. Likewise, Agilent has made high-throughput one of the major focal areas in its research, said Doug Amorese, the company’s R&D manager for bioresearch solutions.

Going hand in hand with smaller content and higher throughput is cost per assay. This becomes an issue when it comes to screening experiments. “We would design experiments differently if it cost a penny to run them vs. costing $50 per sample,” said Kayne. But this is only true if the quality remains the same. When it comes to compromising results for reduced cost, the verdict among researchers is unanimous: data quality ranks first. “Money is always important, but if you are going to be at the edge, ...cost is not the most important thing,” said Shane Weber, a senior scientist in technology development at Millennium Pharmaceuticals. His company switched from a cDNA platform to Affymetrix about a year ago, and has also been using its own nylon-based mini-arrays with a few hundred spots, which it is looking to replace with a sensitive and automation-friendly commercial system – not necessarily chip-based – to study low and medium abundance genes in compound screens, toxicology studies, or cell culture quality control assays. “We see two areas [for mini-arrays], one of 50 genes but hundreds of thousands of assays, [and] others which might be anywhere between 300 [and] maybe as many as 1,000, but where you are doing…maybe 1,000 arrays in an experiment,” Weber said.

Increased sensitivity is an item on the wish list of many researchers, enabling them, for example, to use less of precious patient samples from clinical trials, and to get away without any sample amplification steps that may introduce bias in expression studies.

Sense And Sensitivity

One of Weber’s goals is to work with as few as 10,000 cultured cells and be able to detect genes expressed between five and 100 copies per cell, with as few sample manipulation steps as possible. Both Affymetrix and Agilent said they are working on ways to improve the sensitivity of their systems.

While researchers appreciate having someone else produce the chips and check their quality, quick turnaround time for custom arrays – advertised by some new companies – does not take the highest priority, as long as it stays within a month or so. “For planning experiments over six months or a year time period, if it takes them two weeks or six weeks to get it ready, maybe that’s not so critical,” noted Kayne.

– JK

The Scan

Single-Cell Sequencing Points to Embryo Mosaicism

Mosaicism may affect preimplantation genetic tests for aneuploidy, a single-cell sequencing-based analysis of almost three dozen embryos in PLOS Genetics finds.

Rett Syndrome Mouse Model Study Points to RNA Editing Possibilities

Investigators targeted MECP2 in mutant mouse models of Rett syndrome, showing in PNAS that they could restore its expression and dial down symptoms.

Investigators Find Shared, Distinct Genetic Contributors to Childhood Hodgkin Lymphoma

An association study in JAMA Network Open uncovers risk variants within and beyond the human leukocyte antigen locus.

Transcriptomic, Epigenetic Study Appears to Explain Anti-Viral Effects of TB Vaccine

Researchers report in Science Advances on an interferon signature and long-term shifts in monocyte cell DNA methylation in Bacille Calmette-Guérin-vaccinated infant samples.