Skip to main content
Premium Trial:

Request an Annual Quote

Anthony Brookes on the Faults and Fanfare of the Genotyping Arena

Premium

Name: Anthony Brookes

Title: Vice chairman and clinical genomics unit coordinator at the Karolinska Institute’s Center for Genomics and Bioinformatics in Stockholm, Sweden.

When it comes to his views on the SNP-genotyping industry, Anthony Brookes sees the microtiter plate as being half full.

Brookes, a respected researcher and lab leader at the Karolinska Institute’s Center for Genomics and Bioinformatics, in Stockholm, Sweden, developed a novel genotyping technology based on dynamic allele-specific hybridization that had been copurted for a time by Genome Therapeutics.

He also is known among colleagues as having a fine sense of the state of the genotyping corpus — he is a defender of academic labs and has experienced their sometimes profound economic challenges; he knows the myriad ways startups and spinoffs can struggle; and he has a keen understanding of the role that high-end platform companies play in the space.

Brookes, vice chairman and clinical genomics unit coordinator at the CGB, is also a researcher who possesses a great respect for his corner of pharmacogenomics, and he readily acknowledges its faults and foibles.

Brookes was in Key Biscayne, Fla., this week attending a meeting sponsored by the American Association for Cancer Research called SNPs, Haplotypes, and Cancer. SNPtech Reporter caught up with him during the last day of the meeting.

What’s your take on the state of SNP-genotyping technologies today?

We’re doing OK. It’s a very, very long way to go until we can do everything one could ever want to do. If you think there are 3 billion bases out there, who knows what fraction you’ll want to score. But I guess in an ideal experiment, you’ll want to score everything; you’d want to look at a whole genome, and you’d want to do that in thousands of individuals.

Obviously, those kinds of technologies are just not here today. So we’re a long way from that, but we’re also a long way from where we were five years ago, where studies of thousands of genotypes were the kinds of things people were reporting. There was a report recently … that reported something like 60 million genotypes in one paper. And that put it in perspective; things have really come a long way. And these new techniques … can genuinely create a million genotypes per day. Things like Illumina’s system, ParAllele’s system, Perlegen’s technology, Affymetrix’s new chips, like the 100K chips. These technologies are allowing us to genotype a million or more genotypes per day.

Yet there’s still the issue of cost. Throughput is one thing; you can genotype 1 billion per day if you bought however many thousand Illumina devices. But throughput also depends on cost. So, that’s becoming the bottleneck as well: you need to have the unit cost go down as well as the potential throughout-per-device go up. And both of those things are happening, though overall, the price ... of a competitive study today is going up. What this means is that most academics can’t play that game anymore. They’re limited to alternative technologies, which they can afford to set up in-house and run at throughputs that match their budgets. Consequently, they can’t produce the enormous studies that big industry can, or some of the bigger academic centers can.

Is that an unfortunate development for SNP-genotyping? Is it a natural, Darwinian progression?

The field is changing and evolving, and these kinds of things happen. It’s not a bad thing to have a situation [in which] some people are able to do some things that they weren’t able to do before. It’s all progress. The total amount of genotypes and genome-variation analysis that’s going on per day is increasing constantly, and that’s a good thing.

There’s two things that could be viewed as down sides: One is that individual researchers are feeling that they haven’t got anything to contribute that’s worth anything anymore. I’m at this meeting here [in Key Biscayne, Fla.] and several people have asked, ‘Well, given these technologies that we’re hearing about, what should I do? Because I can’t afford these things, and I’ve got my clinical materials.’ And they’re getting the feeling that it’s not worth doing their candidate gene studies … with a $50,000 or $100,000 budget, which for them is quite a lot. …

Also, it really put the competitive studies in the hands of … companies. So by definition, those companies are going to stake claims, over lots of discoveries that will take a while until academia gets to find. It’s kind of like the sequencing history, where Celera went out there and tried to grab the genome quickly. And fortunately, academia was in a position at that stage to get together and do something comparable and competitive. But with genome variation, the academic efforts focus on the HapMap project. … And while it’s putting its effort into that, it obviously is less able to put a lot of effort into whole genome-association studies. And I can imagine that groups like the American Association for Cancer Research and others will, or probably are, considering getting those materials together and doing those studies. But you’re talking about studies that cost millions of dollars for each scan, and you need the infrastructure and everything else to make that happen. So it seems to me that academia is on the whole is going to lose. …

What would you say is some of the most interesting research being performed today, and what does that tell you about the state of the industry?

There are still very different opinions out there. … It’s still very much a field in flux. It’s not as though it’s clear which way to go, whereas for doing the genome it was clear it was going to be that particular technology on some kind of capillary devices, and once everyone started racing with the shotgun approach it was clear which way to go. It’s not like that with genome variation. We’re far from that position. We’re many steps behind where expression-array studies are, in fact. A number of years ago, the expression-array community had cause to do really quite high-throughout experiments — many, many genes, many tissues. And then they started running into problems of, ‘How do we organize the data? What are standard exchange formats?’ Really, that’s about the stage that the genome-variation field is in now. It’s only just now got its hands on these pretty high-throughput techniques, and we’re just starting now to say, ‘Oh, crikey, we need better databases for organizing discoveries, for standardizing our methods for looking at precision estimates, and what it is that makes one method work better on one set of SNPs than another.

You’re running a successful lab at Karolinska, and we’ve heard that your technology has been approached by at least one company. What kind of advice would you give to colleagues in academia who are considering commercializing a technology?

There are a lot of players out there putting a lot of money into research. Some of them are groups like Illumina, where they’ve got a powerful technology and they’re constantly striving to improve it further. Other groups are companies like Affymetrix, where what I said for Illumina is true, but they’re also perhaps experimenting with more and more different ways to use their more generic platform. So they might be trying all sorts of PCR tricks that we haven’t heard of. …

It’s very hard for one guy and his post-doc or his student to compete. You have to come up with something that’s really innovative, because whatever idea the academics are coming up with, you can bet these big companies are coming up with the same ideas as well — with the resources to take them through.

That said, I don’t want to disillusion anyone. Look at history: It’s from that academic base that many of the truly original breakthroughs come. …

Where did PCR come from in the first place? It came from an academic lab.

 

Note to readers: The final installments of the CFO Perspectives, scheduled to run over the next three weeks, will not appear. Despite numerous requests for interviews, the three companies — Transgenomic, Lynx Therapeutics, and DeCode Genetics — declined to participate in the annual, mid-year financial interview. SNPtech Reporter has therefore resumed the traditional Perspectives section, which offers insights from the industry's leading academic and regulatory insiders.

— Ed.

Filed under

The Scan

Positive Framing of Genetic Studies Can Spark Mistrust Among Underrepresented Groups

Researchers in Human Genetics and Genomics Advances report that how researchers describe genomic studies may alienate potential participants.

Small Study of Gene Editing to Treat Sickle Cell Disease

In a Novartis-sponsored study in the New England Journal of Medicine, researchers found that a CRISPR-Cas9-based treatment targeting promoters of genes encoding fetal hemoglobin could reduce disease symptoms.

Gut Microbiome Changes Appear in Infants Before They Develop Eczema, Study Finds

Researchers report in mSystems that infants experienced an enrichment in Clostridium sensu stricto 1 and Finegoldia and a depletion of Bacteroides before developing eczema.

Acute Myeloid Leukemia Treatment Specificity Enhanced With Stem Cell Editing

A study in Nature suggests epitope editing in donor stem cells prior to bone marrow transplants can stave off toxicity when targeting acute myeloid leukemia with immunotherapy.