Skip to main content
Premium Trial:

Request an Annual Quote

Seeing Is Believing

Premium

If you haven't used imaging yet — and it's a fair bet you haven't — what are you waiting for? The research possibilities enabled by coupling high-throughput imaging with other large-scale biology tools, such as genomics and proteomics, are growing fast enough to warrant their own mini-revolution.

Before imaging will be embraced by the number of researchers whose work could actually be helped by the technology, it will have to overcome the stigma that has followed it since its inception: that it's subjective, inaccurate, more art than science. "Traditional microscopy was about somebody looking down a microscope and saying, 'Something moved,'" says Ger Brophy, general manager of advanced systems for GE Healthcare. "It was qualitative." To translate: not something you'd ever associate with the quantitative, high-throughput systems biology field.

But imaging vendors and interested academics have spent the last decade or two working to change all that. Today, bioimaging is highly automated, high-throughput, and increasingly quantitative. One significant change came when imaging capabilities went to 3D, says Roland Eils at the German Cancer Research Center. "That was the first time the bioimaging community realized that computational methods were essential, and not nice to have."

In fact, it was an informatics breakthrough that really put imaging on the map for biologists, says Brophy. A little company called Cellomics introduced algorithms that could scan through a series of images and report back to the user what had happened during the experiment. "Cellomics really broke the mold," he says. The company also proved that this technology was no flash in the pan: it was acquired by Fisher Scientific (now Thermo Fisher) for about $49 million in cash in mid-2005.

Of course, another reason for the uptake of imaging was that scientists became increasingly aware that traditional Petri dish studies were not as biologically meaningful as they had hoped. "A lot of the stuff you see in a tissue culture dish is not what you see in an in vivo situation," says Rick Horwitz, a PI at the University of Virginia. The need to see what's happening in real time and in live cells has become more and more pronounced.

The Here and Now

Today, it's clear that there's a surge of interest among scientists for using new bioimaging tools in conjunction with the genomics and proteomics research they're already conducting. "It's data in context," says Mark Collins, senior marketing manager for cell imaging at Thermo Fisher Scientific. It's "a new way to get more information in context about diseases and targets and compounds," he adds, noting that in particular, imaging tools offer scientists the "what, where, and when in, on, or around a cell or collection of cells."

One field where imaging has shown to be extremely promising is in pathway analysis, says Mark Roskey, vice president of reagents and applied biology at Caliper Life Sciences. Researchers can assess pathways of interest, say to a certain disease state, and then quickly get those pathways into models that can be validated through imaging and reporter gene assays.

Drug discovery programs have been quite keen to study kinase inhibitors, and imaging has already been brought to bear on these efforts, says Roskey. His team has worked with clients to take their in vitro kinase inhibition assays and develop them into in vivo tests. In a recent example, Pfizer and Sugen used Caliper's imaging technology to see whether tumors were shrinking in response to a kinase inhibitor that successfully got to market and is now sold as Sutent.

Roskey touts that work as a great example of how imaging is becoming part of the pipeline, and also notes that it opens the door for a much faster discovery and development process. By developing imaging-based biomarkers for use in animals as well as in humans, pharma researchers could link "the early-phase target identification and compound identification more quickly to preclinical studies in mice [and] rapidly accelerate the whole process," he says.

But imaging isn't just for the pharma set. Roskey says that 60 percent of the installations of Caliper's imaging platform go to academic customers. In fact, one of the fields that GE's Brophy expects to see move toward imaging is the postdoc staple, western blotting. Imaging can't give you molecular weight, he says, but it tells researchers where in the cell the protein is located — and for a lot of people, that could be much more valuable information.

Inherent Uncertainty

To be sure, today's interest doesn't guarantee a bright and shiny future for bioimaging. Roskey notes that when Caliper acquired imaging tools company Xenogen last year, existing Caliper customers weren't exactly banging down the door to get their hands on these tools. "We actually specifically went to our set of core Caliper customers" and showed them how imaging technologies could be combined with their research to give them better results, he says. That method proved successful: clients that now have instruments from both Caliper and its Xenogen subsidiary include Bristol-Myers Squibb, Pfizer, Merck, Johnson & Johnson, Biogen, and Amgen, to name a handful.

Brophy says that tool vendors have to do a better job easing customers into imaging. "Some of these techniques and technologies are a little bit nerve-racking" to the uninitiated, he says. "As technology providers, it's our job to remove that barrier."

Another barrier to truly widescale buy-in from the scientific community lies in getting better informatics tools out to users, says Thermo Fisher's Collins. "We know how to acquire images. We're getting better and better at labeling cells. ... We're getting very good at image analysis. Where we're falling down, I think, is in bringing good tools to end users to get the best [use] out of that data," he says. "The worry is that if we don't have those tools available soon and show the real value" that users will lose interest and not consider imaging a core part of their technology stable.

Part of the need for better data mining tools is in letting users look for what they want, says Brophy at GE. When image analysis tools were first released, they were only available for certain types of images and certain analyses within them. An early one was NF-kB nuclear transfer, he says. "Everybody would show you their NF-kB nuclear transfer images" — whether or not that was really essential to their research. Now, the job at hand is to broaden out those tools so that "you analyze what you need to and what you want to," Brophy says.

Turning Data into Knowledge

What's really at the crux of bioimaging stepping up in its own right is scientists' ability to generate relevant and useful data sets that provide answers not just today but in the years to come. "This is not only about analyzing data and just looking at it one time and then writing a paper," says Berta Strulovici, research vice president and head of automated biotechnology at Merck. Data sets should be solid enough that they can help answer questions scientists ask in the future. "Maybe today it doesn't tell you everything, but a year from now, with different information from a different type of experiment," you should be able to re-query your imaging database to pull out new knowledge and find new connections, she says. Scientists want to be able to return to their data and "ask new questions on the fly," says Brophy.

"An image is really the raw data," says Collins. It used to be that the image was the answer — you might look at it to see a change in cell morphology or a reporter gene light up. But today's experiments generate billions of images at a time. Expecting to assess them by eye is a little bit like using a high-throughput DNA sequencer and then getting your sequence by manually reading individual peaks off the chromatogram. "No one actually wants to look at the images," says Eils at the German Cancer Research Center, who is working to develop microscopes that could produce 10 billion images per day. That's why image analysis tools and data mining technology represent the keystone for bioimaging.

The related challenge, of course, lies in data storage, access, and retrieval. "We don't know yet how to catalog such information and how to retrieve it in ways that will give us intelligence," Strulovici says. Scientists aren't even sure how much to store — with image files being so large, there's a fair amount of debate on whether the images themselves should just be thrown away after each experiment.

Peering into the Future

As imaging moves forward, there are a few key areas where experts expect to see continued evolution and improvement. Upping complexity is one of those: scientists are increasingly looking at live cells, tissues, or organisms in 3D and even 4D (that's 3D over time). "That's really pushing the envelope of what you can do with cell-based assays," says Collins.

Klaus Hahn, a scientist at the University of North Carolina, says "true spatiotemporal analysis of signaling via high content" would turn out to be "a really exciting wave of the future" in bioimaging. He also emphasizes the need for better multiplexing capabilities as researchers move to multi-parameter measurements within individual cells.

Another improvement will be in whole-organism imaging, predicts Brophy at GE. He sees leading-edge users moving into imaging small model orgs such as zebrafish to better understand their biology. After that, he expects to see more interest in tissue imaging and, with that, a move toward increasingly clinical applications for bioimaging.

Ultimately, the idea is to conduct high-throughput, high-speed bioimaging experiments, generate tremendous data sets that would be analyzed automatically, and from that the computer would spit out systems biology models based on the patterns that emerged, says Bob Murphy at Carnegie Mellon. His team has worked recently on tackling this challenge; the goal is to "describe the way in which patterns are generated so that you can synthesize new images which, in a statistical sense, are drawn from the same distribution that you use to train [the algorithm]," he says. "We believe those kind of generative models are going to be analogous for protein location patterns to hidden Markov models ... for sequence data."

Image-based modeling, says Horwitz at the University of Virginia, is "in one sense a post-genomic paradigm for a cell biological process."

Machine-learning algorithm tackles proteome-wide image analysis

ob Murphy is used to having to prove his ideas. A scientist and professor at Carnegie Mellon University, his early experiences with imaging told him that any chance for the technology to be high-throughput was going to mean computational tools that would automatically analyze images for patterns. But what was obvious to him was a joke to most everyone else.

"At the time we began this, there was a very clear conviction among nine out of 10 cell biologists I talked to that machines weren't suitable for understanding ... protein patterns," he says. When his team set to work on the task in the mid-'90s, their first goal was just to establish that computers could analyze patterns as well as humans could. "It turned out they could recognize and analyze patterns even better than people," he says.

Today, Murphy and his group work on proteome-wide analysis projects that rely on the machine learning advances they helped establish. The emphasis is on "protein location within cells and tissues and understanding it in an automated way so that it can be done on a proteome-wide basis," he says. Working with an automated system that looks at all proteins, rather than one protein at a time, has been a tremendous boost in understanding patterns and function without having to resort to the traditional colocalization experiments most researchers perform. For a proteome-wide study, "the bottom line is [that colocalization] is a very inefficient way of doing that," Murphy says. "The combinatorics are going to kill you."

Instead, Murphy uses bioimaging tools and computational algorithms to examine "possible patterns a protein could have just by looking at the proteins themselves," he says. The idea is that if you can see all patterns of all proteins, "you could essentially do colocalization without ever doing a colocalization experiment."

One of his current projects involves tagging proteins expressed in a particular cell type and, using an approach called CD tagging, "we use a retrovirus to insert a GFP in the middle of a protein, in between two exons," Murphy says. Then the group makes a bunch of tagged clones and images them to try to figure out what the randomly tagged proteins are. The technique is also informative because "you can see, for instance, whether tagging in one place changes distribution relative to tagging somewhere else." That work generates some 50 to 100 clones per week using mouse 3T3 cells.

In related work, his team also used a data set from Jonathan Weissman showing GFP tagging of yeast proteins to build and train an algorithm that would analyze images, Murphy says.

RNAi screens lead to neuro breakthrough

At Merck, imaging has become a fundamental part of the drug discovery process over the course of the last four years. Berta Strulovici, research vice president and head of automated biotechnology at Merck, says that much of this takes place in the form of genome-wide RNAi screens to help understand gene function and compound mechanisms of action.

The Merck team uses confocal microscopy and laser scanning cytometry; the former is part of the screening effort to look at aspects of cell morphology, while the latter is especially useful in imaging stem cells for proliferation and differentiation, Strulovici says. Cell morphology scans can assess factors such as "integrity of the nucleus and stage of the cell in the cell cycle," she adds.

"When we do an siRNA screen, [we can] identify new potential molecules or targetsfor drug discovery," Strulovici says. "When we look at the phenotype that was obtained, that gives us a hint as to what the function of that particular gene could be in the body."

One example of this, says Strulovici, came in the form of a particular neurotoxic peptide that no one could find the receptor for. Scientists performed an siRNA screen to look at the peptide's binding state in a host of different gene knockdown situations and successfully identified the receptor. The neuroscience work led to a paper related to Alzheimer's disease in PNAS. As Strulovici puts it, "We have actually done something quite interesting and found new molecular targets in Alzheimer's."

The boon Merck sees from imaging comes not just from adding a new tool to the arsenal, but also in large part from the speedup of experiments it makes possible. Strulovici says that a single experiment such as an siRNA screen using a multi-parametric read out allows researchers to "ask many types of questions at the same time" — questions that, if tested one at a time, would take months to answer.

The caBIG project makes a big bet on bioimaging

Scientists have come to the conclusion that success in large-scale biology requires collaborators from a range of different backgrounds. But even the most open-minded of researchers probably wouldn't have expected major advances in combining imaging with genomics and proteomics to come from, of all people, a medically trained radiologist.

Eliot Siegel, a professor and vice chair of radiology at the University of Maryland, has spent much of his career dealing with the vagaries of image-related informatics. Then, in 2005, when the folks at the National Cancer Institute's caBIG program decided to launch a controversial imaging component, Siegel was brought on board to help direct it. He now spends nearly half his time working with what has become known as the In Vivo Imaging Workspace, and the other half of his life he continues his posts at Maryland.

Siegel's first challenge with the imaging workspace was confronting the fact that a good number of people did not want him there. "There was so much skepticism," he says, about whether imaging would ever be objective and accurate enough to add to the value of caBIG. For Siegel, a lifelong clinician, the foray into NCI was his first glimpse of how imaging was used and thought of in research. "In the research world, imaging was not valued nearly as much as it was clinically," he says. "I was really surprised at the fact that imaging wasn't used more as a biomarker for response." In particular, he was surprised to find that while clinicians regularly use imaging to tell if a tumor is growing or shrinking, the endpoint used in clinical trials for tumor treatments was far more likely to be whether a patient died.

The goal of the imaging workspace (populated by members from academia, government, and industry) was to create standards that would allow scientists to share images and tools as well as to extract information from various sources of data. In this way, Siegel thought it would be possible to turn imaging into a quantitative, reliable tool that would help scientists push the boundaries and get closer to the goal of personalized medicine. To that end, he and his imaging crew have engaged in a number of projects.

AIM, or the standard mechanism for Annotation and Image Markup, tries to confront the challenge that all equipment manufacturers have different and proprietary ways in which their images can be marked up. There's no standard way, for instance, for a doctor to circle a patch of cancer on a lung image; and without standards, no way to put all those images into a single database and allow users to extract a comprehensive set of, say, images of lungs positive for lung cancer.

Another effort has resulted in the XIP, or extensible imaging platform, standard. The idea is to allow people to design tools that could be freely exchanged by users of many different platforms. Researchers can "develop their own software for image display and visualization," Siegel says, and then anyone running any XIP-friendly platform would find it simple to load and use the same software. Another effort to promote sharing has created middleware that allows users from any type of research platform to connect to the standard display representation used in medical imaging.

"Just overall, the phase one of what we've been doing has been to identify the major challenges and to address those challenges with the development of various informatics tools and informatics standards that can be applied directly to clinical trials and research," Siegel says. His team planned to launch phase two of the program last month; that phase aims to take these tools into clinical trials and research settings to see how they do. The big goal of this phase is to get buy-in from the community for the tools that have been developed by workspace members during the past two years.

What they've accomplished so far is not only the production of far more tools than anyone expected in such a short time, but also a welcome change in their reception by the rest of caBIG, Siegel says. "We've gotten some feedback that imaging has become one of the more important success stories at caBIG. ... The enthusiasm of the rest of the workspace has really just skyrocketed."

Live cell arrays enable ultra-complex, genome-wide studies

Roland Eils has no doubt about the tipping point that pushed imaging into the truly high-throughput realm. That advance happened when microarray technology met bioimaging, says Eils, a scientist at the German Cancer Research Center. Live cell arrays, such as those published by David Sabatini's group, represent a major innovation in throughput and complexity for the imaging field.

If you think about the standard genomic microarray, says Eils, "for each spot you only have to measure one parameter. ... For the cell arrays, in each spot we have hundreds of different ... parameters in time and space. The complexity is amazingly higher."

It's exactly that complexity that Eils and his group are looking to capitalize on. Like others, they use live cell arrays — microarrays in which hundreds of siRNAs are spotted on glass slides, on top of which researchers plate live cells — to study knockdown and overexpression over time. The latest advance has been doing this on a genome-wide scale, he says. One genome-wide siRNA screen can account for 10 billion images and as much as 20 terabytes of data, Eils notes.

And if all goes as planned, he's just going to make things worse. Eils says that with current technology, a genome-wide screen takes a few microscopes running around the clock for two or three months. He would like to see that throughput improved by two orders of magnitude so that a genome-wide experiment could be carried out in a single day. His group has been working to improve the microscopes and has managed to achieve one order of magnitude improvement, and he hopes that by focusing on new concepts in parallel optics, his team will be able to reach the second order of magnitude scale-up that he believes is necessary.

Eils, who has been involved in bioimaging for 15 years, really helped kick-start the high-throughput imaging field through his participation in MitoChek, a European consortium that aimed to study the role of all genes in the genome related to cell proliferation. That project has wrapped up, and initial findings were published last year, he says. Now that all the technology has been set up, getting going on other projects should be even easier. Up next for Eils and his crew is a large-scale effort to study the interaction between host cells and the viruses that infect them. They'll look at "the integration of the virus into the host cell, then how the virus replicates and uses the host cell to replicate its own genome," he says. The goal is to find "which genes of the host cell we can knock down so we can block integration ... without killing the host cell," he says.