Skip to main content
Premium Trial:

Request an Annual Quote

The Big Picture



Though genomics research can give a big picture of what's happening at the level of an organism, that work is only the beginning of answering real biological questions. Finding location, structure, spatial relationships, and mechanism of action is the next step in actually solving these problems. And to do that, more and more scientists are using high-throughput imaging techniques to complement data from large-scale biological experiments.

Manfred Auer, staff scientist at Lawrence Berkeley National Lab, uses both electron microscopy and optical imaging techniques to accomplish this. "The overall idea is to understand the cell in exquisite detail [so] that you can determine function from its architecture, but what you need to know is the composition as well," he says. While 'omics techniques give researchers a parts list, he adds, "what imaging can do is [take] the parts list and [put] it back into an actual image."

While bioimaging has made advances recently in the form of new approaches to advanced light microscopy, scientists are still grappling with issues such as improving resolution, making analysis more automated and high-throughput, and applying bioimaging to single cells.

The multimodal approach

In addition to electron microscopy, Auer is "particularly interested in correlative imaging," he says. "That is going to be the future of imaging: ... multi-scale, multimodal imaging — taking the same specimen [and subjecting it] to a variety of different optical techniques as well as electron microscopy." To that end, he's making use of the most recent developments in fluorescence imaging. Techniques like PALM (photoactivated localization microscopy) and STORM (sub-diffraction-limit imaging by stochastic optical reconstruction microscopy) offer increased resolution to less than 20 nanometers, which Auer calls "spectacular."

One researcher who's making use of these high-resolution fluorescence imaging techniques is Harvard's Jeff Lichtman, who has created recombinant mice that can express hundreds of different colors of neurons. That enables him to create a network diagram of the developing mouse brain. But while he can trace their wiring, he, like others who use light microscopy, is still limited by diffraction. Light microscopes are limited in their resolution, "but there is a whole new set of optical techniques that work with fluorescence that break through this diffraction barrier and give much higher-resolution images," Lichtman says. "It's a field of microscopy that has been invented in the last couple of years.

Sometimes it's called nanoscopy because the resolution is now in the tens of nanometers, much closer to what traditionally electron microscopes would see." He's combined what he calls his Brainbow mice with STORM to break the diffraction barrier. With STORM, "ultimately you can get a map of where the fluorescent molecules are at a resolution that far exceeds the resolution based on diffraction," he adds.

The problem with using these advanced optical techniques is that "it's a bit like [a] starry night," Auer says. "You have a night sky with lots of little dots there, but you don't know what to relate them to. So that's typically why it's desirable to overlay that with the electron microscopy — because in the electron microscope, you're getting all kinds of hallmarks in a cell." Auer's interests span the molecular mechanisms of hearing, breast cancer, microbial communities, and bioenergy. One area where imaging may have particular promise, he says, is in allowing researchers to see cancer stem cells in real time. "For most cancer imaging, we're looking at the wrong thing," he says. Using a combination of fluorescence imaging to mark the cells and electron microscopy to see what those cells are actually up to will be key to figuring out just how cancer cells, and stem cells in particular, differ from other cells.

Scott Fraser, director of the Beckman Institute's Biological Imaging Center at the California Institute of Technology, does a lot of live cell imaging, and he also applies a multimodal approach. Using a combination of laser scanning confocal microscopy and multiphoton imaging along with microscopic MRI, microPET, and supra-high resolution microscopy has greatly widened his scope of vision. "We like to do things so that they're multimodal, realizing that no one technique's going to be perfect," he says.

Fraser has been pushing for better labeling — one area of bioimaging that he says needs improvement. His lab is working to create brighter and more robust fluorophores for experiments using six or more such labels at a time. The team uses multispectral instruments such as the Zeiss LSM 510 META and its successor, the LSM 710. "The idea is that instead of just taking an image, you take the spectrum of each pixel in the image," Fraser says. "It's then a fairly straightforward mathematical operation to find out what mixtures of different labels are in each pixel within the image."

High content, high throughput

That kind of innovation in fluorescence microscopy could prove especially useful to high-content screening, an approach that has relied on bioimaging for the past decade. High-content screening involves introducing a sample or compound to a large number of cells and then watching labeled molecules within the cells to see how they react to those substances.

Applications for this kind of functional analysis cover for both basic and clinical research. Anne Carpenter directs the Broad Institute's imaging platform, which supports screening facilities for both RNAi and chemical compounds. "We have seen high-content screening expanding quite rapidly in both the academic and the pharmaceutical worlds, and the types of samples that are being screened include not just chemical compounds, but also RNAi reagents, cDNA overexpression libraries, mutant libraries of some sort or another," she says. "It's a pretty versatile technology."

Carpenter's lab has written and made publicly available a data analysis package called CellProfiler. The tool takes in raw fluorescence microscopy data, identifies all the  cellular compartments, and measures any number of features — including cell count, cell size, cell cycle distribution, organelle number and size, cell shape, texture, and levels and localization of proteins and phospho-proteins. "It's just an easy way for biologists to get their hands on these advanced algorithms and be able to apply them in a user-friendly package," Carpenter says.

For screens that can produce many different and subtle phenotypes, Carpenter is developing machine-learning algorithms. Her software will measure hundreds of features in a blind way and then determine which are most relevant for identifying a phenotype of interest. "[It] lets your average biologist readily perform machine learning on an image-based screen," she says.

With these advances come inherent challenges. One is statistical analysis. "It's still a bit unclear what the best way is to choose hits from a screen," Carpenter says. Another is the amount of information that images yield, and the fact that biologists are only just scratching the surface of mining data from them. One of the ways that the field is moving forward is in using images as a source of data for more systems biology approaches, she adds. "I'm really excited about the potential for images being used as another source of data for doing systems biology, and it's especially interesting when you consider combining image-based data sets with other large-scale data sets, including gene expression data and protein-protein interaction data."

Outside of academia, drug discovery continues to make use of high-content screening, with an eye on moving from what in many cases is still medium-throughput to high-throughput screening. Marjo Simonen works for Novartis, where she does primary screening, secondary screening, and some compound profiling. Her screens look for many different cellular phenotypes, including translocation of proteins from the nucleus to the cytoplasm or vice versa, changes in morphology, phosphorylation of proteins, and protein trafficking, to name a few. She uses GE Healthcare's IN Cell Analyzer 3000, but notes that her work is considered medium-throughput at about 250,000 compound screens per day. Her goal, having just ordered a 1,536-well plate, is to be doing full screens of more than 1 million compounds per day by next year. "The direction where high-content screening, in general, is going is towards more complicated assays, things that earlier were possible only in very, very small, low-throughput scale," Simonen says.

Genentech's Rami Hannoush, a senior scientist in the department of protein engineering, is also employing high-content imaging "mainly for following the mechanism of action of small molecules and characterizing what they are doing in cells," he says. He, too, works in medium throughput, and tracks four colors at time. While the tools allow him to study proteins in all their glory — including activation levels, translocation activities, interaction with other proteins, and release from certain intracellular compartments — the biggest bottleneck is data storage and analysis. "Even though it's high content, the technique at this point in time is still not high throughput because the bottleneck is getting bioinformatics support and network infrastructure to be able to support the infinite amount of data acquired," Hannoush says. Screens typically have to process hundreds of thousands of images, and "coming up with quantitative parameters so that you can quickly narrow down on your hits without spending too much time looking at images" is a critical problem.

Key to enhancing high-content screening platforms is making the image analysis software "more approachable [to] people who perhaps aren't that familiar with image analysis concepts," says Mark Collins, senior marketing manager for Cellomics, now a part of Thermo Fisher Scientific. While the company's screening platform was originally sold to pharma and biotechs, Collins says that 40 percent of current customers are basic labs.

In the next two to three years, he sees the field moving into more complex image analysis of 3D structures, as well as making sense of image data in the context of other data such as RNAi screens, sequencing, and chemical structure information. The ability "to do more genome-wide analysis of the data post-image analysis" is key to bringing  these tools out of their infancy, Collins says. "The opportunity to do functional genomics at the cell level using high content and RNAi is really what's driving a lot of interest in high content in the target identification and validation," he adds.


Bioimaging's Proving Grounds: Efforts to Build Brain Atlases

One area where bioimaging has seen a great deal of use is in creating high-content brain atlases. In 2006, the Allen Institute for Brain Science launched its first atlas of the mouse brain. This year, institute scientists began work on three new atlases, each of which will combine microarray with in situ hybridization data to create a visual map of gene expression patterns.

The mouse spinal cord atlas is to be completed by the beginning of next year, and will survey all 20,000 genes of the adult and juvenile mouse spinal cord. Funding came in the form of a consortium, which developed in response to researchers' needs. Members approached the Allen Institute, according to COO Elaine Jones, and said, "'There's no normal map, we don't even know where the genes are turned on in parts of the spinal cord,' so they asked us if we would do it."

The other two projects are a map of the developing mouse brain, for which scientists will collect expression data on 3,000 genes for four prenatal and three postnatal stages; and a map of the human brain, which will look at 1,000 different anatomical structures, and then narrow that down to 50 to 500 genes of interest. "Even though that's a low number versus 20,000, if you looked at all the drugs that are manufactured, it includes less than 100 different gene targets," Jones says.

At Janelia Farm, Hanchuan Peng is working on a "comprehensive, three-dimensional, very high-resolution brain atlas," hoping to piece together image data to construct the first 3D digital map of an entire insect brain at the single-neuron level. "The ultimate goal is to try to understand how the brain works," he says. Peng is collaborating with Janelia Farm and Stanford University scientists on a 3D digital nuclei atlas for C. elegans and is working on construction of a high-res digital atlas of the fruitfly brain. "Once we have this structural map, we will be able to further study the function of neuronal circuits and animal behavior in a more efficient way," he says.

His lab also focuses on building new image acquisition and analysis tools. While a lot of conventional tools exist for medical imaging analysis, bioimaging is a different story. "The image has a much bigger volume," Peng says, and medical imaging tools are either too slow or "have a lot of requirements about the fine tuning of the parameters." One tool he's developed is WANO, a 3D annotation tool at the single-cell level. Peng says visualization tools also need a bit of improvement. His lab is developing a new tool called V3D, short for visualization 3D. The next step is to move toward single-cell visualization technology, he says. "The problem is, even if you can see a lot of things happening, you're not going to be able to [identify] the single cell at the whole animal level," he adds.

The Scan

LINE-1 Linked to Premature Aging Conditions

Researchers report in Science Translational Medicine that the accumulation of LINE-1 RNA contributes to premature aging conditions and that symptoms can be improved by targeting them.

Team Presents Cattle Genotype-Tissue Expression Atlas

Using RNA sequences representing thousands of cattle samples, researchers looked at relationships between cattle genotype and tissue expression in Nature Genetics.

Researchers Map Recombination in Khoe-San Population

With whole-genome sequences for dozens of individuals from the Nama population, researchers saw in Genome Biology fine-scale recombination patterns that clustered outside of other populations.

Myotonic Dystrophy Repeat Detected in Family Genome Sequencing Analysis

While sequencing individuals from a multi-generation family, researchers identified a myotonic dystrophy type 2-related short tandem repeat in the European Journal of Human Genetics.