Skip to main content
Premium Trial:

Request an Annual Quote

As HCS Industry Seeks Standardization, NIST To Study Matter; Need for Image Standards Tops List

Premium

It's happened in the microarray, proteomics, and bioinformatics industries. Now, a growing number of researchers in the high-content screening and analysis market are calling for standardization, and the National Institute of Standards and Technology is considering forming an industry-wide consortium to help guide the development of such standards.

Many feel that standards could help further the industry as a whole. HCS standardization could also be a boon to several startup biotechs playing in the space, in particular those specializing in image analysis and informatics.

However, the complex nature of high-content screening and the competitiveness of vendors vying for the biggest piece of the pie in the relatively young industry may make industry-wide standardization challenging — just as it has been in the microarray, proteomics, and bioinformatics industries.

At an HCS end-user forum at Cambridge Healthtech Institute's High-Content Analysis meeting held last week in San Francisco, John Elliott, a biotechnology research scientist at NIST, led a discussion group entitled "Assays, Reagents, and Cells," but much of the discussion veered toward HCS standardization issues.

Later, Elliott was part of another panel discussion that focused on developing industry standards for cellular assays, including reagents, cells, plates, and informatics.


"There is now a need for implementing some standards to really add value to those measurements for comparability between platforms, or for at least having the ability to move data from one platform to another."

Although this wasn't the first time the subject of standardization has been broached at an HCS conference, it was the first time that a NIST representative has appeared at a major meeting, and, as such, the topic was top of mind for many attendees.

Elliott is part of a NIST group that specifically deals with cell and tissue measurements, but comprises scientists interested in developing standards for microarrays, bioinformatics, and proteomics. More specifically, Elliott leads a smaller group that deals with cellular imaging.

"Recently — and this CHI meeting was a good example of it — the technology platforms themselves have now come to a point where they're making great quantitative measurements," Elliott told CBA News this week. "There is now a need for implementing some standards to really add value to those measurements for comparability between platforms, or for at least having the ability to move data from one platform to another; and for having interoperability between laboratories, so measurements made in one laboratory can be standardized to measurements made in another laboratory."

According to Elliott, one of the biggest issues that came out of the meeting was adopting some type of image standard — either a specific image file format, image-analysis routine, or image file storage and recall protocol.

"That is one of the major issues because there are several third-party developers that are developing analysis software routines, and it would be nice to have some kind of standard way to be able to save image data from one microscopy platform so another vendor could use that," he said.

Such a standard might allow researchers to more easily capture images using one HCS instrument platform, analyze those images using software from another HCS platform vendor, independent software provider, or open-source software project, and mine the resulting data using yet another informatics package.

Some HCS providers have already started down this path. Cellomics, for instance, has developed a product it calls HCS Gateway, which makes its informatics software compatible with other HCS instruments. So far, other industry players such as GE Healthcare and Evotec have partnered with Cellomics to develop this compatibility with their instruments.

The desire for standardization could also benefit many start-up biotechs that have cropped up over the last few years with independent image-analysis and/or informatics offerings, because it may increase their pool of potential customers. These include companies such as Definiens, Vala Sciences, BioImagene, and Genedata.

Another standard that Elliott said the industry is calling for is for fluorescence intensity measurements, or "some kind of material to be able to calibrate microscopes, too. If you have two different microscopes, how do you put the measurements on the same footing?" he said.

Jeff Price, co-founder, chairman, and CEO of Vala Sciences, said that he is also hearing calls for standardization on the imaging and data-storage side.

"What image format should you use, and how should you store the measurement data?" Price asked hypothetically. "Some of that data begins with how the experiment was set up: What were the fluorescent dyes, what were the wavelengths? On the other side, what kind of data do you have coming out of the screen? Is it cell-by-cell data, or is it well-by-well data?"

Price, who is also an associate professor at the Burnham Institute, and an adjunct professor at the University of California, San Diego, is also involved in a group within the International Society for Analytical Cytometry, which is attempting to adopt its own standards and incorporate some aspects of HCS. Additionally, he is a member of a public-private consortium working on ways to incorporate various image-analysis algorithms under one umbrella.

"It would be nice to have a standard way for people to evaluate different algorithms and compare them, and there's no way to do that right now," he said. "You kind of buy a system, and you get the algorithms that are with it. There is some comparison going on at these meetings, but it's very hard for the average person to do it — you basically have to have a couple of instruments, or find someone that will collaborate with you."

Being that NIST isn't a regulatory agency, and thus has no way or interest to enforce standards, the challenge is, as always, getting everybody on board.

On one hand, the challenge of adopting industry-wide standards may be lessened by the fact that only a handful of full HCS platform vendors exist.

But according to Price, on the flip side, the relative complexity of HCS may make standardization a relatively difficult process.

"It's great that people are working on [the issue of standardization], but I think that it needs to continue to evolve, because there are tremendous demands in HCS because of the complexity of the images that don't exist with typical high-throughput screens," Price said.

For example, Price said, one has to take into consideration all of the steps that are involved in a typical high-content screen, such as designing the assay; building new algorithms; optimizing the algorithm to the biology; running the assay; collecting hits; and comparing those hits to other types of past experiments, such as high-throughput biochemical screens.

In addition, the area of image analysis is particularly complex to standardize, as researchers may want to run a "virtual experiment years later on the same image data, because [they] now have, for example, a new morphology algorithm that didn't exist before, and want to go back and look at genetic mutations, or abnormal nuclei," Price said. "There are lots of algorithms to still be developed in HCS, and the idea of being able to do that virtual screening later, if you save the data in a nice database, is very powerful."

Overall, Price said, he thinks adopting standards would further the industry "tremendously." However, he has his doubts that everyone will want to jump on board immediately. As an example, he cited the Open Microscopy Environment project, which has brought together some industry players in the field of standard research microscopy, and is just beginning to incorporate high-content screening and analysis into the fold.

"I think you might look at how the HCS people are responding to OME," he said. "Some people … are incorporating the OME standard, or are a member of the committee, but others are not."

Elliott said that NIST hasn't yet felt significant push-back from anybody in industry, and downplayed the idea that there could be an element of peer pressure to any companies not interested in conforming.

"This is conservative, and there is no pushing," he said. "It's industry-driven, and before real standards can be developed, the community has to be calling for them. Then NIST can play the role of bringing them together.

"The only way to do it is to get people together and slowly put together the standard, and try your best not to hinder anyone's development," he added. "Certainly it would not be the intent to keep people out because they don't want to follow the standard."

Currently, there is no timeline for the development of a consortium to discuss standardization issues, Elliott said, adding that NIST would "hopefully" have formed a committee in the "next few months." Much of the onus is now on industry players.

"Really, when industry is calling out for it, it puts us in a great position, because we just get them together and they dictate what the issues are," he said.

— Ben Butkus ([email protected])

The Scan

Back as Director

A court has reinstated Nicole Boivin as director of the Max Planck Institute for the Science of Human History, Science reports.

Research, But Implementation?

Francis Collins reflects on his years as the director of the US National Institutes of Health with NPR.

For the False Negatives

The Guardian writes that the UK Health Security Agency is considering legal action against the lab that reported thousands of false negative COVID-19 test results.

Genome Biology Papers Present Epigenetics Benchmarking Resource, Genomic Architecture Maps of Peanuts, More

In Genome Biology this week: DNA methylation data for seven reference cell lines, three-dimensional genome architecture maps of peanut lines, and more.