Skip to main content
Premium Trial:

Request an Annual Quote

Q&A: Portuguese Researchers Develop New Method for Acquiring, Storing Array Images

Premium

neves.jpgName: António Neves

Title: Assistant professor, electrical engineering, University of Aveiro, Portugal; researcher, Signal Processing Laboratory, Institute of Electronics and Telematics Engineering of Aveiro

Education: 2007— PhD, electrical engineering, University of Aveiro, Portugal; 2002 — BSc, electrical engineering and telecommunications, University of Aveiro


As the density and complexity of microarray images continues to grow, researchers are finding it a challenge to acquire and store them.

Facing their own array image-acquisition and -storage issues, biologists at the University of Aveiro in Portugal recently contacted engineers at the Signal Processing Laboratory of the Institute of Electronics and Telematics Engineering, also based in Aveiro, to develop new methods that can help them cope with these challenges.

Writing in this month’s issue of IEEE Transactions on Medical Imaging, the University of Aveiro team discusses a lossless bitplane-based method that can more efficiently compress microarray images.

[Neves A, et al. Lossless compression of microarray images using image-dependent finite-context models. 2009 Feb;28(2):194-201.]

The method, based on arithmetic coding driven by image-dependent multibitplane finite-context models, produces an embedded bitstream that allows progressive, lossy-to-lossless decoding.

In the paper, the researchers compare the efficiency of their method to compress images with three image-compression standards — JPEG2000, JPEG-LS, and JBIG — and with the two newest specialized methods for microarray image coding, MicroZip and prediction by partial approximate matching.

According to the authors, their method yields better results and confirms the effectiveness of bitplane-based methods and finite-context modeling for the lossless compression of microarray images.

To get a better understanding of the challenges of acquiring, storing, and retrieving array images and the methods developed to overcome them, BioArray News spoke with lead author António Neves, a researcher at the Signal Processing Laboratory and an assistant professor at the University of Aveiro, this week.

“Due to the good results obtained, we would like to go further in the development of this method and try to commercialize these algorithms so that they are available to all biology laboratories,” Neves said.

Below is an edited transcript of that discussion.


Tell me more about your background. Your CV says array image acquisition is a new area for you.

I started my PhD working on images with specific characteristics, and I began working with microarray images as part of that work. We were contacted by our biology laboratory and they informed us it was a problem to store the images they acquired from the scanners. Actually, they sometimes would just analyze an array and then delete the image. This is a problem because several new analysis algorithms are appearing on a daily basis. So, that was the motivation for this work.

What was your prior experience in developing methods for array imaging and storage?

We have experience in image processing, particularly in image compression algorithms. Most of our work has concerned the use of arithmetic encoding, finite-context models, and binary composition. We have developed several algorithms and approaches. It is difficult to enumerate them all.

In this case, we have tried to develop efficient lossless compression methods that are independent of the spot distribution in the image. We try to explore the specific characteristics of the images, namely, the noise in the least significant bitplanes, the fact that for each experiment we have two images with almost the same spatial information and our knowledge in finite-context models and arithmetic encoding.

[ pagebreak ]

What are some of the current challenges for imaging and storing microarray data?

In my opinion, the huge amount of microarray experiments being produced, the fact that each microarray image has 16 bits per pixel, and the large resolution of these images leads to problems in storage and transmission of these images. Because they are acquired at 16 bits per pixel, the images also contain a lot of noise, especially in the background.

Moreover, most of the methods described in the literature are based on image segmentation for spot and background extraction, and do not take into account the noise on the images. Most of them split the image background and the spots. It is useful for the processing of the image, but for compression, we found it to be a problem, because nowadays we have different methods of depositing spots on microarrays, and so we decided to look at this problem from another perspective. Due to this, it is very important to develop new and efficient compression methods.

Our choice of lossless methods are mainly due to the fact the existing analytic methods for microarray analysis are still evolving, being imprudent, at least nowadays, discarding the raw data and keeping only the parameters obtained through analysis. Methods that provide progressive decoding seem promising, allowing the efficient transmission of these kinds of images through remote databases.

To what degree have new, higher-density arrays created challenges for existing imaging and storage techniques?

Methods based on segmentation are very dependent on the position of the microarray spots. So, new arrangements in the microarrays create further challenges. If the resolution of the images also increases, the image processing algorithms should be more efficient to allow the use of the images.

How did you settle on the approach you described in the paper?

We started our work studying the methods proposed in the literature. Moreover, we provided a detailed work regarding the use of standard image compression algorithms for these images; these results show that bitplane-based methods provide the best compression results.

Using our knowledge in the field of image compression, we then developed a bitplane-based method using arithmetic coding and finite-context models. The algorithm we use processes the image on a bitplane basis and the finite-context model utilizes both the information of the bitplane currently being encoded and the bitplanes already encoded.

We developed a three-dimensional model and we have used the information of the bitplanes [that] has already been encoded in the 3D context model, so that is the innovative part of this work

How did you measure the performance of your method compared to existing methods?

We compared the compression efficiency obtained with the proposed method with the most recent standard lossless compression algorithms and with specific methods existing in the literature. To do this, we used 32 microarray images from a publicly available database. For some images, we saved up to 50 percent [of the] storage space, when compared to the storage of the images uncompressed.

More specifically, we measured the performance of compression. The image without compression needs 16 bits per pixel; we measured [for] our compression what number of bits we need to encode [for] each pixel of the image. We compared it to standard and specific methods that have already been published.

Our work started with the analysis of standard image compression methods regarding this type of image and we have noticed that the JBIG method, which was first published in 1991, gives the best compression results for most of the images. It was the motivation for the analysis of bitplane-by-bitplane basis.

Based on these results, we think our method is the best available in academic research, as far as what we have encountered in papers in journals.

How could this method be developed further?

We are trying to improve this method specifically as well [as] adopt another approach based on binary-treaty composition for compression in these types of images. We are also starting to work in DNA-sequence compression. We have two or three published [papers] regarding that. We are also trying to work more closely with the bioinformaticists at our university.

Is there any route for commercializing your invention?

At the moment this has been only academic work. However, due to the good results obtained, we would like to go further in the development of this method and try to commercialize these algorithms so that they are available to all biology laboratories. So, we do intend to make this more widely available. Right now, we have the software and source code available and we are sharing it within the university.

The Scan

Latent HIV Found in White Blood Cells of Individuals on Long-Term Treatments

Researchers in Nature Microbiology find HIV genetic material in monocyte white blood cells and in macrophages that differentiated from them in individuals on HIV-suppressive treatment.

Seagull Microbiome Altered by Microplastic Exposure

The overall diversity and the composition at gut microbiome sites appear to coincide with microplastic exposure and ingestion in two wild bird species, according to a new Nature Ecology and Evolution study.

Study Traces Bladder Cancer Risk Contributors in Organ Transplant Recipients

In eLife, genome and transcriptome sequencing reveal mutation signatures, recurrent somatic mutations, and risky virus sequences in bladder cancers occurring in transplant recipients.

Genes Linked to White-Tailed Jackrabbits' Winter Coat Color Change

Climate change, the researchers noted in Science, may lead to camouflage mismatch and increase predation of white-tailed jackrabbits.