Last week, Beckman Coulter and Harvard Medical School-Partners Healthcare Center for Genetics and Genomics announced a cooperative purchase agreement for an automated microarray sample-preparation system for gene-expression and genotyping analysis for the Affymetrix GeneChip platform.
Beckman and HPCGG will collaborate to evaluate and refine an automation platform based on Beckman Coulter's Biomek FX technology.
The automation of microarray-analysis procedures has been an ongoing goal of the tools industry, and is seen as a necessary next step as this genomic technology and other 'omics tools move toward the clinical setting and the still-fuzzy vision of chip-based tests as a common part of medical care.
Where, in the past, microarrays were essentially handled one at a time, this systems gears up to handle arrays in units of 96-well plates.
BioCommerce Week spoke with Vance Morgan, director of laboratory operations for Harvard Medical School-Partners Healthcare Center for Genetics and Genomics, to learn about the implementation of this new technology into his research core lab.
What is the center doing that requires so much throughput capacity?
We operate a series of core service labs, and we get a lot of throughput from the Partners community in Massachusetts, primarily Brigham and Women's Hospital, and Mass General. To serve that community and the volume we get, we try to look at high-throughput technologies that would give high-quality data, repeatability, accuracy, and do it in a reasonable turnaround time, at relatively cheap cost.
Does this new robotics platform bring an order of magnitude increase in your capacity?
We are looking at a couple of things here. One is that this particular robotics station can do full 96-well plates, but it is also configured so that it can do less than that if the demand is not for a full plate. It gives us the opportunity to batch several projects together onto a plate. We do about 1,000 [microarray] chips a year here, and we frequently get surges where we will have a large volume of chips. Manually, that takes a while to turn that around, and it takes more technician time. One of the advantages of automation is that it multiplies your technician force. Now you don't have to have someone completely tied up doing just that; they can do other tasks. So, I think this gives us an efficiency gain.
Do you see using this for all of your microarray sample prep?
I foresee that we will be using this for all the expression-profiling work that we do using the Affymetrix protocol. The other nice aspect of this platform is that it can do some other things when it's not doing microarrays — like pre-dispense master mix for pre-PCR reactions in some of the genotyping work we do. We are going to be processing some of the Affymetrix 100K SNP chips, and those definitely will be coming in 96-well plates.
I think it will multiply our efficiency and allow the staff to focus more on the front end. And, it will allow us to assure, from reaction to reaction, that we are doing exactly the same thing each time and minimizing the points of variability that can come in.
Have you talked to any others labs that are doing this?
I think we are one of the earliest labs to get this.
Would you discuss the sales process you went through to get this unit?
The discussions with Beckman have been, to a large degree, separate from Affymetrix. We know that Affymetrix has interacted with Beckman, and they are one of their preferred providers and they have talked about automating these processes. Some of the development of this automated system arose out of those discussions. From our point of view, we already have one Beckman FX here that we use to support our genotyping and our high-throughput sequencing operation. Given that we see more genotyping work coming along, it made sense to expand our experience and bring in another one. Albeit, this is intended to support RNA preparations, but also being an FX platform, it can support other plate replication capabilities.
Beckman Coulter was eager to work with us. We spent a lot of time talking about what we would want to do in terms of our ability to process microarrays and to have a platform that was flexible enough to do other things. We knew that other methods would need to be developed, and we wanted to use their expertise with automation, and blend that with our ability to test out these systems.
We first started talking specifically to this in September. October through December was when we really had the most dedicated interactions and got this agreement up and running. The machine is now in the last stages of installation; the engineer is here today installing and tuning up the last few parts. It came here the first week of January. It was a pretty fast turnaround.
How does this fit into your facility?
We didn't have to do a lot of modifications to the facility. One of the things we did do, we decided to put this robot in a room more dedicated to the pre-PCR and RNA-handling activities. So, it was separated to minimize the opportunities for any kind of contamination. The robot is fairly compact, it sits on an 8-foot table that can be moved and has plenty of room for the ancillary pieces of equipment.
Did it displace anything in the lab?
This was a general-purpose room, targeted for future development. We had that space, which certainly may not be the case in every lab. Empty space doesn't stay empty very long.
Will this change your pricing?
Over the next six months, it's my hope that the combination of the ability to accommodate higher throughput and more volume would help us drive some of those costs down, but we will have to test that model.
What will be the working hours for the machine?
The way the process is set up, currently the steps in doing setup for microarrays are: Once someone has Total RNA ready to go, there is a cDNA synthesis process and that takes eight hours. Then, there is another transcription process to make labeled RNA, and the fragmentation of that so that it is ready to go. So, total robot time consumes about eight hours and probably overnight for a second run. Once that is done, you return to the manual process of loading that onto the chip. Unfortunately, the way the chips are manufactured, there is not an accessible automated solution for loading the chips. To date, all the sample prep has been at the bench with a technician sitting down with a pipette and manually manipulating the solutions back and forth. I think this is a very significant step forward to be able to put these onto an automated platform.
What kind of doubts do you have?
I think that one obvious question is how this fits our model in terms of ability to sustain throughput. We figure that one tech should do 50 samples a week, so we want to try to push the envelope and see how this compares. We think that having this type of platform allows us some flexibility. So, if we use one set of reagents right now, but if there were other kits amenable to automation, this platform would allow us the flexibility to interchange those as needed. Without doing some of that, we don't know if that is possible. We use Affymetrix kits for labeling. There are some other kits to do the labeling, and we will be looking at those as they gain more presence in the marketplace.
Did you look at other automation platforms?
Tecan and Caliper have nice automation systems. But, this seemed like the system most ready for the market. That's nothing negative about the other systems. This just seemed most ready, and since we had another FX here, that allowed us to keep some standardization of equipment.
HPCGG has a new Linux-based server installed. Is there any connection between this installation and that one?
Indirectly, there is some connection. The HPCGG has an IT group, which a little more than a year ago entered into five-year relationship with Hewlett-Packard. This is the first hardware installation that has come out of that. That resource is available to all of the Partners community. We hope that having all of that compute power will inspire folks to do more ambitious experiments.
What is next on your automation wish list?
We continue to talk to Affymetrix about their higher-throughput platforms. Whether that will be installed here, that is something that is still under discussion. We continue to look at technologies that offer high throughput on proteomics as well — things dealing with the handling of protein mix and purification, gel-spot analysis, things that would power a protein/proteomics kind of group. We have also been looking at some of the nanoliter pipetting stations. We have a sequencing group that does high-volume sequencing and we have been thinking about introducing smaller and smaller volumes to try to control the costs of what we charge people. We want to see how small a volume of sequencing mix we can get down to. For conventional robotic stations, the accuracy seems to fall off as you get down to the 1- to 2-microliter range without going to some different kinds of robotics. There are some nanoliter pipetting robotics stations and we have been evaluating those.
You are associated with a clinical institution. Will you be evaluating how this might fit into that environment?
We do think about the clinical environment. We have a CLIA-certified lab that is in our lab space. So, one of the nice advantages of our center is lots of what we do on the research side, we can coordinate with them and actually help develop something that you could use as a clinical assay. Certainly as chip technology advances, and the FDA becomes more comfortable with having this used as a diagnostic platform, I would think that yes, the volume would go up, and the automated solutions would look more attractive in addressing those volume issues.