Skip to main content
Premium Trial:

Request an Annual Quote

NCI s Haleem Issaq on Optimizing Separations Technologies for Proteomics Research

Haleem Issaq
Head of Separations Technologies Laboratory

At A Glance

Name: Haleem Issaq

Position: Head of Separations Technologies Laboratory, SAIC-Frederick, a division of the Science Applications International Corporation — the operations and technical support contractor for the National Cancer Institute, since 2002.

Background: Senior scientist, head of Chromatography/Spectroscopy Section, SAIC-Frederick, 1982-2002.

Scientist, NCI-Frederick Cancer Research Facility, 1973-1982.

PhD in analytical chemistry, Georgetown University, 1972.

Last week, ProteoMonitor reported on competing separations technologies that were presented by various vendors at the Pittcon convention in Orlando, Fla. (see ProteoMonitor 3/16/2006). This week, ProteoMonitor spoke with NCI's Haleem Issaq, who has been working in separations technologies for over 30 years, to find out more about how to optimize separations for proteomics research.

How did you get into separations technology, and applying that to study proteins?

Well, I started in 1972 when I was hired at the National Cancer Institute in Frederick, Md., doing thin layer chromatography. I built a nice lab with all kinds of equipment for thin layer chromatography. We even were one of the first [labs] to do online thin layer chromatography mass spectrometry.

So we got into the thin layer chromatography, and we tried two things: one, to use computers to select the mobile phase for separations. Then, we also developed what we called multi-modal separations. That is, on the same thin layer chromatography plate, you can do two modes of separation — normal phase and reverse phase.

Then high performance liquid chromatography came by, and we did HPLC. And then I did multi-dimensional gas chromatography, where you use two different chemistry columns to separate complex mixtures. Then capillary electrophoresis came by about 15 years ago, and we got into CE. We were one of the first ones to do it after Jim Jorgenson developed the instrument. We bought a Beckman system, and we built our own, and we did CE.

Then I set up the Frederick conference on CE, and we did DNA fragment separations, and SNPs and things of that nature.

Then the NCI decided to create in our group a mass spectrometry center for analysis of proteins and peptides. And as head of the separations group, we were involved in that. We were to do the separations and fractionation of proteins or peptides, and then to give them to the mass spec for analysis. That was about three or four years ago.

Before that, what we had done is used multidimensional HPLC-CE for the separation of proteins and peptides. That was about 2000.

We had good experience, and that's why the director felt that they didn't need to hire anybody in separations. They used us for the mass spectrometry center, which is part of our group that's called the Laboratory of Proteomics and Analytical Technologies. So we are three groups — there's the mass spec center, the separations technologies, which I head, and an NMR group.

We got into developing HPLC, then we had to do nano-HPLC to be compatible with mass spec. We started packing our own columns, especially with C-18 particles.

Capillary columns are very expensive to buy. We buy a gram of [packing] material for, say, $400, and that gives us a lot of columns. For the price of one column, we can get 25, 30, or maybe 50 columns.

At last week's Pittcon conference, some HPLC vendors said that the most important thing for proteomics researchers is not speed, but rather peak capacity. Can you speak about what are some of the most important things in nano-HPLC for getting the best peak capacity and separations?

Well, with nano-LC, the lower the flow rate, the higher the sensitivity. That's one thing. The other thing people are talking about is that there are two different particles — either silica particles with no pores in them, or porous silica. Porous silica has a higher capacity than non-porous silica. And if you want to separate a complex mixture, you really need to go to the porous silica to do it.

However, porous silica is not as amenable for the separation of proteins, because the proteins are large, and they stick in the pores, and there will be a lot of diffusion. So there are advantages and disadvantages — it depends on if you're separating proteins or peptides.

If you're separating peptides, what you need is smaller particles, but smaller particles means the column has to be shorter because of the back-pressure. You don't want to have high back-pressure.

So it's a trade-off. If you do the serum proteome, you have about 20,000 proteins, and when you digest them, you could end up with between half a million and a million peptides. And those can not be resolved by any system in one shot. So what they do is they do fractionation in strong cation exchange. And what we do is we collect fractions, and we do HPLC using a reverse phase column with electrospray ionization into the mass spectrometer.

In order to get a very efficient system, you really have to do fractionation, if you are going the bottom-up approach. So that is what we do.

We have noticed that a lot of people do the strong cation exchange separation, but it's a lousy separation. In order to get the highest number of peptides identified, you have to optimize your system completely. You have to optimize the strong cation exchange, you have to optimize the gradient, and you have to optimize your spray generation for electrospray ionization.

At Pittcon, Agilent said that one of the important things for achieving maximum peak capacity is to minimize the amount of dead space through fittings, etc. That is especially important for multidimensional separations, they said, and they are pushing their chip technology as the best way to minimize dead space. What do you think of Agilent's chip technology?

Well, I haven't seen everybody running to buy it. I mean, maybe it's a very good technology. We haven't tested it, so I can't really say anything about it. I normally would talk about things that we have done.

You have to minimize the dead space, yes. That is known in chromatography. What is good in chromatography is to minimize the dead space so that you don't have diffusion. Once you have diffusion, then the peak will broaden, and you want to prevent peak broadening by minimizing diffusion.

So it is not only in their system. In any system, you have to have minimum dead volume.

The only thing that people are doing, and we have done this, is in place of doing strong cation exchange, we do reverse phase. We can do fractionation by reverse phase, and then run into the mass spec by CEMS — capillary electrophoresis mass spectrometry.

What we found is that if you collect fractions by strong cation exchange, and you collect fractions by reverse phase, you will find that reverse phase gives you higher resolution, better separation, and not much repeatability between adjacent fractions.

Let's say you take fraction one, fraction two, and fraction three. Fraction two will have some of fraction one and fraction three in it. So you go to analyze, and it's a waste of time, because you are analyzing something that is already found in fraction three, for example. The repeatability from one fraction to another, according to John Yates [of the Scripps Research Institute] and Steve Gygi [of Harvard Medical School], is at least 40 percent.

If we do the fractionation by reverse phase, then the repeatability of peptides between fractions is less than 15 percent. And that was published last year in Analytical Chemistry in a manuscript we wrote on CEMS.

We developed a new interface for the CE with the mass spectrometer which has zero dead space. The separation and the electrospray is on the same capillary, so that makes it much more efficient. This is not commercially available. We are in the process of applying for a patent for it. It can be used with any mass spectrometer from any company. It is not limited to any specific vendor, and it's very simple to do, and it's low cost. The whole thing would cost, say, $50. It's easy to manufacture.

Are you planning on trying to sell this to a mass spec company?

That will depend on what the government wants to do. We are a government facility, so whatever we develop is in the hands of the government.

What kind of commercial system do you use in your lab?

We have Agilent, the 1100. We don't have the 1200, and we haven't tried the chip.

Why did you choose Agilent over another vendor?

Well, because we've always used Agilent. We use it because it's good for us, and it's a workhorse. And they came with the nanosystem. We tried another company's nanosystem, and it didn't work well for us, so we went with Agilent.

It's reliable for us. We run it 24 hours per day. But as a government facility, we will not recommend any manufacturer over another or anything. I like Agilent's technologies and systems, so that's what we use. And it's worked well for us. I have no complaints about it.

Is your lab basically a support facility where you take samples from other people and process them?

We are a lab that helps all the scientists from the NCI and NIH facilities. So whenever they have a protein they want to identify, let's say from tissue, or membrane proteins, they come to us. We do the fractionation, give it to the mass spec center to identify the peptides and do the searches, and then we give the scientists the results. We are a core support facility.

You're not involved in the mass spec side?

No, we don't develop mass spectrometry. We just buy what's available that we think will meet our needs.

How do you think the nano-LC systems could be improved upon?

Well, with the column production, the particles are getting smaller, and the systems can handle higher pressures. I mean UPLC by Waters goes to probably 12,000 psi, and there is a new one, the 1200 by Agilent, that goes to about 9,000 to 10,000 psi at 600 bar. And they call it Ultra High Pressure — that's what Waters calls it. But ultra-high pressure is really what Jim Jorgenson is developing — that's 100,000 psi.

But anyway, these systems are good enough to handle higher back-pressure, because we would like to use smaller particles. Smaller particles mean higher surface area, and better resolution, and higher efficiency of the column.

What about monolithic columns — are those advantageous?

Those are fine. They require sometimes high flow rates. What I understand from some people is that they are really hard to manufacture. What happens is they put this polymeric material inside a capillary, and then either they heat it, or they put it under UV light to polymerize. And if there's a difference of say 0.1 pH unit, you will not get the results that you hoped for when the material polymerizes. The same thing with the temperature. Any small change in the pH or the temperature will affect the results.

So that technology, although it has been out for a few years, it still needs some work in order to get things to work reproducibly.

Why is it important to have a slow flow rate?

The lower the flow rate, the higher the sensitivity. That's one thing. The other thing is if there is a high rate, you are pushing things out of the column, and when you are doing a complex mixture, you get a lot of peptides coming out at the same time that the mass spec will not be able to handle.

I'm not a mass spectrometrist, but what I can say is that there are better mass spectrometers that have a faster recycle time that can identify a mixture of peaks, and you don't have to do as well a separation if you have, let's say, an FTICR system, as if you have a regular LTQ system, let's say from Finnigan.

Are you involved now in developing improved technology?

No, we don't develop technologies. We try to work on columns, and see how columns work. We are trying to do some monolithic columns.

What we do is we use existing technologies to get better separations. So I don't go and build an HPLC system, and I don't go and build a CE system. I use commercial equipment, and use it to an advantage. I ask, 'What can I connect from one system to another to get what I want?'

File Attachments
The Scan

Genetic Tests Lead to Potential Prognostic Variants in Dutch Children With Dilated Cardiomyopathy

Researchers in Circulation: Genomic and Precision Medicine found that the presence of pathogenic or likely pathogenic variants was linked to increased risk of death and poorer outcomes in children with pediatric dilated cardiomyopathy.

Fragile X Syndrome Mutations Found With Comprehensive Testing Method

Researchers in Clinical Chemistry found fragile X syndrome expansions and other FMR1 mutations with ties to the intellectual disability condition using a long-range PCR and long-read sequencing approach.

Team Presents Strategy for Speedy Species Detection in Metagenomic Sequence Data

A computational approach presented in PLOS Computational Biology produced fewer false-positive species identifications in simulated and authentic metagenomic sequences.

Genetic Risk Factors for Hypertension Can Help Identify Those at Risk for Cardiovascular Disease

Genetically predicted high blood pressure risk is also associated with increased cardiovascular disease risk, a new JAMA Cardiology study says.