Skip to main content
Premium Trial:

Request an Annual Quote

NCGC s Jim Inglese on Small-Molecule Cell-Based Screening at the NIH

Premium

At A Glance

Name: Jim Inglese

Position: Director, Biomolecular Screening and Profiling, NIH Chemical Genomics Center (NCGC); Editor-in-Chief, Assay and Drug Development Technology

Background: Senior research fellow, Merck Research Laboratories — 1999-2004; Senior research fellow, Pharmacopeia — 1995-1999; Postdoc, Howard Hughes Medical Institute, Duke University — 1989-1994; PhD, organic chemistry, Pennsylvania State University — 1989; BS, chemistry, Rensselaer Polytechnic Institute — 1984

When the NIH founded its Chemical Genomics Center in June of last year, it was entering somewhat uncharted territory — industrial-scale screening of small molecules, a la pharmaceutical industry. Where better to pluck staff from than the ranks of the pharmaceutical industry itself? Jim Inglese was one of those hires, and he brought nearly 10 years of small-molecule screening experience at leading drug-discovery and pharmaceutical companies with him. He now serves as director of biomolecular screening and profiling at NCGC, and as such, is responsible for aiding in the organization’s infrastructure and staffing its laboratories. Inglese took some time last week to discuss with Inside Bioassays the evolution of his career, as well as the evolution of cell-based small-molecule screening.

How did you develop an interest in screening small-molecule biomodulators?

When I was younger, I always had a fascination with chemistry, and subsequently this was my major as an undergraduate at Rensselaer Polytechnic Institute. As a senior at RPI, I had taken whatever graduate-level courses in advanced synthetic organic chemistry were available. During this time, Professor Jim Coward, now at the University of Michigan, had given me the opportunity to do an undergraduate thesis project in his lab. The project was to synthesize an analog of MTX, a cancer chemotherapeutic, where a gamma glutamyl hydrogen was replaced by fluorine. This subtle chemical alteration affected the electronic characteristic of MTX’s gamma carboxylate, thus altering its ability to be modified by an enzyme in the cell responsible for potentiating some of the toxic side effects of MTX. This began my interest in how molecules interacted with biological systems, and so I initiated my PhD studies in bioorganic chemistry, where I studied both synthetic chemistry and enzymology in the laboratory of Professor Stephen Benkovic at Penn State University. After that I trained as a post-doctoral fellow in Professor Bob Lefkowtiz’s lab at Duke, where I developed an understanding of signal transduction and how to use the tools of molecular biology. Still retaining a passion for chemistry, I ventured off to a small Princeton, (NJ)-based biotech called Pharmacopeia where I was able to work directly with chemists making large combinatorial chemical libraries encoding members with rule-of-five properties. I lead a group that was responsible for screening these libraries against targets of interest to the company. It was at Pharmacopeia that I first began screening for small-molecule biomodulators.

Parallel to your career, small-molecule screening itself has seemingly evolved to the point where the NIH is now heavily involved. How did this develop?

Three facts led us here: First, the Human Genome Project has provided an abundance of targets to screen; the current number of human genes is around 25,000. This can be equated to about one million human proteins encoded by the genome, once the gene products undergo post-transcriptional and post-translational modifications; second, the advent of commercial compound suppliers and combinatorial chemistry has provided hundreds of thousands of high-quality compounds to screen; and third, the advances in assay, screening, and robotics technologies have provided the capacity to screen those large numbers of compounds. From NHGRI’s perspective, development of small-molecule research tools is critical, because they are able to alter function at the protein level, rather than mRNAs — siRNA, antisense, overexpression — or at the gene and locus level, as with knock-out mice. One of the lessons from human and other genome sequences is that complexity is not conferred by simple gene number, but rather the complexity of gene regulation, splicing, and protein functions — for example, multifunctional proteins, post-translational modifications, et cetera. To understand this complexity, research tools that perturb biology at the level of the effector of the phenotype — in other words, the protein — are needed.

The NCGC is expected to be at the cutting edge of small-molecule screening. Last year, it signed a large contract with Kalypsys for its screening technology. What made NCGC settle on this?

We needed very high capacity given our remit to do screens for the entire research community, and to cover as much of the genome as possible. Kalypsys offered this throughput, precision of liquid handling, miniaturization, and low operating costs.

Obviously there is a lot of non-robotics work done still. What other types of major technologies is NCGC currently exploring?

We are positioning the center to screen assay formats and conduct follow-up studies not suitable for the Kalypsys system. To this end, we have invested in nanoliter liquid-handling technology from Aurora Discovery, which will allow us to generate compound titrations using minimal amounts of compound — again, to keep reagent costs low. To complement our plan to generate concentration response curves for all compounds screened, we have obtained state-of-the-art [GeneData] software for analyzing and visualizing HTS data and large numbers of concentration response curves in a way that will allow us to quickly make decisions on how to proceed with the next step in our biomodulator discovery process. For capturing assay protocol development information we have invested in another package, [Teranode Design Suite], which operates off a model-based paradigm allowing all of our assay optimization data to be stored in a database and retrieved, analyzed, and used for subsequent optimization designs. In my previous environments this data went primarily into lab notebooks or Excel files scattered about a common hard drive with no useful way to mine or search — this is something that I do not want to repeat at the NCGC.

What types of assay or instrumentation technologies are on your wish list — whether they exist or not?

Subcellular imaging to enable phenotypic screening is an area we plan to pursue. Given our interest in exploring the “dark matter” of the genome — that is, the many gene products of unknown function — phenotypic assays present one avenue to study this area. However, I am waiting for the field to mature further before bringing in a microscope-based analyzer. Some of the products I am interested in seeing evolved here include the ability of these systems to make rapid kinetic readings from a stimulus-driven cellular event, such as ion-channel activation. This would require the ability to perform rapid and low-volume liquid additions to the cell population during the image acquisition. Further, the manipulation and data reduction of image files generated from these systems is not optimal, nor are the 1,536-well microtiter plates currently available. To bridge the gap here, we have included a laser-scanning imager [from TTP Labtech] on our robotic platform that will permit population distribution analysis of cells or particles in the well of a microtiter plate. You can think of this as a “static” fluorescence-activated cell sorting instrument. This is in contrast to the population-averaged output one obtains from standard plate readers. Another technology that we would like is ligand- or function-independent assays and detection systems that, for example, might permit detection of small molecule-protein binding or thermal signatures of cellular metabolic or signaling states. Advanced low-volume calorimety technologies are of great interest, too, but have not yet proven feasible to us. We also would like to see primary and mixed cell culture techniques applicable to high-throughput screening, to permit screening in systems that better approximate physiological settings, and to potentially reduce variability in compound behavior due to subtle changes in assay conditions. Perhaps stem cells offer an opportunity here. And lastly, we need robust and flexible analysis tools to analyze multiparameter assay readouts — for example, image-based screens.

How much of what NCGC does is cell-based?

Anywhere from 50 to 70 percent of our assays are expected to be based on a cellular system. Initially, this means reporter gene and population analysis using laser-scanning imaging, which is essentially a means to obtain similar data that one obtains from a FACS analysis.

Doesn’t the term “chemical genomics” imply cellular analysis?

Not at all. Though the term chemical genomics is used in many different ways, we use the term to simply describe the use of small molecules as research tools to understand basic biology. This can be done in cell-free systems, cellular reporter systems, cellular phenotypic systems, or even organismal systems — anything that will fit into a multiwell plate. Some use the term “chemical genomics” to describe the study of the effects of small molecules on gene expression genome-wide — for example, by microarray analysis of mRNA from treated cells — this may be what you’re thinking of, but it is not how we use the term.

What are some other tangential interests of yours? You started a journal …

Yes, like you, I am interested in journalism. I founded and edit a journal called Assay and Drug Development Technologies, to report on integrated advances in science and engineering directed toward drug discovery. Progress in drug discovery and development is limited by a fluctuating gap that exists between science and technology, and I felt that those working in fields such as biology, chemistry, computer science, biophysics, and instrument engineering needed a platform that allowed cross-fertilization within these areas. The journal has been rather successful and appears to be well received.

The Scan

Dropped Charges

The US Justice Department has dropped visa fraud charges against five Chinese researchers, according to the Wall Street Journal.

More Kids

The Associated Press says Moderna is expanding its SARS-CoV-2 vaccine study to included additional children and may include even younger children.

PNAS Papers on Rat Clues to Human Migration, Thyroid Cancer, PolyG-DS

In PNAS this week: ancient rat genome analysis gives hints to human migrations, WDR77 gene mutations in thyroid cancer, and more.

Purnell Choppin Dies

Purnell Choppin, a virologist who led the Howard Hughes Medical Institute, has died at 91, according to the Washington Post.