Skip to main content
Premium Trial:

Request an Annual Quote

Juan Enriquez on Agriculture and the Future of Microarray Policy


At A Glance

Founder and chairman of the board of Biotechonomy, based in Newton, Mass.

Background: Founder of Life Science Project, Harvard Business School.

MBA — Harvard University, 1986

BA — Government, Harvard University.

Juan Enriquez is a member of the US Department of Agriculture Secretary’s Advisory Committee on Biotechnology and 21st Century Agriculture, a group of 18 members organized in April with representatives from agriculture and biotechnology, as well as farmers and academics.The group is meeting over a two-year period as an advisory body on the future of biotechnology in agriculture in the US.

While the US Food and Drug Administration has commanded the microarray industry’s attention with its efforts to establish a framework to regulate the technology and the information it provides, there are two other federal agencies that also have regulatory jurisdiction over this technology — the USDA and the Environmental Protection Agency. [For more information on the USDA’s regulatory role in biotechnology, see this website].

Because of his position as a member of this committee, as well as his background as an expert on the economics and the political impact of discoveries in life sciences, BioArray News spoke to Enriquez to find out about how microarrays might fit into the agricultural future in the US, and how USDA might consider regulating the technology.

Would you describe what the Secretary’s Advisory Committee is covering?

Basically, what we are doing is creating a report for the Secretary of Agriculture that says ‘Here are some of the things that will be important and that you should be thinking about for agriculture in the 21st century.’ We are trying to get beyond this month’s crisis, or next month’s crisis, and trying to come up with big forces, the big trends that will move the shape and form of agriculture, The committee tries to bring together people who are quite varied in their outlook. There are industry people, environmental people, consumer groups, and farmers.

Let’s look into the future for microarrays. What are the issues in the agricultural applications of this technology?

As you think about what you use microarrays for, particularly in agriculture, it really depends on how you are going to price the animal and the product coming out of the animal. There are certain tiers about how you can use them, and there are security issues, so that you can test to see that you are getting what you are promised. That becomes important with [genetically-modified organisms] traceability. And, as you get into tracing more and more food, which seems to be the way in which we are going, and tracing particular strains of animals or diseases, or diseases borne in animals, then there is a general application as tests. Every cow in Europe carries a passport and is traceable as to where it was born, to whom it was born, and where it traveled. A lot of that will be done with embedded chips.

The second tier is security applications, where you can test for things like either deliberate- or non-deliberate security issues. So, everything from SARS through deliberately targeted things like anthrax. You can have a series of tests to tell if this variety comes from that place. Those are high-value, relatively low-volume.

In agriculture, I can’t see the use of microarrays for testing every animal or every creature, particularly. Maybe every cow or every horse, and maybe pigs. But when you get to the level of chicken and rabbit, likely not. Microarrays would have to follow a cost curve that looks like a computer chip cost curve in terms of becoming half the price every 18 months in order to be effective and have really broad-scale applications across low-volume commodity markets.

So, since microarrays are a relatively costly technol-ogy, the economics of the agricultural industry are not favorable?

Biochips have been designed in such a way that they are operating in a market that has an average margin of 30 to 40 percent per product or higher. When you move into the commodity markets, you are operating at margins of 3 to 5 percent. So there is a huge disparity between the margins at which these things have been designed to work so far and the margins that would be sustainable in an agricultural system. That doesn’t mean they are not useful in labs; that doesn’t mean they are not useful in random tests, or to verify things; but I don’t see, outside the high value animals, like prize horses, a broad use of microarrays. The margins are so different between pharma and the ag business.

Where do you see applications of economic value?

As animals move from being feed and fiber and into being factories, then they become much more higher-value-added animals. So if you engineer a pig to have humanized skin, or humanized heart valves, or you create a goat to generate spider silk in the milk, then you are talking about animals that could be worth tens of thousands of dollars or even hundreds of thousands of dollars. If an animal becomes a pharmaceutical factory, then you have an enormous investment there, and a real need for quality control and that is a point where high-value expensive chips become essential on a constant monitoring basis. But we are not there yet. When things like xenotransplantation really start to take off, then monitoring for diseases on a constant basis to avoid cross-species disease transmission is going to become absolutely essential. Other applications might include monitoring herds of animals for compliance with treaties. Today, you are not supposed to traffic in certain types of animal meats and certain types of animals. Bioarrays might become one way to enforce environmental regulations and find out just what is selling in the Tokyo fish market. So, if you are selling whale meat, and you claim it is X species, and it turns out to be an endangered species of whale, there is a big deal there.

There is a third tier there too, a security aspect where you need quick field diagnostics across a series of diseases, and that could be a big deal. Then, there is the general preventive health issue, which is not tied to agriculture, but if you could put specific virus bacterial signatures on chips, then when they put that tongue depressor and ask you to say ahh, they could be collecting data. That could happen with animals too but it’s harder to get them to say ahhh. It’s not inconceivable to do that. You could go after a whole series of common diseases. And arrays could quickly become a rapid diagnostic tool for diseases that spread through plants. You could quickly decide which plants to take out before the symptoms are there so that you cull the disease, conceivably.

We have talked a lot about animals. Let’s turn to the plant world. What sort of applications might be right for this market?

You can put a whole series of things in plants that go beyond food, feed and fiber, the traditional uses of plants. Now you are beginning to get the ability to reprogram life forms in such a way that they execute different functions. To put that in more English terms, you know you can generate energy out of biomass, [but] the problem with that is that it requires a significant subsidy, so it hasn’t taken off as a significant industry, because, if there is not a significant subsidy, it doesn’t work. That tells you that you are working on a matrix and the left side of that matrix is the price of energy. As the price of that goes up, the availability and likelihood of biomass being used also goes up. The bottom part of that matrix is the energy conversion ratio. So, if you are able to tweak the genetics of plants in such a way that the energy absorbed from the sun is more efficiently transformed to an energy form, that makes the likelihood of biomass energy higher. If you get a combination of rising energy prices and more efficient biomass conversion, then an equation which for centuries has been, ’Am I going to use this hectare of land for feed or fiber, or food,’ becomes a fourth-tier equation of food, feed, fiber or energy. And, that puts a floor price on a bushel of corn or wheat, or something else, whatever you are using for biomass. As soon as you have a fourth option, if your food price goes down, the energy price keeps the price of that bushel up. And that starts to give the farmer a series of options to think about what his or her job is going to be, or what industry he or she is going to be tied to. Now all of sudden you are going to face a series of questions that you haven’t had to face before: Am I going to be growing medicine in the animals that are in here?Aam I going to be growing these medicines in animals or plants? Am I going to be growing energy?

What are the consequences of that?

Those are the kinds of things that lead to very large industrial upheavals. You are already beginning to see this with some very strange mergers. You saw that over the last five to eight years with most of the major seed companies being bought up by very strange players. It was not the big agribusiness players that bought the seed companies, but the chemical companies and the pharmaceutical companies. So, in the measure that they start thinking about seeds as diskettes that could execute different programs, then it becomes of much more interest for those players. The logical conclusion of this is: As seeds become carriers or executors of programs, then you are going to get information-driven companies buying seed companies. Companies that traditionally haven’t been traditional players in medicine will be going in and doing some of these purchases. To put it in more recent terms, the strangest, most interesting merger was GE buying Amersham. Everybody thinks of GE as a manufacturing company, but it’s not a manufacturing company, it’s a company that has been making most of its money off financing and services. Now it is moving to become a services information company, and, by the way, so is IBM, which is moving into providing the network of services. They aren’t going to make their money off making the chip, but providing the network and the services and the plumbing. As medicine becomes an information-driven industry, as you are able to recode information, then what you are going to get is a very different system. The players will be different. Plants and animals are going to be one way out of many to think about executing these new programs and applications.

So, where does that scenario leave arrays?

At that point, quality control, traceability, and disease management become absolutely essential. If you are going to start creating medicines in the field and in a barn, you had better be darn careful about things like retroviruses: You had better be able to find the signatures for retroviruses before you transplant anything. If you are going to be growing proteins in animals for human consumption through medicine, you had better be darn sure that batch A and batch C are identical. As we start making animals and plants [into] factories for things other than food, feed, fiber, it is going to be very important to be able to trace exactly what is the input and output of this stuff.

In a way, are you talking about a technology that is extended from microarrays, as we know them today, shifting instead to a biosensor technology?

As we get better at monitoring the environment and our own conditions in it, the whole notion of a biosensor will become a common notion. A lot of things that we do today, the nursing, the let-me-take-your-temperature-and-blood-pressure, will become quite primitive when you can have biosensors. The way in which we do a lot of diagnostics is utterly primitive. I think medicine will move into a more preventive structure and I think biosensors will be at the forefront of that. The problem with that is that you have a very different paradigm. Instead of giving you in most cases, a yes-no answer, it’s going to give you a probability answer, because, I suspect for most things, you find the presence of a series of things without necessarily having the symptoms or actually getting sick. Obviously, there are things where if you got it, you got it. But a lot of things can live in your body without the triggers to make you sick and it’s going to require some really big databases and some really interesting cross-correlations to figure out what your probabilities of getting sick are. Biosensors may be a way of monitoring that. They are also a way of breaking out a typology of disease, the same way as we now understand plants species a lot better. I suspect we will be using bioarrays to understand microbe and germ breeding and speciation and outcomes in a similar way.

But, regulators are not there yet.

The dilemma is interesting. Will you allow stuff to come to market that is predictive as a probability, or as a certainly? And, how sure is sure? That’s part of the problem with regulating medicine today: There is no acceptance of risk. Any time you put something in your mouth, it carries a certain amount of risk. We are living in a society that seems to say that is not acceptable, or allowable. So, we are spending huge amounts of money on products that are me-too products. You end up puting things together that are not risky. So, you are not bringing malaria vaccines to market.

Today, we don’t know close to enough about how the body works and how cells work and how disease works in order to say, ‘If you get this pattern out of a microarray this is what it is going to happen.’ So you can take a microarray pattern and say: ‘You know, for women who have this particular pattern of cancer, about 5.5 percent are going to die if these are the genes that are lit. If you take these genes and light then up, then about 45 percent will die.’ That is a predictive mechanism and that is a very different form of regulating medicine and thinking abut medicine. If industry tries to produce microarrays as specific predictive models, to create certainty in treatment, then very few things, I think, are going to come to market because we don’t know enough about how the system software of our bodies operates to be able to do that. If the FDA comes to this thing with an open eye and says, given the current diagnostic methods, which are pretty darn primitive, this will be a complement to diagnostics, and it’s not certain, it’s not 100 percent, or even 90 percent predictive. As you think about these things, you are going to have to go to the FDA and say: ‘Yes, we understand this is a flawed mirror, or a flawed diagnostic, but it’s better than the diagnostics that are out there.’ We have to create a legal protection system that says, given the current instruments, this improves the standard that we have today but you can’t hold it to the current standard that you expect out of some medical compounds. That is going to be a tricky thing to do.

What are the regulatory issues that the USDA should consider for microarrays?

We haven’t addressed that question, but it is one that we should think about.

The Scan

Transcriptomic, Epigenetic Study Appears to Explain Anti-Viral Effects of TB Vaccine

Researchers report in Science Advances on an interferon signature and long-term shifts in monocyte cell DNA methylation in Bacille Calmette-Guérin-vaccinated infant samples.

DNA Storage Method Taps Into Gene Editing Technology

With a dual-plasmid system informed by gene editing, researchers re-wrote DNA sequences in E. coli to store Charles Dickens prose over hundreds of generations, as they recount in Science Advances.

Researchers Model Microbiome Dynamics in Effort to Understand Chronic Human Conditions

Investigators demonstrate in PLOS Computational Biology a computational method for following microbiome dynamics in the absence of longitudinally collected samples.

New Study Highlights Role of Genetics in ADHD

Researchers report in Nature Genetics on differences in genetic architecture between ADHD affecting children versus ADHD that persists into adulthood or is diagnosed in adults.