‘Climate proofing’ the world’s food supply with edible microorganisms

image f eb ba c ad a
Archaeological evidence suggests that what we think of as modern humans — sophisticated primates capable of abstract thought, communication through spoken language, planning, etc. — existed at least as early as 50,000 years ago. Yet until approximately 10,000 years ago, our ancestors subsisted as hunter-gatherers, with scant archaeological evidence of any serious attempts at agriculture. Then starting around 10,000 years ago, our ancestors successfully developed agriculture — not just once, but at least six times in different parts of the world independently of each other. And it all happened in the relatively short time span of 5,000 years.

Once agriculture was established, this enabled human societies to expand from small, nomadic or semi-nomadic groups to larger, permanent settlements — villages, towns and cities that eventually joined together into nation states. The higher efficiency of food production per person meant that a proportion of the population could now specialize in tasks beyond agriculture and become artisans, administrators, merchants, scientists, engineers, doctors and so on all the way to today’s app developers, reality TV stars, lifestyle bloggers and Instagram influencers.

What was it that took us so long? In 2001, University of California researchers Peter Richerson, Robert Boyd and Robert Bettinger published a study, which hypothesized that the reason for this delay was that the climate before 10,000 years ago was not only too harsh for most forms of agriculture but also too variable. This was backed up by a second paper in 2007 written by astrophysicists Joan Feynman and Alexander Ruzmaikin, which used additional ice core data to show that until 10,000 years ago, the global climate was simply too unstable from one century to the next to permit sufficient time for domestication of animals and crops.

Since agriculture is pivotal to the development of a complex society, and climate stability is the key to successful agriculture, we have to answer an important question: Just how resilient are agricultural societies to climate variability? From archaeological evidence as well as sediment and ice core data, we suspect that some 4,200 years ago, a century-long and near-global drought referred to as the 4.2-kiloyear event caused societal collapses all over the planet. Repeated cycles of drought are thought to have caused the demise of the classic lowland Maya empire during 800-1000 AD. The so-called “little ice age” (c. 1300-1850 AD) has been put forward as one reason why the 500-year Norse settlement on Greenland finally came to an end during the 16th century. So it would seem that natural climate fluctuations can potentially have devastating impacts on individual societies.

a overpopulation

Fast forward to the present moment. We are quickly approaching a global population of 8 billion people and may reach as much as 12 billion by the end of this century. At the same time, we are running out of new land to farm while per capita consumption of carbon-intensive animal protein is increasing. And the global climate is changing rapidly because of human activity. Just how much the climate is changing and what the exact consequences will be are fiercely debated. But if things really went south, could our current globalized society survive if there were large-scale disruptions to food production caused by climate change?

The question becomes how do we “climate-proof” global food production in a future worst-case scenario? Even with the latest molecular tools for breeding plants and animals, year-to-year fluctuations between drought and heavy rainfall may become too extreme to be able to ensure a stable level of food production. There is also serious concern that we may also end up losing significant areas of cropland to rising sea levels and desertification. Some ocean scientists and fishery experts fear that melting glaciers at the poles could change or even disrupt ocean currents, which could lead to a collapse in wild fish populations.

What we would need in such a worst-case scenario is a global food production system that is tolerant to unpredictable climate fluctuations while at the same time being less reliant on soil quality and fresh water. One suggestion that keeps coming up these days is to simply move agriculture indoors with vertical farming. On paper this initially looks like an attractive way to solve many of the issues that would face conventional agriculture in an extreme climate change scenario. However, as agricultural scientist Jonathan Foley has pointed out, the scale at which this technology would have to be deployed in order to feed billions of people is likely to make it prohibitively expensive.

What other options are there? Cultured meat produced in bioreactors would also be another way to protect food production from a hostile external environment. However, in addition to the cost of building the thousands of large bioreactors needed for cultivation of animal cells, cultured meat would still be dependent on an external source of “feed”, either from plant or animal sources. As I have pointed out previously, this is because all animal cells––cultured or not––require sugars, protein and fats for growth. So even if the cultured meat cells are safe and snug inside their bioreactors, you would still have to produce their food by conventional means i.e. outside. Therefore cultured meat would still be vulnerable to extreme climate conditions through their absolute dependence on external sources of feed.

webp net resizeimageThere is one other food production technology that is not dependent on external climate conditions and soil quality and requires less fresh water. Edible microorganisms––bacteria, microalgae, yeasts and filamentous fungi—seldom feature in the public debate of future food production, but have many attractive features that would enable more predictable and resilient food production in a destabilized climate of the future. Microorganisms have good nutritional properties and there are already some established microbial food products for human consumption such as the mycoprotein ingredient in Quorn-brand products. Mycoprotein is made from a soil-living fungus, which is cultivated on sugar in process that resembles beer brewing. The fungal mycelium is harvested and processed in a way that gives the final product a meat-like texture. Apart from the benefit of replacing conventional meat, the manufacturers of mycoprotein also claim a number of additional health benefits from mycoprotein consumption including lowering blood cholesterol.

The key advantage of edible microorganisms over all other food production options is the ability of microorganisms to use simple organic compounds such as hydrocarbons, alcohols and organic acids as growth substrates. This is significant since such compounds (e.g. methanol, formic acid and acetic acid) can be chemically synthesized directly from carbon dioxide captured from the atmosphere. That makes it possible to produce edible microbial biomass in a way that is completely independent of photosynthesis, soil quality or local climate conditions. This means that you could produce food pretty much anywhere on the planet––in deserts, polar regions, in the middle of the ocean or even under ground.

Large-scale production of edible microorganisms would still suffer from the same drawback as vertical farming and cultured meat, which are the substantial costs in building giant bioreactors for controlled growth of microorganisms. The one crucial advantage of microorganisms compared to the two previous options is the speed at which microorganisms can grow – a bacterial cell can divide as often as every 20 minutes under optimal conditions, while a yeast cell can manage to divide between every 60 to 90 minutes. This means that a bioreactor of a certain size could produce edible biomass at a much higher rate than a vertical farm or a bioreactor for cultured meat of equal size.

Just how fast can edible microbial biomass be produced using a large bioreactor? During the early 1980s, the British company Imperial Chemical Industries produced an animal protein feed called Pruteen, which was made from a bacterium that could grow on methanol as its only source of metabolic carbon. The Pruteen bacterium was cultivated in a single, giant bioreactor that stood 60 meters (196 feet) tall and had the capacity to churn out 50,000 metric tons of dried bacterial biomass per year. To put that into perspective, to produce the equivalent amount of soybeans per year, you would need about 15,000 hectares (c. 37,500 acres) of arable land. In fact, if you wanted to replace the entire current soybean production capacity in the US (36 million hectares/90 million acres) with a Pruteen-like process, you would “only” need about 2,400 bioreactors.

beyond meat

As alluded to above, edible microorganisms also do not have an absolute requirement for fresh water. Although most current production of microbial biomass for human or animal consumption uses fresh water, many of the established microbial species that are used for food and feed applications are actually quite salt-tolerant and can be cultivated in either mixtures of fresh and salt water or just salt water alone. The main concern here is not so much the ability of the microorganisms to grow but instead the wear and tear on bioreactors that would be caused by the salt. However, this a surmountable engineering problem.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

So in the end, the good news is that we have at least one mature food production technology that can theoretically feed all of humanity in a future worst-case climate change scenario. But perhaps more significantly, if the production of edible microorganisms allows us to produce more food on less land, couldn’t we deploy this technology right now and perhaps help avert potentially catastrophic climate change to begin with? It is clear that substantial portions of agricultural land could be spared in favor of microbial alternatives. The subsequent restoration of such agricultural lands to natural habitat could then be used to sequester carbon dioxide from the atmosphere as well as provide biodiversity sanctuaries. As long as more carbon can be sequestered by land-sparing than what is emitted by building and running bioreactors for the production of edible microorganisms, it just might work.

Where would we start? Substituting meat for microbial meat analogs like mycoprotein would free up substantial land areas that are currently used either for pasture or to grow feed crops. Greater adoption of mycoprotein in favor of animal meat would also help reduce methane emissions from beef cattle as well as emissions of nitrous oxide––a highly potent greenhouse gas––from animal manure. A recent study has suggested that a simple reduction in price of mycoprotein would lead to an increase in consumption, which would then translate into environmental benefits. I can testify from personal experience that I would eat more mycoprotein if it was cheaper. For example, mince made from mycoprotein currently retails at nearly double the price of its beef equivalent where I live. Hopefully the emergence of new mycoprotein producers will help bring down retail prices.

The recent craze for the plant-based Impossible Burger in the fast food sector clearly shows that there is a lot of consumer interest for meat analogs, especially if they are marketed as better for the environment. I would love to see a marriage between Quorn Foods’ mycoprotein product and Impossible Foods’ heme “miracle” ingredient.

For those consumers who will prefer “real” meat over meat analogs, it would still be possible to reduce the environmental costs of meat production by substituting conventional sources of animal feed (soy, rapeseed, fishmeal) with edible microorganisms. The Pruteen bacterium mentioned above could replace much of global soybean cultivation, which currently occupies an area of 125 million hectares (309 million acres)––equivalent to the land area of South Africa. Rapeseed (36 million hectares/89 million acres globally) could be substituted with specialized oil-producing microorganisms. These same oil-producing microorganisms could also be used to replace fishmeal, which is used extensively in aquaculture and suffers from significant sustainability issues.

If edible microorganisms are to make a meaningful contribution to reducing the carbon footprint of global food production and help avert catastrophic climate change, the technology must be deployed on a global scale without delay. My sense is that current market forces are unlikely to drive such a massive transition on their own as long as there is little public awareness that these types of food products even exist. And as long as there is little public awareness, I suspect policy makers will probably not take the initiative either. Edible microorganisms produced independently of photosynthesis are likely to become much more cost-competitive once climate change really starts affecting global agricultural yields and drives up food prices. However, by that time it will probably be too late.

Tomas Linder is an associate professor of microbiology based at the Swedish University of Agricultural Sciences in Uppsala, Sweden. He studies microbial metabolism and how it can be applied for food production, pest control and degradation of environmental pollutants. Follow him on Facebook

{{ reviewsTotal }}{{ options.labels.singularReviewCountLabel }}
{{ reviewsTotal }}{{ options.labels.pluralReviewCountLabel }}
{{ options.labels.newReviewButton }}
{{ userData.canReview.message }}
screenshot at  pm

Are pesticide residues on food something to worry about?

In 1962, Rachel Carson’s Silent Spring drew attention to pesticides and their possible dangers to humans, birds, mammals and the ...
glp menu logo outlined

Newsletter Subscription

* indicates required
Email Lists
glp menu logo outlined

Get news on human & agricultural genetics and biotechnology delivered to your inbox.