PohlFinal HLB risk x

Viewpoint: Modern-day Luddites: How precautionary activism and reporting paint a misleading picture of biotechnology

We live in a precautionary era in which technological breakthroughs poised to dominate the coming decades—from artificial intelligence and nanotechnology to the biotechnology revolution in medicine and agriculture—are often cast in a dark shadow. Journalists are mirrors, reflecting the broader, societal anxiety that we will not be able to rein in the seemingly runaway forces of technology. It is a cultural and economic war between pessimism and progress. Social media only amplifies the cacophony.

It is part of a historical pattern. The idealization of the past in the face of paradigm-shifting technology is not a new phenomenon. New technology is disruptive, which means that while the direction of change may be positive for society as a whole, there will be innocent losers as well as many winners. The Luddites of early 19th century Britain have emerged as the historical symbol of technological rejectionism. This oath-based organization of rural fabric and button makers were horrified about the mechanization of their crafts, as textile mills began replacing their small-town shops. They fashioned themselves as the liberals of that era, chosen by God to protect the pastoral English life they were so used to, and protest the disruptions of industrialization sparked by the machine technology and coal mining revolutions (the ‘disruptive’ technologies of that time) (Sale 1996).

History abounds with examples of epic misjudgments rooted in pessimism about the promise of emerging disruptive technologies. Consider a Western Union internal memo, dated 1876: “This ‘telephone’ has too many shortcomings to be seriously considered as a means of communication [and] is inherently of no value to us” (Wadwha 2014). Or a comment by a British Member of Parliament in 1903: “I do not believe the introduction of motor-cars will ever affect the riding of horses” (van Wulfen 2016). Additionally, the infamously flip quip by an executive editor at Prentice-Hall in 1957: “I have talked with the best people and I can assure you that data processing is a fad that won’t last out the year” (Sherman 2012).

crispr cover cmyk
Credit: Chris Labrooy

Resurrecting these anti-innovation sentiments is insightful because we are in the early stages of a once-in-a-generation, and maybe once-in-a-century, innovation earthquake that is making food safer, more nutritious and more abundant, and helping us fight the scourge of climate change. New techniques of biotechnology—from genetic modification to CRISPR (clustered regularly interspaced short palindromic repeats) gene editing—are propelling dramatic change in food and farming. But many in the media mainstream, spurred in part by self-described “progressive environmentalists,” will have little of it, and their views have sowed doubt amongst the public at large.

Reporting on food is not like covering City Hall—its products—food—are visceral, deeply personal, and cultural. When it comes to applying technology to farming, everyone has an opinion, informed or not. President Dwight Eisenhower, who was raised in Kansas farm country, became skeptical of reporters and Washington bureaucrats who misunderstood the Green Revolution and the role of synthetic pesticides and fertilizers that revolutionized global farming beginning in the 1940s and 1950s. “Farming looks mighty easy when your plow is a pencil and you’re a thousand miles from the corn field,” he quipped in a speech at Bradley University in 1956. He called critics of modern agriculture ‘synthetic farmers’ (Smith 2009).

GMO Rejectionism

The targeted manipulation of genes that began in the 1980s and 1990s that became known as genetically modified organism (GMO) technology has long been received with a similar mixture of alarmism and misreporting. Although there are many examples of nuanced critiques of biotech-inspired farming practices, much of the media coverage has been shaped by environmentalists and advocacy groups who define themselves as “liberal” but have adopted a Luddite-like precautionary view of GMOs and transgenic plants and, more recently, of the advances ushered in by gene editing and other new breeding techniques.

No surprise that the 2000s are marked by dozens of scientifically challenged, best-selling books (e.g. Seeds of Deception by Jeffrey Smith, 2003; Omnivore’s Delight by Michael Pollan, 2006; The Unhealthy Truth by Robyn O’Brien, 2009) and documentaries (e.g. The World According Monsanto, Marie-Monique Robin, 2018; GMO OMG, Jeffrey Seifert, 2013; Sustainable, Matt Wechsler and Annie Speicher, 2016) that lack supporting evidence, thereby promote a pessimistic view of agricultural technology.

monsanto resize

An unflattering meme has emerged about conventional farming and the agro-businesses that support it. Books, movies, and thousands of newspaper articles and online stories generated by advocacy groups and journalists conclude, with little variation in subtlety, that the world food system is dominated by rapacious transnational corporations and that biotechnology is making farmers more vulnerable, endangering our collective health, and, in its most apocalyptic expression, threatens the sustainability of our planet.

Its titular leaders, such as Vandana Shiva, an Indian philosopher, who has been described by supporters as the “rock star” of progressive environmentalism, goes so far as to reject the Green Revolution as a vestige of corrupt global capitalism. She dismisses it as a symbol of the failure of 20th century science technology and of the ‘rational’ Enlightenment agenda itself. Shiva rejects the use of synthetic fertilizers and pesticides altogether, criticizes agricultural biotechnology as an “assault on nature,” and promotes a return to small-scale farming, early 20th century farming even if it means a radical reduction in yields and lower incomes for farmers (Genetic Literacy Project 2019).

By-and-large the arguments these biotechnology critics advance bewilder many scientists, farmers, and independent journalists because they do not address scientific risk or compare costs and benefits and they deify a prosperous “pastoral” farming past that never existed. Subtlety and nuance are not the currency of modern science journalism and advocacy lobbying.

shivabarcelona atf
Vandana Shiva

An unwillingness to recognize, let alone embrace, what might be called “innovation with reasonable risk” is not a new phenomenon, as proponents of the telephone, automobile, and computers can attest to. Past critics share the common mistake of exaggerating the disruptions that accompany all innovation and under-appreciating the prosperity often ushered in by disruptive, paradigm-shifting innovation (Juma 2016). This brings us, chronologically, to circa today.

It is desultory enough to see simplistic criticisms associated with an influential environmental organization; what makes this kind of statement so telling is that its perspective is mainstream among many ‘progressive’ groups throughout Europe, North America, and elsewhere. This technological pessimism is reflected in the tone and substance of mainstream media reporting of modern agriculture.

Sustainability Factor

Biotechnology is shaping up as the fundamental building block of innovation in the 2020s. CRISPR and other biotechnology tools are poised to make a tremendous impact on medicine, with gene editing and gene therapy promoting the development of new treatments and cures. As with any new technology, scientists need to apply the technology to confirm its safe use, with regulatory scientists conducting risk assessments that confirm the resulting products are no riskier than existing products. But the most immediate impact of the gene editing revolution is on food and farming and it is already ushering in an era of more sustainable agriculture.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

Challenging the popular narrative in journalism, which has helped shape consumer beliefs, organic, agro-ecological, and regenerative farming techniques may not be the most sustainable approach to feeding a population expanding planet with the smallest ecological footprint while addressing climate-related agricultural challenges.

“Contrary to widespread consumer belief,” writes plant pathologist Dr. Steve Savage, “organic farming is not the best way to farm from an environmental point of view. There are now several cutting-edge agricultural practices which are good for the environment, but difficult or impossible for organic farmers to implement within the constraints of their pre-scientific rules” (Savage 2013).

Among new breeding biotechnologies with environmentally beneficial innovations:

  • GMO crops designed to be grown without tilling, which dramatically limits the release of carbon from the soil (Entine and Randall 2017).
  • Genetically engineered insect and disease resistant crops, from cotton and soybeans to eggplant and papaya, repel pests using natural bacterium, which has resulted in as much as a 90% reduction in chemical usage compared to standard practice when weighted by environmental impact (Perry et al. 2016).
  • GMO and gene edited plant-based foods, such as the Impossible Burger (also Impossible Pork, Fish, etc.) use up to 87% less water, 96% less land, resulting in 89% fewer greenhouse gas emissions, and emit 92% less dead zone-creating nutrient pollution than ground beef from cows (Impossible Burger Impact Report 2019).
  • CRISPR engineered plants engineered with climate-adaptive traits, such as heat tolerance (Yu et al. 2019), drought tolerance (Shi et al. 2017), and salt tolerance (Farhat et al. 2019).
  • Gene editing hardier produce staples (Cremer 2019) that last longer on shelves, with fewer pathogens developing (Chandrasekaran et al. 2016) so that more food makes it from farm to plate, limiting wastage.
  • CRISPR engineered staple crops produce less methane, cattle feed that is easier to digest and can help crops fix more carbon directly (Miller and Jameel 2020).
  • Gene edited plants that enhance nutrition, such as Calyxt soybeans that are engineered to produce a “high oleic” oil with no trans fats and less saturated fat (Calyxt 2020).

This is a non-exhaustive list of the myriad of sustainability benefits ushered in by biotechnological innovation. But these ecologically advanced agricultural products are sparsely reported on by the most influential media sources and face ideological attacks from many nominally mainstream environmental organizations, including Greenpeace, Friends of the Earth, ETC Group, Third World Network, Center for Food Safety, Organic Consumers Association, and the Environmental Working Group—all of which reject the scientific consensus that gene editing, as well as transgenic breeding, are both efficacious and safe.

Rather, these and similar nongovernmental organizations (NGOs) often focus their analysis on the theoretically abstract, however unlikely, unintended consequences these new technologies may (or may not) encourage while ignoring the sustainability benefits that are already being delivered. This is more commonly known as speculative science, where there is no agreed upon theory and no corroborating data.


Calyxt. 2020. One oil for all of your formulation needs,

Chandrasekaran, J., M. Brumin, D. Wolf, D. Leibman, C. Klap, M. Pearlsman, A. Sherman, T. Arazi, and A. Gal‐On. 2016. Development of broad virus resistance in non‐transgenic cucumber using CRISPR/Cas9 technology. Mol Plant Pathol 17:1140–1153

Cremer, J. 2019. Can these apples change the GMO conversation? 15 April 2019

Entine, J. and R. Randall. 2017. GMO sustainability advantage? Glyphosate spurs no-till farming, preserving soil carbon,

Farhat S., N. Jain, N. Singh, R. Sreevathsa, P. K. Dash, R. Rai, S. Yadav, P. Kumar, A. K. Sarkar, A. Jain, N. K. Singh, and V. Rai. 2019. CRISPR-cas 9 directed genome engineering for enhancing salt stress tolerance in rice. Semin Cell and Dev Bio 96: 91–99

Genetic Literacy Project. 2019. Vandana Shiva: ‘Rock Star’ of GMO protest movement has antiscience history,

Impossible Foods. 2019. Impact Report 2019.

Juma, C. 2016. Innovation and Its Enemies: Why People Resist New Technologies. Oxford University Press, Oxford

Miller, L. and A. L. Jameel. 2020. Making real a biotechnology dream: nitrogen-fixing cereal crops. MIT News

O’Brien, R. 2009. The Unhealthy Truth. Penguin Radom House, New York

Perry, E. D., F. Ciliberto, D. A. Hennessy, and G. C. Moschin. 2016. Genetically engineered crops and pesticide use in U.S. maize and soybeans. Science Advances 2 (8): e1600850, doi:10.1126/sciadv.1600850

Pollan, M. 2006. The Omnivore’s Dilemma: A Natural History of Four Meals. Penguin Press, London

Sale, K. 1996. Rebels Against the Future: The Luddites and Their War on the Industrial Revolution: Lessons for the Computer Age. Perseus Publishing, Cambridge, Massachusetts

Savage, S. 2013. Six reasons organic is NOT the most environmentally friendly way to farm,

Sherman, R. J. 2012. Supply Chain Transformation: Practical Roadmap to Best Practice Results John Wiley & Sons, Hoboken, New Jersey

Shi, J., H. Gao, H. Wang, H. R. Lafitte, R. L. Archibald, M. Yang, S. M. Hakimi, H. Mo and J. E. Habben. 2017. ARGOS8 variants generated by CRISPR‐Cas9 improve maize grain yield under field drought stress conditions. Plant Biotechnol J 15 (2): 207–216

Smith J. 2003. Seeds of Deception: Exposing Industry and Government Lies About the Safety of the Genetically Engineered Foods You’re Eating. Yes! Books, Portland, Maine

Smith, C. 2009. Fifty-Three-Year-Old quote still rings true today. Corn South,

van Wulfen, G. 2016. 10 great ideas that were originally rejected. Innovation Excellence

Wadwha, V. 2014. Why we should believe the dreamers and not the experts. Washington Post, 31 July 2014,

Yu, W., L. Wang L, R. Zhao, J. Sheng, S. Zhang, R. Li, and L. Shen. 2019. Knockout of SlMAPK3 enhances tolerance to heat stress involving ROS homeostasis in tomato plants. BMC Plant Biology 19 (354), doi:10.1186/s12870-019-1939-z

Jon Entine is founder and executive director of the Genetic Literacy Project. Jon is also known for his research and writings on corporate social responsibility and environmental sustainability, and was US editor for 15 years of the UK-based publication Ethical Corporation. Follow him on Twitter @JonEntine


In the midst of the coronavirus pandemic, Daniel Defoe’s account of London’s 1665 bubonic plague offers a shock of recognition

Pandemics have punctuated recorded history going back to ancient Greece and Egypt. However, the novel coronavirus pandemic is unfolding in a world that is qualitatively different due to densely-populated cities, long-distance air travel, and modern medicine and genomics. So it is understandable that we assume that our experience of a pandemic in the 21st century could have little in common with that of periods predating antibiotics, vaccines, and the germ theory of disease.

But it is hubris to think that material and technological progress makes our era totally discontinuous with the past, and that the experience of epidemics of plague, cholera, and other diseases that were a regular occurrence until recently has nothing in common with what we are experiencing.

We are in the midst of what Ed Yong of The Atlantic termed a “patchwork pandemic” – characterized by different geographic areas, different population groups, and different historical legacies. While the world awaits the development of an effective vaccine as well as treatments that can tamp down the ravages of a capricious virus, public health officials are exhorting us to rely on the most rudimentary, age-old tools for keeping the virus at bay – wearing a mask, hand-washing, and social distancing and lockdowns – in other words, treating people we don’t know as potential threats. Every virus and every bacterium has its distinct personality, and, yet, looking back at the history of pandemics, the ways in which human societies have responded to the upheaval and terror provoked by a poorly-understood microorganism have striking commonalities.  For this reason, chronicles of disease outbreaks from the past can provoke a shock of recognition.

Daniel Defoe, author of the classic Robinson Crusoe, is also renowned for his classic description of an epidemic, A Journal of the Plague Year. Defoe was five years old when the bubonic plague came to London in 1665. He must have heard stories of the plague as a child, and in 1722 he published a gripping account of life during the plague. His Journal is a sleight-of-hand. Though actually a work of fiction written more than fifty years after the events, it presents itself as a contemporaneous, first-hand, neighborhood-by-neighborhood, eye-witness narrative of what life was like during the “visitation” by the plague. For his chronicle Defoe drew on a small library of contemporaneous accounts.

Frontispiece of original edition of A Journal of the Plague Year, 1722.

Owing to the vividness and immediacy of the narrator’s description of the effects of the epidemic on ordinary people in this street or that neighborhood of the city, the Journal has become the most famous account of the Great Plague of London, displacing contemporaneous accounts by actual witnesses.

A failed businessman-turned-journalist, who started writing fiction later in life, Defoe originated a new style of writing that dispensed with aristocratic literary conventions, relying instead on empiricism and realism. His narrator tells us that his journal is based only on what he has observed directly in his walks about the city, what he has heard from credible persons, and the weekly “bills of mortality” published by the city of London. On occasion, he refers to events which he feels obliged to report but which he can’t vouch for.

As in Robinson Crusoe, the narrator of the Journal finds himself in a situation in which he must summon up all his wits and energy to survive an overpowering, incommensurate threat.  While concerned for his own safety and his business, his single-minded focus is on the impact of the plague on the city of London, which is his protagonist.  We are at his side as he describes what he sees as he moves about the city and provides his coordinates, which would have been familiar to any Londoner of the 18th century – “that is to say, in Bearbinder Lane, near Stocks Market.”  The city’s inhabitants are characterized only insofar as they are affected by the “distemper” and make decisions about how to respond to it.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

At first the plague, which the narrator tells us has come to London from Holland, manifests itself by a few isolated cases in the winter of 1664-65. But in February it flares up in parishes to the west of the city and gradually makes its way eastward, methodically visiting formerly untouched parishes. In the course of a year, it has reached every corner of England. Early in the outbreak the wealthy flee the city with their servants to their country houses, and the narrator, who initially considers fleeing, remarks that there were no horses left in the city.

In spite of his belief in Providence, the narrator emphasizes that transmission of the infection requires close personal contact, often within families, or with contaminated belongings, food, or cargo.  Although the plague may have been sent by a Divine power to punish men for their sins, he makes clear that natural causes are entirely sufficient to account for the spread of the disease and its effects on its victims.

I must be allowed to believe that no one in this whole nation ever received the sickness or infection but who received it in the ordinary way of infection from somebody, or the clothes or touch or stench of somebody who was infected before.1

He describes the high transmissibility of the plague (which we now know to be caused by the bacterium Yersinia pestis, which was spread by fleas that fed on the  black rat) and the unbearable sensitivity and pain caused by its pathognomic feature – buboes – which, he tells us, drove sufferers to throw themselves out of windows or into the Thames.  He also notes that asymptomatic cases could spread the infection and that the disease can manifest differently in different people. Houses where people took sick were shut up and padlocked by order of the magistrate, and watchmen were posted outside day and night to insure – not always successfully – that the imprisoned could not escape. The narrator describes the pitiful cries that were heard from the street as family members discovered that a loved one had succumbed to the plague.  Others, he tells us, died in the street. The bodies of the deceased were collected at night and taken to pits dug in churchyards or in open lots and buried en masse.

Defoe’s narrator describes the desperate condition of the poor, who, thrown out of work, could not buy food or other necessities for their families.  In this situation, he tells us, they had no choice but to perform the most dangerous jobs created by the epidemic – tending to the sick and collecting and burying the dead. He notes that “the plague, which raged in a dreadful manner from the middle of August to the middle of October, carried off in that time thirty or forty thousand of these very people.” Once the plague makes itself felt, the common people, who are keenly attuned to astrological signs and portents, are desperate to ward off its spread and chase after an abundant array of fake cures and elixirs:

[The common people] … were now led by their fright  to extremes of folly; … they ran to conjurors and witches, and all sorts of deceivers, to know what should become of them (who fed their fears, and kept them always alarmed and awake on purpose to delude them and pick their pockets) so they were as mad upon their running after quacks and mountebanks, and every practicing old woman, for medicines and remedies; storing themselves with such multitudes of pills, potions, and preservatives, as they were called, that they not only spent their money but even poisoned themselves beforehand for fear of the poison of the infection.2

Throughout his chronicle, the narrator anxiously scrutinizes the weekly “bills of mortality” published for each parish to gauge the progress of the infection.  He knows from observing what is going on around him that the numbers of deaths attributed to the plague are grievously under-reported due to relatives’ fear of being stigmatized and the authorities’ connivance.  He tries to judge the true magnitude of the deaths from the plague by examining increases in other causes of death and by comparing the overall death rate in a parish before the outbreak to the numbers when the plague was present all around him.

Bill of mortality for a week summarizing deaths from all London parishes.

Deaths increased at an extraordinary rate in July and August, reaching a peak in September when, we are told, each week there were 8,000 or 9,000 deaths from the plague, and this he considers an under-estimate.  Thereafter, the deaths began to decline precipitously, and the outbreak abated. According the bills of mortality, 68,590 people died of the plague, while the narrator claims that the total reached 100,000.  This amounts to twenty percent of the population.

Graph of weekly deaths from the plague, 1665-66, shows the peak occurring in September. From Samuel Pepys’s Diary.

Looking back at the events after a half-century, Defoe knew the outcome of the “visitation,” and his account has a taught unity of time and place.  Here in the United States, five months into the SARS-CoV-2 pandemic, many observers are watching in horror and disbelief as the virus spirals out of control in states in the South and West that failed to take it seriously and refused to learn from the experience of other states, and countries, that succeeded in bringing their outbreaks under control.  At the same time, some states and countries that successfully broke the chain of transmission, are experiencing resurgences.

ft july confirmed cases

For all the differences between the London plague and our pandemic, there are striking commonalities. As in Defoe’s London, the poor and the weak are disproportionately exposed to the coronavirus in low-income neighborhoods, close living-quarters, and low-wage jobs.  As in Defoe’s London, many people refuse to follow common-sense precautions, instead falling for quackery and scientifically unsupported treatments. As in London, statistics regarding the number of cases and deaths from Covid-19 are manipulated and misinterpreted to suit the narrative of different parties.

On the most basic level, what we share with the London outbreak is the massive, sudden upsurge in morbidity and mortality, and the inescapable sense that a pathogen is beyond our control. In London of 1665 the bills of mortality were widely distributed so that residents could know the situation in their parish, and neighbors shared the latest news of who had fallen ill and died.  In 2020, normal life has been replaced by a profusion of images, charts, statistics, and stories, which have flooded the media since March.  These convey the fever chart of the epidemic in different places together with stories of intensive care units stretched to capacity and the bios of individuals who have been lost.  At the same time, we are subjected to a constant flow of interviews with health care workers, epidemiologists, and public health officials, who interpret the day-to-day trends in an effort to explain where things are headed.

Although we pride ourselves on being modern and are used to thinking that we have control over our lives, at the present moment we have to admit that we have no idea of how this “visitation” will play itself out.

  1. Daniel Defoe, A Journal of the Plague Year, Penguin edition, 1966, p. 206.
  2. A Journal, p. 30.

Geoffrey Kabat is an epidemiologist and the author, most recently of Getting Risk Right: Understanding the Science of Elusive Health Risks. Geoffrey can be found on Twitter @GeoKabat


Eerily similar? Examining fates of the rich and poor during COVID-19 and 14th century Black Death pandemics

The coronavirus can infect anyone, but recent reporting has shown your socioeconomic status can play a big role, with a combination of job security, access to health care and mobility widening the gap in infection and mortality rates between rich and poor.

The wealthy work remotely and flee to resorts or pastoral second homes, while the urban poor are packed into small apartments and compelled to keep showing up to work.

As a medievalist, I’ve seen a version of this story before.

Following the 1348 Black Death in Italy, the Italian writer Giovanni Boccaccio wrote a collection of 100 novellas titled, “The Decameron.” These stories, though fictional, give us a window into medieval life during the Black Death – and how some of the same fissures opened up between the rich and the poor. Cultural historians today see “The Decameron” as an invaluable source of information on everyday life in 14th-century Italy.

Boccaccio was born in 1313 as the illegitimate son of a Florentine banker. A product of the middle class, he wrote, in “The Decameron,” stories about merchants and servants. This was unusual for his time, as medieval literature tended to focus on the lives of the nobility.

waterhouse decameron“The Decameron” begins with a gripping, graphic description of the Black Death, which was so virulent that a person who contracted it would die within four to seven days. Between 1347 and 1351, it killed between 40% and 50% of Europe’s population. Some of Boccaccio’s own family members died.

In this opening section, Boccaccio describes the rich secluding themselves at home, where they enjoy quality wines and provisions, music and other entertainment. The very wealthiest – whom Boccaccio describes as “ruthless” – deserted their neighborhoods altogether, retreating to comfortable estates in the countryside, “as though the plague was meant to harry only those remaining within their city walls.”

Meanwhile, the middle class or poor, forced to stay at home, “caught the plague by the thousand right there in their own neighborhood, day after day” and swiftly passed away. Servants dutifully attended to the sick in wealthy households, often succumbing to the illness themselves. Many, unable to leave Florence and convinced of their imminent death, decided to simply drink and party away their final days in nihilistic revelries, while in rural areas, laborers died “like brute beasts rather than human beings; night and day, with never a doctor to attend them.”

file yyd t
Josse Lieferinxe’s ‘Saint Sebastian Interceding for the Plague Stricken’ (c. 1498). Credit: Wikimedia Commons

After the bleak description of the plague, Boccaccio shifts to the 100 stories. They’re narrated by 10 nobles who have fled the pallor of death hanging over Florence to luxuriate in amply stocked country mansions. From there, they tell their tales.

One key issue in “The Decameron” is how wealth and advantage can impair people’s abilities to empathize with the hardships of others. Boccaccio begins the forward with the proverb, “It is inherently human to show pity to those who are afflicted.” Yet in many of the tales he goes on to present characters who are sharply indifferent to the pain of others, blinded by their own drives and ambition.

In one fantasy story, a dead man returns from hell every Friday and ritually slaughters the same woman who had rejected him when he was alive. In another, a widow fends off a leering priest by tricking him into sleeping with her maid. In a third, the narrator praises a character for his undying loyalty to his friend when, in fact, he has profoundly betrayed that friend over many years.

Humans, Boccaccio seems to be saying, can think of themselves as upstanding and moral – but unawares, they may show indifference to others. We see this in the 10 storytellers themselves: They make a pact to live virtuously in their well-appointed retreats. Yet while they pamper themselves, they indulge in some stories that illustrate brutality, betrayal and exploitation.

Boccaccio wanted to challenge his readers, and make them think about their responsibilities to others. “The Decameron” raises the questions: How do the rich relate to the poor during times of widespread suffering? What is the value of a life?

In our own pandemic, with millions unemployed due to a virus that has killed thousands, these issues are strikingly relevant.

Kathryn McKinley is a professor of English at the University of Maryland. Her research and teaching interests include Chaucer; Ovid, Boccaccio, and late medieval vernacularity; medieval visual literacy and material culture; and the history of later medieval European and English food culture, food scarcity, and famine. She has published in such journals as The Chaucer Review, Viator, and English Manuscript Studies 1100-1700. 

A version of this article was originally published at the Conversation and has been republished here with permission. The Conversation can be found on Twitter @ConversationUS

mag insects image superjumbo v

Disaster interrupted: Which farming system better preserves insect populations: Organic or conventional?

A three-year run of fragmentary Armageddon-like studies had primed the journalism pumps and settled the media framing about the future of the global insect population: modern agriculture was steering us toward catastrophe.

But scientists remained queasy about what they increasingly came to believe was a simplistic narrative. None of the studies reaching ‘disaster conclusions’ was comprehensive. All were steeped in assumptions that could radically skew the data. Most of the world’s insect population centers were not even studied. And the declines were far from uniform. In some localities, there were reports of increases in overall insect population, and some types of insects are increasing in abundance across the world.

[Editor’s note: This is part two of a two-part series examining the “Insect Armageddon” narrative. Read part one, Are we facing an ‘Insect Apocalypse’ caused by ‘intensive, industrial’ farming and agricultural chemicals? The media say yes; Science says ‘no]

Which brings us to the 2020 meta-study of 166 long-term surveys by Roel van Klink at the German Center for Integrative Biology and his team of 30 scientists. For the first time, scientists had a full platter of studies, covering much of the world. Here was data that might answer questions that by now had turned highly ideological.

roel van klink
Roel van Klink

The few journalists who picked up on the study’s release noted the finding that insect declines were far less than reported in the smaller-scale studies, and indeed, no catastrophe was imminent. In fact, freshwater insects like mayflies and dragonflies actually have increased over the years, they found, and insect declines in the US, especially in Midwest agricultural areas, began leveling off at the turn of the century.

That doesn’t mean there isn’t a real and significant problem, as van Klink took pains to point out—he called the situation “awfully alarming.” But the difference between a “hair on fire” apocalypse and a serious problem is that there is time to get a better understanding of the causes and, hopefully, make rational decisions to constructively address them.

And it was precisely on the question of causation that the new study fundamentally challenged the “accepted narrative” that modern agriculture and the overuse of pesticides are driving the observed declines.

Effects of modern agriculture

Van Klink’s finding that “crop cover,” which is the phrase he uses to describe farmland, is correlated with increases in insect populations runs directly contrary to the speculations—more often than not presented as fact—that modern farming, especially the use of GMOs and pesticides, is the problem.

The second bugaboo, climate change, also didn’t appear on the suspect list; there was simply no correlation, positive or negative. The primary driver was urbanization, most likely due to the destruction of natural habitat as swamps are drained, rivers channelized, woodlands cleared and land is paved over for housing developments, roadways and shopping malls.

We found moderate evidence for a negative relationship between terrestrial insect abundance trends and landscape-scale urbanization, potentially explained by habitat loss and light and/or chemical pollution associated with urbanization. By contrast, insect abundance trends were positively associated with crop cover at the local (but not landscape) scale in both realms. Specifically, in the terrestrial realm, temporal trends became less negative with increasing crop cover …

Of course, the positive association between agriculture and insect population increases applies to existing fields, not forest or natural grassland cleared for cultivation. As van Klink has pointed out in interviews, the conversion of land to accommodate more farming would also destroy habitat.

But that is exactly the point if sustainability is the key: using technology to boost yields on existing cropland—growing more food on less land—is the most important action we can take to protect habitat and biodiversity.

And that’s what’s been happening. In a 2013 paper titled “Peak Farmland and the Prospect for Land Sparing,” three scholars at Rockefeller University calculated that global increases in crop yields as the result of advanced technologies, including genetic engineering, meant it took about one-third the amount of land in 2010 to grow the same amount of food as in 1961.

The graphs below, taken from the paper, highlight an event that has since been replicated around the world: after World War II total agricultural production, which until then had been largely circumscribed by the amount of land under cultivation, began a steep ascent as farming entered the modern era.


[To see this process unfold in time, check out the animated charts on crop yields at Our World in Data.]

The boom happened almost simultaneously across the world, from rice in China to wheat in France and Egypt.


The spur for these dramatic productivity gains is no mystery. After World War II, many of the key agricultural inputs—particularly modern pesticides, synthetic fertilizers, and advanced hybrid crops—came online in a major way. The rise accelerated with the advent of the Green Revolution in the early 1960s, and began to be widely dispersed around the world, rescuing  many countries, such as India, from the brink of mass starvation.

It is this unprecedented historic decoupling of production from land—what has become known as intensive agriculture—that so many in the environmental movement demonize and seek to reverse. One of their central claims: intensive farming is the primary culprit driving biodiversity loss and insect declines.

Yet, a careful look at the data shows the narrative touting small-scale organic-focused farming as a necessary alternative is outdated, even reactionary, writes Ted Nordhaus at the Breakthrough Institute:

Low-productivity food systems have devastating impacts on the environment. As much as three-quarters of all deforestation globally occurred prior to the Industrial Revolution, almost entirely due to two related uses, clearing land for agriculture and using wood for energy.

… attempting to feed a world of seven-going-on-nine billion people with a preindustrial food system would almost certainly result in a massive expansion of human impacts through accelerated conversion of forests, grasslands, and other habitat to cropland and pasture.

… we need to accelerate the long-term processes of growing more food on less land. … raising yields while reducing environmental impacts will require that we farm with ever-greater precision. Raising yields through greater application of technology has often meant more pesticides, fertilizer, and water. But as technology has improved, these trends have begun to reverse.

The organic deficit

The charm of farmer’s markets, Nordhaus writes, is not enough to abandon a system that is limiting land use to counter the effects of urbanization and driving down chemical toxicity levels. It should be noted that organic farming yields on average 10-40 percent less than non-GMO farming, which in turn is about 15 percent less productive than farms using advanced biotechnology. A recent study by the organic advocacy group IDDRI found that if Europe were to adopt agroecological food production practices, productivity would decrease by an average of 35 percent—meaning 35 percent more organic-cultivated land would be needed to produce the same amount of food as produced conventionally.

unnamed file
Organic agriculture uses more land than no-till intensive agriculture, seen here. Credit: Shutterstock

The math of land saving through the use of modern technologies is so compelling and the yield deficits of organic production so thoroughly cataloged that they can’t be gainsaid. Anti-technology advocates generally prefer to avoid the topic altogether, focusing instead on Goulson-style claims about the adverse effects of chemical pesticides and ignoring organic farmers’ reliance on mechanical plowing using carbon-belching equipment as a form of weed control, which is massively destructive to soil health and biodiversity, and is a major contributor to carbon pollution.

The major sustainability contribution of conventional agriculture is the advent of no-till farming, which began with the use of chemical herbicides like atrazine and accelerated with the debut in 1996 of herbicide-tolerant GMO crops tied to glyphosate. GMO no-till farming has resulted in a massive reduction in carbon release estimated at 37 percent by the Belgian research institute VIB.

The turn away from efficient, intensive agriculture to accommodate the ideological fashion of our times could be a disaster for the fragile insect population. Population growth and growing affluence in the developing world over coming decades will require a sharp increase in necessary food calories, which can only occur by expanding farmable acreage—or by increasing yields on currently available acres.

All of these facts make the German meta-study very uncomfortable for organic farming advocates. The correlation in insect population increases with crops challenges the widespread damage to biodiversity they have been claiming. That may be why most of the major media reporting on the study, such as the BBC, simply ignored the finding, while others—Guardian, Reuters, Smithsonian—included swipes at pesticides not raised by the study authors and written in such a way that the average reader would assume it was backed up by research.

How fast is the decline? How real is the decline?

Trying to determine a global rate of decline, when the data is so uneven and, as the authors say, almost all effects are local and variations are so high even among adjacent sites, is fraught with difficulty. Nevertheless, the new study pins the rate of decline of land-based insects at just under one percent annually, which translates to an 8.3 percent decline per decade.

The study authors note some questions about the scope of the global decline, explaining that it was heavily influenced by what they term “outlier” studies with anomalously high findings. If these outliers are excluded, they say, insect populations would decline by far less, about 15 percent over 25 years.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

This too is not good, but it’s not an apocalypse; and there is time to turn things around even if the estimated trends are accurate. That hopeful take is actually supported by another important, though largely ignored, finding in the study, which is that the terrestrial insect declines in North America were no longer negative after 2000 and freshwater insects increased dramatically.

The fact that North American trends began plateauing or improving around 20 years ago suggests we are headed in the right direction in what had been up until then, according to the authors, the worst performing part of the world. Statistically speaking, once North American data was excluded, the study states that there was only “weak evidence for a negative mean tend” in global populations.

Geography and models

We all naturally gravitate to the headline numbers coming out of these studies. They’re simple, easy to remember and give us a sense of concreteness. Unfortunately, they are probably the least reliable and meaningful findings of all. If the ongoing COVID-19 pandemic has taught us anything, it’s that we should understand complex statistical modelling for what it is: a hypothesis generator or a sophisticated “best guess,” given current knowledge that may, as more facts come to light, prove to be anything from fairly close to wildly off the mark.

All one has to do is look at the maps of the geographic distribution of the studies included in van Klink’s analysis to realize just how problematic any conclusion about global trends is, considering the lack of data from most of the world. The vast majority of studies came from North America and Europe (by my count, almost 2/3 of all the studies).

screen shot at pm

There is a total of two studies from all of Africa, relatively few from Asia, and none at all from South Asia (India, Pakistan and Bangladesh). There is a single study from the Amazon, one of the richest sources of insect life on the planet.

These gaps are magnified by the fact that most of these studies concern only one specific order or family of insect, or some other sub-division (e.g. parasitoid wasps). But we know that different insect species vary enormously in their response to changes in climate, weather, disease, pollution and habitat destruction. It simply isn’t plausible that a model can compensate for what is, unfortunately, a massive quantity of unknowns, including, to borrow a phrase, many unknown unknowns, when it comes to insect population trends. Simply said, the likelihood of sampling error is immense.

It should be emphasized that none of this is to take away from the prodigious work of the van Klink research team. Almost all the criticisms outlined here are acknowledged and discussed by the authors themselves.

One of the most refreshing aspects of this study, in fact, has been the humility with which this team, which has done some of the best and most thorough work yet trying to establish global insect trends, has presented their results. In an article accompanying the study in Science, addressed to researchers not associated with the project, the team points the way forward for others in this field, and indeed in any scientific endeavor.

Advances in our knowledge about ongoing biodiversity changes and ability to predict future ones will require the incorporation of layers of nuance in patterns of change and drivers of that change.

The temptation to draw overly simple and sensational conclusions is understand­able, because it captures the attention of the public and can potentially catalyze much needed action in policy development and research arenas. However, fear-based mes­sages often backfire. This strategy has the grave risk of undermining trust in science and can lead to denialism, fatigue, and apa­thy. Embracing nuance allows us to balance accurate reporting of worrying losses with hopeful examples of wins. Hope is a more powerful engine of change than fear.

Jon Entine is Executive Director of the Genetic Literacy Project and a life-long journalist with 20 major journalism awards. Follow him on Twitter @JonEntine