The first drug against HIV brought dying patients back from the brink. But as excited doctors raced to get the miracle drug to new patients, the miracle melted away. In each and every patient, the drug only worked only for a while.
It turned out the drug was very good at killing the virus, but the virus was even better at evolving resistance to the drug. A spontaneous mutation in the virus’ genetic material prevented the drug from doing its work, and so the mutant viruses were able to replicate wildly despite the drug, making the patients sick again. It took another decade before scientists found evolution-proof therapies.
Could the same thing happen to a COVID-19 vaccine? Could a vaccine that is safe and effective in initial trials go on to fail because the virus evolves its way out of trouble? As evolutionarymicrobiologists who have studied a poultry virus that has evolved resistance to two different vaccines, we know such an outcome is possible. We also think we know what it takes to stop it. COVID-19 vaccines could fail – but if they have certain properties, they won’t.
History of vaccine resistance
For the most part, humanity has been lucky: Most human vaccines have not been undermined by microbial evolution.
For instance, the smallpox virus was eradicated because it never found a way to evolve around the smallpox vaccine, and no strain of the measles virus has ever arisen that can beat the immunity triggered by the measles vaccine.
But there is one exception. A bacterium that causes pneumonia managed to evolve resistance against a vaccine. Developing and replacing that vaccine with another was expensive and time-consuming, with seven years between the initial emergence of resistant strains and the licensing of the new vaccine.
If SARS-CoV-2 evolves in response to a COVID vaccine, there are several directions it could take. The most obvious is what happens with the flu virus. Immunity works when antibodies or immune cells bind to molecules on the surface of the virus. If mutations in those molecules on the surface of the virus change, antibodies can’t grab on to them as tightly and the virus is able to escape. This process explains why the seasonal flu vaccine needs updating each year. If this happens, a COVID vaccine would need frequent updating.
But evolution might head off in other directions. It would be better for human health, for example, if the virus evolves a stealth mode, perhaps by reproducing slowly or hiding in organs where immunity is less active. Many pathogens that cause barely noticeable chronic infections have taken this tack. They avoid detection because they do not cause acute disease.
Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.
A more dangerous avenue would be if the virus evolved a way to replicate more quickly than the immunity generated by the vaccine. Another strategy would be for the virus to target the immune system and dampen vaccine-induced immunity.
Many microbes can survive inside the human body because of their exquisite ability to interfere with our immune systems. If SARS-CoV-2 has ways of even partially disabling human immunity, a COVID vaccine could favor mutants that do it even better.
Before COVID came along, the two of us compared vaccines that keep working with vaccines that have been undermined by pathogen evolution.
It turns out that truly evolution-proof vaccines have three features. First, they are highly effective at suppressing viral replication. This stops further transmission. No replication, no transmission, no evolution.
Second, evolution-proof vaccines induce immune responses that attack several different parts of the microbe at the same time. It is easy for a single part of the virus to mutate and escape being targeted. But if many sites are attacked at once, immune escape requires many separate escape mutations to occur simultaneously, which is almost impossible. This has already been shown in the laboratory for SARS-CoV-2. There the virus rapidly evolved resistance to antibodies targeting a single site, but struggled to evolve resistance to a cocktail of antibodies each targeting multiple different sites.
Third, evolution-proof vaccines protect against all circulating strains, so that no others can fill the vacuum when competitors are removed.
Will a COVID vaccine be evolution-proof?
Around 200 COVID vaccine candidates are at various stages of development. It is too soon to know how many of them have those evolution-proofing features.
Fortunately we don’t need to wait until a licensed vaccine fails to find out. A bit of extra effort during vaccine trials can go a long way to working out whether a vaccine will be evolution-proof. By swabbing people who have received the experimental vaccine, scientists can tell how far virus levels are suppressed. By analyzing the genome of any virus in vaccinated people, it might be possible to see evolutionary escape in action. And by taking blood from vaccinees, we can work out in the lab how many sites on the virus are being attacked by vaccine-induced immunity.
Clearly, the world needs COVID vaccines. We believe it is important to pursue those that will keep working. Likely, many candidates in the current portfolio will. Let’s work out which those are in clinical trials and go with them. Vaccines that provide only temporary relief leave people vulnerable and take time and money to swap out. They may also negate other vaccines should viruses evolve that are resistant to several vaccines at once.
Today, the world has insecticide-resistant mosquitoes and crop pests, herbicide-resistant weeds, and an antibiotic resistance crisis. No need for history to repeat itself.
Andrew Read is the Evan Pugh University Professor of Biology and Entomology and Director of the Huck Institutes of the Life Sciences at Penn State. His team works on virulence and infectiousness, adaptation to new hosts, vaccine failure, and drug and insecticide resistance.
David Kennedy is an Assistant Professor of Biology at Penn State. Research in the Kennedy lab focuses on gaining a mechanistic understanding of disease ecology that can be used to understand pathogen evolution. Find David on Twitter @dkenned11
A version of this article was originally posted at the Conversation and has been reposted here with permission. The Conversation can be found on Twitter @ConversationUS
Pandemics have punctuated recorded history going back to ancient Greece and Egypt. However, the novel coronavirus pandemic is unfolding in a world that is qualitatively different due to densely-populated cities, long-distance air travel, and modern medicine and genomics. So it is understandable that we assume that our experience of a pandemic in the 21st century could have little in common with that of periods predating antibiotics, vaccines, and the germ theory of disease.
But it is hubris to think that material and technological progress makes our era totally discontinuous with the past, and that the experience of epidemics of plague, cholera, and other diseases that were a regular occurrence until recently has nothing in common with what we are experiencing.
We are in the midst of what Ed Yong of The Atlantic termed a “patchwork pandemic” – characterized by different geographic areas, different population groups, and different historical legacies. While the world awaits the development of an effective vaccine as well as treatments that can tamp down the ravages of a capricious virus, public health officials are exhorting us to rely on the most rudimentary, age-old tools for keeping the virus at bay – wearing a mask, hand-washing, and social distancing and lockdowns – in other words, treating people we don’t know as potential threats. Every virus and every bacterium has its distinct personality, and, yet, looking back at the history of pandemics, the ways in which human societies have responded to the upheaval and terror provoked by a poorly-understood microorganism have striking commonalities. For this reason, chronicles of disease outbreaks from the past can provoke a shock of recognition.
Daniel Defoe, author of the classic Robinson Crusoe, is also renowned for his classic description of an epidemic, A Journal of the Plague Year. Defoe was five years old when the bubonic plague came to London in 1665. He must have heard stories of the plague as a child, and in 1722 he published a gripping account of life during the plague. His Journal is a sleight-of-hand. Though actually a work of fiction written more than fifty years after the events, it presents itself as a contemporaneous, first-hand, neighborhood-by-neighborhood, eye-witness narrative of what life was like during the “visitation” by the plague. For his chronicle Defoe drew on a small library of contemporaneous accounts.
Owing to the vividness and immediacy of the narrator’s description of the effects of the epidemic on ordinary people in this street or that neighborhood of the city, the Journal has become the most famous account of the Great Plague of London, displacing contemporaneous accounts by actual witnesses.
A failed businessman-turned-journalist, who started writing fiction later in life, Defoe originated a new style of writing that dispensed with aristocratic literary conventions, relying instead on empiricism and realism. His narrator tells us that his journal is based only on what he has observed directly in his walks about the city, what he has heard from credible persons, and the weekly “bills of mortality” published by the city of London. On occasion, he refers to events which he feels obliged to report but which he can’t vouch for.
As in Robinson Crusoe, the narrator of the Journal finds himself in a situation in which he must summon up all his wits and energy to survive an overpowering, incommensurate threat. While concerned for his own safety and his business, his single-minded focus is on the impact of the plague on the city of London, which is his protagonist. We are at his side as he describes what he sees as he moves about the city and provides his coordinates, which would have been familiar to any Londoner of the 18th century – “that is to say, in Bearbinder Lane, near Stocks Market.” The city’s inhabitants are characterized only insofar as they are affected by the “distemper” and make decisions about how to respond to it.
Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.
At first the plague, which the narrator tells us has come to London from Holland, manifests itself by a few isolated cases in the winter of 1664-65. But in February it flares up in parishes to the west of the city and gradually makes its way eastward, methodically visiting formerly untouched parishes. In the course of a year, it has reached every corner of England. Early in the outbreak the wealthy flee the city with their servants to their country houses, and the narrator, who initially considers fleeing, remarks that there were no horses left in the city.
In spite of his belief in Providence, the narrator emphasizes that transmission of the infection requires close personal contact, often within families, or with contaminated belongings, food, or cargo. Although the plague may have been sent by a Divine power to punish men for their sins, he makes clear that natural causes are entirely sufficient to account for the spread of the disease and its effects on its victims.
I must be allowed to believe that no one in this whole nation ever received the sickness or infection but who received it in the ordinary way of infection from somebody, or the clothes or touch or stench of somebody who was infected before.1
He describes the high transmissibility of the plague (which we now know to be caused by the bacterium Yersinia pestis, which was spread by fleas that fed on the black rat) and the unbearable sensitivity and pain caused by its pathognomic feature – buboes – which, he tells us, drove sufferers to throw themselves out of windows or into the Thames. He also notes that asymptomatic cases could spread the infection and that the disease can manifest differently in different people. Houses where people took sick were shut up and padlocked by order of the magistrate, and watchmen were posted outside day and night to insure – not always successfully – that the imprisoned could not escape. The narrator describes the pitiful cries that were heard from the street as family members discovered that a loved one had succumbed to the plague. Others, he tells us, died in the street. The bodies of the deceased were collected at night and taken to pits dug in churchyards or in open lots and buried en masse.
Defoe’s narrator describes the desperate condition of the poor, who, thrown out of work, could not buy food or other necessities for their families. In this situation, he tells us, they had no choice but to perform the most dangerous jobs created by the epidemic – tending to the sick and collecting and burying the dead. He notes that “the plague, which raged in a dreadful manner from the middle of August to the middle of October, carried off in that time thirty or forty thousand of these very people.” Once the plague makes itself felt, the common people, who are keenly attuned to astrological signs and portents, are desperate to ward off its spread and chase after an abundant array of fake cures and elixirs:
[The common people] … were now led by their fright to extremes of folly; … they ran to conjurors and witches, and all sorts of deceivers, to know what should become of them (who fed their fears, and kept them always alarmed and awake on purpose to delude them and pick their pockets) so they were as mad upon their running after quacks and mountebanks, and every practicing old woman, for medicines and remedies; storing themselves with such multitudes of pills, potions, and preservatives, as they were called, that they not only spent their money but even poisoned themselves beforehand for fear of the poison of the infection.2
Throughout his chronicle, the narrator anxiously scrutinizes the weekly “bills of mortality” published for each parish to gauge the progress of the infection. He knows from observing what is going on around him that the numbers of deaths attributed to the plague are grievously under-reported due to relatives’ fear of being stigmatized and the authorities’ connivance. He tries to judge the true magnitude of the deaths from the plague by examining increases in other causes of death and by comparing the overall death rate in a parish before the outbreak to the numbers when the plague was present all around him.
Deaths increased at an extraordinary rate in July and August, reaching a peak in September when, we are told, each week there were 8,000 or 9,000 deaths from the plague, and this he considers an under-estimate. Thereafter, the deaths began to decline precipitously, and the outbreak abated. According the bills of mortality, 68,590 people died of the plague, while the narrator claims that the total reached 100,000. This amounts to twenty percent of the population.
Looking back at the events after a half-century, Defoe knew the outcome of the “visitation,” and his account has a taught unity of time and place. Here in the United States, five months into the SARS-CoV-2 pandemic, many observers are watching in horror and disbelief as the virus spirals out of control in states in the South and West that failed to take it seriously and refused to learn from the experience of other states, and countries, that succeeded in bringing their outbreaks under control. At the same time, some states and countries that successfully broke the chain of transmission, are experiencing resurgences.
For all the differences between the London plague and our pandemic, there are striking commonalities. As in Defoe’s London, the poor and the weak are disproportionately exposed to the coronavirus in low-income neighborhoods, close living-quarters, and low-wage jobs. As in Defoe’s London, many people refuse to follow common-sense precautions, instead falling for quackery and scientifically unsupported treatments. As in London, statistics regarding the number of cases and deaths from Covid-19 are manipulated and misinterpreted to suit the narrative of different parties.
On the most basic level, what we share with the London outbreak is the massive, sudden upsurge in morbidity and mortality, and the inescapable sense that a pathogen is beyond our control. In London of 1665 the bills of mortality were widely distributed so that residents could know the situation in their parish, and neighbors shared the latest news of who had fallen ill and died. In 2020, normal life has been replaced by a profusion of images, charts, statistics, and stories, which have flooded the media since March. These convey the fever chart of the epidemic in different places together with stories of intensive care units stretched to capacity and the bios of individuals who have been lost. At the same time, we are subjected to a constant flow of interviews with health care workers, epidemiologists, and public health officials, who interpret the day-to-day trends in an effort to explain where things are headed.
Although we pride ourselves on being modern and are used to thinking that we have control over our lives, at the present moment we have to admit that we have no idea of how this “visitation” will play itself out.
Daniel Defoe, A Journal of the Plague Year, Penguin edition, 1966, p. 206.
The coronavirus can infect anyone, but recent reporting has shown your socioeconomic status can play a big role, with a combination of job security, access to health care and mobility widening the gap in infection and mortality rates between rich and poor.
Following the 1348 Black Death in Italy, the Italian writer Giovanni Boccaccio wrote a collection of 100 novellas titled, “The Decameron.” These stories, though fictional, give us a window into medieval life during the Black Death – and how some of the same fissures opened up between the rich and the poor. Cultural historians today see “The Decameron” as an invaluable source of information on everyday life in 14th-century Italy.
Boccaccio was born in 1313 as the illegitimate son of a Florentine banker. A product of the middle class, he wrote, in “The Decameron,” stories about merchants and servants. This was unusual for his time, as medieval literature tended to focus on the lives of the nobility.
“The Decameron” begins with a gripping, graphic description of the Black Death, which was so virulent that a person who contracted it would die within four to seven days. Between 1347 and 1351, it killed between 40% and 50% of Europe’s population. Some of Boccaccio’s own family members died.
In this opening section, Boccaccio describes the rich secluding themselves at home, where they enjoy quality wines and provisions, music and other entertainment. The very wealthiest – whom Boccaccio describes as “ruthless” – deserted their neighborhoods altogether, retreating to comfortable estates in the countryside, “as though the plague was meant to harry only those remaining within their city walls.”
Meanwhile, the middle class or poor, forced to stay at home, “caught the plague by the thousand right there in their own neighborhood, day after day” and swiftly passed away. Servants dutifully attended to the sick in wealthy households, often succumbing to the illness themselves. Many, unable to leave Florence and convinced of their imminent death, decided to simply drink and party away their final days in nihilistic revelries, while in rural areas, laborers died “like brute beasts rather than human beings; night and day, with never a doctor to attend them.”
After the bleak description of the plague, Boccaccio shifts to the 100 stories. They’re narrated by 10 nobles who have fled the pallor of death hanging over Florence to luxuriate in amply stocked country mansions. From there, they tell their tales.
One key issue in “The Decameron” is how wealth and advantage can impair people’s abilities to empathize with the hardships of others. Boccaccio begins the forward with the proverb, “It is inherently human to show pity to those who are afflicted.” Yet in many of the tales he goes on to present characters who are sharply indifferent to the pain of others, blinded by their own drives and ambition.
In one fantasy story, a dead man returns from hell every Friday and ritually slaughters the same woman who had rejected him when he was alive. In another, a widow fends off a leering priest by tricking him into sleeping with her maid. In a third, the narrator praises a character for his undying loyalty to his friend when, in fact, he has profoundly betrayed that friend over many years.
Humans, Boccaccio seems to be saying, can think of themselves as upstanding and moral – but unawares, they may show indifference to others. We see this in the 10 storytellers themselves: They make a pact to live virtuously in their well-appointed retreats. Yet while they pamper themselves, they indulge in some stories that illustrate brutality, betrayal and exploitation.
Boccaccio wanted to challenge his readers, and make them think about their responsibilities to others. “The Decameron” raises the questions: How do the rich relate to the poor during times of widespread suffering? What is the value of a life?
In our own pandemic, with millions unemployed due to a virus that has killed thousands, these issues are strikingly relevant.
Kathryn McKinley is a professor of English at the University of Maryland. Her research and teaching interests include Chaucer; Ovid, Boccaccio, and late medieval vernacularity; medieval visual literacy and material culture; and the history of later medieval European and English food culture, food scarcity, and famine. She has published in such journals as The Chaucer Review, Viator, and English Manuscript Studies 1100-1700.
A version of this article was originally published at the Conversation and has been republished here with permission. The Conversation can be found on Twitter @ConversationUS
A three-year run of fragmentary Armageddon-like studies had primed the journalism pumps and settled the media framing about the future of the global insect population: modern agriculture was steering us toward catastrophe.
But scientists remained queasy about what they increasingly came to believe was a simplistic narrative. None of the studies reaching ‘disaster conclusions’ was comprehensive. All were steeped in assumptions that could radically skew the data. Most of the world’s insect population centers were not even studied. And the declines were far from uniform. In some localities, there were reports of increases in overall insect population, and some types of insects are increasing in abundance across the world.
Which brings us to the 2020 meta-study of 166 long-term surveys by Roel van Klink at the German Center for Integrative Biology and his team of 30 scientists. For the first time, scientists had a full platter of studies, covering much of the world. Here was data that might answer questions that by now had turned highly ideological.
The few journalists who picked up on the study’s release noted the finding that insect declines were far less than reported in the smaller-scale studies, and indeed, no catastrophe was imminent. In fact, freshwater insects like mayflies and dragonflies actually have increased over the years, they found, and insect declines in the US, especially in Midwest agricultural areas, began leveling off at the turn of the century.
That doesn’t mean there isn’t a real and significant problem, as van Klink took pains to point out—he called the situation “awfully alarming.” But the difference between a “hair on fire” apocalypse and a serious problem is that there is time to get a better understanding of the causes and, hopefully, make rational decisions to constructively address them.
And it was precisely on the question of causation that the new study fundamentally challenged the “accepted narrative” that modern agriculture and the overuse of pesticides are driving the observed declines.
Effects of modern agriculture
Van Klink’s finding that “crop cover,” which is the phrase he uses to describe farmland, is correlated with increases in insect populations runs directly contrary to the speculations—more often than not presented as fact—that modern farming, especially the use of GMOs and pesticides, is the problem.
The second bugaboo, climate change, also didn’t appear on the suspect list; there was simply no correlation, positive or negative. The primary driver was urbanization, most likely due to the destruction of natural habitat as swamps are drained, rivers channelized, woodlands cleared and land is paved over for housing developments, roadways and shopping malls.
We found moderate evidence for a negative relationship between terrestrial insect abundance trends and landscape-scale urbanization, potentially explained by habitat loss and light and/or chemical pollution associated with urbanization. By contrast, insect abundance trends were positively associated with crop cover at the local (but not landscape) scale in both realms. Specifically, in the terrestrial realm, temporal trends became less negative with increasing crop cover …
Of course, the positive association between agriculture and insect population increases applies to existing fields, not forest or natural grassland cleared for cultivation. As van Klink has pointed out in interviews, the conversion of land to accommodate more farming would also destroy habitat.
But that is exactly the point if sustainability is the key: using technology to boost yields on existing cropland—growing more food on less land—is the most important action we can take to protect habitat and biodiversity.
And that’s what’s been happening. In a 2013 paper titled “Peak Farmland and the Prospect for Land Sparing,” three scholars at Rockefeller University calculated that global increases in crop yields as the result of advanced technologies, including genetic engineering, meant it took about one-third the amount of land in 2010 to grow the same amount of food as in 1961.
The graphs below, taken from the paper, highlight an event that has since been replicated around the world: after World War II total agricultural production, which until then had been largely circumscribed by the amount of land under cultivation, began a steep ascent as farming entered the modern era.
[To see this process unfold in time, check out the animated charts on crop yields at Our World in Data.]
The boom happened almost simultaneously across the world, from rice in China to wheat in France and Egypt.
The spur for these dramatic productivity gains is no mystery. After World War II, many of the key agricultural inputs—particularly modern pesticides, synthetic fertilizers, and advanced hybrid crops—came online in a major way. The rise accelerated with the advent of the Green Revolution in the early 1960s, and began to be widely dispersed around the world, rescuing many countries, such as India, from the brink of mass starvation.
It is this unprecedented historic decoupling of production from land—what has become known as intensive agriculture—that so many in the environmental movement demonize and seek to reverse. One of their central claims: intensive farming is the primary culprit driving biodiversity loss and insect declines.
Low-productivity food systems have devastating impacts on the environment. As much as three-quarters of all deforestation globally occurred prior to the Industrial Revolution, almost entirely due to two related uses, clearing land for agriculture and using wood for energy.
… attempting to feed a world of seven-going-on-nine billion people with a preindustrial food system would almost certainly result in a massive expansion of human impacts through accelerated conversion of forests, grasslands, and other habitat to cropland and pasture.
… we need to accelerate the long-term processes of growing more food on less land. … raising yields while reducing environmental impacts will require that we farm with ever-greater precision. Raising yields through greater application of technology has often meant more pesticides, fertilizer, and water. But as technology has improved, these trends have begun to reverse.
The organic deficit
The charm of farmer’s markets, Nordhaus writes, is not enough to abandon a system that is limiting land use to counter the effects of urbanization and driving down chemical toxicity levels. It should be noted that organic farming yields on average 10-40 percent less than non-GMO farming, which in turn is about 15 percent less productive than farms using advanced biotechnology. A recent study by the organic advocacy group IDDRI found that if Europe were to adopt agroecological food production practices, productivity would decrease by an average of 35 percent—meaning 35 percent more organic-cultivated land would be needed to produce the same amount of food as produced conventionally.
The math of land saving through the use of modern technologies is so compelling and the yield deficits of organic production so thoroughly cataloged that they can’t be gainsaid. Anti-technology advocates generally prefer to avoid the topic altogether, focusing instead on Goulson-style claims about the adverse effects of chemical pesticides and ignoring organic farmers’ reliance on mechanical plowing using carbon-belching equipment as a form of weed control, which is massively destructive to soil health and biodiversity, and is a major contributor to carbon pollution.
The major sustainability contribution of conventional agriculture is the advent of no-till farming, which began with the use of chemical herbicides like atrazine and accelerated with the debut in 1996 of herbicide-tolerant GMO crops tied to glyphosate. GMO no-till farming has resulted in a massive reduction in carbon release estimated at 37 percent by the Belgian research institute VIB.
The turn away from efficient, intensive agriculture to accommodate the ideological fashion of our times could be a disaster for the fragile insect population. Population growth and growing affluence in the developing world over coming decades will require a sharp increase in necessary food calories, which can only occur by expanding farmable acreage—or by increasing yields on currently available acres.
All of these facts make the German meta-study very uncomfortable for organic farming advocates. The correlation in insect population increases with crops challenges the widespread damage to biodiversity they have been claiming. That may be why most of the major media reporting on the study, such as the BBC, simply ignored the finding, while others—Guardian, Reuters, Smithsonian—included swipes at pesticides not raised by the study authors and written in such a way that the average reader would assume it was backed up by research.
How fast is the decline? How real is the decline?
Trying to determine a global rate of decline, when the data is so uneven and, as the authors say, almost all effects are local and variations are so high even among adjacent sites, is fraught with difficulty. Nevertheless, the new study pins the rate of decline of land-based insects at just under one percent annually, which translates to an 8.3 percent decline per decade.
The study authors note some questions about the scope of the global decline, explaining that it was heavily influenced by what they term “outlier” studies with anomalously high findings. If these outliers are excluded, they say, insect populations would decline by far less, about 15 percent over 25 years.
Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.
This too is not good, but it’s not an apocalypse; and there is time to turn things around even if the estimated trends are accurate. That hopeful take is actually supported by another important, though largely ignored, finding in the study, which is that the terrestrial insect declines in North America were no longer negative after 2000 and freshwater insects increased dramatically.
The fact that North American trends began plateauing or improving around 20 years ago suggests we are headed in the right direction in what had been up until then, according to the authors, the worst performing part of the world. Statistically speaking, once North American data was excluded, the study states that there was only “weak evidence for a negative mean tend” in global populations.
Geography and models
We all naturally gravitate to the headline numbers coming out of these studies. They’re simple, easy to remember and give us a sense of concreteness. Unfortunately, they are probably the least reliable and meaningful findings of all. If the ongoing COVID-19 pandemic has taught us anything, it’s that we should understand complex statistical modelling for what it is: a hypothesis generator or a sophisticated “best guess,” given current knowledge that may, as more facts come to light, prove to be anything from fairly close to wildly off the mark.
All one has to do is look at the maps of the geographic distribution of the studies included in van Klink’s analysis to realize just how problematic any conclusion about global trends is, considering the lack of data from most of the world. The vast majority of studies came from North America and Europe (by my count, almost 2/3 of all the studies).
There is a total of two studies from all of Africa, relatively few from Asia, and none at all from South Asia (India, Pakistan and Bangladesh). There is a single study from the Amazon, one of the richest sources of insect life on the planet.
These gaps are magnified by the fact that most of these studies concern only one specific order or family of insect, or some other sub-division (e.g. parasitoid wasps). But we know that different insect species vary enormously in their response to changes in climate, weather, disease, pollution and habitat destruction. It simply isn’t plausible that a model can compensate for what is, unfortunately, a massive quantity of unknowns, including, to borrow a phrase, many unknown unknowns, when it comes to insect population trends. Simply said, the likelihood of sampling error is immense.
It should be emphasized that none of this is to take away from the prodigious work of the van Klink research team. Almost all the criticisms outlined here are acknowledged and discussed by the authors themselves.
One of the most refreshing aspects of this study, in fact, has been the humility with which this team, which has done some of the best and most thorough work yet trying to establish global insect trends, has presented their results. In an article accompanying the study in Science, addressed to researchers not associated with the project, the team points the way forward for others in this field, and indeed in any scientific endeavor.
Advances in our knowledge about ongoing biodiversity changes and ability to predict future ones will require the incorporation of layers of nuance in patterns of change and drivers of that change.
The temptation to draw overly simple and sensational conclusions is understandable, because it captures the attention of the public and can potentially catalyze much needed action in policy development and research arenas. However, fear-based messages often backfire. This strategy has the grave risk of undermining trust in science and can lead to denialism, fatigue, and apathy. Embracing nuance allows us to balance accurate reporting of worrying losses with hopeful examples of wins. Hope is a more powerful engine of change than fear.