Viewpoint: Here’s how zealous environmental groups dupe politicians about the ‘widespread dangers’ of many safe and effective pesticides

Pesticides used in agriculture have long been the subject of controversy and misinformation – and the attention of regulators. Since before America’s first neonicotinoid (“neonic”) insecticide was registered in 1994, the Environmental Protection Agency (EPA) has been continually assessing the risks of this vital class of pesticide on plants, humans, animals, and the environment. EPA’s science-based risk assessments are the basis for their reputation in agriculture as one of the safest and most environmentally friendly agrochemicals in use today.

But that hasn’t stopped state lawmakers from trying to pass anti-neonic legislation that is damaging to farmers, economies, and the environment. Already in 2023, at least six state legislatures, from Connecticut to California, have introduced “save-the-bees” legislation that would curtail or ban neonic use by growers and/or consumers inside state borders.

Why? Zealous environmental activist groups like the Natural Resources Defense Council have duped legislators and their constituents by convincing them that bees are dying in large numbers – the so-called “Bee-pocalypse” – and that neonicotinoids are to blame, that EPA is doing nothing about it, and that states must take the lead.

Most significant among these bills is New York’s Birds and Bees Protection Act, which would ban neonic-treated seeds for corn, soybean, and wheat growers, and also end neonic use by turfgrass professionals.

This regressive, anti-farming bill, currently on Gov. Kathy Hochul’s desk, ignores EPA’s findings that neonics are safe for plants, humans, animals, and the environment, when used as directed. Instead, the state legislature appears to have relied on a Cornell University report that is replete with equivocation and ambiguity. This statement in it is typical:

This risk assessment is intended to support evidence-based decisions, [but] we make no recommendations or policy prescriptions. Finding the ‘best policy’ or ‘best policies’ for neonicotinoid insecticides in New York will require thoughtful choices between competing priorities.

Moreover, much of the analysis doesn’t apply to New York agriculture, and it even downplays Cornell’s own research, which concluded from a large field study that up to 66% of a crop planted without neonic-treated seeds can suffer significant economic damage.

Neonics are a linchpin for agroecological practices on New York farms that use organic cattle manure for fertilizing soils and cover cropping, a sustainable practice that takes carbon out of the air, stores it in the soil, and reduces greenhouse gas emissions. These practices attract seed corn maggots, however, and neonics are the only viable option for treating them. Even Cornell’s own highly touted neonic replacement for farmers, diamides, are triple the cost of neonics.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

Another significant bill, passed by the California legislature, directs the California Department of Pesticide Regulation (CDPR) to reevaluate neonics with the intention of curtailing their use in lawn and garden use. With the EPA, the federal government’s pesticide regulator, finishing its latest 15-year neonic risk assessment, California’s legislation is, at best, redundant.

Under intense political pressure, CDPR has already issued costly new neonic regulations that will go into effect in 2024. For just six of California’s 400-plus crops, revenue losses to growers are estimated to be in the range of $12-13 million. Production costs will spike – as much as 186.6% for growers of tomatoes; 73.8% for grapes; and upwards of 66.6% for citrus.

This costly bill has already proved damaging to farmers and the environment. Neonicotinoids have successfully lowered California’s dependence on less environmentally friendly insecticides. Thanks to neonics, the use of organophosphates decreased by 41.5%, from 1,900 tons of active ingredient applied in 2007 to 1,100 tons in 2016. The use of organophosphates will likely rise if neonic use is restricted.

Nevada lawmakers have already enacted a bill that, in part, requires growers to get permits from the state’s Department of Agriculture before applying neonicotinoids, adding an unnecessary state-based regulatory hurdle for neonic use that is redundant with EPA oversight, and whose permissiveness may rise or fall based on politics.

Only the EPA has a legal mandate to regulate pesticides across the United States. One of the most important reasons is so that state legislators don’t inadvertently hurt growers, consumers, and the environment by making decisions based on emotion or local politics rather than science. Misguided efforts by legislators to redefine pesticide regulation are lowering farmers’ incomes, sowing regulatory uncertainty, raising food costs, and leaving growers no choice but to resort to less environmentally friendly practices.

If legislators cared to investigate, they would find that bees are not, in fact, in dire straits at all. Managed honeybee hive numbers around the world have risen by nearly 26% in the last decade, from 81 million to 102 million. They would also find that the scientific consensus holds that varroa mites and climate change, not neonics, are the greatest threats to bee health. The levels of neonics in plant pollen and nectar are too low to harm bees when used as directed by EPA.

Even some environmentalists are reversing their Bee-pocalypse rhetoric, because, in an unobvious way, it’s actually hurting pollinators. Thanks to years of activists’ scaremongering, too many people are starting hives around their homes to “save the bees,” and these domesticated bees are causing pollinator starvation by usurping pollen and nectar from wild bees, moths, and monarch butterflies.

Fortunately, governors such as Hochul can still heed the regulatory experts and use their power to veto their respective states’ misguided regulation. I’m not holding my breath.

Henry I. Miller, a physician and molecular biologist, is the Glenn Swogger Distinguished Fellow at the American Council on Science and Health. Follow Henry on  X @henryimiller

A version of this article was originally posted at Henry Miller’s Blog and is reposted here with permission. Any reposting should credit both the GLP and original article.

How Spain emerged as the European epicenter for egg donations

baby foot hgyh w hni
Spain performs more than half of all egg donation treatments across Europe. The country is the largest provider of donor eggs across the continent. And every year, thousands of international fertility patients travel to Spain to access treatments.

Spain is a popular choice for private fertility treatments because any woman or man regardless of their civil status, sexual orientation or age can access them. Indeed, many come to Spain because of restrictions and bans in their home countries, particularly related to egg donation. Spain also tends to have very short waiting lists.

At the same time, egg banks in Spain have proliferated in recent yearsshipping donor eggs around the world, as part of this multi-million-pound industry.

Close to 15,000 women undergo egg extraction cycles every year in Spain. Many are financially motivated – donors in Spain receive one of the highest rates of financial compensation across Europe (around €1,100 (£945) for a successful cycle).

While research has previously shown that some women claim to give their eggs for altruistic reasons (often alongside financial motivation), very little is known about women’s experiences of egg donation. They are for the most part invisible. This is why I wanted to find out more about what it’s like being an egg donor in Spain.

As part of my research, I interviewed egg donors and doctors and also observed women in fertility clinics to get a better sense of what the process was actually like.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

The reality of egg donation

Fertility clinics’ websites usually describe egg donation as a fast and easy procedure. But women wanting to become egg donors have to do a fair bit of preparation before the donation can actually take place.

First, a screening happens which includes health-related and psychological questionnaires as well as gynaecological and genetic tests. Potential donors will then be given hormone injections for about ten days.

After that, egg retrieval occurs which involves surgery under general anaesthetic to remove the eggs through a vaginal ultrasound scanner which is connected to a needle. It’s a time-consuming, inconvenient and at times painful process.

It also carries medical risks such as ovarian hyperstimulation syndrome, which is when the ovaries become enlarged and can lead to several serious problems such as blood clots or bleeding.

Donors can also experience medication intolerance or side-effects alongside a risk of infection during surgery. But the longer-term risks of being an egg donor remain largely unknown, because of the limited amount of studies carried out in this area.

I also discovered that the reality of egg donation can mean busy schedules balanced between work, studies and personal life. To avoid losing any income, most of the women I spoke to didn’t usually stop any of their usual activities during the cycle. Instead, they tried to make it all fit into their busy lives – which sometimes created risks in terms of their health.

This was particularly obvious when it came to the egg extraction. The scheduling of the surgery relies on the pace of the hormonal drugs in the body, so it cannot be decided according to the donors’ convenience. When the eggs are ready to be collected they are ready to be collected – and this can differ from patient to patient. When donors are ready, they have to administer the final injection, which is the one that makes them ovulate before they are scheduled for the surgery the next morning.

The result is that although clinics advise resting for 24-48 hours after the extraction, many egg donors work the next day, or even the same day, on their afternoon shifts.

Payments and compensation

Compensation for a cycle is usually paid in cash at the end of the process. And is only secured after the extraction surgery if there are extractable eggs – donors are paid the same amount regardless of the number of eggs.

If the process has to be stopped before the extraction for reasons that aren’t the donor’s fault, such as the medication not producing the expected effects on egg production, most clinics do not offer compensation.

If donors undergo the extraction surgery but there are no extractable eggs, the matter is usually discussed among doctors, with significant differences across clinics. If there is an indication that the donor may not have administrated the last injection of hormones or that she did so at the wrong time, the clinic will usually not pay her at all.

If the clinic believes she followed the rules but that she ovulated earlier than expected, different clinics have different rules: some might give her the full amount, others only a partial payment and some nothing at all.

Donors may also be required to reimburse all expenses for the treatments if they decide to abandon the process halfway through for non-medical reasons – something most can’t afford to do.

In the event of side effects or complications following the extraction, donors are usually referred to the emergency room of public hospitals, as the donation contract does not include private health insurance.

oocyte with zona pellucida
Credit: Wikimedia Commons (CC BY-SA 2.0)

The few studies on egg donors’ experiences in Spain that do exist show problems and gaps in terms of the information donors are provided with and the conditions in which egg donation is undertaken.

Most of the women I spoke to didn’t know how many eggs were extracted, the number of women that might be treated with them, or whether they will be used in the clinic, frozen to be banked or shipped abroad.

It is clear then that as the demand for egg donation increases, urgent action is needed to ensure that women in the global egg donation industry are properly informed, cared for and insured in case of complications and side-effects.

Anna Molas is a Research Fellow in Anthropology at the Autonomous University of Barcelona. Find Anna on X @Anna_Molas

A version of this article was originally posted at the Conversation and is reposted here with permission. Any reposting should credit both the GLP and original article. Find the Conversation on X @ConversationUS

Body odor fingerprint: How your unique smell could help reveal cancer or COVID infections

fe f fb b
From the aroma of fresh-cut grass to the smell of a loved one, you encounter scents in every part of your life. Not only are you constantly surrounded by odor, you’re also producing it. And it is so distinctive that it can be used to tell you apart from everyone around you.

Your scent is a complex product influenced by many factors, including your genetics. Researchers believe that a particular group of genes, the major histocompatibility complex, play a large role in scent production. These genes are involved in the body’s immune response and are believed to influence body odor by encoding the production of specific proteins and chemicals.

But your scent isn’t fixed once your body produces it. As sweat, oils and other secretions make it to the surface of your skin, microbes break down and transform these compounds, changing and adding to the odors that make up your scent. This scent medley emanates from your body and settles into the environments around you. And it can be used to track, locate or identify a particular person, as well as distinguish between healthy and unhealthy people.

We are researchers who specialize in studying human scent through the detection and characterization of gaseous chemicals called volatile organic compounds. These gases can relay an abundance of information for both forensic researchers and health care providers.

Human scent analysis breaks down body odor to its individual components.

Science of body odor

When you are near another person, you can feel their body heat without touching them. You may even be able to smell them without getting very close. The natural warmth of the human body creates a temperature differential with the air around it. You warm up the air nearest to you, while air that’s farther away remains cool, creating warm currents of air that surround your body.

Researchers believe that this plume of air helps disperse your scent by pushing the millions of skin cells you shed over the course of a day off your body and into the environment. These skin cells act as boats or rafts carrying glandular secretions and your resident microbes – a combination of ingredients that emit your scent – and depositing them in your surroundings.

Your scent is composed of the volatile organic compounds present in the gases emitted from your skin. These gases are the combination of sweat, oils and trace elements exuded from the glands in your skin. The primary components of your odor depend on internal factors such as your race, ethnicity, biological sex and other traits. Secondary components waver based on factors like stress, diet and illness. And tertiary components from external sources like perfumes and soaps build on top of your distinguishable odor profile.

Identity of scent

With so many factors influencing the scent of any given person, your body odor can be used as an identifying feature. Scent detection canines searching for a suspect can look past all the other odors they encounter to follow a scent trail left behind by the person they are pursuing. This practice relies on the assumption that each person’s scent is distinct enough that it can be distinguished from other people’s.

Researchers have been studying the discriminating potential of human scent for over three decades. A 1988 experiment demonstrated that a dog could distinguish identical twins living apart and exposed to different environmental conditions by their scent alone. This is a feat that could not be accomplished using DNA evidence, as identical twins share the same genetic code.

The field of human scent analysis has expanded over the years to further study the composition of human scent and how it can be used as a form of forensic evidence. Researchers have seen differences in human odor composition that can be classified based on sex, gender, race and ethnicity. Our research team’s 2017 study of 105 participants found that specific combinations of 15 volatile organic compounds collected from people’s hands could distinguish between race and ethnicity with an accuracy of 72% for whites, 82% for East Asians and 67% for Hispanics. Based on a combination of 13 compounds, participants could be distinguished as male or female with an overall 80% accuracy.

Researchers have trained dogs to sniff out COVID-19 infections.

Researchers are also producing models to predict the characteristics of a person based on their scent. From a sample pool of 30 women and 30 men, our team built a machine learning model that could predict a person’s biological sex with 96% accuracy based on hand odor.

Scent of health

Odor research continues to provide insights into illnesses. Well-known examples of using scent in medical assessments include seizure and diabetic alert canines. These dogs can give their handlers time to prepare for an impending seizure or notify them when they need to adjust their blood glucose levels.

While these canines often work with a single patient known to have a condition that requires close monitoring, medical detection dogs can also indicate whether someone is ill. For example, researchers have shown that dogs can be trained to detect cancer in people. Canines have also been trained to detect COVID-19 infections at a 90% accuracy rate.

Similarly, our research team found that a laboratory analysis of hand odor samples could discriminate between people who are COVID-19 positive or negative with 75% accuracy.

Forensics of scent

Human scent offers a noninvasive method to collect samples. While direct contact with a surface like touching a doorknob or wearing a sweater provides a clear route for your scent to transfer to that surface, simply standing still will also transfer your odor into the surrounding area.

Although human scent has the potential to be a critical form of forensic evidence, it is still a developing field. Imagine a law enforcement officer collecting a scent sample from a crime scene in hopes that it may match with a suspect.

Further research into human scent analysis can help fill the gaps in our understanding of the individuality of human scent and how to apply this information in forensic and biomedical labs.

Chantrell Frazier is an Assistant Professor of Chemistry and Food Science at Framingham State University.

Kenneth G. Furton is a Professor of Chemistry and Biochemistry at Florida International University. 

Vidia A. Gokool is a Postdoctoral Researcher at Lawrence Livermore National Laboratory. 

A version of this article was originally posted at the Conversation and has been reposted here with permission. Any reposting should credit the GLP and original article. Find the Conversation on X @ConversationUS

Here’s the straight poop about fecal transplants

fecal
I was a little surprised to see “Ethical Issues in Fecal Microbiota Transplantation” festooning the cover of the May 2017 issue of The American Journal of Bioethics – not their typical topics of gene editing and testing, stem cells, and medical matters of life or death. But as fecal transplants become more medically accepted, questions of access and quality control are indeed arising. So here are a few scintillating facts about borrowing bowel microbiomes to combat “dysbiosis.”

1. The only sort-of approved use of “FMT” is for recurrent infection with Clostridium difficile, which causes severe diarrhea. The infection is usually acquired in a health care facility. A 2013 FDA “enforcement discretion” ruling allows doctors to provide FMT without filing an Investigational New Drug Application – but only to treat C. diff infection (CDI). It’s 90% effective! The procedure is in clinical trials for other indications, albeit against a backdrop of widespread DIY variations on the theme.

2. The procedure may become frontline treatment for CDI, not just a last resort when antibiotics have failed to control the terrifying torrents of watery stool. And it’s needed. Results of a study reported in the Annals of Internal Medicine found that cases of multiply recurrent CDI – more than two bouts in a short time period – are increasing at more than four times the rate of the infection in general. The numbers are daunting: up to half of the 500,000 people in the US who get CDI annually get it again at least once, for a total cost exceeding $5 billion a year. Drug resistant strains are arising and new drugs are coming on the market, but a fecal transplant may be the way to go from the get-go. CDI, say many who’ve had it, is far worse than overcoming the “ick factor” of receiving a bit of foreign poop.

cdi

3. Some medical organizations and insurers (including Medicare) cover fecal transplants for CDI.

4. What’s in a bowel movement? From 25 percent to 54 percent of the solid portion – after removing the 75 percent that’s water – consists of bacteria. The rest is undigested nutrients, electrolytes, and mucus, with color from bile pigments and odor from bacterial compounds (phenols, indole, skatole, ammonia, and hydrogen sulfide). But stool composition varies daily in individuals, which will complicate standardizing transplants. It also presents an obstacle to using a microbiome profile as a form of identification. One bioethicist mentioned checking stool at airports to see whether travelers have come from countries banned from immigration. Would a passport from a Swede be accepted if her feces harbor bacteria native to Somalia?

5. Fecal transplants may conjure mental images of turkey basters, but the material is delivered via enema, colonoscope, nasogastric tube (a nose hose), or capsule.

6. The technology is at least 1,700 years old. The first recorded use was in 4th century China by a physician, Ge Hong, to treat food poisoning and diarrhea. In various times and places, poop has been delivered as “yellow soup” to humans and other animals (especially cattle) and German soldiers reportedly infused camel feces to treat bacterial diarrhea during World War I.

7. Reductionists attempting to drill down to the good stuff in a turd and then recreate it note that just part of a microbiome need be transferred, akin to a keystone organism in an ecosystem. Seres Therapeutics’ SER-109 — a capsule that delivers “an ecology of bacterial spores enriched and purified from healthy, screened human donors” — is in phase 3 clinical trials to treat CDI. More mysterious is SER-262 — “the first synthetically-derived and designed microbiome therapeutic.” It fared well against placebo in a 24-week phase 1 randomized controlled clinical trial.

8. Researchers are hard at work describing the optimal feces donor. Most references cite the Amsterdam protocol in this regard. And the American Gastroenterological Association maintains a National FMT Registry to monitor adverse events and the details of donors. Will we one day have poop centers much like frozen yogurt shops where a hopeful recipient can order up a particular fecal microbiome? Or even mix flavors?

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

9. Altering the intestinal microbiome might treat autism, Parkinson’s disease, depression, and anxiety, perhaps by affecting serotonin levels, thanks to the “gut-brain axis.” In an intriguing experiment, stool from people with major depressive disorder had a different effect on “depression-like behaviors” when transplanted into germ-free mice compared to the rodents’ more spirited response to stool from happy humans.

poop money10. Should people pay for poop, like they do for sperm? Should we patent exceptionally healing donations? Anyone remember The Repository for Germinal Choice, an ill-fated California sperm bank for Nobel-prizewinners?

11. Delivery. Once feces donations are standardized, how will they be prepared and shipped? Dried out like sea monkeys? Fedex? UPS? Amazon Prime?

12. Should informed consent for a recipient include knowing the donor’s diet? Would a transplant from a person who ate pork be like implanting pig heart valves into an orthodox Jewish person? Might a recipient request a vegan donor?

13. OpenBiome is a nonprofit stool bank that sends frozen matter to hospitals. Founded by a relative of someone who fought CDI futilely with seven rounds of vancomycin before a transplant helped, the company pays $40 for donating several times a week for two months. Stool must pass two rounds of screening, and the original owner must be aged 18-50, have a BMI under 30, and live near Cambridge, MA, where donations are deposited. The homepage opens to an image of clean, white bottles — countering the ick factor is a big challenge for this emerging industry.

14. Fecal transplantation may have unexpected effects, especially since standardizing it as a medical substance is so challenging. The first noted was obesity, which is sort of obvious, but one man who had alopecia since age 6 had a transplant to treat CDI and grew so much hair that he had to shave!

15. AdvancingBio treats private payers. Prep costs $115 and delivery depends on the route: “esophagogastroduodenoscopy” (down the hatch) is $307 and colonoscopy $341 to $591. The “become a donor” page shows 10 smiling people, most of them millennials. Those willing to sell their excrement must be between the ages of 18 and 65, have a BMI under 35, provide a medical history, and have a blood test for infections, including cholera, E. coli, plague, foodborne Salmonella and Shigella, as well as various eggs and larvae. Presumably the donation must score a healthy type 3 or 4 on the Bristol Stool Chart.

Stay tuned. Scatological jokes aside, fecal transplantation is a valid medical procedure that will likely continue to find new niches.

BristolStoolChart

Ricki Lewis has a PhD in genetics and is a genetics counselor, science writer and author of Human Genetics: The Basics. Follow her at her website or X @rickilewis.

Viewpoint: Rejecting simplistic organic farming — Instead of worrying about which farming system is more ‘natural’, which is impossible to define, focus on sustainability

Back in 2004, Mendocino County in California hit the headlines after becoming the first county in the United States to vote for a ban on growing genetically engineered crops, following a demonstration led by local organic wine producers claiming that GMOs could contaminate their crops and damage their ‘natural’ image.

The Mendocino vote to ban genetically engineered crops might suggest to the casual observer that we would all be better off by avoiding the application of new agricultural technologies, while embracing more “natural’ farming techniques.

But it got me thinking what, exactly, do we mean by natural? And what would be the costs to society of abandoning new technology in agriculture?

Right now, the food available in our stores is cheaper, more plentiful and more nutritious than ever before in our history. Yet we worry about the way food is produced on farms and about the genetic makeup of the plants used by our farmers. “Are they using natural plants and farming the natural way?” we ask ourselves.

Perhaps it is time to kill off a few myths about farming. There is nothing natural about farming. An agricultural landscape may look attractive – a vineyard in the San Diego backcountry for example, or a sunflower field in full bloom in the Provence in France – but its creation required the complete destruction of the natural ecosystem and its replacement by an agricultural ecosystem.

Further, to grow so many of the same plants in one field while at the same time suppressing the growth of other plants – in this case, weeds – is not natural. This is true even if farmers practise crop rotation, or “inter-cropping,” the practice of growing two or three crops at the same time. Such an ecosystem is not what nature intended, and as a result we must continuously supply fertilisers, and apply weed control, disease control and insect control measures to keep that artificial ecosystem going. The most important question is not whether it is natural, but whether it is sustainable in the long run. Do our practices destroy the resource base, or do they maintain it for future generations?

And what about the plants? Are they natural? Well, our crop plants were domesticated 5,000 to 10,000 years ago, and in the process their genetic makeup was changed considerably and irreversibly. Changed so much in fact that crop plants generally cannot survive in nature. Although all the plants in our canyons and mountains are not native – there are many invaders – there are no runaway crop plants to be found. They simply can’t survive there.

Further, the genetic makeup of our crops keeps on changing. This is true whether a San Diego tomato farmer buys the latest hybrid seeds from a crop breeding company or whether a corn seed selector in Chiapas, Mexico, selects seeds from this year’s harvest for planting the next season.

In subsistence farming communities all over the world, seed selectors – usually women – carefully select seeds from the best plants and keep them for planting. This does not maintain the genetic “purity” of these land races but rather produces constant genetic change so that the crop remains adapted to its ever-changing environment.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

In our society, ever since the 1900s, plant breeders have been making new gene combinations to produce the best planting materials. The so-called genetically manipulated or “GM crops,” sometimes referred to as “GMOs,” are simply the latest expression of plant breeders’ desires to produce the best crops for the farmers. In such GM crops, new genes are introduced by a combination of molecular techniques and traditional plant breeding.

Because molecular techniques are used at the start, the genes can come from any organism: another plant species, a microbe or even an animal. Animal genes will not be used to create new food plants but may be introduced to create plants that manufacture pharmaceuticals. The productivity of our agriculture, whether conventional or organic, can only be maintained by constant genetic improvement because the disease organisms and crop pests keep on evolving. Which brings me to the vote in Mendocino County to reject the growing of genetically manipulated crops. This was another battle pitting organic farmers against biotech companies. We love these David and Goliath stories.

The campaign and the vote were discussed in the local media under the headline “For Mendocino County, natural’s the only way to grow.” Without being explicit, the headline reinforced the popular belief – not based on scientific evidence – that some types of agriculture – in this case, organic – are somehow more natural than conventional methods.

The use of manure, that symbol of virtuous farming, does not make those practices any more natural. Instead of worrying about what is natural, which is impossible to define, we should worry about sustainability.

If certain farming practices are unsustainable – irrigation with groundwater that is not replenished, for example – they should be taxed rather than subsidised to make them less attractive to farmers. If certain new pesticides are less toxic to people and the environment than the traditional ones used by organic farmers, their use should not be stigmatised by those seeking economic advantage for their own farming practices. If certain GM crops make agriculture more sustainable because they permit less pesticides to be used or conserve water they should certainly not be banned but embraced by society.

Rejecting modern technologies would be a disastrous development if we are to help feed the 9 billion people who soon will inhabit our planet. To achieve that goal, we must seek out the best agricultural practices and combine them with the best genetic crop varieties – whether produced by molecular and/or traditional means – so as to achieve food security for all, including the 800 million who are now without a secure food supply.

The organic farmers of Mendocino County and elsewhere are shrewd business people. By sticking to manure and certain older chemical fertilisers and pesticides, by banning newer ones and by banning GM crops, they have hoodwinked the public into believing they are “natural” farmers. The public is willing to pay a premium for their organic wines, and they are happy for anyone to spread their groundless message that they are farming in nature’s way and others are not.

Maarten J. Chrispeels is Distinguished Professor Emeritus at the Department of Cell and Developmental Biology at the University of California San Diego (UCSD). His active research career at UCSD spanned 42 years. For 10 years he served as the Director of the San Diego Center for Molecular Agriculture (SDCMA) on the UCSD campus. Professor Chrispeels was elected to membership in the US National Academy of Sciences in 1996. Follow Maarten on Linkedin

A version of this article was posted at Science for Sustainable Agriculture and has been reposted here with permission. Any reposting should credit the original author and provide links to both the GLP and the original article. Find Science for Sustainable Agriculture on X @SciSustAg

Viewpoint: Anti-glyphosate rabbit hole — Will the ethically-compromised International Agency for Cancer (IARC) lead Europe to embrace a scientifically-challenged Green Deal?

What chemicals or environmental exposures are likely to cause cancer? That’s a complex question with a wide variance in views across the scientific community. 

First, one needs to determine whether a substance or situation is likely to ever cause cancer. For instance, we know that drinking hot coffee or tea or any drink can increase the chance of esophageal cancer. In other words, they present what scientists call a hazard

Whether a hot coffee drinker is likely to get cancer — what scientists call risk — is a much different formulation. It depends on how many hot drinks a person consumes, every day, for many years. In other words, Risk = Hazard x Exposure. 

That appears straightforward, but journalists and even some scientists, botch that simple equation in their discussions of environmental exposures, particularly chemicals. That often leads to bizarre public policy debates over the potential danger of one chemical or another, which can lead to lousy regulatory decisions.

This problem is not an abstraction. It was only four years ago that Christopher Wild, then-head of a United Nations-affiliated research group known as the International Agency for Research on Cancer (IARC), declared that breakfast bacon and cold cuts are carcinogenic, and that hamburgers and other red meat are “probable” causes of cancer. The IARC classifications prompted some uncritical praise but mostly ridicule, and debunking by scores of scientists

image

IARC also shook up the agricultural industry when it declared that the world’s most popular herbicide— considered for decades to be both the most effective and among the safest to use weedkillers — was also a “probably carcinogenic to humans” —  putting it in the same ‘danger’ category 2A, as red meat, hot drinks, being a barber or hairdresser, or working the nightshift — not very scary exposures. 

While environmental activists, journalists and social media shrugged off or mocked IARC’s classification of lamb chops and a visit to the beauty parlor as “probably” dangerous, they literally exploded in horror when IARC issued a proclamation in 2015 that glyphosate — sold generically and by Monsanto (now Bayer) under the trademark name Roundup — posed an identical hazard. This headline is from the UK The Guardian:

image

The Group 2a classification ignited mass litigation in the United States against Monsanto that is still going on and could lead to a ban on glyphosate sales across the European Union and elsewhere.

How did the mainstream science community react? Initially with panic because literally all the regulatory and risk studies to that time had concluded that glyphosate did not pose any substantial cancer risk at all. Did they all get it wrong? That sets off a frenzied reassessment of glyphosate by 18 of the world’s premier regulatory and risk agencies. [Click on the bolded excerpts to take you to the document issued by the regulatory or research agency.]

image

The most recent one, released last summer by the European Food Safety Authority that upheld its former conclusion — the EFSA “did not identify critical areas of concern,” finding the weedkiller “unlikely to be genotoxic or to pose a carcinogenic threat” to humans, animals and the environment. All told, there have been 24 risk assessments of glyphosate over the past decade and not one concludes the weedkiller is a carcinogen.

As Health Canada wrote after two recent reviews after IARC made its hazard declaration: “No pesticide regulatory authority in the world currently considers glyphosate to be a cancer risk to humans at the levels at which humans are currently exposed”.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

IARC and its impact

IARC, if you are not familiar with it, has become one of the most controversial health agencies in the world over the past decade precisely because of such controversial findings. It has a controversial history. It started as a research institute in the 1950s before affiliating with the World Health Organization. IARC was chartered to assess cancer hazards and has since passed judgement on 1035 chemicals or exposures.

[Ironically, an investigation by Reuters after IARC’s controversial 2015 classification found that the agency was poised to designate glyphosate as “non-carcinogenic”— only the second of the 1036 substances it has reviewed over the decades to be so designated.  

Mysteriously and inexplicably, an 11th hour edit by the agency moved glyphosate into the “probably carcinogenic” category — much to the glee of anti-biotechnology advocacy groups and predatory tort lawyers who soon cashed in on IARC’s re-classification. 

Christopher Portier, a key member of IARC’s review board at the time, became a consultant to the US-based Church of Scientology tort firm Baum Hedlund, which served as point litigator in the first three glyphosate court cases filed against Monsanto, in a partnership with Robert F. Kennedy, Jr.]

 image

IARC was again in the news in July when it announced that aspartame, found in many sugar-free chewing gums, Diet Coke and other products, is “probably carcinogenic. The claim was widely circulated, in many cases uncritically, by the global media, and it was hyped incessantly by environmental activist groups, such as Environmental Working Group, with known ties to both IARC and Baum Hedlund.

Government researchers around the world reject IARC’s hazard finding as ludicrous, noting that an adult would have to consume between 12 and 36 cans of diet soda every day for the rest of her life to even hazard the longshot possibility of getting cancer. As recently as July, the European Food Safety Authority

But unfortunately, IARC’s classifications are no longer a joke, as EU member states are fighting an extended battle over whether to renew glyphosate’s year registration for another 10 years.

IARC has caused much harm by fueling eco-extremist organizations and activist scientists, whose pressures led to the collapse of governments and the devastation of thriving agricultural economies. Here I’d like to revisit the ongoing impact of IARC’s glyphosate finding so that history does not repeat itself in Europe.

When IARC released its “probably carcinogenic to humans” finding on glyphosate, the green lobby knew it had been handed a devastating weapon beyond their wildest dreams. For the first time, an official-sounding agency classified glyphosate as potentially cancerous, even though all other regulatory agencies in the world have concluded the opposite.

Environmentalists supporting the European Green Deal, which proposes to reduce toxins in Europe, began amplifying marginal studies that found tiny amounts of glyphosate – a few parts per billion – in wine. Those are levels far below what any science agency believes could be harmful. (Ironically, IARC has classified wine and other types of alcohol as the most dangerous carcinogen humans can be exposed, Group 1, so focusing on the ppb of safe glyphosate in ‘dangerous’ wine is scientifically ridiculous.)

Fear still grips the public and regulators, and sound toxicological science on glyphosate produced by EPA and other leading regulatory agencies has become a victim of activist efforts. A computer scientist turned amateur epidemiologist named Stephanie Seneff churned out “scientific” papers claiming that practically every noncommunicable disease could be related to glyphosate’s presence in the food chain. In one of her more ridiculous “studies”, Seneff and a colleague drew up laughable correlation figures blaming the increased use of glyphosate on autism.

image

Mocking her, scientists posted correlation charts showing the “link” between increased consumption of organic foods and autism.

image

Seneff’s “research” is promoted by Dr. Joseph Mercola and other international pseudoscience “gurus” with millions of social media followers. The glyphosate hysteria even tripped up the American Association for the Advancement of Science. The AAAS bestowed its 2019 Award for Scientific Freedom and Responsibility to two crank Sir Lankan scientists, Drs. Sarath Gunatilkake and Channa Jayasumana, who authored two much-ridiculed papers linking glyphosate to chronic kidney disease. The announcement of the award touched off a backlash among AAAS members, and it was rescinded.

image

Public as victims and a new reality

The biggest victims of IARC’s glyphosate misinformation bombs have been the countries banning or limiting glyphosate use. Inspired by the eventually-rescinded AAAS award, and led by eco-extremist ideologues such as Vandana Shiva, activists duped Sri Lanka’s leaders into going “in sync with nature” by adopting an all-in organic policy. Then-president Gotabaya Rajapaksa banned 100% of agrochemicals, even mineral fertilizers. 

This pleased then-Prince Charles and other green ideologues at the 2021 Glasgow Climate Summit, but it nearly destroyed the tiny island nation. Yields plummeted in this once-flourishing agricultural sector, thanks to rampant weed infestations and a lack of nutrients. As the New York Times wrote, Sri Lanka took a ”sudden and disastrous turn toward organic farming”.

image

Within a year, the country could not adequately feed itself. The cost of food exploded, the country starved, and Rajapaksa had to flee his own nation  after a farmers’ revolt blew into a full-scale mass uprising that demanded his exit.

It appears EU leaders are taking note of Sri Lanka’s lessons spurred by the Ukraine war, which has transformed the views of many of its politicians. In August, Frans Timmermans, European Union commissioner for the European Green Deal, submitted his resignation as Member of the European Commission. The Dutch farmers’ party has taken control of the Dutch senate in response to Green Deal efforts to buy out livestock farmers. And, President Macron’s team no longer talks of banning glyphosate. 

Organic agriculture with its emphasis on composting, manual weeding and tilling of soils, and its rejection of genetic engineering techniques, cannot hope to address global farming challenges.

The impact of the Ukraine-Russia War has helped show that Europe’s food supply is fragile. Europe’s “Zeitenwende” or “epochal turn” from its “back to nature” dreams has started, but it’s incomplete unless it eschews agricultural policies rooted in anti-science hogwash. It must reject elitist and eco-extremist pseudo-solutions that have negatively influenced EU policymaking.

Chandre Dharma-wardana, a scientist with a Ph.D. from the University of Cambridge, currently works for the National Research Council of Canada and the Université de Montréal. 

Disaster interrupted: Which farming system better preserves insect populations: Organic or conventional?

mag insects image superjumbo v
A three-year run of fragmentary Armageddon-like studies had primed the journalism pumps and settled the media framing about the future of the global insect population: modern agriculture was steering us toward catastrophe.

But scientists remained queasy about what they increasingly came to believe was a simplistic narrative. None of the studies reaching ‘disaster conclusions’ was comprehensive. All were steeped in assumptions that could radically skew the data. Most of the world’s insect population centers were not even studied. And the declines were far from uniform. In some localities, there were reports of increases in overall insect population, and some types of insects are increasing in abundance across the world.

Which brings us to the 2020 meta-study of 166 long-term surveys by Roel van Klink at the German Center for Integrative Biology and his team of 30 scientists. For the first time, scientists had a full platter of studies, covering much of the world. Here was data that might answer questions that by now had turned highly ideological.

roel van klink
Roel van Klink

The few journalists who picked up on the study’s release noted the finding that insect declines were far less than reported in the smaller-scale studies, and indeed, no catastrophe was imminent. In fact, freshwater insects like mayflies and dragonflies actually have increased over the years, they found, and insect declines in the US, especially in Midwest agricultural areas, began leveling off at the turn of the century.

That doesn’t mean there isn’t a real and significant problem, as van Klink took pains to point out—he called the situation “awfully alarming.” But the difference between a “hair on fire” apocalypse and a serious problem is that there is time to get a better understanding of the causes and, hopefully, make rational decisions to constructively address them.

And it was precisely on the question of causation that the new study fundamentally challenged the “accepted narrative” that modern agriculture and the overuse of pesticides are driving the observed declines.

Effects of modern agriculture

Van Klink’s finding that “crop cover,” which is the phrase he uses to describe farmland, is correlated with increases in insect populations runs directly contrary to the speculations—more often than not presented as fact—that modern farming, especially the use of GMOs and pesticides, is the problem.

The second bugaboo, climate change, also didn’t appear on the suspect list; there was simply no correlation, positive or negative. The primary driver was urbanization, most likely due to the destruction of natural habitat as swamps are drained, rivers channelized, woodlands cleared and land is paved over for housing developments, roadways and shopping malls.

We found moderate evidence for a negative relationship between terrestrial insect abundance trends and landscape-scale urbanization, potentially explained by habitat loss and light and/or chemical pollution associated with urbanization. By contrast, insect abundance trends were positively associated with crop cover at the local (but not landscape) scale in both realms. Specifically, in the terrestrial realm, temporal trends became less negative with increasing crop cover …

Of course, the positive association between agriculture and insect population increases applies to existing fields, not forest or natural grassland cleared for cultivation. As van Klink has pointed out in interviews, the conversion of land to accommodate more farming would also destroy habitat.

But that is exactly the point if sustainability is the key: using technology to boost yields on existing cropland—growing more food on less land—is the most important action we can take to protect habitat and biodiversity.

And that’s what’s been happening. In a 2013 paper titled “Peak Farmland and the Prospect for Land Sparing,” three scholars at Rockefeller University calculated that global increases in crop yields as the result of advanced technologies, including genetic engineering, meant it took about one-third the amount of land in 2010 to grow the same amount of food as in 1961.

The graphs below, taken from the paper, highlight an event that has since been replicated around the world: after World War II total agricultural production, which until then had been largely circumscribed by the amount of land under cultivation, began a steep ascent as farming entered the modern era.

klink

[To see this process unfold in time, check out the animated charts on crop yields at Our World in Data.]

The boom happened almost simultaneously across the world, from rice in China to wheat in France and Egypt.

klink

The spur for these dramatic productivity gains is no mystery. After World War II, many of the key agricultural inputs—particularly modern pesticides, synthetic fertilizers, and advanced hybrid crops—came online in a major way. The rise accelerated with the advent of the Green Revolution in the early 1960s, and began to be widely dispersed around the world, rescuing  many countries, such as India, from the brink of mass starvation.

It is this unprecedented historic decoupling of production from land—what has become known as intensive agriculture—that so many in the environmental movement demonize and seek to reverse. One of their central claims: intensive farming is the primary culprit driving biodiversity loss and insect declines.

Yet, a careful look at the data shows the narrative touting small-scale organic-focused farming as a necessary alternative is outdated, even reactionary, writes Ted Nordhaus at the Breakthrough Institute:

Low-productivity food systems have devastating impacts on the environment. As much as three-quarters of all deforestation globally occurred prior to the Industrial Revolution, almost entirely due to two related uses, clearing land for agriculture and using wood for energy.

… attempting to feed a world of seven-going-on-nine billion people with a preindustrial food system would almost certainly result in a massive expansion of human impacts through accelerated conversion of forests, grasslands, and other habitat to cropland and pasture.

… we need to accelerate the long-term processes of growing more food on less land. … raising yields while reducing environmental impacts will require that we farm with ever-greater precision. Raising yields through greater application of technology has often meant more pesticides, fertilizer, and water. But as technology has improved, these trends have begun to reverse.

The organic deficit

The charm of farmer’s markets, Nordhaus writes, is not enough to abandon a system that is limiting land use to counter the effects of urbanization and driving down chemical toxicity levels. It should be noted that organic farming yields on average 10-40 percent less than non-GMO farming, which in turn is about 15 percent less productive than farms using advanced biotechnology. A recent study by the organic advocacy group IDDRI found that if Europe were to adopt agro-ecological food production practices, productivity would decrease by an average of 35 percent—meaning more than percent more organic-cultivated land would be needed to produce the same amount of food as produced conventionally.

unnamed file
Organic agriculture uses more land than no-till intensive agriculture, seen here. Credit: Shutterstock

The math of land saving through the use of modern technologies is so compelling and the yield deficits of organic production so thoroughly cataloged that they can’t be gainsaid. Anti-technology advocates generally prefer to avoid the topic altogether, focusing instead on Goulson-style claims about the adverse effects of chemical pesticides and ignoring organic farmers’ reliance on mechanical plowing using carbon-belching equipment as a form of weed control, which is massively destructive to soil health and biodiversity, and is a major contributor to carbon pollution.

The major sustainability contribution of conventional agriculture is the advent of no-till farming, which began with the use of chemical herbicides like atrazine and accelerated with the debut in 1996 of herbicide-tolerant GMO crops tied to glyphosate. GMO no-till farming has resulted in a massive reduction in carbon release estimated at 37 percent by the Belgian research institute VIB.

The turn away from efficient, intensive agriculture to accommodate the ideological fashion of our times could be a disaster for the fragile insect population. Population growth and growing affluence in the developing world over coming decades will require a sharp increase in necessary food calories, which can only occur by expanding farmable acreage—or by increasing yields on currently available acres.

All of these facts make the German meta-study very uncomfortable for organic farming advocates. The correlation in insect population increases with crops challenges the widespread damage to biodiversity they have been claiming. That may be why most of the major media reporting on the study, such as the BBC, simply ignored the finding, while others—Guardian, Reuters, Smithsonian—included swipes at pesticides not raised by the study authors and written in such a way that the average reader would assume it was backed up by research.

How fast is the decline? How real is the decline?

Trying to determine a global rate of decline, when the data is so uneven and, as the authors say, almost all effects are local and variations are so high even among adjacent sites, is fraught with difficulty. Nevertheless, the new study pins the rate of decline of land-based insects at just under one percent annually, which translates to an 8.3 percent decline per decade.

The study authors note some questions about the scope of the global decline, explaining that it was heavily influenced by what they term “outlier” studies with anomalously high findings. If these outliers are excluded, they say, insect populations would decline by far less, about 15 percent over 25 years.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

This too is not good, but it’s not an apocalypse; and there is time to turn things around even if the estimated trends are accurate. That hopeful take is actually supported by another important, though largely ignored, finding in the study, which is that the terrestrial insect declines in North America were no longer negative after 2000 and freshwater insects increased dramatically.

The fact that North American trends began plateauing or improving around 20 years ago suggests we are headed in the right direction in what had been up until then, according to the authors, the worst performing part of the world. Statistically speaking, once North American data was excluded, the study states that there was only “weak evidence for a negative mean tend” in global populations.

Geography and models

We all naturally gravitate to the headline numbers coming out of these studies. They’re simple, easy to remember and give us a sense of concreteness. Unfortunately, they are probably the least reliable and meaningful findings of all. If the ongoing COVID-19 pandemic has taught us anything, it’s that we should understand complex statistical modelling for what it is: a hypothesis generator or a sophisticated “best guess,” given current knowledge that may, as more facts come to light, prove to be anything from fairly close to wildly off the mark.

All one has to do is look at the maps of the geographic distribution of the studies included in van Klink’s analysis to realize just how problematic any conclusion about global trends is, considering the lack of data from most of the world. The vast majority of studies came from North America and Europe (by my count, almost 2/3 of all the studies).

screen shot at pm

There is a total of two studies from all of Africa, relatively few from Asia, and none at all from South Asia (India, Pakistan and Bangladesh). There is a single study from the Amazon, one of the richest sources of insect life on the planet.

These gaps are magnified by the fact that most of these studies concern only one specific order or family of insect, or some other sub-division (e.g. parasitoid wasps). But we know that different insect species vary enormously in their response to changes in climate, weather, disease, pollution and habitat destruction. It simply isn’t plausible that a model can compensate for what is, unfortunately, a massive quantity of unknowns, including, to borrow a phrase, many unknown unknowns, when it comes to insect population trends. Simply said, the likelihood of sampling error is immense.

It should be emphasized that none of this is to take away from the prodigious work of the van Klink research team. Almost all the criticisms outlined here are acknowledged and discussed by the authors themselves.

One of the most refreshing aspects of this study, in fact, has been the humility with which this team, which has done some of the best and most thorough work yet trying to establish global insect trends, has presented their results. In an article accompanying the study in Science, addressed to researchers not associated with the project, the team points the way forward for others in this field, and indeed in any scientific endeavor.

Advances in our knowledge about ongoing biodiversity changes and ability to predict future ones will require the incorporation of layers of nuance in patterns of change and drivers of that change.

The temptation to draw overly simple and sensational conclusions is understand­able, because it captures the attention of the public and can potentially catalyze much needed action in policy development and research arenas. However, fear-based mes­sages often backfire. This strategy has the grave risk of undermining trust in science and can lead to denialism, fatigue, and apa­thy. Embracing nuance allows us to balance accurate reporting of worrying losses with hopeful examples of wins. Hope is a more powerful engine of change than fear.

Jon Entine is the Executive Director of the Genetic Literacy Project and a life-long journalist with 20 major journalism awards. Follow him on Twitter @JonEntine

This article previously appeared on the GLP June 17, 2020.

Why it’s so critical to move beyond liberal rejectionism of human biodiversity

The way in which evolutionary explanations can be so readily applied to apparent differences in human psychology does highlight the glaring gap in the liberal consensus: if natural selection has produced the obvious physical differences in different human groups, could it not have done the same with cognition and behavior?

Cognitive scientist Steven Pinker has provided a measured critique of the Jewish IQ hypothesis—the claim that the higher IQ of Ashkenazi Jews is more genetics than culture. “The standard response to claims of genetic differences,” he has written, “has been to deny the existence of intelligence, to deny the existence of races and other genetic groupings, and to subject proponents to vilification, censorship, and at times physical intimidation”.

Harvard geneticist David Reich argued in an oft-cited 2018 New York Times Magazine essay, the consensus response can be wrong-headed, short-sighted and counter-productive. Beginning in the early 1970s, based in part on ground-breaking research by geneticist Richard Lewontin, a consensus emerged that “there are no differences large enough to support the concept of ‘biological races.’” As Lewontin had contended, to the extent that there was variation among humans, most of it was because of “differences between individuals.” This was the accepted interpretation, canonized if you will by Jonathan Marks in 1995.

But Reich argued that this ‘simple logic’ defies science and common sense:

…this consensus has morphed, seemingly without questioning, into an orthodoxy. The orthodoxy maintains that the average genetic differences among people grouped according to today’s racial terms are so trivial when it comes to any meaningful biological traits that those differences can be ignored.

The orthodoxy goes further, holding that we should be anxious about any research into genetic differences among populations. The concern is that such research, no matter how well-intentioned, is located on a slippery slope that leads to the kinds of pseudoscientific arguments about biological difference that were used in the past to try to justify the slave trade, the eugenics movement and the Nazis’ murder of six million Jews.

I have deep sympathy for the concern that genetic discoveries could be misused to justify racism. But as a geneticist I also know that it is simply no longer possible to ignore average genetic differences among “races.”

Reich and other geneticists and social scientists have come to believe that creating and policing taboos on touchy topics vacates the high ground to racists and bigots. Otherwise vacuous ideas acquire a veneer of credibility when presented as hidden knowledge that the public is not meant to know. This tendency to deny and denounce, however well-intentioned, may also stymy research that brings practical benefit to the very marginalized or oppressed groups that progressives profess to champion.

The more we know about human behavioral traits, the better able we are to address their potential negative consequences. How we act is mediated through innumerable other genetic and environmental influences; for example, in affluent social environments risk-taking may be advantageous (think individual dynamism or entrepreneurialism); in economically-deprived situations, however, this same trait may instead result in drug and alcohol abuse, criminality or violence. Again, such conclusions do not contradict the progressive imperative to improve social environment to mitigate undesirable social outcomes.

The original basis for Cochran’s and Harpending’s evolutionary hypothesis was genetic research into debilitating brain disorders predominantly found in Ashkenazim. Cochran and Harpending then speculated that these disorders might be an indication of rapid, recent genetic change in response to new environmental conditions—what they called ‘positive selection’. Genes that cause diseases are usually phased out of the human genome, as its carriers die without passing on the killer mutation to future generations. But some negative mutations survive. Why?

Consider sickle cell disease. In the sickle cell case, an increased prevalence of malaria due to new agricultural practices is thought to have sparked a partially successful genetic response, with two copies of the sickle cell gene providing malarial resistance but a single copy causing anaemia. In other words, sickle cell disease has not disappeared because the mutation also confers, in some populations, a survival benefit.

According to Cochran and Harpending, similar trade-offs may have occurred in the Ashkenazim case, with the deleterious brain disorders the unfortunate consequence of genes that, in different combinations, enhance rather than impair cognitive function. Their speculation about Jewish employment in Medieval Christian Europe, therefore, was Cochran’s and Harpending’s attempt to describe a possible recent selective pressure on cognitive function—one that, they believed, neatly explained both the incidence of specific cognitive disorders and the perceived greater intelligence of a distinct ‘racial’ group, Ashkenazi Jews.  

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

Facts before ideology

The work of political scientist James Flynn, who died in 2020, exemplifies how a commitment to scientific veracity on ‘taboo’ topics does not diminish liberal ideals. A lifelong social democrat, Flynn did not shy away from openly investigating recorded racial differences in IQ—how else, he reasoned, would we ever come to understand the determinants of intelligence and use this knowledge for social good? Yet if academic censorship had prevented his studies, Flynn may never have discovered some of the strongest evidence of environmental (rather than genetic) influences on intelligence—the so-called “Flynn effect” of rising intergenerational IQ. This effect, while still debated and not fully understood, clearly demonstrates the strong environmental (i.e., social and cultural) influences on human cognitive abilities. 

Credit: Anders Sorman-Nilsson

Politically, the Flynn effect sustains one long-held leftist belief while discrediting another. For a start, this evidence of culturally-induced cognitive change backs up progressive demands to improve the social and educational environment of those failed by the existing system. It also indicates that the current low attainment of some groups relative to others—that emphasized by the likes of Rushton and seized upon so gleefully by racists—is not ineluctable or inevitable. Importantly, evidence such as the Flynn effect clearly show that genes are not destiny.

It is not and never has been either nature or nurture, genes or environment. Genetically-mediated behaviors (such as risk-taking) have different outcomes in different social environments, while different social environments bring about different genetically-mediated behaviors (such as those associated with the Flynn effect’s rising IQ). Even the act of learning to read causes changes in how the human brain processes and perceives the world. 

u bbshhosAs outlined in anthropologist Joseph Henrich’s The WEIRDest People in the World, cultural processes appear to have changed the way WEIRD (Western educated industrialized democratic) people see the world. For example, Henrich argues that specific cultural changes in Medieval Europe—the Catholic Church’s suppression of marriage to close relatives, say, or the Protestant emphasis on individual literacy—have recently and rapidly (in evolutionary terms) transformed WEIRD people’s cognitive behavior. For Henrich, this is a process of nature via nurture, in which culture changes behavior, which then feeds back into culture—what Jon Entine in Taboo called a “biocultural feedback loop”: nurture determines nature as much as genes determine environment.

What does this mean for the liberal “standard response” to the question of evolved human biodiversity? 

In short, Darwinian reasoning can help explain why the world is as it is, but it doesn’t tell us how it could or should be. At the same time, the more we understand about how both genes and culture make us what we are, the more knowledge we will have to change society in desirable ways. None of this conflicts with liberal political aspirations. To rephrase a famous ‘progressive’ slogan: evolutionists can interpret the world in various ways; it is still up to us to change it. 

Or to paraphrase James Flynn, who remained sanguine about the possibility of deeper genetic influences on intelligence, if everybody had a decent standard of living, fewer people would worry that there were more accountants or dentists of one race than another.

Disease proclivity, like sports ability and IQ, are the product of many genes with environmental triggers influencing the “expression” of our base DNA. It’s further shaped throughout our lives by a ‘biosocial feedback loop’. 

Why touch this third rail of human biodiversity? After all, as UCLA’s Jared Diamond has noted, “Even today, few scientists dare to study racial origins, lest they be branded racists just for being interested in the subject.”

Acknowledging the fact of evolved diversity—in our bodies and in our brains—isn’t racist. It will not perpetuate existing racial inequalities. Indeed, what will maintain the current status quo, and encourage the rants of the alt-right, is wilful denial of the complex environmental and genetic factors that underpin human physical well-being and social behavior. 

Over the past two decades, human genome research has moved from a study of human similarities to a focus on population-based differences. We readily accept that evolution has turned out Jews with a genetic predisposition to Tay-Sachs, Southeast Asians with a higher proclivity for beta-thalassemia and Africans who are susceptible to colorectal cancer and sickle cell disease. So why do we find it racist to suggest that Usain Bolt, in addition to incredible training commitment, can also thank his West African ancestry for the most critical part of his success—his biological make-up?

Genes influence human social outcomes; we have a moral responsibility to accept this and to use that knowledge to improve people’s and peoples’ lives.

Jon Entine is the founding executive director of the Genetic Literacy Project, and winner of 19 major journalism awards. He has written extensively in the popular and academic press on agricultural and population genetics. You can follow him on X @JonEntine

Patrick Whittle has a PhD in philosophy and is a freelance writer with a particular interest in the social and political implications of modern biological science. Follow him on his website patrickmichaelwhittle.com or on X @WhittlePM

On the anniversary of Kristallnacht, as the Israel-Hamas War rages, a DNA data leak of Jewish 23andMe customers raises fears of modern-day Jewish yellow badges

untitled design

Tonight is the 85th anniversary of Kristallnacht, “The Night of Broken Glass.” On November 9 and 10, 1938, Storm Troopers, Hitler Youth and civilians rampaged throughout the German Reich, shattering the windows of more than 1400 synagogues and more than 7,500 businesses and Jewish homes. More than 400 people were murdered. 

image
Destroyed Bamberger & Hertz Department Store, Leipzig (Jewish Museum of Berlin)

In the following days, in what the Nazi government called Judenakation (Jewish Action), the Gestapo arrested more than 30,000 Jews, herding them through the streets and then transporting them to concentration camps. 

Three days later, in yet another gruesome twist, Hermann Göring, designated as Hitler’s successor, decided not only to fully “eliminate Jews from German economic life,” but to make the victims collectively pay 1 billion Reichsmarks for the damage inflicted on them. 

Nazi officials implemented the Star of David yellow badge to brand Jews as their master plan of elimination accelerated. Many date these horrific events as the formal start of the Holocaust.

screenshot pm

The Holocaust that followed led to the coining of the phrase “Never again,” a commitment of Jewish resolve to never again allow the branding of Jews as “the other”. 

That resolve is now being tested. As the Israeli-Hamas war rages on, advances in genetics are raising perilous questions about whether a new form of anti-Jewish branding could emerge.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

DNA, ancestry and Jews

After World War II, antisemitism at times faded to a simmer only to resurge with targeted attacks, never really vanishing. But even with a surge of attacks on synagogues around the world since 2016, anti-Jewish hate seemed a fringe phenomenon. When I read an article in Wired, “23andMe User Data Stolen in Targeted Attack on Ashkenazi Jews,” I skimmed it, not too concerned. It was Friday, October 6. 

image

If anyone wanted to see my personal info at 23andMe — the map of eastern Europe and my 99.7% Ashkenazi Jewish ancestry or that I can taste asparagus, carry a tune and have more Neanderthal DNA than most customers — so be it. 

But I felt differently the next day, Saturday October 7. In the immediate wake of Hamas’ attack on innocents in Israel, being identifiable as Jewish had become a risk. Were my consumer DNA test results akin to the yellow stars that Jewish people were forced to wear in Nazi-occupied countries during the Second World War?  

DNA profiles of Ashkenazim are indeed strong identifiers, for we descend from Jews who trace their ancestry to the Levant (Egypt, Cyprus, Iraq, Palestine, Syria, Jordan, Lebanon and Turkey) during Biblical times. Many thousands migrated to Italy and then to Central Europe, with the men often taking on non-Jewish wives. 

After an unending undulation of small population bottlenecks during the first millennium that serially strangled the diversity of our gene pool in a continual choreography of hatred, the majority of surviving Jews settled in Eastern and Central Europe. Known as Ashkenazim, they went through at least one “genetic bottleneck” by 14th century, whittling the population to only a few hundred individuals. It then grew slowly for centuries and then exponentially from the 18th century onward.

All present-day Ashkenazi Jews can be traced back to this small population. Over the centuries, Jews have been periodically blended in or co-existed peacefully with local populations or viewed as a separate ‘race’ and an object of hate. We are now seeing the flames of antisemitism reignite at universities and public protests across the US, Europe and elsewhere, with Jews globally viewed as proxies for Israel.

image
Courtesy: Austrian Union of Jewish Students

It’s uncomfortable to acknowledge this in the current atmosphere, but Ashkenazi Jews, although not a ‘race’, are identifiable by genetic signatures and so-called Jewish diseases, the legacy of intermarriage among our clan dating to the Middle Age bottleneck. I can’t imagine exactly how consumer DNA data from Jewish people might be misused, but being so identifiable is concerning. And while recent legislation oversees use of consumer DNA data in law enforcement and life, long-term care, and disability insurance, it doesn’t go far enough to consider genetic privacy and hate crimes.

Attack and hack? 23andMe alerts consumers

On October 12, a generic email from 23andMe came: 

Dear Ricki,

We recently learned that certain profile information — which a customer creates and chooses to share with their genetic relatives in the DNA Relatives feature — was accessed from individual 23andMe.com accounts. This was done without the account users’ authorization. We do not have any indication at this time that there has been a data security incident within our systems, or that 23andMe was the source of the account credentials used in these attacks.

This missive went on to explain that customers using the same password for different things could have faciliated the breach, so we were admonished to change passwords and enable multiple-factor authentication. “If we learn that your data has been accessed without your authorization, we will contact you separately with more information,” the email said.

In the next few days, as details emerged of the massacre at the Nova music festival, 23andMe’s initial email that “certain profile information” had been breached became more specific:  the customers are of Ashkenazi Jewish heritage. 

Me. 

I got no further information, even at 23andMe’s blog. Their most recent news release, pitches the new “23andMe Health Membership“ of “customized action plans, from recommended lifestyle changes to clinician-ordered lab tests.” Christmas and Hannukah are nearing, and kits must be sold, the website already festooned with ads for the holiday.

screenshot pm

Was the timing of the 23andMe hack targeting private genetic data of Jews, just prior to the two-years-in-the-making October 7 attack by Hamas, a coincidence? Perhaps not. 

The Wired report had details:

Hackers posted an initial data sample on the platform BreachForums earlier this week, claiming that it contained 1 million data points exclusively about Ashkenazi Jews …. On Wednesday, the actor began selling what it claims are 23andMe profiles for between $1 and $10 per account, depending on the scale of the purchase. The data includes things like a display name, sex, birth year, and some details about genetic ancestry results … like that someone is of ‘broadly European’ or “broadly Arabian’ descent. The information does not appear to include actual, raw genetic data.

Even raw genetic data would be meaningless. It’s just a collection of spots —“data points” — in a genome where the DNA base varies in populations, not sequenced genes or anything about traits or health. The colored maps from 23andMe and Ancestry come from a million or so points (SNPs, for single nucleotide polymorphisms) and match the patterns to those from the ancestral peoples of nearly 2,000 populations. 

image

Most of us Ashkenazim are probably sixth cousins or closer. Our data are boring! We’re most recently from Ukraine, Poland with its fluid borders, Lithuania, elsewhere in eastern Europe. Comedian Modi Rosenfeld summed up his disappointing experience with consumer DNA testing:

I got back the results, and they have these amazing pie charts. One-eighth Navajo Indian! A quarter Iberian! Celtic, Norwegian, African, I was so excited for mine! I opened mine up: 99.8% Jewish. That was it. No Vikings, no Navajo, no nothing. The pie chart was a matzoh ball with a toothpick in it.

screenshot pm

But it is that sameness, the near 100% identity on a genetic level, that’s terrifying, that evokes the image of yellow stars used to identify Jews in Nazi Germany.

Broaden DNA privacy laws? 

Ironically, also on October 31, The Journal of the American Medical Association published a viewpoint, “The Genetic Information Privacy Act: Drawbacks and Limitations.” Anya E. R. Prince, JD, from the College of Law, University of Iowa, Iowa City, discussed the Genetic Information Privacy Act bills under consideration in 26 states over the past three years. Their aim was to protect genetic privacy for consumer DNA testing results used by law enforcement and insurers. 

Eleven states have adopted the laws, with bipartisan support. “However, the public should not be fooled. Even though the bills do offer sensible and important protections, they miss the mark at fully addressing many genetic privacy concerns held by the public and many in the medical and research fields,” Prince writes. 

One that the laws clearly miss is misuse of genetic information to identify members of a group to perpetuate hate crimes. 

Given the schedule of journal publishing, Prince likely wrote her Viewpoint well before the events of October 7, and so it doesn’t cover this new kind of nefarious uses of DNA data. 

Existing protection against use and misuse of genetic test results, from the Genetic Information Nondiscrimination Act of 2008, covers clinical tests. GINA prohibits employers from using genetic information to hire, fire, or promote an employee or require genetic testing, and health insurers can’t require genetic tests nor use results to deny coverage. The law clearly defines genetic tests – DNA, RNA, chromosomes, proteins, and metabolites – and genetic information – genetic test results and family history of a genetic condition. 

But the year of GINA’s debut is critical: 2008. That’s when consumer DNA testing began. I was at the human genetics meeting where it was introduced to an initially highly skeptical audience of scientists and physicians. I don’t think they, or the founders of the companies, imagined how eager people would be to spit into tubes or swish their cheek linings to learn about their DNA. 

In contrast, the new Genetic Information Privacy Act covers direct-to-consumer DNA testing and emphasize consent of the individuals who provide DNA samples for how the data are used. It cites two applications: law enforcement and insurance. 

According to the new laws, testing companies must follow “valid legal processes” that include consent when providing DNA data to law enforcement agencies. However, Prince points out that these “processes” are not defined. Plus, some consumer DNA testing companies already require such consent, so it clearly isn’t sufficient.

At issue is familial DNA matches — using DNA data from someone in prison who shares genetic markers with a relative who is a suspect known from DNA found in evidence. That’s how the Golden State Killer was identified. 

screenshot pm
Credit: Sacramento Police Department

On the insurance front, since 2021, 29 bills in 16 states have proposed consent to use consumer DNA test results in considering life, long-term care, and disability insurance. 

University of Iowa College of Law professor Anya E. R. Prince’s conclusion is both prescient and chilling:  

Although a groundswell of states passing a Genetic Information Privacy Act seems like a consumer win, the sweeping name should not obscure the fact that most of the laws passed at the state level only add minimal protections focused on consumer consent, transparency, and internal data handling. The enacted legislation does not robustly address the fact that third parties, particularly law enforcement and insurers, can still access and use consumer genetic data.

Coda

Being Jewish this time of year, especially if you live in a nearly-100-percent Christian town like I do, can be tough. That you celebrate Christmas is assumed. Years ago, well-meaning people would ask my kids what Santa was bringing them, and when they answered nothing, were lectured for “being bad.” Public school had Christmas gift exchanges; school photos were taken on Yom Kippur. Today, I leave dance classes when the Christmas music is unending. 

But being marked as Jewish by my DNA is different. Sometimes it’s better being invisible.

Even if nothing comes of the 23andMe DNA data breach, the possibility reawakens the never-very-far-away collective familial memory of what happened to our ancestors in Eastern Europe and of the lucky ones who escaped the mass annihilation of the Pogroms leading up to the Holocaust. For we always know that it can happen again. 

Ricki Lewis is the GLP’s senior contributing writer focusing on gene therapy and gene editing. She has a PhD in genetics and is a genetic counselor, science writer and author of The Forever Fix: Gene Therapy and the Boy Who Saved It, the only popular book about gene therapy. BIO. Follow her at her website or X @rickilewis

GLP podcast/video: BBC corrects botched organic farming report; Happy 41st birthday, GMO insulin! Scientific American a ‘scientific sewer’?

fallacies
Facing intense criticism from experts, the BBC was pushed to correct a deeply misleading story about the benefits of organic farming. Genetically engineered insulin turned 41 this year. Let’s take a look back at the importance of this groundbreaking drug. According to one biologist, Scientific American has become a “scientific sewer.”

Podcast:

Video:

Join hosts Dr. Liza Dunn and GLP contributor Cameron English on episode 242 of Science Facts and Fallacies as they break down these latest news stories:

In an error-ridden report on the benefits of organic farming, the BBC alleged that conventional agriculture poses a serious risk to biodiversity and human health, owing to its use of “synthetic” pesticides and other chemicals that can increase pollution. Science for Sustainable Agriculture, self-described as a “pro-science think-tank,” lodged a complaint with the news network, urging it to remove the misleading claims from its BBC Bitesize website, which is geared toward undergraduate college students. After reviewing the matter, BBC editors updated the website and issued a correction. It’s a great example of experts holding the media accountable, and reporters taking steps to ensure the accuracy of their coverage.

We used to retrieve insulin from the pancreases of slaughtered cows and pigs to treat diabetics, who don’t produce enough of the hormone naturally to maintain their blood sugar within healthy limits. That all changed in the 1970s with the advent of genetic engineering. Using this recombinant DNA technology, scientists coaxed GE bacteria into producing virtually unlimited quantities of human insulin, which proved to be safer and more effective than its counterpart derived from animals. Former FDA official Dr. Henry Miller, who oversaw the agency’s medical review of the novel drug, recounts how significant this development was for public health, and what regulators today should learn from the story.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.
Once a respected publication with a massive audience, Scientific American has fallen from its position of prominence, says University of Chicago evolutionary biologist Jerry Coyne. In fact, Coyne argues, the publication has become a “scientific sewer,” more concerned with promoting fashionable but demonstrably false ideas about biological sex than explaining science to a broad audience.

Dr. Liza Dunn is a medical toxicologist and the medical affairs lead at Bayer Crop Science. Follow her on X @DrLizaMD

Cameron J. English is the director of bio-sciences at the American Council on Science and Health. Visit his website and follow him on X @camjenglish

Viewpoint: Activists falsely claim Bill Gates orchestrated flare-up in malaria cases so he can ‘cash in’ on eradicating it

e b e b
The eight cases of locally transmitted malaria recently reported in the U.S. – the first in 20 years – have elicited loony conspiracy theories about the cause. They’re bunk.

The CDC has reported that at least eight malaria infections in Florida and Texas have been detected during the past two months. These cases represent the first time locally acquired mosquito-borne malaria has occurred in this country since 2003.

Malaria is a serious and sometimes fatal disease marked by high fevers, shaking chills, and flu-like illness. It is caused by a parasite, Plasmodium, that commonly infects a certain species of mosquito that can then transmit the organism to humans when it bites them. The life cycle of its infection of humans, depicted below, is exceedingly complex.

malaria lifecycle
Courtesy CDC

About 2,000 cases of malaria are diagnosed in the United States each year, almost all in travelers and immigrants returning from countries where malaria transmission occurs, mostly sub-Saharan Africa and South Asia. Between the eradication of malaria in the U.S. in 1970 and the cases in Florida in 2003, the CDC reported 11 outbreaks involving only 20 instances of what was thought to be locally acquired mosquito-transmitted malaria.

Worldwide, however, malaria is a huge burden. The World Health Organization estimates that in 2020, 241 million clinical cases of malaria occurred, with 627,000 deaths, most of them children in Africa.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

Lately, it seems that any infectious disease outbreak brings the conspiracy-theory crazies out of the woodwork, and this one is no exception. “So let me get this straight,” one person tweeted. “Bill Gates has been releasing GMO mosquitoes in Florida and Texas, and now Florida and Texas, for the first time in 20 years, have mosquitoes that give people malaria??? Is that right?”

Another tweeted:

It must be a coincidence that from 2003-2023 there wasn’t one case of Malaria spread by mosquitos … and along comes a company funded by Bill Gates … to solve a problem that didn’t exist … and suddenly in the exact places where he releases mosquitos … there’s an outbreak of Malaria?

The Gates Foundation did give a grant to Oxitec, a British company that has developed genetically engineered mosquitoes to control mosquitoes that cause certain viral diseases, including Zika, dengue fever, chikungunya, West Nile, and yellow fever — but not malaria. These mosquitoes were released in California and Florida’s Monroe County, but not Sarasota County or Texas, where malaria was contracted. Further, the Oxitec mosquitoes are the species Aedes aegypti, while the ones that transmit malaria are Anopheles. Also, the Oxitec mosquitoes are males, which don’t bite. (They reduce Aedes aegypti populations by passing a lethal mutation to their offspring.)

To be clear, Oxitec’s mosquitoes were not released where malaria was caught, are not of the species that transmit malaria, and are not even of the sex (female) that bites humans.

Researchers at Imperial College London have engineered mosquitoes that slow the growth of malaria-causing parasites in their guts and prevent transmission of the disease to humans. The mosquitoes carry a genetic modification that causes them, after a blood meal, to produce two antimicrobial peptides in the gut that inhibit the malaria parasite’s development. In a lab setting, the strategy works well to reduce the possibility of malaria spreading, but to my knowledge, there have not yet been any field trials of those mosquitoes, and certainly not in the U.S.

The Gates Foundation is also involved in this project, developing a model that can assess the impact of such modifications if used in various African settings. It predicts that the Imperial College approach could be a powerful tool for reducing the number of cases of malaria even where transmission is high. There is no mention of genetically engineered mosquitoes in the Gates Foundation’s extensive discussion of its many malaria-related projects.

Disease-carrying mosquitoes do not observe national borders. They can enter the U.S. on ships and planes, as well as in cars and trucks. We don’t need nutty conspiracy theories to explain the presence of locally acquired malaria cases in the U.S.

Henry I. Miller, a physician and molecular biologist, is the Glenn Swogger Distinguished Fellow at the American Council on Science and Health. He was the founding director of the FDA’s Office of Biotechnology. Find him on X @henryimiller

A version of this article was originally posted at the American Council on Science and Health and has been reposted here with permission. Any reposting should credit both the GLP and original article. Find the ACSH on X @ACSHorg

Twitter/X’s race to the disinformation bottom: Are we losing a valuable forum for rational discussion?

image

While many users are fleeing Twitter/X in disgust at the turn it has taken toward encouraging the spread of conspiracy theories, antisemitism and hate under Elon Musk’s leadership, a cadre of established users are remaining to fight against these very trends.

My corner of Twitter

I joined Twitter in January 2011, mainly to follow discussions in my field of epidemiology, including discussions of topics concerning potential health risks, such as cell phones, the chemical BPA, the weedkiller glyphosate, and vaccines. 

image

Gradually, I added people who had knowledge in areas that interested me and who wrote with verve and authority. Over time, my curated feed grew to about six hundred people in areas including medicine, public health, biology, biotech, history, literature, politics, journalism, psychology and more. 

Twitter put me in touch with people who had vital information, as well as insight, to share about subjects I was interested in or writing about. This became a matter of life-or-death during the pandemic, when in-person contact was curtailed. Having access to the latest findings posted by people like Eric Topol, Peter Hotez, Laurie Garrett, Michael Mina and many others made it possible to follow the changes in SARS-CoV-2 over time as well as the effectiveness of vaccines and other public health measures. 

Although I was aware that there were many more worthwhile people I could follow, the few I chose was all I could manage for a half hour or so each day. I was content to get a sampling of what seemed to be highly informed and interesting views on topics that interested me. Occasionally, I would “unfollow” an account I was following if I felt that the writer descended into polemics or flagrant disinformation.

On this basis, my Twitter feed — my corner of Twitter — always provided plenty of thoughtful commentary on the news, politics, health and other topics.

Toward the end of 2022, I became aware that several people I was following and admired were abandoning the site in disgust, with some of them switching to other social media sites. I regretted losing these voices, but I was only vaguely aware of what was behind these defections. I don’t recall seeing any detailed explanation of anyone’s reasons for leaving. Still, most of the people I was following stayed on the site, even though some expressed ambivalence and seemed to be considering leaving. 

Now, even for someone with my blinkered view, it has become impossible to ignore how the Twitter that I only used in a limited, highly curated, way has been turned into a mechanism for amplifying conflict and distrust as a means of boosting engagement and increasing revenue. 

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

The case against Twitter/X

In early October, Bloomberg technology columnist Dave Lee announced that he had decided to step away from Twitter/X as an engaged user, although he would continue to report on it. Lee wrote: 

One thing the prior Twitter management didn’t do is actively make things worse. When Musk introduced creator payments in July, he splashed rocket fuel over the darkest elements of the platform. These kinds of posts always existed, in no small number, but are now the despicable main event. There’s money to be made. X’s new incentive structure has turned the site into a hive of so-called engagement farming — posts designed with the sole intent to elicit literally any kind of response: laughter, sadness, fear. Or the best one: hate. Hate is what truly juices the numbers.

Lee concluded: 

X is now an app that forcibly puts abhorrent content into users’ feeds and then rewards the people who were the most successful in producing it, egging them on to do it again and again and to make it part of their living. Know this: As the scramble for attention increases, the content will need to become more violent, more tragic and more divisive to stand out. More car crashes, high school fights and public humiliation.

Lee was hardly alone in his distress over the changes made under Musk. Here are some recent posts on Twitter/X from several people I follow reflecting on all these changes. Each poster is in a different discipline but their point is the same:

image

image

image

Three weeks ago, speaking on the PBS NewsHour, information warfare researcher Emerson Brooking succinctly analyzed the impact of Musk’s refashioning of the site to “tear down all gatekeepers, even when gatekeepers are performing an invaluable function.”

The first big thing he’s done is essentially remove the ability to identify and verify credible accounts, journalists or other people who have been vetted and trusted. And second, he’s introduced a for-profit motive which didn’t previously exist on the platform, which incentivizes accounts, often under specious identities, to spread false content as quickly as possible in order to maximize their own revenue.

And all this matters because so many policy makers, so many journalists, so many people still use this platform to understand what’s happening. Now it’s become much harder.

Any event can provide the grist for engagement, whether questioning how deadly the novel coronavirus is or documenting Hamas’ attacks on civilians in southern Israel and the Israeli response. However, in wartime, misinformation and unverified reports take on a qualitatively different character, with the power to ignite responses from local populations, neighboring militias and powerful nation-states, as we are seeing unfold in the eastern Mediterranean. 

CNN has reported that some of the biggest spreaders of false information about the Israel-Hamas war on Twitter/X “are premium, so-called ‘verified’ accounts that pay the social media company formerly known as Twitter to promote their posts to boost visibility.”

CNN cited a report from NewsGuard, an information analysis company, which identified seven accounts that qualified as “misinformation superspreaders,” sharing widely discredited claims about recent events in Israel and Gaza accompanied by videos from earlier, unrelated conflicts and even video games. NewsGuard has launched what it calls an Israel-Hamas War Tracking Center to monitor the tsunami of disinformation.

image

It also analyzed 250 of the most highly-circulated posts from the first week of the conflict containing some of the most common false or unsubstantiated claims about events. These posts were viewed more than 100 million times in just one week. The company said that 186 of the 250 posts, or 75 percent, originated from premium X accounts.

In this perverse way, Musk’s X incentivizes spreaders of misinformation by promoting their “verified” accounts in exchange for a $8 monthly fee, and enabling them to receive payments from the site. 

Two Twitters

There are two conflicting Twitters. One is the company that has mostly been in the red since its founding in 2006 and that Musk is attempting to put on a sound financial footing. To achieve this aim, he is reshaping X to mirror his worldview. According to Axios, X is already evolving into another company altogether, “a Musk-run social universe that pulls together 24 years of his ideas and wildest fantasies.” In pursuit of his vision, Musk is doing everything he can to engage users by promoting misinformation and allowing antisemitism and conspiracy theories to flourish.

The second Twitter, which I’ve been tuned in to for more than a decade, provides valuable information and commentary posted by people with no agenda other than to parse the facts relevant to a question of interest.

These two Twitters represent different principles and models of social engagement — between thinking fast and slow, between the reptilian brain and the frontal cortex, between the mob and the individual. Currently, it comes down to the difference between ensnaring followers in conspiracy theories versus trying to identify new insights to improve one’s understanding of some phenomenon. 

Health Nerd on ivermectin  

Amidst the noise and toxicity, to come across a carefully-explained, fact-based treatment of a question or the dissection of a false notion that has gained currency inspires gratitude.

Here is one example from many I could have chosen. 

Early in the coronavirus pandemic, when patients were dying on ventilators in hospitals and before there was a vaccine or effective treatment, the anti-malaria drug hydroxychloroquine and the antiparasitic drug ivermectin were touted to provide dramatic protection. Sales of both drugs skyrocketed and store shelves emptied. 

Gideon Meyerowitz-Katz, aka Health Nerd, an Australian epidemiologist, was appalled that so many politicians (including President Trump), self-appointed experts and disinformation entrepreneurs promoted these drugs. He was moved to respond on Twitter. In a 25-Tweet thread, he led the reader through an early paper purporting to provide evidence of the effectiveness of ivermectin in preventing Covid-19 — and why it was problematic.

image

image

Even a cursory look at the paper revealed red flags — conflicting numbers of study subjects in different parts of the paper; its appearance in a highly questionable journal; and the fact that the drug appeared to be close to 100 percent effective, something unheard of in medicine. 

In retrospect, it turns out that this and several other papers that were heralded as game-changers were fraudulent. But, in an atmosphere of panic and terror, people were desperate for an effective treatment, and charlatans and some clueless physicians latched onto the “evidence,” based on hearsay and apparently without giving it even the most cursory look. In fact, few people are in a position to actually read and dissect papers claiming stunning results. 

As Meyerowitz-Katz later explained in a superb essay in Medical News Today published in December 2021, what was lost sight of is something that every health scientist knows — or should know: much of what is published is either false or over-stated. Reviewers of papers In epidemiology that, no matter how bad and no matter how many times it is rejected, it can end up getting published somewhere.

When we most needed effective fact-checking, our grand institutions of scientific research instead reviewed studies in a matter of days, if not hours, and posted fraudulent studies online to be shared across the world.

It’s tempting to say that research into ivermectin is uniquely flawed, but that’s clearly not true — realistically, it would be remarkable if a broken system produced only one failure.

Going against the current

Over the past dozen years, Twitter has provided an invaluable forum for making breaking news accessible in real-time, exchanging ideas, circulating information, including scholarly papers and critiquing false narratives that have gained currency. 

Despite the atmosphere of self-indulgence and narcissism that dominates so much of Twitter/X under Elon Musk, there are still ongoing, disciplined efforts, by people like Meyerowitz-Katz, to leverage social media to bring knowledge to bear on important questions and to challenge misinformation — not for attention or monetary gain but to contribute to the common good. 

One hopes that these dedicated people will be able to continue to provide an alternative to the direction Twitter/X appears to be headed in — and to champion a model of rational exchange and discourse.

Geoffrey Kabat is a cancer epidemiologist and the author of Hyping Health Risks: Environmental Hazards in Daily Life and the Science of Epidemiology and Getting Risk Right: Understanding Science of Elusive Health Risks. Find Geoffrey on X @GeoKabat

Blame human evolution for corporate jargon and thick academic prose

corporate jargon dot top x @ x
For anyone who’s ever worked in a large organization, this kind of message will be depressingly familiar: “Do you have capacity to cascade the following information to your team? Please escalate inquiries through line management channels.”

In plain English, this (real-life example) simply means: “Pass this on. Any questions, just ask.”

So why not simply say so? And what on earth has this to do with genetics?

The late philosopher Denis Dutton provides a potential answer to both questions. In addition to writing The Art Instinct, a Darwinian analysis of aesthetics, Dutton also ran a Bad Writing Contest that lampooned “the most egregious examples of awkward, jargon-clogged academic prose” produced in any particular year. (Sadly, this competition — to borrow a phrase from one of its winners — now has an “absentation of actuality”. That is, it no longer exists.)

And it is Dutton’s explanation for why so many academics indulge in incomprehensibly obscure writing that is of relevance here: that is, that it’s a smokescreen, deliberately designed to hide trite or lightweight arguments within profound-sounding text. As Dutton says of one year’s 94-word single sentence winning entry, “To ask what [it] means is to miss the point. [It] beats readers into submission and instructs them that they are in the presence of a great and deep mind. Actual communication has nothing to do with it.”

Or, more succinctly, it’s simply “showing off”.

Much the same could be said of the corporate communication speak with which we began that obscuring simple messages with unnecessary jargon is merely an attempt to make mundane messages sound more important. In effect, such language use is merely a display and it’s with the concept of ‘display’ that biology and genes come into play.

The classic example of biological display is, of course, the gorgeous tail of a peacock; something that, surprisingly, the father of evolutionary biology, Charles Darwin, found particularly unpleasant: “The sight of a feather in a peacock’s tail,” he noted shortly after the publication of the Origin of Species, “makes me sick!”

peacock eye feathers flickr ozgurmulazimoglu

Actually, Darwin’s reaction here was far from surprising – after all, the peacock’s flashy appendage seemed impossible to explain through Darwin’s freshly formulated theory of natural selection. How on earth could such a cumbersome and impractical adornment surely a handicap in evading predators  have survived natural selection’s remorseless ‘struggle for existence’?

Darwin’s ingenious solution to this dilemma was his theory of sexual selection (1871); briefly, that elaborate traits such as the peacock’s tail (or its incredible spider equivalent) could arise, even at the expense of individual survival, if the traits ultimately increased an individual’s reproductive success.

Put simply, the peacock’s tail is a reproductive organ (albeit, a particularly large and protruding one) that functions merely to attract females. And as peahens prefer to mate with males that possess the most impressive tails, these males end up with more offspring — offspring that, moreover, tend to be similarly well-endowed and similarly willing to advertise the fact. The antlers of a stag are another example of such (equally large and protruding) reproductive organs; although here, these are sexually selected weapons as much as ornaments, used to fight off other males or to protect the territory and resources needed to attract females. (In humans, women’s breasts and men’s V-shaped torsos are thought to be sexually selected traits, designed more to attract mates than to aid survival.)

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

A modern refinement to Darwin’s original theory is that sexually selected traits are also signals of ‘good genes’. (Though see Richard Plum’s recent Pulitzer-nominated The Evolution of Beauty for a critique of this idea). In the case of peacocks, for example, a glorious tail indicates that its owner has the wherewithal not only to escape enemies, but also the genetic fitness to overcome the environmental stresses (such as disease or parasites or shortage of food) that might otherwise  interfere with the tail’s development. If the ‘good genes’ theory is correct, then peahens select their preferred partners less on explicit aesthetic grounds and more on implicit genetic ones.

All well and good but what has this to do with the egregious examples of language use mentioned above?

According to evolutionary psychologist Geoffrey Miller, for example, many aspects of human language including story-telling and singing, or wordplay and humor are likely the result of sexual selection, with linguistic competence functioning as a signal of the cognitive abilities of the speaker. A large vocabulary, for example, indicates intelligence and education (plus the resources to pay for it), while quick-witted repartee similarly shows an active, fully-functioning brain, along with an engaging, entertaining personality. And such cunning linguistics, in Miller’s view, is ultimately aimed at one thing only: reproductive success.

Indeed, in his seminal The Mating Mind (2000), Miller argues that the human brain itself is primarily a product of sexual rather than natural selection: “an entertainment system” designed principally to stimulate and attract other brains; in other words, the idea that our incredible cognitive abilities have evolved, “like the peacock’s tail, for courtship and mating”.

At first glance, Miller’s claim that our brains (and our intelligence) are largely geared towards sexual display runs contrary to a far more intuitive notion: that possessing such brains/intelligence provides a purely functional advantage in the struggle for life (building tools, planning hunting, outwitting rival humans, and the like).

Yet Miller’s argument makes sense of why the human brain, like the peacock’s overly elaborate tail, appears unnecessarily complex after all, a brain capable of appreciating art or music or math (or even dad jokes) seems somewhat excessive for the Pleistocene environment in which it evolved. Why would naked, bipedal apes like us possess brains that (to steal from the title of a recent text on human evolution) can understand the universe?

Sexual selection also helps explain otherwise puzzling aspects of modern human social behaviour. In a later book, Miller amusingly illustrates this by asking: “Why would the world’s most intelligent primate buy a Hummer H1 Alpha sport utility vehicle”? Darwinian sexual selection provides a good answer, he argues: “Humans evolved in small social groups in which image and status were all-important, not only for survival but for attracting mates, impressing friends and rearing children.”

In our ancestral environment, possessing a bearskin or a stone tool (or, more especially, the ability to obtain the former or craft the latter) would have afforded such all-important status; in the modern consumer world, a Hummer (or an equally expensive equivalent) does much the same job.

But even today, it’s not just material possessions that enhance image. As already suggested, humor or story-telling or singing abilities are also attractive to others. (And, of course, such attraction can be for both material assets and individual abilities: for instance, the sexual allure of Rolling Stones crooner Mick Jagger, whose youngest child was born well after he had become a great grandfather, is perhaps now due as much to his fame, fortune and followers as to his wonderful voice. And his rugged good looks, of course.)

And this is where we can return, at long last, to Denis Dutton. In The Art Instinct, Dutton suggests that artistic expression (much like humor or Hummer-buying) is a sexually selected human characteristic; in brief, that, in evolutionary terms, an artist’s explicit display of virtuosity is also an implicit signal of the quality of his or her genes. Given the ‘nature red in tooth and claw’ view of sexual selection, something as inessential as art a feature of all human societies is difficult to explain; unless, of course, it enhanced our ancestors’ individual reproductive success.

px titian sacra conversazione the madonna and child with saints luke and catherine of alexandria banner

Dutton’s is an intriguing idea of the origins of art (one that’s at odds with the cultural explanations of prevailing art theory). But it also provides a satisfying answer to an otherwise puzzling aspect of art appreciation the strong negative reactions people have to discovering that a supposed Old Master, say,  is actually a cunningly executed fake. If both the original and the imitation are virtually indistinguishable, what real difference does it make?

According to Dutton’s Darwinian account, such fakery would make a great deal of difference if our focus is aimed not solely on aesthetic value but rather on the genetic worth of the artist. We cannot help but feel cheated by forgeries because, in our ancestral past, falling for a con artist rather than a real artist could have carried a reproductive cost (i.e., mating with an inferior partner).

The annoyance caused by corporate-speak, therefore, likely has the same origin the underlying feeling that it too is false, an attempt to disguise something of little worth as something of value. Unfortunately, given that we have an evolved susceptibility for showy language that which, in the past, would have been a good signal of a good brain (and the good genes that built it) simple plain English can sound, well, simply too plain. Whether we like it or not, we’re often impressed (or tricked) by others’ language use think politicians, preachers and marketeers.

To call it linguistic peacocking makes instant sense; behavior aimed at making an impression, winning friends and influencing people. At a deeper level, hidden from our conscious minds, it’s motivated by the same impulses (and indeed the same hormones) that cause the real peacock to strut its stuff, with the ultimate aim you guessed it simple reproductive success. And deeper still, of course, are the underlying genes that direct the behavior that’s given the world corporate-speak, country music and cosmetics plus all the other wonders of human art and science.

Darwin’s theory of sexual selection explains so many otherwise baffling aspects of human behavior including this overly elaborate essay on his ingenious idea.

Patrick Whittle has a PhD in philosophy and is a freelance writer with a particular interest in the social and political implications of modern biological science. Follow him on his website patrickmichaelwhittle.com or on X @WhittlePM

This article previously appeared on the GLP July 14, 2020. 

Anti-chemical activists up in arms as European Union leans toward reauthorizing use of weedkiller glyphosate. Here’s the science EU should consider

image

Does the controversial weedkiller Roundup, made by Bayer and marketed in generic form by more than 30 companies as glyphosate, pose a cancer risk to humans?

Protestors in Europe, stirred by inflammatory campaigns launched by environmental activist groups, say ‘no’. 

Is their opposition to the herbicide grounded in science or ideology? There are these facts that all sides agree on: As the most used weedkiller in the world, glyphosate is ubiquitous. Gardeners and other applicators have been spraying the herbicide for decades and the general population is exposed to it through micro-traces in our food. But does that exposure pose harm to human health or the environment?

image
Credit: University of Sydney

What do environmental groups maintain?

Prominent activist groups on the environmental left have coalesced around the belief that there is overwhelming proof that glyphosate is carcinogenic and a killer, posing dangers to humans and the environment. Here are a few recent statements by high-profile anti-glyphosate campaigners:

Natural Resources Defense Council:

Glyphosate has been linked with an increased risk of non-Hodgkin’s lymphoma, a cancer that afflicts parts of the immune system.

Greenpeace:

Monsanto, perhaps the world’s most reviled environmental villain, has finally been busted for selling its poisonous products around the world.

US Right to Know:

Out of 2,310 urine samples taken from Americans intended to be representative of the population, CDC found that 1,885 contained detectable levels of glyphosate.

Environmental Working Group:

Children in the US are regularly exposed to this cancer-causing weedkiller through the food they eat virtually every day.

image

Reading these proclamations and without much background in science, it’s understandable that most people and many journalists believe that the scientific consensus confirms that glyphosate can cause cancer. This debate is particularly raucous in Europe, which is considering reauthorizing its use across the EU for ten years — much to the howling of environmental groups.

screenshot am
Environmental activist groups trying to leverage inconclusive urine studies

What does the research say? 

Yes, glyphosate is found in many people’s urine samples. Is that a reason for concern? In and of itself, no. 

Government researchers have found more than 3,000 chemical compounds regularly show up in urine tests in the US. The data are listed in the Urine Metabolome Database. Almost none of these identified chemicals in our urine are harmful.

“The fact that so many compounds seem to be unique to urine likely has to do with the fact that the kidneys do an extraordinary job of concentrating certain metabolites from the blood,” the researchers said. Glyphosate, like thousands of other metabolites, are safely and regularly excreted from the body.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

But what about the four court cases in which plaintiffs won sizable victories based on claims that glyphosate caused their non-Hodgkin’s lymphomas? And Bayer (which acquired Monsanto in 2018) — the maker of the patented formulation of glyphosate known as Roundup — has already paid out an additional $10.9 billion to settle thousands of suits, with many more individual cases pending. Isn’t that proof enough?

No. Jury decisions do not follow scientific standards of proof. In civil litigation, jurors are told to weigh the evidence differently than would a scientist. Jurors don’t even have to decide “beyond a reasonable doubt” as they would in a criminal case. Instead, they are asked to reach a verdict based on the balance of probability. 

In the words of one claimant’s lawyer, Brent Wisner, managing partner of the Church of Scientology law firm Wisner Baum (formerly Baum Hedlund), “Did the exposure cause the plaintiff’s cancer? ‘I’m not sure, but I think so’” is enough is decide in favor of the claimant. Wisner Baum has teamed with presidential aspirant Robert F. Kennedy, Jr. in litigating a number of these cases.

image

But what about the definitive statements from advocacy groups claiming, as the Center for Food Safety (CFS) has written, “Science has prevailed, and today it is accepted that glyphosate causes cancer.” 

The formal and proper scientific term to describe that claim is “rubbish”. Science indeed has prevailed; every major independent oversight and regulatory agency in the world, bar none, has concluded the very opposite of what CFS and other activist NGOs regularly claim: glyphosate does not cause cancer as used. 

As Health Canada concluded in two independent reviews of glyphosate:

image

But what about the International Agency for Research on Cancer, the only organization cited by environmental groups as a definitive source? In fact, IARC doesn’t even assess cancer risks”; rather it evaluates “cancer hazards”. What’s the difference? The risk of cancer depends upon exposure: “The dose makes the poison,” a pithy summary of the science credited to Pericles.

IARC has reviewed more than one thousand substances and found that all but one pose a cancer hazard. In its review of existing glyphosate studies, IARC (which does no original research), did not find sufficient evidence to link dietary exposure to cancer. As for the data on applicators, the studies IARC selected for review showed almost as much evidence that glyphosate prevented cancer as might cause it. 

image

IARC has developed a notorious reputation over the past decade. Its conclusions are often perplexing and in contradiction to global risk agencies, and they often contradict mainstream science. This past summer. IARC issued a monograph claiming that the artificial sweetener aspartame was cancer-causing — a finding not supported by any risk agency in the world.

Among the chemicals or activities that IARC has determined pose more of a cancer hazard than glyphosate: getting a suntan; drinking wine or beer; eating salami; consuming Chinese-style salted fish; and taking oral contraceptives used safely by tens of millions of women every year. 

Stated simply, IARC’s controversial cancer warning about glyphosate means little when set against thousands of studies that conclude the weedkiller poses minimal health and environmental risk. Yet it’s the 2015 IARC hazard conclusion, and only that limited review, that activists cite.

The global science community has looked in aggregate at thousands of studies and unanimously concluded that glyphosate does not pose a cancer risk if used as directed. The infographic below (which is downloadable in pdf form), summarizes 24 studies, almost all released after IARC’s 2015 report. The most recent assessment was released this past summer. For the second time in eight years, the European Food Safety Authority concluded:

The assessment of the impact of glyphosate on the health of humans, animals and the environment did not identify critical areas of concern…. It is the most comprehensive and transparent assessment of a pesticide that the EFSA and the EU Member States have ever carried out.

glyphosatedangersinfographic genetic literacy project june

To accept the validity of ideology-tainted claims that glyphosate causes cancer, one would have to believe that two dozen independent agencies in the US, Canada, Europe, Asia, South America, Africa, Australia, and New Zealand, and three separate divisions of WHO, are coordinating in a scheme to suppress evidence of glyphosate’s cancer-causing dangers. 

No, there is not a coordinated worldwide conspiracy. It’s just straightforward science: glyphosate is safe as used and there is no serious scientific debate.

Jon Entine is the founding executive director of the Genetic Literacy Project, and winner of 19 major journalism awards. He has written extensively in the popular and academic press on agricultural and population genetics. You can follow him on X @JonEntine

Viewpoint: Vaccine-rejectionist Jessica Biel’s foray into selling children’s medicine reinforces why she should stick to acting

m
The deluge of the use of the term “natural” for product promotion continues unabated. But perhaps it’s getting stale because KinderFarms, Jessica Biel’s company, is selling stuff like Tylenol and Benadryl with the promise of avoiding “artificial petrochemicals.” That ignores the fact that these drugs are all made from just that. Nope, no kindness or farms. Just another misleading ad campaign.

I just got the following email from KinderFarms, a “natural” kid’s medicine company that objects to all the “unnatural” things found in children’s OTC medications. Damn, they picked the wrong person to spam with this.

Hi Josh,

It’s hard to believe it’s already almost back to school season! Our client, KinderFarms, is more than prepared with back to school essentials that every family needs this school year! 

KinderFarms, co-founded by Jessica Biel, is committed to transforming the family health products industry by offering clean, organic options that eliminate anything artificial. Here are some great products KinderFarms has to offer for your kids return to school.

Biel isn’t the first to try fishy advertising to sell kids’ medicines. Back in 2021, I wrote about a seemingly disingenuous ad campaign by a new company named Genexa, whose motto is “the first clean medicine company,” whatever that means.

At the very least Genexa, by choosing this slogan, implied that the acetaminophen (aka Tylenol) in their products was “clean” while the other 250 billion pills that are produced annually by other companies must be “dirty” by comparison.

screenshot at am

From the company’s website:

We use the same active ingredients as the name brands on shelf today, which have gone through years of scientific testing to make sure they’re safe for you and the ones you love.

Let me pose a challenge. Locate one person in the solar system who “loves” acetaminophen, a largely ineffective and potentially deadly drug, which I reviewed in 2017 and 2023.

And then there’s this:

Our products are Certified Gluten-Free, Non-GMO Verified, and free of common allergens. It’s what people deserve.

It is indisputable that the most dangerous ingredient in this “clean” acetaminophen is acetaminophen itself, a known liver toxin, which sent more than 78,000 people in the US to emergency departments during the two years from January 2006 through December 2007. The term “If you polish a turd, it’s still a turd” comes to mind.

Update: At that time, I gave Genexa the benefit of the doubt. Was it possible that its founders were really concerned (although misguided) about “dirty Tylenol” and were making an honest effort to improve children’s medicine? That benefit of the doubt is now rescinded. Here’s why (Figure 1).

screenshot at pm
Figure 1. Genexa’s homeopathic Arnica “pain relief” product is a bunch of clean nothing. The company is selling sugar-coated witchcraft. I wrote an article in early 2023 explaining these mysterious terms like 6X and 30X. To varying degrees, these mean that there are essentially none of the ingredients (which are worthless in their own right) in the pill.

Arnica is a “homeopathic medicine;” it’s not a medicine at all. Rather it’s a bottle of sweet nothings that cannot possibly be useful for anything except Genexa’s bottom line. Sorry guys, your credibility is shot. But take heart. Homeopathic “remedies,” whether water or sugar, could be quite clean…

…but not kind: Biel’s KinderFarms® gets in on the action

Apparently not content to be left out of the anthropomorphism of OTC drugs, in 2022, the actress Jessica Biel and her business partner Jeremy Adams began selling a line of “benevolent” children’s medication called KinderFarms®, as if the drugs had the capacity for kindness or were somehow grown on a farm. The company’s website tells its story. Here is part of it.

As they peeled back the labels of the common children’s over-the-counter medicines, they were surprised to find artificial ingredients, petrochemicals, and fillers they didn’t feel comfortable giving their children, especially when they were sick.

Which begs a few questions…

  1. Why would these “artificial petrochemicals” be given to children when they were healthy?
  2. Is “didn’t feel comfortable” a scientifically rigorous assessment of the risks and benefits of a drug, especially to Biel, who in 2019 lobbied with RFK Jr. on vaccine policy? (1)
  3. Is there really anything natural about these medicines?
screenshot at am
No unicorns, just a chemical named para-acetamidophenol, aka acetaminophen

No, not even close

The “farm” that the company’s name suggests doesn’t much look like a place where zucchinis are growing all over the place. Or an idyllic setting filled with cute little lambs.

screenshot at pm
Image: Wallpaperflare

No, it looks like Figure 2. This is how “kinder,” “natural,” ‘petrochemical-free”  acetaminophen (Tylenol) is actually made.

screenshot at pm
Figure 2. (Left) Oil distillation towers are loaded with crude oil (from the ground) and gradually heated to separate their components into different fractions, which are defined by the boiling point at which they distill. Photo: Wallpaper flare. (Center) A depiction of a distillation tower shows the products of fractional distillation of crude oil and the temperature range at which they distill. Phenol distills between 170-230oC (similar to kerosene) and is collected, further purified, and transported to some chemical plant (Right) where acetaminophen is manufactured by the synthetic scheme in Figure 3. Photos: Wikimedia Commons, Wikimedia Commons
screenshot at pm
Figure 3. The industrial manufacturing of acetaminophen. (It is not kind.) Following fractional distillation (Figure 2), phenol (a carcinogen) is reacted with nitric acid (A) to give a mixture of ortho- and para-nitrophenol. The two are separated. The para-nitrophenol is reduced using hydrogen with a nickel catalyst (B), which converts it into para-aminophenol (another carcinogen). The final step is a simple acylation using acetic anhydride.

Is there anything “natural” about any of this? The answer is technically yes, provided that you acknowledge that petroleum-based chemicals themselves are natural since petroleum itself is natural. Which is not such a bad assumption, since crude oil is the product of plant and animal life sitting in the ground for millions of years. But I doubt this nuance is built into Genexa or KinderFarms ads. (2)

Is chemical kindness possible?

Chemicals themselves cannot be kind, unkind, or experience any emotion whatsoever. But kindness is possible among the workers who work at the chemical plants. Perhaps this is the basis for the name of the company.

screenshot at pm
Two chemical plant workers, both named Joe, display exceptional kindness to each other. Photo credit: PEO ACWA on Flickr.

Notes: 

(1) RFKs Instagram post of the collaboration: “Please say thank you to the courageous @jessicabiel for a busy and productive day at the California State House.” Uh-huh.

(2) KinderFarms also sells Benadryl. (Here is a YouTube video (for masochists only) showing its synthesis.) Although it is only 3:44 in length, it is unlikely you’ll make it through. Good luck understanding a single word of it. Other products the company sells are KinderMed™ Kids’ Cough & Congestion (guaifenesin and dextromethorphan, both synthetic), KinderMed™ Kids’ Nighttime Cold & Cough (Benadryl and phenylephrine, both synthetic). I could go on…

Dr. Josh Bloom is Executive Vice President of the American Council on Science and Health. He has published more than 60 op-eds in numerous periodicals, including The Wall Street Journal, Forbes, and New Scientist. Follow him on X @JoshBloomACSH

A version of this article was originally posted at the American Council on Science and Health and is reposted here with permission. The American Council on Science and Health can be found on X @ACSHorg

CRISPR-based mosquito suppression system could reduce child mortality and aid economic development in Africa

chjpdmf zs sci pbwfnzxmvd vic l zs ymdizltazl zsntezotc mju otmtaw hz uuanbn
Malaria remains one of the world’s deadliest diseases. Each year malaria infections result in hundreds of thousands of deaths, with the majority of fatalities occurring in children under five. The Centers for Disease Control and Prevention recently announced that five cases of mosquito-borne malaria were detected in the United States, the first reported spread in the country in two decades.

smidler apte lab
Ifegenia technology study first author Andrea Smidler (left) and co-first author Reema Apte.

Fortunately, scientists are developing safe technologies to stop the transmission of malaria by genetically editing mosquitoes that spread the parasite that causes the disease. Researchers at the University of California San Diego led by Professor Omar Akbari’s laboratory have engineered a new way to genetically suppress populations of Anopheles gambiae, the mosquitoes that primarily spread malaria in Africa and contribute to economic poverty in affected regions. The new system targets and kills females of the A. gambiae population since they bite and spread the disease.

Publishing July 5 in the journal Science Advances, first-author Andrea Smidler, a postdoctoral scholar in the UC San Diego School of Biological Sciences, along with former master’s students and co-first authors James Pai and Reema Apte, created a system called Ifegenia, an acronym for “inherited female elimination by genetically encoded nucleases to interrupt alleles.” The technique leverages the CRISPR technology to disrupt a gene known as femaleless (fle) that controls sexual development in A. gambiae mosquitoes.

Scientists at UC Berkeley and the California Institute of Technology contributed to the research effort.

ifegenia art
An artist’s depiction of Ifegenia, a new technology developed at UC San Diego that uses CRISPR genetic editing to disrupt a gene that controls sexual development in the larva of African mosquitoes. Credit: Reema Apte

Ifegenia works by genetically encoding the two main elements of CRISPR within African mosquitoes. These include a Cas9 nuclease, the molecular “scissors” that make the cuts and a guide RNA that directs the system to the target through a technique developed in these mosquitoes in Akbari’s lab. They genetically modified two mosquito families to separately express Cas9 and the fle-targeting guide RNA.

“We crossed them together and in the offspring it killed all the female mosquitoes,” said Smidler, “it was extraordinary.” Meanwhile, A. gambiae male mosquitoes inherit Ifegenia but the genetic edit doesn’t impact their reproduction. They remain reproductively fit to mate and spread Ifegenia. Parasite spread eventually is halted since females are removed and the population reaches a reproductive dead end. The new system, the authors note, circumvents certain genetic resistance roadblocks and control issues faced by other systems such as gene drives since the Cas9 and guide RNA components are kept separate until the population is ready to be suppressed.

“We show that Ifegenia males remain reproductively viable, and can load both fle mutations and CRISPR machinery to induce fle mutations in subsequent generations, resulting in sustained population suppression,” the authors note in the paper. “Through modeling, we demonstrate that iterative releases of non-biting Ifegenia males can act as an effective, confinable, controllable and safe population suppression and elimination system.”

mosquito larva
Larva of Anopheles gambiae mosquitoes were injected with CRISPR-based genetic editing tools in a new population suppression system.

Traditional methods to combat malaria spread such as bed nets and insecticides increasingly have been proven ineffective in stopping the disease’s spread. Insecticides are still heavily used across the globe, primarily in an effort to stop malaria, which increases health and ecological risks to areas in Africa and Asia.

Smidler, who earned a PhD (biological sciences of public health) from Harvard University before joining UC San Diego in 2019, is applying her expertise in genetic technology development to address the spread of the disease and the economic harm that comes with it. Once she and her colleagues developed Ifegenia, she was surprised by how effective the technology worked as a suppression system.

“This technology has the potential to be the safe, controllable and scalable solution the world urgently needs to eliminate malaria once and for all,” said Akbari, a professor in the Department of Cell and Developmental Biology. “Now we need to transition our efforts to seek social acceptance, regulatory use authorizations and funding opportunities to put this system to its ultimate test of suppressing wild malaria-transmitting mosquito populations. We are on the cusp of making a major impact in the world and won’t stop until that’s achieved.”

The researchers note that the technology behind Ifegenia could be adapted to other species that spread deadly diseases, such as mosquitoes known to transmit dengue (break-bone fever), chikungunya and yellow fever viruses.

Mario Aguilera is the Director of Communications for Biological Sciences at UC San Diego. Find Mario on X @mario_maguilera

A version of this article was originally posted at the University of San Diego and has been reposted here with permission. Any reposting should credit the original author and provide links to both the GLP and the original article. Find PLOS  on X@PLOS

Serotonin-boosting foods and fatty acids that can lift your mood

fdae b

Why do we care about serotonin?

One in four Americans currently suffers from anxiety or depression, correlating directly to serotonin levels found in the body. Normal serotonin levels help with your emotional state and digestion, sleep, wound healing, sexual desire, and bone density. However, the most common issues with low serotonin levels are related to mental health.

Serotonin is a neurotransmitter known as the “happy hormone.” It is vital in managing stress, supporting mental well-being, enhancing social interactions, promoting better sleep, and improving cognitive function and emotional resilience.

And its benefits don’t stop there. In bones, serotonin regulates bone density and remodeling, with high levels linked to increased bone density and a reduction in potential risk of osteoporosis, while promoting bone formation. Serotonin also plays a role in wound healing by aiding in blood clotting through platelet release and influencing immune response and tissue repair processes.

How does it actually do all of that? It plays a crucial role in the central nervous system as it acts as a neurotransmitter. It carries messages between the nerve cells in the brain and throughout the body.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

Gut-brain axis support network

The gut-brain axis refers to the bidirectional communication between the gut (gastrointestinal tract) and the brain. It involves complex interactions between the central nervous system (CNS) and the enteric nervous system (ENS), which is often referred to as the “second brain” of the body due to its extensive network of neurons in the gut.

Serotonin plays a critical role in this communication system, serving as a messenger molecule that helps regulate various physiological processes and behaviors. The majority of serotonin in the body is found in the gut, serving multiple functions:

Changes in gut serotonin levels can have major impacts on many bodily functions. Having balanced serotonin levels in the gut helps normalize various gastrointestinal functions, including bowel movements and intestinal motility. Imbalances in gut serotonin levels have been linked to conditions like irritable bowel syndrome (IBS).

It can also affect feelings of satiety and control eating behavior, while also playing a role in the gut’s immune response, helping regulate inflammation and immune cell activity.

The gut-brain axis is a fascinating area of research that highlights the intricate connections between various bodily systems. Serotonin’s influence on the gut and brain underscores its role as a key mediator in the body’s communication network.

Serotonin-boosting foods

Okay, so now I know it can boost not only my mood, but fortify my immune system, help me regulate my hunger, positively impact my digestion and decrease inflammation, but should I take a pill? Is there a pill?

Here at Dirt to Dinner, after much research, we have included that it is always better to seek nutrients through whole foods. Not only is the supplement industry unregulated which makes it hard to know what you are taking, but most of the time, nutrients are more bioavailable for the body to use in its whole food form.

Incorporating serotonin-boosting foods into your diet is a natural and accessible way to promote emotional and physical health and the many other benefits of serotonin.

Nutrients in foods such as complex carbohydrates, vitamin B6, omega 3s, and tryptophan all work together to do just that! For instance, a meal of salmon, quinoa, and spinach with sliced bananas for dessert will work well together to produce the serotonin you need!

Tryptophan-rich foods

Tryptophan is an essential amino acid that our bodies can’t produce alone. Consuming foods high in tryptophan can increase serotonin levels in our gut and brain, as the amino acid synthesizes to become serotonin in your body.

Good news for you, most people already consume more than double the recommended amount, typically 900-1000 milligrams daily as part of their regular diets. Some tryptophan-dense foods are cod, spirulina, nuts and seeds, and legumes.

Here’s a fun fact to share…

Most people think turkey has the most tryptophan, but take a look at the chart on the left! 

Complex carbohydrates

Consuming complex carbohydrates can also boost serotonin production. These carbohydrates increase insulin levels, which aids in the absorption of amino acids, including tryptophan, into the brain. Some excellent sources of complex carbohydrates include whole grains ( like oats, quinoa, farrow, and brown rice), sweet potatoes, and legumes ( including beans, lentils, and peas).

Not sure how to tell the difference between a complex carb and a simple carb? Here’s a good trick: most whole, unprocessed foods contain complex carbs. Avoid processed foods and “white” foods, which are mostly comprised of simple carbs.

When you eat a meal rich in carbohydrates from whole grains, insulin stimulates the uptake of other amino acids into cells, leaving tryptophan with relatively fewer competitors. As a result, more tryptophan can be converted into serotonin, contributing to a more balanced and positive mood.

Complex carbohydrates provide a slow and steady release of energy compared to simple carbohydrates. This sustained energy release helps stabilize blood sugar levels, preventing rapid spikes and crashes. Fluctuations in blood sugar levels can affect mood and energy levels, and stable blood sugar can reduce emotional ups and downs.

Vitamin B6 & serotonin conversion

Vitamin B6 helps the body convert tryptophan into serotonin. Including foods high in vitamin B6 can enhance this serotonin synthesis.

Some notable sources of vitamin B6 are fish (like tuna, salmon, and trout), poultry, and bananas. B6 is critical in allowing the body to utilize serotonin to assist with our cognitive and emotional functioning.

Curious about other B6-rich foods? Print out this handy chart and stick it on your fridge!

Omega-3 fatty acids

The relationship between omega-3 fatty acids and serotonin involves multiple interconnected mechanisms that can impact mood and emotional well-being. Omega-3 fatty acids are essential for brain health and function, particularly eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA).

These fatty acids are incorporated into cell membranes, influencing membrane fluidity and receptor activity. By regulating the cell membrane, omega 3s can enhance the function of serotonin receptors, making them more responsive to serotonin.

Studies suggest that omega-3 fatty acids, when consumed in sufficient amounts (at least 200mg a day), may contribute to maintaining healthy serotonin levels.

Which foods are excellent sources of omega 3s? At the top of the list are fatty fish (tuna, salmon, trout, herring, anchovies), chia seeds, and flaxseeds.

What else can we do?

Want to boost the effects of these foods? Get good sleep. Serotonin is the first step in melatonin production, a hormone we produce that regulates sleep-wake cycles. Ensuring you are making enough serotonin can support healthy sleep patterns and improve sleep quality, leading to better overall health and productivity.

The bottom line

Incorporating foods that boost serotonin production in your diet can be a natural and effective way to enhance mood and overall well-being. Tryptophan-rich foods, complex carbohydrates, vitamin B6 sources, and foods rich in omega-3 fatty acids can all contribute to increased serotonin levels in the brain and an overall healthier you!

Hayley N. Phillip is a graduate of the University of California Santa Barbara with degrees in Sociology and Marketing.  Hayley leads the Dirt to Dinner team in debunking popular fad diets, fast-nutrition, and myths about ‘quick’ dietary fixes. Hayley also researches and writes about the intersectionality of regeneration and sustainable growing methods that will safely produce enough food for future generations. 

A version of this article was originally posted at Dirt To Dinner and has been reposted here with permission. Any reposting should credit the original author and provide links to both the GLP and the original article. Find Dirt To Dinner on Twitter @Dirt_To_Dinner

Biotechnology timeline: Humans have manipulated genes since the ‘dawn of civilization’

Historically, biotech has been primarily associated with food, addressing such issues as malnutrition and famine.

Today, biotechnology is most often associated with the development of drugs. But drugs are hardly the future of biotech. We’ve entered the Fourth Industrial Revolution, and genetics are on a new level. Biotech is paving a way for a future open to imagination, and that’s kind of scary.

The next ten years will surely prove exciting as artificial intelligence and biotechnology merge man and machine…

The history of biotechnology can be divided into three distinct phases:

  1. Ancient Biotechnology
  2. Classical Biotechnology
  3. Modern Biotechnology

1. Ancient Biotechnology (Pre-1800)

Most of the biotech developments before the year 1800 can be termed as ‘discoveries’ or ‘developments’. If we study all these developments, we can conclude that these inventions were based on common observations about nature.

Humans have used biotechnology since the dawn of civilization.

After domestication of food crops (corn, wheat) and wild animals, man moved on to other new observations like cheese and curd.  Cheese can be considered as one of the first direct products (or by-product) of biotechnology because it was prepared by adding rennet (an enzyme found in the stomach of calves) to sour milk.

Yeast is one of the oldest microbes that have been exploited by humans for their benefit. The oldest fermentation was used to make beer in Sumeria and Babylonia as early as 7,000BCE.

By 4,000BCE, Egyptians used yeasts to bake leavened bread.

Credit: Topsimages

Another ancient product of fermentation was wine, made in Assyria as early as 3,500BCE.

The Chinese developed fermentation techniques for brewing and cheese making.

500 BCE: In China, the first antibiotic, moldy soybean curds, is put to use to treat boils.

Hippocrates treated patients with vinegar in 400 BCE.

In 100BCE, Rome had over 250 bakeries which were making leavened bread.

A.D. 100: The first insecticide is produced in China from powdered chrysanthemums.

The use of molds to saccharify rice in the koji process dates back to at least A.D. 700.

13th century: The Aztecs used Spirulina algae to make cakes.

One of the oldest examples of crossbreeding for the benefit of humans is mule. Mule is an offspring of a male donkey and a female horse. People started using mules for transportation, carrying loads, and farming, when there were no tractors or trucks.

Credit: Berkshire Archaeological Society

By the 14th century AD, the distillation of alcoholic spirits was common in many parts of the world.

Vinegar manufacture began in France at the end of the 14th century.

1663: Cells are first described by Hooke.

1673-1723: In the seventeenth century, Antonie van Leeuwenhoek discovered microorganisms by examining scrapings from his teeth under a microscope.

1675: Leeuwenhoek discovers protozoa and bacteria.

1761: English surgeon Edward Jenner pioneers vaccination, inoculating a child with a viral smallpox vaccine.

2. Classical Biotechnology (1800-1945)

The Hungarian Károly Ereky coined the word “biotechnology” in Hungary during 1919 to describe a technology based on converting raw materials into a more useful product. In a book entitled Biotechnologie, Ereky further developed a theme that would be reiterated through the 20th century: biotechnology could provide solutions to societal crises, such as food and energy shortages.

1773-1858: Robert Brown discovered the nucleus in cells.

1802: The word “biology” first appears.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

1822-1895: Vaccination against small pox and rabies developed by Edward Jenner and Louis Pasteur.

In 1850, Casimir Davaine detected rod-shaped objects in the blood of anthrax-infected sheep and was able to produce the disease in healthy sheep by inoculation of such blood.

1855: The Escherichia coli bacterium is discovered. It later becomes a major research, development, and production tool for biotechnology.

In 1868, Fredrich Miescher reported nuclein, a compound that consisted of nucleic acid that he extracted from white blood cells.

1870: Breeders crossbreed cotton, developing hundreds of varieties with superior qualities.

Credit: WEF

1870: The first experimental corn hybrid is produced in a laboratory.

By 1875, Pasteur of France and John Tyndall of Britain finally demolished the concept of spontaneous generation and proved that existing microbial life came from preexisting life.

1876: Koch’s work led to the acceptance of the idea that specific diseases were caused by specific organisms, each of which had a specific form and function.

In 1881, Robert Koch, a German physician, described bacterial colonies growing on potato slices (First ever solid medium).

In 1888, Heinrich Wilhelm Gottfried Von Waldeyer-Hartz, a German scientist, coined the term ‘Chromosome.’

In 1909, the term ‘Gene’ had already been coined by Wilhelm Johannsen (1857-1927), who described ‘gene’ as carrier of heredity. Johannsen also coined the terms ‘genotype’ and ‘phenotype.’

1909: Genes are linked with hereditary disorders.

1911: American pathologist Peyton Rous discovers the first cancer-causing virus.

1915: Phages, or bacterial viruses, are discovered.

Credit: SciTechDaily

1919: The word “biotechnology” is first used by a Hungarian agricultural engineer.

Pfizer, which had made fortunes using fermenting processes to produce citric acid in the 1920s, turned its attention to penicillin. The massive production of penicillin was a major factor in the Allied victory in WWII.

1924: start of Eugenic Movement in the US.

The principle of genetics in inheritance was redefined by T.H. Morgan, who showed inheritance and the role of chromosomes in inheritance by using fruit flies. This landmark work was named, ‘The theory of the Gene in 1926.”

Alexander Fleming discovered ‘penicillin’ the antibacterial toxin from the mold Penicillium notatum, which could be used against many infectious diseases. Fleming wrote, “When I woke up just after dawn on September 28, 1928, I certainly didn’t plan to revolutionize all medicine by discovering the world’s first antibiotic, or bacteria killer.

Sir Alexander Fleming, (6 August 1881 – 11 March 1955) was a Scottish biologist, pharmacologist and botanist who discovered Penicillin. Credit: Universal History Archive/UIG

1933: Hybrid corn is commercialized.

In 1940, a team of researchers at Oxford University found a way to purify penicillin and keep it stable.

1941: The term “genetic engineering” is first used by a Danish microbiologist.

1942: The electron microscope is used to identify and characterize a bacteriophage- a virus that infects bacteria.

1942: Penicillin is mass-produced in microbes for the first time.

3. Modern Biotechnology (1945-present)

The Second World War became a major impediment in scientific discoveries. After the end of the second world war some, very crucial discoveries were reported, which paved the path for modern biotechnology.

The origins of biotechnology culminate with the birth of genetic engineering. There were two key events that have come to be seen as scientific breakthroughs beginning the era that would unite genetics with biotechnology: One was the 1953 discovery of the structure of DNA, by Watson and Crick, and the other was the 1973 discovery by Cohen and Boyer of a recombinant DNA technique by which a section of DNA was cut from the plasmid of an E. coli bacterium and transferred into the DNA of another. Popularly referred to as “genetic engineering,” it came to be defined as the basis of new biotechnology.

In Britain, Chaim Weizemann (1874–1952) developed bacterial fermentation processes for producing organic chemicals such as acetone and cordite propellants. During WWII, he worked on synthetic rubber and high-octane gas.

1950s: The first synthetic antibiotic is created.

1951: Artificial insemination of livestock is accomplished using frozen semen.

In 1953, JD Watson and FHC Crick for the first time cleared the mysteries around the DNA as a genetic material, by giving a structural model of DNA, popularly known as, ‘Double Helix Model of DNA.’

1954: Dr. Joseph Murray performs the first kidney transplant between identical twins.

1955: An enzyme, DNA polymerase, involved in the synthesis of a nucleic acid, is isolated for the first time.

1955: Dr. Jonas Salk develops the first polio vaccine. The development marks the first use of mammalian cells (monkey kidney cells) and the first application of cell culture technology to generate a vaccine.

Jonas Salk administers polio vaccine.

1957: Scientists prove that sickle-cell anemia occurs due to a change in a single amino acid in hemoglobin cells

1958: Dr. Arthur Kornberg of Washington University in St. Louis makes DNA in a test tube for the first time.

Edward Tatum (1909–1975) and Joshua Lederberg (1925–2008) shared the 1958 Nobel Prize for showing that genes regulate the metabolism by producing specific enzymes.

1960: French scientists discover messenger RNA (mRNA).

1961: Scientists understand genetic code for the first time.

1962: Dr. Osamu Shimomura discovers the green fluorescent protein in the jellyfish Aequorea victoria. He later develops it into a tool for observing previously invisible cellular processes.

1963: Dr. Samuel Katz and Dr. John F. Enders develop the first vaccine for measles.

1964: The existence of reverse transcriptase is predicted.

At a conference in 1964, Tatum laid out his vision of “new” biotechnology: “Biological engineering seems to fall naturally into three primary categories of means to modify organisms. These are:

  1. The recombination of existing genes, or eugenics.
  2. The production of new genes by a process of directed mutation, or genetic engineering.
  3. Modification or control of gene expression, or to adopt Lederberg’s suggested terminology, euphenic engineering.”

1967: The first automatic protein sequencer is perfected.

1967: Dr. Maurice Hilleman develops the first American vaccine for mumps.

Maurice Hilleman, the scientist credited with saving millions of lives through the development of vaccines. Credit: Montana State University

1969: An enzyme is synthesized in vitro for the first time.

1969: The first vaccine for rubella is developed.

1970: Restriction enzymes are discovered.

1971: The measles/mumps/rubella combo-vaccine was formed.

1972: DNA ligase, which links DNA fragments together, is used for the first time.

1973: Cohen and Boyer perform the first successful recombinant DNA experiment, using bacterial genes.

In 1974, Stanley Cohen and Herbert Boyer developed a technique for splicing together strands of DNA from more than one organism. The product of this transformation is called recombinant DNA (rDNA).

Kohler and Milestein in 1975 came up with the concept of cytoplasmic hybridization and produced the first ever monoclonal antibodies, which has revolutionized diagnostics.

Techniques for producing monoclonal antibodies were developed in 1975.

1975: Colony hybridization and Southern blotting are developed for detecting specific DNA sequences.

1976: Molecular hybridization is used for the prenatal diagnosis of alpha thalassemia.

1978: Recombinant human insulin is produced for the first time.

Credit: National Museum of American History

1978: with the development of synthetic human insulin the biotechnology industry grew rapidly.

1979: Human growth hormone is synthesized for the first time.

In the 1970s-80s, the path of biotechnology became intertwined with that of genetics.

By the 1980s, biotechnology grew into a promising real industry.

1980: Smallpox is globally eradicated following 20-year mass vaccination effort.

In 1980, The U.S. Supreme Court (SCOTUS), in Diamond v. Chakrabarty, approved the principle of patenting genetically engineered life forms.

Ananda Chakrabarty, biotech pioneer.

1981: Scientists at Ohio University produce the first transgenic animals by transferring genes from other animals into mice.

1981: The first gene-synthesizing machines are developed.

1981: The first genetically engineered plant is reported.

1982: The first recombinant DNA vaccine for livestock is developed.

1982: The first biotech drug, human insulin produced in genetically modified bacteria, is approved by FDA. Genentech and Eli Lilly developed the product. This is followed by many new drugs based on biotechnologies.

1983: The discovery of HIV/AIDS as a deadly disease has helped tremendously to improve various tools employed by life-scientist for discoveries and applications in various aspects of day-to-day life.

In 1983, Kary Mullis developed polymerase chain reaction (PCR), which allows a piece of DNA to be replicated over and over again. PCR, which uses heat and enzymes to make unlimited copies of genes and gene fragments, later becomes a major tool in biotech research and product development worldwide.

Credit: Wladimir_Bulgar

1983: The first artificial chromosome is synthesized.

In 1983, the first genetic markers for specific inherited diseases were found.

1983: The first genetic transformation of plant cells by TI plasmids is performed.

In 1984, the DNA fingerprinting technique was developed.

1985: Genetic markers are found for kidney disease and cystic fibrosis.

1986: The first recombinant vaccine for humans, a vaccine for hepatitis B, is approved.

1986: Interferon becomes the first anticancer drug produced through biotech.

1986: University of California, Berkeley, chemist Dr. Peter Schultz describes how to combine antibodies and enzymes (abzymes) to create therapeutics.

1988: The first pest-resistant corn, Bt corn, is produced.

1988: Congress funds the Human Genome Project, a massive effort to map and sequence the human genetic code as well as the genomes of other species.

In 1988, chymosin (known as Rennin) was the first enzyme produced from a genetically modified source-yeast-to be approved for use in food.

In 1988, only five proteins from genetically engineered cells had been approved as drugs by the United States Food and Drug Administration (FDA), but this number would skyrocket to over 125 by the end of the 1990s.

In 1989, microorganisms were used to clean up the Exxon Valdez oil spill.

1990: The first successful gene therapy is performed on a 4-year-old girl suffering from an immune disorder.

In 1993, The U.S. Food and Drug Administration (FDA) declared that genetically modified (GM) foods are “not inherently dangerous” and do not require special regulation.

1993: Chiron’s Betaseron is approved as the first treatment for multiple sclerosis in 20 years.

1994: The first breast cancer gene is discovered.

1995: Gene therapy, immune-system modulation and recombinantly produced antibodies enter the clinic in the war against cancer.

1995: The first baboon-to-human bone marrow transplant is performed on an AIDS patient.

1995: The first vaccine for Hepatitis A is developed.

1996: A gene associated with Parkinson’s disease is discovered.

1996: The first genetically engineered crop is commercialized.

1997: Ian Wilmut, an Irish scientist, was successful in cloning an adult animal, using sheep as model and naming the cloned sheep ‘Dolly.’

Professor Ian Wilmut stands beside Dolly the Sheep, who he helped clone. Credit: The Roslin Institute

1997: The first human artificial chromosome is created.

1998: A rough draft of the human genome map is produced, showing the locations of more than 30,000 genes.

1998: Human skin is produced for the first time in the lab.

1999: A diagnostic test allows quick identification of Bovine Spongicorm Encephalopathy (BSE, also known as “mad cow” disease) and Creutzfeldt-Jakob Disease (CJD).

1999: The complete genetic code of the human chromosome is deciphered.

2000: Kenya field-tests its first biotech crop, virus-resistant sweet potato.

Craig Venter, in 2000, was able to sequence the human genome.

2001: The sequence of the human genome is published in Science and Nature, making it possible for researchers all over the world to begin developing treatments.

2001: FDA approves Gleevec® (imatinib), a gene-targeted drug for patients with chronic myeloid leukemia. Gleevec is the first gene-targeted drug to receive FDA approval.

2002: EPA approves the first transgenic rootworm-resistant corn.

2002: The banteng, an endangered species, is cloned for the first time.

Credit: World Association of Zoos and Aquariums

2003: China grants the world’s first regulatory approval of a gene therapy product, Gendicine (Shenzhen SiBiono GenTech), which delivers the p53 gene as a therapy for squamous cell head and neck cancer.

In 2003, TK-1 (GloFish) went on sale in Taiwan, as the first genetically modified pet.

2003: The Human Genome Project completes sequencing of the human genome.

2004: UN Food and Agriculture Organization endorses biotech crops, stating biotechnology is a complementary tool to traditional farming methods that can help poor farmers and consumers in developing nations.

2004: FDA approves the first antiangiogenic drug for cancer, Avastin®.

2005: The Energy Policy Act is passed and signed into law, authorizing numerous incentives for bioethanol development.

2006: FDA approves the recombinant vaccine Gardasil®, the first vaccine developed against human papillomavirus (HPV), an infection implicated in cervical and throat cancers, and the first preventative cancer vaccine.

Credit: Merck

2006: USDA grants Dow AgroSciences the first regulatory approval for a plant-made vaccine.

2006: The National Institutes of Health begins a 10-year, 10,000-patient study using a genetic test that predicts breast-cancer recurrence and guides treatment.

In 2006, the artist Stelarc had an ear grown in a vat and grafted onto his arm.

2007: FDA approves the H5N1 vaccine, the first vaccine approved for avian flu.

2007: Scientists discover how to use human skin cells to create embryonic stem cells.

2008: Chemists in Japan create the first DNA molecule made almost entirely of artificial parts.

2009: Global biotech crop acreage reaches 330 million acres.

In 2009, Sasaki and Okana produced transgenic marmosets that glow green in ultraviolet light (and pass the trait to their offspring).

2009: FDA approves the first genetically engineered animal for production of a recombinant form of human antithrombin.

In 2010, Craig Venter was successful in demonstrating that a synthetic genome could replicate autonomously.

2010: Dr.  J. Craig Venter announces completion of “synthetic life” by transplanting synthetic genome capable of self-replication into a recipient bacterial cell.

2010: Harvard researchers report building “lung on a chip” – technology.

In 2010, scientists created malaria-resistant mosquitoes.

2011: Trachea derived from stem cells transplanted into human recipient.

Credit: ABC

2011: Advances in 3-D printing technology lead to “skin-printing.”

2012: For the last three billion years, life on Earth has relied on two information-storing molecules, DNA and RNA. Now there’s a third: XNA, a polymer synthesized by molecular biologists Vitor Pinheiro and Philipp Holliger of the Medical Research Council in the United Kingdom. Just like DNA, XNA is capable of storing genetic information and then evolving through natural selection. Unlike DNA, it can be carefully manipulated.

2012: Researchers at the University of Washington in Seattle announced the successful sequencing of a complete fetal genome using nothing more than snippets of DNA floating in its mother’s blood.

2013: Two research teams announced a fast and precise new method for editing snippets of the genetic code. The so-called CRISPR system takes advantage of a defense strategy used by bacteria.

Jennifer Doudna (top left) at the University of California, Berkeley, USA, and Feng Zhang (top right) at the Broad Institute of the Massachusetts Institute of Technology (MIT) and Harvard University, have each undertaken pioneering work in relation to CRISPR-Cas9. They and others are currently embroiled in a legal firestorm over who owns commercial or IP rights in the technology. Credit: Keegan Houser/UC Berkeley/Justin Knight Photography

2013: Researchers in Japan developed functional human liver tissue from reprogrammed skin cells.

2013:  Researchers published the results of the first successful human-to-human brain interface.

2013: Doctors announced that a baby born with HIV had been cured of the disease.

2014: Researchers showed that blood from a young mouse can rejuvenate an old mouse’s muscles and brain.

2014: Researchers figured out how to turn human stem cells into functional pancreatic β cells—the same cells that are destroyed by the body’s own immune system in type 1 diabetes patients.

2014: All life on Earth as we know it encodes genetic information using four DNA letters: A, T, G, and C. Not anymore! In 2014, researchers created new DNA bases in the lab, expanding life’s genetic code and opening the door to creating new kinds of microbes.

2014: For the first time ever, a woman gave birth to a baby after receiving a womb transplant.

In 2014, team of scientists reconstructed a synthetic and fully functional yeast chromosome. A breakthrough seven years in the making, the remarkable advance could eventually lead to custom-built organisms (human organisms included).

2014 & Ebola: Until this year, ebola was merely an interesting footnote for anyone studying tropical diseases. Now it’s a global health disaster. But the epidemic started at a single point with one human-animal interaction — an interaction which has now been pinpointed using genetic research. A total of 50 authors contributed to the paper announcing the discovery, including five who died of the disease before it could be published.

Ebola virus. Credit: CDC

2014: Doctors discovered a vaccine that totally blocks infection altogether in the monkey equivalent of the disease — a breakthrough that is now being studied to see if it works in humans.

2015: Scientists from Singapore’s Institute of Bioengineering and Nanotechnology designed short strings of peptides that self-assemble into a fibrous gel when water is added for use as a healing nanogel.

2015 & CRISPR: scientists hit a number of breakthroughs using the gene-editing technology CRISPR. Researchers in China reported modifying the DNA of a nonviable human embryo, a controversial move. Researchers at Harvard University inserted genes from a long-extinct woolly mammoth into the living cells — in a petri dish — of a modern elephant. Elsewhere, scientists reported using CRISPR to potentially modify pig organs for human transplant and modify mosquitoes to eradicate malaria.

2015: Researchers in Sweden developed a blood test that can detect cancer at an early stage from a single drop of blood.

2015: Scientists discovered a new antibiotic, the first in nearly 30 years, that may pave the way for a new generation of antibiotics and fight growing drug-resistance. The antibiotic, teixobactin, can treat many common bacterial infections, such as tuberculosis, septicaemia, and C. diff.

2015: A team of geneticists finished building the most comprehensive map of the human epigenome, a culmination of almost a decade of research. The team was able to map more than 100 types of human cells, which will help researchers better understand the complex links between DNA and diseases.

2015: Stanford University scientists revealed a method that may be able to force malicious leukemia cells to change into harmless immune cells, called macrophages.

2015: Using cells from human donors, doctors, for the first time, built a set of vocal cords from scratch. The cells were urged to form a tissue that mimics vocal fold mucosa – vibrating flaps in the larynx that create the sounds of the human voice.

2016: A little-known virus first identified in Uganda in 1947—Zika—exploded onto the international stage when the mosquito-borne illness began spreading rapidly throughout Latin America. Researchers successfully isolated a human antibody that “markedly reduces” infection from the Zika virus.

Geovane Silva holds his son Gustavo Henrique, who has microcephaly, at the Oswaldo Cruz Hospital in Recife, Brazil. Credit: Ueslei Marcelino / Reuters

2016: CRISPR, the revolutionary gene-editing tool that promises to cure illnesses and solve environmental calamities, took a major step forward this year when a team of Chinese scientists used it to treat a human patient for the very first time.

2016: Researchers found that an ancient molecule, GK-PID, is the reason single-celled organisms started to evolve into multicellular organisms approximately 800 million years ago.

2016: Stem cells injected into stroke patients re-enable patient to walk.

2016Cloning does not cause long-term health issues, study finds

2016: For the first time, bioengineers created a completely 3D-printed ‘heart on a chip.’

Credit: Elveflow

2017: Researchers at the National Institute of Health discovered a new molecular mechanism that might be the cause of severe premenstrual syndrome known as PMDD.

2017: Scientists at the Salk Institute in La Jolla, CA, said they’re one step closer to being able to grow human organs inside pigs. In their latest research they were able to grow human cells inside pig embryos, a small but promising step toward organ growth.

2017: First step taken toward epigenetically modified cotton.

2017: Research reveals different aspects of DNA demethylation involved in tomato ripening process.

2017: Sequencing of green alga genome provides blueprint to advance clean energy, bioproducts.

2017: Fine-tuning ‘dosage’ of mutant genes unleashes long-trapped yield potential in tomato plants.

2017: Scientists engineer disease-resistant rice without sacrificing yield.

2017: Blood stem cells grown in lab for the first time.

2017: Researchers at Sahlgrenska Academy – part of the University of Gothenburg, Sweden – generated cartilage tissue by printing stem cells using a 3D-bioprinter.

2017: Two-way communication in brain-machine interface achieved for the first time.

Today, biotechnology is being used in countless areas including agriculture, bioremediation and forensics, where DNA fingerprinting is a common practice. Industry and medicine alike use the techniques of PCR, immunoassays and recombinant DNA.

Genetic manipulation has been the primary reason that biology is now seen as the science of the future and biotechnology as one of the leading industries.

A version of this article was originally published on Brian Colwell’s website as “A Giant-Sized History of Biotechnology” and has been republished here with permission from the author.

Brian Colwell is a technology futurist with an investment thesis focused on disruptions in this next Industrial revolution. His research areas include agricultural, biotechnology and  artificial intelligence. Follow @BrianDColwell on Twitter and at his website.

This article first appeared on the GLP on September 8, 2020.

Are most GMO safety studies funded by industry?

benjamin franklin x
Anti-GMO activists, many of them financed by organic food companies, claim that the GMO safety consensus is based on biotech industry-funded studies and thus cannot be trusted. Gary Ruskin, co-founder of the organic food advocacy group U.S. Right to Know (USRTK), argues:

The agrichemical companies are unlikely to support research that may undermine their financial interests. Meanwhile, there is a declining amount of public funds available for agricultural research . That means less funding for independent studies to assess health and environmental risks of genetically engineered food and crops.

According to Michael Hansen, a critic of GM foods with Consumers Union, “Look at what the FDA says when they approve a food: ‘It is our understanding that Monsanto has concluded this is safe.’ They just rubber-stamp it.”

The assertion that biotech companies do the research and the government just signs off on it is inaccurate. In practice, companies finance and execute voluntary testing, as that’s the way the US approval process was set up in the 1980s. But absolutely every biotech firm “volunteers.” That’s because the FDA can stop any GMO crop from going to market. Moreover, regulatory review by the USDA and the EPA is mandatory in every sense.

This shared regulatory responsibility is divided up based on each agency’s expertise. The FDA evaluates all foods grown from genetically modified seeds to confirm they are “substantially equivalent” to their conventional counterparts, ensuring that the new foods are nontoxic and nonallergenic. The USDA evaluates GMO crops to see if they will pose a plant pest risk once released into the environment. And as a final layer of regulatory oversight, the EPA evaluates insect- and virus-resistant GMO crops, “plants that are pesticidal in nature,” the agency says, to ensure they won’t pose a threat to the environment or human health.

International standards for industry-funded research are similarly rigorous. The European Food Safety Authority (EFSA) mandates that biotech companies demonstrate their products are substantially equivalent to foods already available in EU supermarkets, before the new items can be sold. Food safety rules established by the UN’s Food and Agricultural Organization (FAO) likewise declare that studies must identify any possible allergen or toxin that may be present in GMO crop varieties before they can be commercially grown. No other foods, including organic products, intended for human consumption face such extensive safety evaluations.

Furthermore, this entire process is subject to extensive peer input and criticism in the form of public comments from independent medical and scientific experts. This virtually eliminates the possibility of “powerful corporations” buying science that favors their economic interests, a practice USRTK’s Ruskin argues is widespread. The Science Media Centre, a UK-based science outreach group, adds:

[T]here are . mechanisms within science to protect experiments from [industry] influence. Experimental design and the peer review system should protect research from bias and, on top of that, [most research] institutes have contracts with industry which allow researchers freedom to publish the facts as they are discovered.

After the initial round of approval studies, independent researchers often do their own analyses, which typically confirm the results of industry studies. For example:

In a meta-review published in a peer-reviewed, high impact factor journal, Critical Review of Biotechnology in 2013, the authors evaluated 1,783 research papers, reviews, relevant opinions, and reports published between 2002 and 2012, a comprehensive process that took over 12 months to complete. The review covered all aspects of GM crop safety, from how the crops interact with the environment to how they could potentially affect the humans and animals who consume them.

Many of those studies were independent, including more than one hundred funded and overseen by the European Commission over more than two decades. The results:

The main conclusion to be drawn from the efforts of more than 130 research projects, covering a period of more than 25 years of research, and involving more than 500 independent research groups, is that biotechnology, and in particular GMOs, are not per se more risky than e.g. conventional plant breeding technologies.

A February 2015 study published in the journal Nature Biotechnology also challenged the view that biotech firms control the safety evaluations of their products. For the study, Miguel Sanchez, a scientist with the biotech firm ChileBio, evaluated the funding sources of 698 studies published between 1993 and 2014 that looked at the environmental and human health impacts of GMOs. Sanchez found that 60 percent of the scientists involved in these studies had no financial relationship with biotech companies. Cornell University’s Alliance for Science summarized the study’s conclusions:

58.3% of published papers ‘have no financial or professional COIs, as the authors were not affiliated with companies that develop GM crops and also declared that the funding sources did not come from those companies,’ Sanchez reports.

glp menu logo outlined

Newsletter Subscription

* indicates required
Email Lists