df aec d a c e

Why a coronavirus vaccine ‘October Surprise’ could be an October disaster

There is widespread anticipation of the availability of vaccines to prevent COVID-19 infections so that Americans can get their lives back to some semblance of normal. Some four dozen vaccines, made with a variety of technology platforms, are now in clinical trials; nine are in large-scale safety/efficacy testing. Several of the more promising development programs have been accelerated by a White House crash program, “Operation Warp Speed,” which was launched in May.

It was no secret that there would be intense pressure on the FDA from a White House desperate for good news to provide an “October Surprise” in the form of a vaccine approval before the November 3rd election, even if that approval was premature. When I wrote about this subject only last month, I described the wall that the head of the FDA, Dr. Stephen Hahn, and his colleagues had constructed in order to resist that pressure. Well, watch out for falling debris, because the wall is crumbling.

gettyimages

The first brick in the wall was an FDA policy statement, “Development and Licensure of Vaccines to Prevent COVID-19: Guidance for Industry,” published on June 30th. It specifies in great detail the criteria for FDA approval of coronavirus vaccines, but the overarching principle is simple: “the goal of development programs should be to pursue traditional approval via direct evidence of vaccine efficacy” in protecting humans from COVID-19—through clinical trials—and a vaccine must be at least 50% more effective than a placebo in preventing the disease. The clinical trials would also need to demonstrate that the vaccine is safe, of course.

Those criteria are extremely important because they emphasize that regulators do not intend to cut corners via “accelerated approval” based on “surrogate endpoints”—such as a vaccine’s ability to elicit antibodies to the virus—but short of actual prevention of infection. The guidance enables FDA Commissioner Hahn to fall back on that policy if he is pressured by his bosses to adopt a lower standard. He went out of his way to emphasize the FDA’s independence and integrity on July 21st, tweeting, “Americans should know that we are steadfast in maintaining our regulatory independence & ensuring our decisions for treatments & vaccines for #COVID19 are based on science & data. This is a commitment that the American public can have confidence that I will continue to uphold.”

And, in a podcast interview with the editor of JAMA, Dr. Hahn repeated that theme: “Americans’ and the world’s public trust in the FDA is really important … People depend upon us every day of their lives, and we cannot do anything that would break that trust. That’s a solemn promise.” Part of that promise was that the Agency’s vaccines advisory committee, which is comprised of outside experts, would review vaccine candidates prior to approval.

In an August 7th article in JAMA, Dr. Hahn and two senior colleagues beat the drum yet again, promising “unequivocally” that “candidate COVID-19 vaccines will be reviewed according to the established legal and regulatory standards for medical products.”

jama

They added:

While Operation Warp Speed is an important initiative and FDA has lent technical expertise around end point selection and safety considerations to this public-private partnership for vaccine development, there is a line separating the government’s efforts to focus resources and funding to scale vaccine development from FDA’s review processes, which are rooted in federal statute and established FDA regulations.

Dr. Peter Marks, a senior civil servant who heads the FDA organization that evaluates vaccines, has also made his feelings on the subject known. In August, Marks told Reuters that the FDA’s evaluations would be guided by science alone and that if he were subjected to political pressure for a premature approval, “I could not stand by and see something that was unsafe or ineffective that was being put through.” He added, “You have to decide where your red line is, and that’s my red line. I would feel obligated [to resign] because in doing so, I would indicate to the American public that there’s something wrong.”

images

In sum, the message from Drs. Hahn and Marks to several audiences—their bosses at the Department of Health and Human Services and the White House, the public, and the vaccine industry—seemed to be clear: although regulators will streamline regulation and facilitate the development of COVID-19 vaccines, they won’t be stampeded into exposing Americans to inadequately tested, potentially dangerous products.

One might think that that would put the matter to rest, and that the American public need not worry about undue political influence on what are essentially scientific and medical decisions. However, recent actions by the FDA and its sister agency, the Centers for Disease Control and Prevention (CDC) have raised widespread concerns:

  • the FDA’s issuance of an Emergency Use Authorization (EUA) for convalescent plasma, an antibody-rich blood product obtained from patients who have recovered from COVID-19. In theory, infusing a sick patient with the antibodies would neutralize the virus that’s present and spur recovery, but many in the medical community – including senior NIH and FDA scientists — felt the EUA was predicated on insufficient evidence (and which has made the completion of rigorous clinical trials difficult or impossible).  Notably, the EUA came a on a Sunday, a day after President Trump accused some “deep state” bureaucrats at the FDA of trying to delay a COVID-19 vaccine until after the fall election.
  • the CDC’s new guidance on testing for the SARS-CoV-2 virus, released on Monday, August 24th, amended the agency’s guidance to recommend that people who have been exposed to the virus, typically defined as being within six feet of an infected person for at least 15 minutes, “do not necessarily need a test” if they do not have symptoms. This flies in the face of evidence that the time of highest virus shedding and infectivity is in the days shortly before symptoms emerge and would seem to represent an abandonment of any attempt at contact tracing and isolation of infected persons.

The New York Times reported that, according to two government officials, the CDC changed its guidelines on instructions from higher up the food chain: “One official said the directive came from the top down. Another said the guidelines were not written by the CDC but were imposed.” The change sounds suspiciously like, “If we do less testing, we have fewer cases.”

These are, at best, dubious decisions, especially the completely inexplicable CDC volte-face on testing, which, if implemented, could significantly set back efforts to suppress the COVID-19 infections.

I am usually suspicious of slippery slope arguments, but I do believe in precedents, and the above two examples are credible precedents for a possible, much worse, far more damaging action – a premature Emergency Use Authorization for a COVID-19 vaccine that had not been adequately tested for safety. There is a suggestion that FDA Commissioner Hahn may be moving toward that: In an interview published by the Financial Times on August 30, he said his agency was prepared to authorize a vaccine before Phase 3 clinical trials were complete, if regulators become convinced that the benefits outweigh the risks.

Another indication of pressure to issue such an authorization(s) is that in a move that is unprecedented, four pharmaceutical companies – Moderna, Pfizer, Johnson & Johnson, and Sanofi – that are developing COVID-19 vaccines are about to issue a statement describing their commitment to prioritize safety before speed by waiting to seek authorization for their vaccine until human trials show “substantial evidence of safety and efficacy,” and to adhere to the highest standards in clinical trials and manufacturing. The logic is so unassailable and obvious that one wonders why it even needs to be publicly articulated – unless the companies are being pressured to submit applications to FDA prematurely.screen shot at pm

But far more worrisome is the existence of a loophole, or workaround, in federal law (U.S. Code, Title 21, Chapter 9), that could be used unilaterally by Hahn’s boss, the Secretary of Health and Human Services – Alex Azar, a political appointee and lawyer who has hardly been a paragon of independence or competence over the course of the pandemic.

The relevant section of the law, “Authorization for medical products for use in emergencies,” specifies that

The Secretary may issue an authorization under this section with respect to the emergency use of a product only if, after consultation with the Assistant Secretary for Preparedness and Response, the Director of the National Institutes of Health, and the Director of the Centers for Disease Control and Prevention… the Secretary concludes (1) that an agent referred to in a declaration under subsection (b) can cause a serious or life-threatening disease or condition; (2) that, based on the totality of scientific evidence available to the Secretary, including data from adequate and well-controlled clinical trials, if available, it is reasonable to believe that (A) the product may be effective in diagnosing, treating, or preventing (i) such disease or condition.”

Note that the Secretary is only required to consult with, but not obtain agreement from, the three subordinates specified in the law.  Moreover, none of the officials entrusted with making the decision appear to have any experience with the non-clinical aspects of vaccine production. Ongoing controls on and consistency in manufacturing are essential to ensuring safety and efficacy going forward.  For example, can the manufacturer ensure that every batch meets the standards for purity, potency, and sterility?  Are there Standard Operating Procedures (SOPs) for those involved in production of the vaccine?  Has the facility passed inspection?

It would surprise me not at all if Secretary Azar was already having and documenting conversations with the  three, specified subordinates, and staffers in the White House and the Department of HHS were already preparing the necessary decision documents for an emergency authorization for one or more of the COVID-19 vaccine candidates.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

Lending credence to that scenario are reports that top administration officials told congressional leaders in July that they were likely to issue emergency authorization for a vaccine before the end of Phase 3 clinical trials in the United States.

For several reasons, that would be unwise.  The perception that the authorization was rushed and issued via an unorthodox pathway would fan the passions of the anti-vaccine movement and undermine public confidence in COVID-19 vaccines; but, more important, the short-circuiting of the usual evaluation mechanisms could endanger the vaccine recipients.

There is a reason that vaccines intended to be administered to hundreds of millions of healthy people are extensively tested and the results carefully evaluated. This is a time that science, meticulously applied, and experience with vaccine evaluation must prevail.

The last word goes to Professor and physician Joel Tepper, of the University of North Carolina Lineberger Comprehensive Cancer Center:

If a substandard vaccine is released, it will hinder, in a major way, efforts to develop a vaccine that is actually safe and effective.  The short- and long-term implications of a poor decision will be enormous.

Henry I. Miller, a physician and molecular biologist, was a research associate at the National Institute of Child Health and Human Development, the founding director of the FDA’s Office of Biotechnology, and the co-discoverer of a critical enzyme in the influenza virus. Find Henry on Twitter @henryimiller

cluestopande

Skeletons provide tell-tale glimpses into past mass infections and pandemics

The previous pandemics to which people often compare COVID-19 – the influenza pandemic of 1918, the Black Death bubonic plague (1342-1353), the Justinian plague (541-542) – don’t seem that long ago to archaeologists. We’re used to thinking about people who lived many centuries or even millennia ago. Evidence found directly on skeletons shows that infectious diseases have been with us since our beginnings as a species.

Bioarchaeologists like us analyze skeletons to reveal more about how infectious diseases originated and spread in ancient times.

How did aspects of early people’s social behavior allow diseases to flourish? How did people try to care for the sick? How did individuals and entire societies modify behaviors to protect themselves and others?

Knowing these things might help scientists understand why COVID-19 has wreaked such global devastation and what needs to be put in place before the next pandemic.

file z w
These round lesions are pathognomonic signs of syphilis. Credit: Charlotte Roberts

Clues about illnesses long ago

How can bioarchaeologists possibly know these things, especially for early cultures that left no written record? Even in literate societies, poorer and marginalized segments were rarely written about.

In most archaeological settings, all that remains of our ancestors is the skeleton.

file wdov
Tuberculosis leaves telltale markings in the spine. Credit: Charlotte Roberts

For some infectious diseases, like syphilistuberculosis and leprosy, the location, characteristics and distribution of marks on a skeleton’s bones can serve as distinctive “pathognomonic” indicators of the infection.

Most skeletal signs of disease are non-specific, though, meaning bioarchaeologists today can tell an individual was sick, but not with what disease. Some diseases never affect the skeleton at all, including plague and viral infections like HIV and COVID-19. And diseases that kill quickly don’t have enough time to leave a mark on victims’ bones.

To uncover evidence of specific diseases beyond obvious bone changes, bioarchaeologists use a variety of methods, often with the help of other specialists, like geneticists or parasitologists. For instance, analyzing soil collected in a grave from around a person’s pelvis can reveal the remains of intestinal parasites, such as tapeworms and round worms. Genetic analyses can also identify the DNA of infectious pathogens still clinging to ancient bones and teeth.

Bioarchaeologists can also estimate age at death based on how developed a youngster’s teeth and bones are, or how much an adult’s skeleton has degenerated over its lifespan. Then demographers help us draw age profiles for populations that died in epidemics. Most infectious diseases disproportionately affect those with the weakest immune systems, usually the very young and very old.

For instance, the Black Death was indiscriminate; 14th-century burial pits contain the typical age distributions found in cemeteries we know were not for Black Death victims. In contrast, the 1918 flu pandemic was unusual in that it hit hardest those with the most robust immune systems, that is, healthy young adults. COVID-19 today is also leaving a recognizable profile of those most likely to die from the disease, targeting older and vulnerable people and particular ethnic groups.

file mjcx
Ground penetrating radar shows mass graves from the small Aboriginal settlement of Cherbourg in Australia, where 490 out of 500 people were struck down by the 1918-1919 influenza pandemic, with about 90 deaths. Credit: Kelsey Lowe

We can find out what infections were around in the past through our ancestors’ remains, but what does this tell us about the bigger picture of the origin and evolution of infections? Archaeological clues can help researchers reconstruct aspects of socioeconomic organization, environment and technology. And we can study how variations in these risk factors caused diseases to vary across time, in different areas of the world and even among people living in the same societies.

How infectious disease got its first foothold

Human biology affects culture in complex ways. Culture influences biology, too, although it can be hard for our bodies to keep up with rapid cultural changes. For example, in the 20th century, highly processed fast food replaced a more balanced and healthy diet for many. Because the human body evolved and was designed for a different world, this dietary switch resulted in a rise in diseases like diabetes, heart disease and obesity.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

From a paleoepidemiological perspective, the most significant event in our species’ history was the adoption of farming. Agriculture arose independently in several places around the world beginning around 12,000 years ago.

Prior to this change, people lived as hunter-gatherers, with dogs as their only animal companions. They were very active and had a well balanced, varied diet that was high in protein and fiber and low in calories and fat. These small groups experienced parasitesbacterial infections and injuries while hunting wild animals and occasionally fighting with one another. They also had to deal with dental problems, including extreme wear, plaque and periodontal disease.

file gh c
A healed fracture of the lower leg bones from a person buried in Roman Winchester, England. Credit: Charlotte Roberts

One thing hunter-gatherers didn’t need to worry much about, however, was virulent infectious diseases that could move quickly from person to person throughout a large geographic region. Pathogens like the influenza virus were not able to effectively spread or even be maintained by small, mobile, and socially isolated populations.

The advent of agriculture resulted in larger, sedentary populations of people living in close proximity. New diseases could flourish in this new environment. The transition to agriculture was characterized by high childhood mortality, in which approximately 30% or more of children died before the age of 5.

And for the first time in an evolutionary history spanning millions of years, different species of mammals and birds became intimate neighbors. Once people began to live with newly domesticated animals, they were brought into the life cycle of a new group of diseases – called zoonoses – that previously had been limited to wild animals but could now jump into human beings.

Add to all this the stresses of poor sanitation and a deficient diet, as well as increased connections between distant communities through migration and trade especially between urban communities, and epidemics of infectious disease were able to take hold for the first time.

Globalization of disease

Later events in human history also resulted in major epidemiological transitions related to disease.

For more than 10,000 years, the people of Europe, the Middle East and Asia evolved along with particular zoonoses in their local environments. The animals people were in contact with varied from place to place. As people lived alongside particular animal species over long periods of time, a symbiosis could develop – as well as immune resistance to local zoonoses.

At the beginning of modern history, people from European empires also began traveling across the globe, taking with them a suite of “Old World” diseases that were devastating for groups who hadn’t evolved alongside them. Indigenous populations in Australiathe Pacific and the Americas had no biological familiarity with these new pathogens. Without immunity, one epidemic after another ravaged these groups. Mortality estimates range between 60-90%.

file qo z
This skull of a person who lived more than 2,600 years ago in Peru shows evidence of a surgery, maybe to treat a head wound.

The study of disease in skeletons, mummies and other remains of past people has played a critical role in reconstructing the origin and evolution of pandemics, but this work also provides evidence of compassion and care, including medical interventions such as trepanationdentistryamputation and prosthesesherbal remedies and surgical instruments.

Other evidence shows that people have often done their best to protect others, as well as themselves, from disease. Perhaps one of the most famous examples is the English village of Eyam, which made a self-sacrificing decision to isolate itself to prevent further spread of a plague from London in 1665.

file t jb
A tuberculosis sanatorium in São Paulo, Brazil, in the late 1800s. Credit: Wellcome Collection

In other eras, people with tuberculosis were placed in sanatoria, people with leprosy were admitted to specialized hospitals or segregated on islands or into remote areas, and urban dwellers fled cities when plagues came.

As the world faces yet another pandemic, the archaeological and historical record are reminders that people have lived with infectious disease for millennia. Pathogens have helped shape civilization, and humans have been resilient in the face of such crises.

Charlotte Roberts is a Professor of Archaeology at Durham University. Charlotte is a bioarchaeologist, and has a background in archaeology, environmental archaeology and human bioarchaeology. 

Gabriel D. Wrobel is an Associate Professor of Anthropology at Michigan State University, a bioarchaeologist, and the director of the Central Belize Archaeological Survey (CBAS) Project. Gabriel’s research focuses primarily on investigating mortuary and biological variability in ancient Maya individuals interred in caves and rockshelters in central Belize.

Michael Westaway is a biological anthropologist and archaeologist and has a strong interest in human evolution in Australia and South East Asia and zooarchaeology in Australia. 

This article was originally published at the Conversation and has been republished here with permission. Follow the Conversation on Twitter @ConversationUS

beijing grants ad ncov patent the first such covid vaccine in china

‘Challenge studies’: Should we be testing COVID vaccines by intentionally infecting volunteers?

To those who’ve never thought about volunteering to be intentionally infected to test a vaccine, the idea may at first seem a bit bonkers. But such “challenge” studies not only have a rich history, but nearly 40,000 people have already checked the box “I am interested in being exposed to the coronavirus to speed up vaccine development” at 1daysooner, a website and non-profit organization that launched in April.

one day sooner

Challenge studies go by other names: “controlled human infection models,” “human viral challenge,” and “purposeful infection.” Dripping virus-tainted saltwater into a volunteer’s nostrils enables researchers to track infection, and the immune system’s response to it, right from the start. The approach complements phase 3 clinical “field” trials of efficacy that await natural infection in the community.

A bioethical quandary

A challenge study can speed discovery of whether or not a vaccine works. That’s important because a vaccine hastens the herd immunity that builds from natural infection. But the risk of possible harm to a recipient of an experimental vaccine may be greater with intentional infection than with community exposure, simply because it’s more likely to happen.

Use of a placebo group is another matter.

A placebo is the gold standard in a conventional clinical trial collecting initial data. But it isn’t necessarily needed for a challenge trial when phase 3 results are forthcoming, as is the case for COVID-19.

Exposure to SARS-CoV-2 after taking an experimental vaccine is scary enough — it’s riskier if the “vaccine” is really a placebo. Philosopher Kent A. Peacock and psychologist John R. Vokey, both of the University of Lethbridge, argued against placebos in COVID challenge trials in STAT News.

Seema Shah, JD, professor of Medical Ethics at the Northwestern University Feinberg School of Medicine, explained the nuances of using placebos:

“If you are testing whether vaccines or treatments work, a placebo is usually necessary. Challenge studies can create a reliable model of infection where researchers learn what dose is needed to infect all volunteers but not make them too sick, and then you wouldn’t use a placebo. Models are also used to investigate transmission and learn about the early stages of infection, and those don’t require placebo.”

Several prominent bioethicists who’ve given the matter of COVID challenge trials much thought support the idea.

Bioethicists Stanley A. Plotkin from the University of Pennsylvania and Arthur Caplan from New York University call developing and distributing an efficacious COVID-19 vaccine a “moral imperative for the world” in the journal Vaccine, calling for immediate discussion of beginning challenge studies. Those conversations are now well underway.

A challenge study isn’t a new idea

Considering the design of COVID challenge trials might benefit from a look back at the history of protection of research subjects that led to the founding of the field of bioethics in 1970 .

The National Research Act of 1974 inspired the Belmont Report of 1979 that established three requirements for people who volunteer for experiments, including clinical trials: respect for autonomy (ability to make decisions); beneficence (benefit); and justice (anyone who meets criteria can volunteer).

More recent renditions of participant protection expanded the definition of ethical research to embrace non-maleficence (“do no harm”) and, vital for vaccine testing, utilitarianism, which justifies actions if they benefit a majority. Even more on target, in 2014 an international group of ethicists interpreted utilitarianism to include “maximization of public health.”

Should we potentially harm a few to possibly save many? That’s the risk of challenge trials. With thousands still dying of COVID-19 every week, demonstrating efficacy another way, to complement the phase 3 trials, would save lives. Overlapping phases and early production of vaccine candidates will surely speed the conventional trajectory, but it isn’t enough.

Even challenge trials with coronaviruses aren’t novel. In 1967 researchers reported in the British Medical Journal experiments that dripped a new respiratory virus “surrounded by a fringe of club-shaped projections” collected from six students with colds into the noses of 26 healthy volunteers. Would they come down with the sniffles? Counting the number of handkerchiefs soiled daily revealed that half of them did. The culprit? Seasonal coronavirus 229-E.

Challenge studies sped development of an improved cholera vaccine in Baltimore from 1977 through 1995, using volunteers and strict quarantine protocols in a hospital to manage symptoms. Most participants who became ill suffered only mild, brief fever and diarrhea.

oral cholera vaccine
Child receiving oral cholera vaccine.

Over the years challenge studies have been deployed against a list of horrors: dengue, shigella, typhoid fever, giardia, tuberculosis, rhinovirus, norovirus, and most commonly, malaria and influenza. It was considered for Zika virus infection but never done because the epidemic abated, and led to approval of an Ebola vaccine in 2019 in a mere 10 months.

In December 2019, Ricardo Palacios from the University of São Paulo in Brazil and Shah wrote a prescient commentary in the journal Trials: “When could human challenge trials be deployed to combat emerging infectious diseases? Lessons from the case of a Zika virus human challenge trial.”

They couldn’t have known a novel virus was about to unfurl on the world.

Volunteering, not coercion

Challenge studies have a checkered past. Doctors in Nazi Germany did it, but without consent.

So did Werner Henle, a virologist at the University of Pennsylvania, who tested an influenza vaccine on prisoners and intellectually disabled children in a state facility. Jonas Salk, of polio vaccine fame, tested flu vaccines on mental patients and prisoners in Michigan.

tna valiunasbanner
Jonas Salk. Credit: Yousuf Karsh

The huge difference between then and now is the word “forced.” Participants in challenge studies for COVID-19 vaccines will follow an informed consent process that ensures that they understand and accept the risks — including the unknown and the possibility of receiving a placebo, if that’s part of a particular plan.

It’s especially important for potential participants to read the consent forms carefully, because regulations have loosened a bit, in a way that may maximize the information gleaned from a challenge trial. Health and Human Services (HHS) updated the “Common Rule” from 1981 that stated that benefits to the participant and to society must outweigh risks to the subject. The amendment in 2018 stated that risks to the subject might be “reasonable,” not zero.

That regulatory flexibility is appreciated now. Jerry Menikoff, MD, JD, Director of the Office for Human Research Protections at HHS, points out that unlike challenge studies for malaria, cholera, and influenza, COVID-19 is riskier because we know less about it. Plus, there’s no “rescue” treatment if an experimental treatment harms someone.

The flip side of potential risk is potential benefit. In a challenge study, investigators can track the nuances of the immune response and degree of viral shedding from the precise start of infection, something not possible in the community, where conventional vaccines are tested. What we can learn from challenge studies may outweigh the concerns. Protocols can be designed to compare vaccines using the fewest volunteers possible, such as by using one placebo group for multiple vaccine candidates.

Bioethicist Nir Eyal, PhD, and epidemiologists Marc Lipsitch, PhD, and Peter G. Smith, DSc, added perspective in an article in The Journal of Infectious Diseases. They compare the sacrifice of a challenge study participant to those of volunteer firefighters, participants in drug trials, members of the military, and living organ donors.

How challenge studies will unfold

 In June, the Advisory Group on Human Challenge Studies from the World Health Organization (WHO) released a draft of an 81-page “technically valid roadmap” to guide discussion among bioethicists, public health experts, epidemiologists, and physicians. It’s currently open for public comment. The report is more specific than a similar guide published in 2001 from NIH researchers.

world health organizationfeasibility

The WHO document begins with “factors that warrant special caution” when conducting a challenge study for COVID-19: severity, high transmissibility, deaths of young healthy individuals, activity of the virus on surfaces for hours, lack of rescue treatment, and the surprises that the evolving pandemic brings. These are the factors that give pause to thoughts of including placebo arms. Then the document lists steps.

First is creation of “challenge strains” of the virus for the volunteers. Four strains are being considered because they represent the distribution of SARS-CoV-2 around the world.

Genetic modification of the challenge strains can insert telltale DNA “tags” that enable tracking of distinct viruses, and other tweaks can render the viruses milder than the predominant natural strains. If a vaccine fails to protect, a person wouldn’t get too sick.

The WHO group decided that participants should be between ages 18 and 25, who are less likely to develop severe symptoms, if any.

Next, the test virus, or placebo, is placed in the nostrils using a pipette, rather than a nasal spray that can shoot virus deep into the lungs. Experiments have shown that three doses are needed to infect most, if not all, participants.

The volunteers then will spend up to 3 weeks in single rooms on isolation units in facilities that have constant monitoring from a nursing station and presence of an expert physician, availability of ICU equipment, and existing treatments like remdesivir and steroids, until PCR tests are negative. The protocol also includes mental health screening for ability to tolerate the isolation. Some of these places were developed for influenza studies.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

Researchers can analyze mounting viral load as well as the unfolding immune response in real time in participants who’ve been infected, categorizing T cells, identifying the viral antigens that different antibodies attack, and charting inflammatory markers in blood. They’ll also be on the lookout for “disease enhancement,” which is when a vaccine worsens an infection.

No challenge trial for a COVID-19 vaccine has yet to begin, and details are still being worked out. The 19 members of the Advisory Group agreed on using young people, four viral strains, and existing treatments, but were split on three other issues: whether studies should proceed even if no better rescue treatments come along; whether a vaccine that protects young people will work on older people or those with pre-existing conditions; and whether the study should or will accelerate regulatory approval, including emergency use authorization. Perhaps comments from the public will help to flesh out these issues.

Where are we now?

Interest in challenge trials for COVID-19 has been building.

Seema Shah and her colleagues published a framework and analysis in Science in May that included “developing a challenge strain, drafting consensus protocols that address ethical concerns, and engaging stakeholders to enhance their social value, minimize risks, and build public trust.”

By midsummer, Adrian Hill, MD, PhD, director of Oxford University’s Jenner Institute, made the media rounds to discuss their candidate vaccine, the adenovirus-based ChAdOx1 nCoV-19, noting:

We’re hoping to be doing challenge trials by the end of the year. This might be in parallel or might be after the phase three trial is completed. They’re not competing options, they’re complementary.

Phase 3 trials of the vaccine are ongoing or about to begin in the UK, Brazil, South Africa, Japan, Russia, and the US, but were temporarily paused on September 8 due to a participant becoming ill.

Even if challenge trials start soon, making sense of findings will take time. Cautioned Meagan E. Deming, MD, PhD, of the Center for Vaccine Development and Global Health, University of Maryland and colleagues in the September 3 New England Journal of Medicine, developing “a robust challenge model for testing SARS-CoV-2 vaccines” may take a year or two. “Investigators at potential sites should begin soon to engage stakeholders in the scientific, regulatory, public health, and local communities.”

Bioethicists Plotkin and Caplan eloquently sum up the challenge of challenge studies:

“Deliberately causing disease in humans is normally abhorrent, but asking volunteers to take risks without pressure or coercion is not exploitation but benefitting from altruism. As Shakespeare put it, ‘Desperate diseases by desperate measures are relieved.”

Those who volunteer to receive an experimental vaccine and then be infected with a potentially lethal viral pathogen exemplify selflessness. They are the polar opposite of the people who flagrantly ignore public health recommendations to prevent spread of COVID-19.

Ricki Lewis has a PhD in genetics and is a genetics counselor, science writer and author of Human Genetics: The Basics. Follow her at her website or Twitter @rickilewis

ts microbiome article

Do you have food allergies? Manipulating the gut microbiome might treat them

As a child, Cathryn Nagler broke out in hives when she ate eggs. She reacted to penicillin. Working in labs after college, she developed a severe allergy to mice that caused wheezing, swelling and trouble breathing — twice landing her in the emergency room.

Today, Nagler is an immunologist the University of Chicago and is helping to pioneer an emerging research field: studying how bacteria in the gut can be harnessed to help people with food allergies.

It wasn’t personal experience with allergies that inspired her interest. Rather, it was an odd observation she made as a doctoral student in the 1980s. She was studying mice whose immune systems go haywire and attack the collagen protein inside their joints, causing severe arthritis. Scientists could jump-start the disease by administering a shot of collagen under the skin. But, curiously, when Nagler later fed the creatures collagen using a tube that snaked down into their stomachs, it had the opposite effect: The mice got better.

Decades on, this concept, called oral immunotherapy, has come into use as a treatment for food allergies, which affect an estimated 32 million people in the United States, including about two schoolchildren per classroom. Over the last ten years or so, some allergists have begun treating food allergy patients with small, regular doses of the offending food (or products made from it) to calm allergic responses. The approach stands to grow in popularity with the approval in January of a standardized version — a set of daily capsules to treat peanut allergy — by the US Food and Drug Administration.

But oral immunotherapy has downsides. The regimen can be nerve-racking, since it involves daily consumption of food that could kill. It doesn’t work for everyone and does little to fix the underlying disease. Success mostly means gaining the ability to safely eat several peanuts, for example, rather than reacting to a speck of peanut flour.

For some families, this modest gain is life-altering. Still, it is precarious: Patients must consume a bit of the food every day, or a few times a week, for the rest of their lives — or they could lose the protection.

So Nagler and several other researchers are working to find ways to treat food allergies more easily and durably. They’re targeting what they believe is a root cause — imbalances in the community of beneficial bacteria, or microbiome, that lives in our guts — in the hopes of resetting the immune system.

p cathryn nagler
Cathryn Nagler discusses experiments with former research technician Elliot Culleen. Credit: Polsky Center/University of Chicago

Producing a microbiome-based treatment will be challenging, with many details to hash out, such as which microbes to provide and how best to deliver them. But the approach is gaining momentum. Last year, Nagler’s team and another group in Boston reported an important step forward: They prevented severe allergic responses in allergy-prone mice by supplying gut microbes from healthy, non-allergic human babies. “The data are sound, and they are very encouraging,” says pediatric allergist Jaclyn Bjelac of the Cleveland Clinic.

And in March, scientists reported finding large amounts of antibodies against peanut allergens in the stomach and gut of allergic patients, further supporting the idea that the gastrointestinal tract is a hotspot for food allergy regulation and treatment. Already, companies are testing several strategies.

It has long been a puzzle why one person tolerates a food while another is allergic but, as outlined in an article she coauthored in the Annual Review of Immunology, Nagler is convinced that the microbiome is key.

Birth of a hypothesis

Four years after finishing her graduate work, Nagler started running a lab at Harvard Medical School. She was studying inflammatory bowel disease, not food allergies, back then. But as research in the 1990s showed that inflammatory bowel disease was primarily caused by immune reactions against gut bacteria, she shifted her attention to the microbiome.

Then, in 2000, she came across an intriguing publication. It described a mouse model for peanut allergy that mimics key symptoms experienced by people. The mice scratch relentlessly. Their eyes and mouths get puffy. Some struggle to breathe — a life-threatening allergic response called anaphylaxis.

All of this happens after researchers feed the mice peanut powder. “That caught my eye,” Nagler says. It ran counter to her earlier findings with the arthritic mice, where feeding collagen calmed the immune reaction. Why the difference?

The peanut-allergy mice, another report showed, had a genetic glitch that damages a receptor called TLR4 that sits in the membranes of immune cells and recognizes microbes. It looked as though the peanut-allergy mice lacked the normal cross talk that takes place between gut microbes and immune cells.

“That was my lightbulb moment,” Nagler says. Perhaps the trillions of microbes that live in us suppress immune responses to food by stimulating the TLR4 receptor. And perhaps perturbations in that teeming microbiome alter the suppression and cause a rise in allergies.

The idea meshes with historical trends. As societies modernized, people moved to urban areas, had more babies by cesarean section, took more antibiotics and ate more processed, low-fiber foods — all of which shake up microbiomes. The timing of these lifestyle shifts parallels the observed increase in food and other types of allergies, whose steep rise over a generation points to some environmental cause.

knowable gut
Rates of food allergies have risen steeply over the last few decades, in line with a variety of changes in the way we live. A lot of these aspects of a modern lifestyle — such as fatty, low-fiber diets and the use of antibiotics — have potential to affect the types of microbes that live in our guts. Some scientists think that such microbiome shifts are making allergies more likely.

In 2004, Nagler and her coworkers published a report showing that peanuts provoked anaphylaxis only in mice with a mutated TLR4 receptor, not in genetically related strains with a normal TLR4. The difference disappeared when the scientists wiped out populations of gut bacteria with antibiotics. Then, even normal mice became susceptible to food allergies, implying that bacteria are at the heart of the protection.

Nagler’s lab has been working ever since to identify which bacteria are helpful, and to understand how they regulate allergic responses.

Early effects

In their work, Nagler’s team focused on Clostridia and Bacteroides — two major groups of bacteria in the human gut. Working with mice bred in a germ-free environment and thus without any microbiome at all, the team found that Clostridia, but not Bacteroides, prevented food-allergic responses when introduced into the guts of the squeaky-clean mice.

There’s a potential explanation: Mice colonized with Clostridia bacteria had more regulatory T cells, a type of cell that dampens immune responses. The Clostridia mice also produced more of a molecule called IL-22 that strengthens the intestinal lining. A new theory began to emerge: If protective microbes are missing, the gut barrier weakens, allowing food proteins to seep into the bloodstream and potentially trigger allergic responses.

p clostridium sem
This color-enhanced scanning electron micrograph shows bacteria of a type called Clostridia, which live in mammalian guts. Certain Clostridia bacteria have been linked to protection against food allergies. Credit: David M. Phillips/Science Source

This reasoning jibes well with the curious observation that top food allergens (certain proteins found in milk, eggs, peanuts, tree nuts, soy, wheat, fish and shellfish) bear little biochemical resemblance to each other. What they do have in common is the ability to remain intact in the digestive tract, which normally breaks food into small pieces that the body absorbs as nutrients. “That seems to be what makes peanut the champion — its ability to resist degradation in the gut,” Nagler says.

Studies have further solidified the link between gut bacteria and food allergies and suggest that the microbiome’s impact comes early in life. Analyzing feces of healthy babies and those with egg or milk allergies, researchers showed that allergic and nonallergic infants had different communities of gut bacteria.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

Another study tracked 226 children with milk allergy from infancy to age 8. The scientists found that certain bacteria, including Clostridiawere enriched in stool samples from 3- to 6-month-old infants who eventually outgrew their allergy, compared to those who remained allergic. The scientists didn’t see the same difference between these groups in older babies, suggesting that allergy-protective microbes may only act early in life.

“All of this points to the concept of a window of opportunity in terms of prevention,” says study leader Supinda Bunyavanich, a pediatric allergist at the Icahn School of Medicine at Mount Sinai in New York City.

Causal evidence

From birth, our immune systems get schooled in life-or-death choices. They learn to kill germs, tumors and dying cells. Much else in their surroundings they must learn to leave alone — nerve fibers, bone tissue, proteins from milk and cookies consumed at snack time. Mouse studies published in 2019 by Nagler’s lab and another team argue convincingly that gut microbes cultivate this critical immune decision-making.

In one of the studies, Nagler and coworkers collected gut bacteria from the feces of healthy and milk-allergic babies and put those collections of microbes into the digestive tracts of germ-free mice. They found that gut bacteria from healthy babies protected mice against allergic responses to milk, whereas microbes from allergic infants didn’t.

Using mathematical and computer science techniques to analyze the results, the team identified bacterial strains that were present in healthy but not allergic babies. They also examined gene activity in cells lining the intestines — certain gene patterns are characteristic of a healthy gut barrier — and looked for microbes whose presence correlated with a healthy barrier.

knowable gut
To understand how gut bacteria influence food allergies, researchers collected feces from four healthy babies and four babies allergic to cow’s milk and transferred these microbe-rich samples into germ-free mice. Later, they exposed the mice to cow’s milk allergen and measured allergic responses in the animals. The team then analyzed gene activity in cells lining the animals’ small intestines — which interact heavily with gut microbes. They saw notable differences between the differently treated mice. In the above chart, color and intensity of the boxes indicate higher (red) or lower (blue) activity of genes. The genes’ names are listed to the right of the grid. Some genes were more heavily active in mice that got microbes from healthy babies (“Up in healthy”), while other genes showed more activity in mice that got microbes from allergic babies (“Up in CMA”) — or vice versa (bottom two sets of rows). By mapping and comparing these effects, the researchers deduced which microbes seemed associated with protection against milk allergy (not shown).

One Clostridia species, Anaerostipes caccae, popped out of both analyses. When the scientists transferred A. caccae alone into germ-free mice, it seemed to mimic the protection imparted by a full, healthy microbiome.

The other team, led by Rima Rachid and Talal Chatila at Boston Children’s Hospital, took a similar approach using hyper-allergic mice, finding that the single species Subdoligranulum variabile and a set of Clostridia species prevented allergic responses. Regulatory T cells were key to the response and were spurred into action by the microbes.

These and other studies clearly show that the microbiome is important for preventing food allergies and inducing tolerance, says Carina Venter, a research dietician at the University of Colorado in Denver who is studying links between maternal diet during pregnancy, microbiomes of infants and risk for eczema and allergies. But, she says, “how that microbiome should look in terms of diversity and in terms of specific strains, we just don’t know.”

Trials and questions

The many unknowns leave a quandary for researchers hoping to develop better treatments for food allergies: Is it better to supply a full, healthy microbiome, or to replenish just a few helpful microbes? “I scratch my head every day thinking about this,” Rachid says.

p allergenic foods
Certain foods are far more likely than others to cause allergies: eggs, shellfish, nuts, peanuts, soy, fish, milk and wheat. These foods, though very different, all contain proteins that are resistant to digestion. Credit: Science Photo Library/Science Source

She’s leading a clinical study to test the first possibility. In this small trial, adults with peanut allergies will swallow pills containing a full slate of gut bacteria from healthy donors pre-screened for safety by the nonprofit stool bank OpenBiome. The approach, known as fecal transplantation, is not FDA-approved but is increasingly used to treat severe intestinal disorders with the aim of fixing diseased microbiomes by infusing healthy, balanced ones.

Other trials are also underway. Using the protective strains identified by the Boston team, Pareto Bio of La Jolla, California, is developing a live microbial product to treat food allergies. Another company, Vedanta Biosciences of Cambridge, Massachusetts, is developing a probiotic capsule that contains a mix of Clostridia strains selected for their ability to induce regulatory T cells. Vedanta is testing the capsules as an add-on to oral immunotherapy in adults with peanut allergies.

A third company, Prota Therapeutics of Melbourne, Australia, is commercializing a similar strategy combining peanut oral immunotherapy with a probiotic — in their case, a Lactobacillus strain commonly prescribed for gastrointestinal problems.

Administering whole microbiomes from donors is not without risk: Four patients have been hospitalized, and one died, from serious infections linked to stool transplants. So some researchers think it may be better to use precisely defined species. Though this risks weakening the benefit, “you’re less likely to induce unanticipated problems,” says Wayne Shreffler, who directs the food allergy center at Massachusetts General Hospital in Boston and is leading the Vedanta study.

But there’s one challenge shared by all microbiome-modulating approaches: getting new microbes established when someone already has a microbiome in place, even an unhealthy one. Traditionally, patients receive antibiotics to help new bacteria gain a foothold. But maybe there’s another way. A start-up that Nagler cofounded with University of Chicago biomolecular engineer Jeff Hubbell — ClostraBio — is developing a therapy that combines live bacteria with a key microbial metabolite, butyrate.

The chemical is known to enhance gut barrier function and may also have antimicrobial effects, which could help create a niche for the added microbes. ClostraBio plans to launch its first human trial by 2021, Nagler says.

Over the next few years, researchers will learn more about harnessing the microbiome to fight food allergies. It won’t be easy. Genetics, diet, environmental exposures: All influence allergy risk. “It’s a big puzzle,” says Bunyavanich. The microbiome is only one piece of it — but she, Nagler and others are betting it will turn out to be a big one.

A former lab rat, Esther Landhuis is a California-based freelance journalist who writes about biomedicine and STEM diversity. Her stories have also appeared in Science News, Scientific American, NPR, NatureChemical & Engineering News and Undark. Follow her on Twitter @elandhuis

This article was originally published at Knowable Magazine and has been republished here with permission. Follow Knowable on Twitter @KnowableMag

artificial embryo structures show insight into fetal development

Podcast: Where do babies come from? How developmental genetics revealed the secrets of life’s earliest stages

Geneticist Dr Kat Arney goes back to the very beginning, telling the stories of the midwives of the field of developmental genetics, two talented researchers whose work helped to reveal the secrets of life in its very earliest stages in the latest episode of the Genetics Society’s ‘Genetics Unzipped’ podcast.

 

The field of genetics began to emerge with the rediscovery of Mendel’s laws of inheritance around the turn of the 20th century, with the founding of The Genetics Society by William Bateson and Edith Rebecca Saunders following in 1919. But around the same time, another new field of biology was emerging: embryology. From the 1880s, scientists began asking how organisms developed – life unfolding from a single cell to many, with cells dividing, dying and specializing from one stage to the next. Or, to put it less scientifically, how are babies made?

In the early 1900s, embryology was considered a completely separate field from heredity or genetics. But over the next century, scientists would reveal the interplay between the two, and the exquisite links between genetics and development. The new field of developmental genetics was born, and its midwives included several remarkable women, two of whom we’re going to take a closer look at: Hilde Mangold and Salome Gluecksohn-Waelsch.

The tale of developmental genetics is a thrilling one, with everything you need for a good story. There’s politics, drama, upheaval, prejudice, and even a suspicious death. So hold on tight, this is a good one.

Full show notes, transcript, music credits and references online at GeneticsUnzipped.com.

Genetics Unzipped is the podcast from the UK Genetics Society, presented by award-winning science communicator and biologist Kat Arney and produced by First Create the Media.  Follow Kat on Twitter @Kat_Arney, Genetics Unzipped @geneticsunzip, and the Genetics Society at @GenSocUK

Subscribe from Apple podcasts, Spotify, or wherever you get your podcasts.

canola oregon seed industry debate cross pollination willamette valley

Podcast: Neuralink brain chips; Flu vaccines during COVID; US farm system unraveling?

Elon Musk’s company Neuralink recently debuted its brain implant in pigs, pushing us a little closer to integrating humans and computers. As the COVID-19 pandemic continues, getting a flu shot this fall could be the difference between life and death, say some experts. Critics fear America’s farming system is about to unravel in the face of climate-change fueled water shortages and unsustainable growing practices that jeopardize soil health. How serious is this threat to our food supply?

Join geneticist Kevin Folta and GLP editor Cameron English on this episode of Science Facts and Fallacies as they break down these latest news stories:

Brain implants that read and write neuron signals are one step closer to widespread use after Neuralink demonstrated how the devices work in pigs. The initial goal is to use these tiny computer chips to bypass spinal cord injuries and restore movement for people who are paralyzed, a reasonable goal since similar devices have already been employed.

More distant applications may include restoring sight to people with eye injuries, minimizing pain and even recording memories. As the technology continues to develop, though, some scientists warn that Neuralink has to carefully consider the risk of serious brain injury, particularly brain bleeding.

memory chip brain implant

It’s important to get a flu shot every year, but it could be essential in 2020, say some infectious disease experts. With the deadly SARS-COV-2 virus already circulating, a bad flu season may lead to many more illnesses and deaths, the results of an overburdened health care system unable to treat people suffering from one or both infections. Conversely, social distancing measures and masking used to stem COVID-19 transmission have apparently mitigated this year’s flu season in the southern hemisphere, and could have the same effect in the US. Whatever the case, the flu vaccine is a wise insurance policy in the face of uncertain risk.

America’s agricultural system produces an abundance of affordable food, but  unsustainable practices employed on many conventional farms has locked that system “in a state of slow-motion ecological unraveling,” writes Guardian contributor Tom Philpott. As climate change accelerates, the problem can only get worse—unless we right the ship and rethink how we produce food.

But the question remains: how do we do that? Do we solve this problem with “a direct political challenge to big agribusiness” firms that profit from conventional farming, as Philpott maintains, or a greater reliance on technology that reduces land use while increasing crop yields?

Subscribe to the Science Facts and Fallacies Podcast on iTunes and Spotify.

Kevin M. Folta is a professor in the Horticultural Sciences Department at the University of Florida. Follow Professor Folta on Twitter @kevinfolta

Cameron J. English is the GLP’s managing editor. BIO. Follow him on Twitter @camjenglish

facial recognition threat

Resurrection of phrenology? AI’s quest to link facial features and criminality has a shady Victorian legacy

‘Phrenology’ has an old-fashioned ring to it. It sounds like it belongs in a history book, filed somewhere between bloodletting and velocipedes. We’d like to think that judging people’s worth based on the size and shape of their skull is a practice that’s well behind us. However, phrenology is once again rearing its lumpy head.

In recent years, machine-learning algorithms have promised governments and private companies the power to glean all sorts of information from people’s appearance. Several startups now claim to be able to use artificial intelligence (AI) to help employers detect the personality traits of job candidates based on their facial expressions. In China, the government has pioneered the use of surveillance cameras that identify and track ethnic minorities. Meanwhile, reports have emerged of schools installing camera systems that automatically sanction children for not paying attention, based on facial movements and microexpressions such as eyebrow twitches.

Perhaps most notoriously, a few years ago, AI researchers Xiaolin Wu and Xi Zhang claimed to have trained an algorithm to identify criminals based on the shape of their faces, with an accuracy of 89.5 per cent. They didn’t go so far as to endorse some of the ideas about physiognomy and character that circulated in the 19th century, notably from the work of the Italian criminologist Cesare Lombroso: that criminals are underevolved, subhuman beasts, recognisable from their sloping foreheads and hawk-like noses.

untitled

However, the recent study’s seemingly high-tech attempt to pick out facial features associated with criminality borrows directly from the ‘photographic composite method’ developed by the Victorian jack-of-all-trades Francis Galton – which involved overlaying the faces of multiple people in a certain category to find the features indicative of qualities like health, disease, beauty and criminality.

Technology commentators have panned these facial-recognition technologies as ‘literal phrenology’; they’ve also linked it to eugenics, the pseudoscience of improving the human race by encouraging people deemed the fittest to reproduce. (Galton himself coined the term ‘eugenics’, describing it in 1883 as ‘all influences that tend in however remote a degree to give to the more suitable races or strains of blood a better chance of prevailing speedily over the less suitable than they otherwise would have had’.)

In some cases, the explicit goal of these technologies is to deny opportunities to those deemed unfit; in others, it might not be the goal, but it’s a predictable result. Yet when we dismiss algorithms by labelling them as phrenology, what exactly is the problem we’re trying to point out? Are we saying that these methods are scientifically flawed and that they don’t really work – or are we saying that it’s morally wrong to use them regardless?

There is a long and tangled history to the way ‘phrenology’ has been used as a withering insult. Philosophical and scientific criticisms of the endeavour have always been intertwined, though their entanglement has changed over time. In the 19th century, phrenology’s detractors objected to the fact that phrenology attempted to pinpoint the location of different mental functions in different parts of the brain – a move that was seen as heretical, since it called into question Christian ideas about the unity of the soul.

walt whitman american phrenologist x

Interestingly, though, trying to discover a person’s character and intellect based on the size and shape of their head wasn’t perceived as a serious moral issue. Today, by contrast, the idea of localising mental functions is fairly uncontroversial. Scientists might no longer think that destructiveness is seated above the right ear, but the notion that cognitive functions can be localised in particular brain circuits is a standard assumption in mainstream neuroscience.

Phrenology had its share of empirical criticism in the 19th century, too. Debates raged about which functions resided where, and whether skull measurements were a reliable way of determining what’s going on in the brain. The most influential empirical criticism of old phrenology, though, came from the French physician Jean Pierre Flourens’s studies based on damaging the brains of rabbits and pigeons – from which he concluded that mental functions are distributed, rather than localised. (These results were later discredited.) The fact that phrenology was rejected for reasons that most contemporary observers would no longer accept makes it only more difficult to figure out what we’re targeting when we use ‘phrenology’ as a slur today.

Both ‘old’ and ‘new’ phrenology have been critiqued for their sloppy methods. In the recent AI study of criminality, the data were taken from two very different sources: mugshots of convicts, versus pictures from work websites for nonconvicts. That fact alone could account for the algorithm’s ability to detect a difference between the groups. In a new preface to the paper, the researchers also admitted that taking court convictions as synonymous with criminality was a ‘serious oversight’. Yet equating convictions with criminality seems to register with the authors mainly as an empirical flaw: using mugshots of convicted criminals, but not of the ones who got away introduces a statistical bias. They said they were ‘deeply baffled’ at the public outrage in reaction to a paper that was intended ‘for pure academic discussions’.

inset case study criminal machine learning
Credit: Wu and Zhang (2016)

Notably, the researchers don’t comment on the fact that conviction itself depends on the impressions that police, judges and juries form of the suspect – making a person’s ‘criminal’ appearance a confounding variable. They also fail to mention how the intense policing of particular communities, and inequality of access to legal representation, skews the dataset. In their response to criticism, the authors don’t back down on the assumption that ‘being a criminal requires a host of abnormal (outlier) personal traits’. Indeed, their framing suggests that criminality is an innate characteristic, rather than a response to social conditions such as poverty or abuse. Part of what makes their dataset questionable on empirical grounds is that who gets labelled ‘criminal’ is hardly value-neutral.

One of the strongest moral objections to using facial recognition to detect criminality is that it stigmatises people who are already overpoliced. The authors say that their tool should not be used in law-enforcement, but cite only statistical arguments about why it ought not to be deployed. They note that the false-positive rate (50 per cent) would be very high, but take no notice of what that means in human terms. Those false positives would be individuals whose faces resemble people who have been convicted in the past. Given the racial and other biases that exist in the criminal justice system, such algorithms would end up overestimating criminality among marginalised communities.

The most contentious question seems to be whether reinventing physiognomy is fair game for the purposes of ‘pure academic discussion’. One could object on empirical grounds: eugenicists of the past such as Galton and Lombroso ultimately failed to find facial features that predisposed a person to criminality. That’s because there are no such connections to be found. Likewise, psychologists studying the heritability of intelligence, such as Cyril Burt and Philippe Rushton, had to play fast and loose with their data to manufacture correlations between skull size, race and IQ. If there were anything to discover, presumably the many people who have tried over the years wouldn’t have come up dry.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

The problem with reinventing physiognomy is not merely that it has been tried without success before. Researchers who persist in looking for cold fusion after the scientific consensus has moved on also face criticism for chasing unicorns – but disapproval of cold fusion falls far short of opprobrium. At worst, they are seen as wasting their time. The difference is that the potential harms of cold fusion research are much more limited. In contrast, some commentators argue that facial recognition should be regulated as tightly as plutonium, because it has so few nonharmful uses. When the dead-end project you want to resurrect was invented for the purpose of propping up colonial and class structures – and when the only thing it’s capable of measuring is the racism inherent in those structures – it’s hard to justify trying it one more time, just for curiosity’s sake.

However, calling facial-recognition research ‘phrenology’ without explaining what is at stake probably isn’t the most effective strategy for communicating the force of the complaint. For scientists to take their moral responsibilities seriously, they need to be aware of the harms that might result from their research. Spelling out more clearly what’s wrong with the work labelled ‘phrenology’ will hopefully have more of an impact than simply throwing the name around as an insult.

Catherine Stinson is a postdoctoral fellow in philosophy and ethics of artificial intelligence at the Center for Science and Thought at the University of Bonn in Germany, and at the Leverhulme Centre for the Future of Intelligence at the University of Cambridge. Follow her on Twitter @nerd_sighted

 This article was originally published at Aeon and has been republished here with permission. Follow Aeon on Twitter @aeonmag

web sonnyside ep

Podcast: Ag Secretary Sonny Perdue interviews GLP’s Jon Entine on feeding the world sustainably through biotech innovation and challenging the ‘myth of organics’

Secretary of Agriculture Sonny Perdue wants to welcome you to his very own podcast – “The Sonnyside of the Farm.” Born and raised on a family farm in middle Georgia, Secretary of Agriculture Perdue is an agriculturalist through and through – having worked as a veterinarian, owning his own grain business, serving as Governor of Georgia and now serving as a member of President Trump’s cabinet as U.S. Secretary of Agriculture.

 

On this month’s episode of The Sonnyside of the Farm, Agriculture Secretary Sonny Perdue sits down with Jon Entine, Founder and Executive Director of the Genetic Literacy Project, to talk about innovation in agriculture.

If we as a society are going to continue doing our part to feed the growing world population, innovation agriculture must be able to adapt to a world where resources are waning and climate is increasingly unpredictable. We need to produce more with fewer ‘inputs’. That means agriculture should be based on ‘best practices’; we can’t depend on farming grounded in part in ideology—movements like organics and agroecology—especially across the African continent and developing countries where there is a critical need to dramatically increase food output while limiting our ecological footprint.

Advanced genetics and biotechnology, including new advances in gene editing, are essential to produce food more sustainably. Climate change and carbon pollution are growing threats to producing food around the world. If organic farming—which produces 40% less than conventional farms on the same amount of land—were to emerge as a dominant model, clear cutting of forests would ensue and carbon would be released at alarming rates.

We need to refocus agriculture on our environmental future. While America’s farmers and ranchers are up to the challenge, there is an emerging narrative in affluent countries that’s anti-science and hostile towards agricultural innovation, prompting many consumers to fear their food. Chemicals are blanketly caricatured as dangerous. But crop protection is a key part of agricultural production, in organic and conventional farming.

The goal should be reducing the impact of chemical protectants, not demonizing them or substituting ‘natural’ chemicals in cases where they are far less effective or can cause more harm than the synthetic chemicals they might replace. Technology and advanced crop protection methods that carefully weigh benefits with environmental costs are essential. This episode will help educate and inform consumers that science and innovation in agriculture are safe and vital.

Jon Entine is founder and executive director of the Genetic Literacy Project. Follow Jon on Twitter @JonEntine

Follow Sonny Perdue on Twitter @SonnyPerdue

This episode of The Sonnyside of the Farm appeared on the USDA website here: The Sonnyside of the Farm. The podcast can also be accessed here on Apple Podcasts. It has been republished with permission. Subscribe to the podcast on iTunes

genetic testing before pregnancy

Podcast: Rare genetic disorders and pregnancy—Navigating an ’emotionally challenging’ journey

Geneticist Dr Kat Arney takes a look at the progress that’s been made in tackling rare genetic disorders, and the challenges that remain. We hear from a prenatal genetic counselor about how new tests are helping people carrying genetic variations make decisions about starting a family, in the latest episode of the Genetics Society’s ‘Genetics Unzipped’ podcast.

A distorted, multicoloured dna fingerprinting gel.When it comes to rare diseases, the clue is in the name—it’s a term usually used to refer to conditions that affect fewer than one in two thousand people.  In many cases these disorders are caused by changes in single genes, but in other cases it’s a bit more complicated. But although each one may be rare in itself, it all adds up. There are somewhere between 6,000 and 8,000 rare diseases known, with a couple of hundred more being described every year as our ability to delve into the genome grows. In fact, a rough estimate suggests that around one in 15 people worldwide is affected by one of these rare conditions. So, maybe not so rare after all. But what do we do about them?

Dr Ron Jortner, or Roni to his friends, is raising awareness of rare diseases— and what needs to happen to understand and treat them better. Jortner is the founder and CEO of Cambridge-based Masthead Biosciences and a trustee of the Cambridge Rare Disease Network, a charity dedicated to making a difference in the lives of those affected by rare genetic conditions:

One issue with rare diseases is delayed diagnosis. The average time to diagnosis is 4.8 years, but I’ve seen delays of 20, even 26 years, and this is unacceptable. This is improving as awareness is increasing, and because tools are improving, but it’s still unacceptable that diagnosis takes so long. Awareness of these diseases is much better than it used to be, but it’s still not high enough.

When it comes to treatment, gene therapy is still very new. It’s really been around only for a few years and still is facing a lot of hurdles, but I think in the end, this is indeed the dream. This is indeed the method to approach these diseases at the heart of the problem.

At the same time, as we’re understanding more about the variations and faults in DNA that cause rare genetic conditions in order to treat them, there’s an increasingly sophisticated range of tests designed to help people who have these variations make decisions about whether and how they want to have children—or what to do about a pregnancy when a genetic abnormality has been detected in the fetus.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

This is an emotionally challenging journey to navigate, but there are people like New York-based prenatal genetic counselor and DNA Today podcast host Kira Dineen to help guide the way:

There are so many tests now, and we can learn so much about the genetics of a child. We’re coming from a place where we used to offer invasive procedures, and that is something that we still continue to do, but [fewer] people are going through these procedures because of the new technology we have.

But with so much information it’s really advantageous to meet with a genetic counselor, to go over all this and see how it applies to a person specifically. In genetic counseling, we’re able to sit and really talk through …. how this affects the individuals in front of us.

Full transcript, links and references available online at GeneticsUnzipped.com

Genetics Unzipped is the podcast from the UK Genetics Society, presented by award-winning science communicator and biologist Kat Arney and produced by First Create the Media.  Follow Kat on Twitter @Kat_Arney, Genetics Unzipped @geneticsunzip, and the Genetics Society at @GenSocUK

Subscribe from Apple podcasts, Spotify, or wherever you get your podcasts.

 

july superfoods lead

Viewpoint: ‘Superfood’—a lucrative marketing term with no scientific basis

Walking through the grocery aisle, there is an overwhelming number of new superfoods to choose from. Hemp hearts are full of alpha linolenic acid, an anti-inflammatory that can reduce heart disease and cholesterol. You can run for miles fueled only on chia seeds, which are also rich in antioxidants, fiber, iron, and calcium. Acai and goji berries are high in amino acids, antioxidants, and vitamins C, A, B1, B2, and E, all of which damage free radicals, boost your energy and support overall immunity. So I dutifully include all of these to my morning oatmeal and I feel energetic and ready to tackle the day!

But, with all this effort, I still don’t really know what a superfood is…

“Superfood” is not defined

The actual term, “superfood”, is not a term regulated by the FDA. While these foods are thought to be exceptionally dense in nutrition, they do not actually have their own food group. They are called ‘super’ because they contain superior nutritional benefits for the amount of calories they contain. Basically, more bang for your buck, but there is more to the story as it relates to its terminology.

The American Heart Association defines superfoods as “nutritious foods that, when added to an already balanced diet, can bring health benefits.” They reference Beans and Legumes, Berries, Dark Leafy Greens, Nuts and Seeds, Oats, Pumpkin, Salmon, Skinless Poultry, and Yogurt. Sounds a lot like the makings of the Mediterranean or MIND diet to me.

aha superfoods x

One thing the AHA states right off the bat, even before addressing specific foods, is that superfoods alone will not make you healthier.

Superfoods alone will not make you healthier? I thought that was the point of a Superfoods – they could do it all!

Unfortunately, no. So don’t throw out your groceries and stock the fridge only with hemp hearts, beans, and berries.

While they won’t turn you into a superhero, so-called superfoods are packed with nutrients with protective and combatant properties. What has become evident is that the foods labeled as superfoods are the ones that have ‘more’ nutrition. For instance, 2 tablespoons of hemp hearts have a bit more protein than an egg. Blueberries and blackberries have more antioxidants than pineapples and may help ward off cancer. Salmon has more omega-3 healthy fats and can help prevent heart disease. And, yes, dark leafy greens are healthier than iceberg lettuce. But that’s not all that’s happening here.

“Superfood” as a lucrative marketing term

The term “superfood” is an attractive word, no doubt an eye-catching phrase in your google search.

Ultimately, these super-terminologies really just mean super-sales. Marketing companies have taken note and capitalized on the viral effects of such catchphrases. According to a Nielsen survey, consumers are willing to pay more for foods perceived as healthy, and health claims on labels seem to help. Unsurprisingly, foods that already carry a “healthy” perception and carry certain beneficial claims on labels have shown the greatest sales.

The incentive to market superfoods as such has not been missed by the food industry. They know the term has no concrete meaning, but they know it will boost sales. According to Mintel‘s research, there was a 36% increase in the number of foods and beverages that were marketed with the “superfood”, “super-grain” or “superfruit” label since 2015. The U.S. was the leader in these product launches.

consumer questions

Beneath the comforting concept lies a disappointing reality of industry bias

Dr. Marion Nestle, nutrition and public health professor emerita at New York University, details the gimmick in her new book, “Unsavory Truth”. She uncovers the role of marketing and how highlighting special health benefits makes the products more appealing to customers.

“When marketing imperatives are at work, sellers want research to claim that their products are ‘superfoods,’ a nutritionally meaningless term,” she wrote.

“One of the things I noticed was that there were [studies on] all these foods that are demonstrably healthy. Why would you need to do research to prove that blueberries or raspberries or pomegranates or grapes are healthy? Of course they’re healthy. So the only reason they are doing it is because they’re trying to increase market share.”

– Dr. Marion Nestle

She calls out the fact that the U.S. Department of Health and USDA’s Dietary Guidelines for Americans does not recommend focusing on a singular food or food group for better health, but instead calls for a variety of “healthy eating patterns” of various fruits, vegetables, grains and more. The inverse of how singular “superfoods” are marketed.

What we are suggesting is that the term is useful as a sales driver as well as an identifier of health. We simply would warn that the term can blind consumers to equally nutritious options that are not as hyped-up, thus depriving us of other nutritious choices.

How do we determine truth from hype?

The answer is in the whole picture! We are all fairly well acquainted with blueberries as a popular superfood. They are high in antioxidants, specifically anthocyanins, that have been reported to inhibit the growth of cancerous human colon cells, and they aid in protecting the body from free-radicals.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

But the human body is complex. To truly examine the effect a food has on our body, we must consider not only our diet but our genetics, our lifestyle, our activity level—things that vary greatly from person to person. What might have super-effects on you might have inverse effects on me. Not from the food alone necessarily, but from the combination of our genes and other lifestyle factors such as sleep, stress, and love.

What’s a person to do about this super-vague label?

Each day, eat 5 to 7 servings of vegetables and 3 to 5 servings of fruit – whether they are ‘super’ or not. We need to ensure we have a balanced diet. And that means increasing our range of nutritious foods in our diets, rather than focusing on a handful of foods that claim to be ‘better’.

Carrots, apples, and onions, for example, have not been touted yet as a “superfoods”, however they contain beta-carotene, flavonoids, and fiber that we need. Whole grains found in cereals, bread, rice, and pasta are also high in fiber and fortified with vitamins and minerals, making it easy for many to consume to achieve recommended daily intake.

The bottom line

When it comes to searching for foods with beneficial health effects, strive for a well-balanced diet – not a handful of claimed “superfoods”. These foods just don’t exist. Beneath these flashy health claims, the nutritional advice remains the same: eat a variety of fruits, vegetables, and legumes every single day.

Hayley N. Philip is a writer and researcher for Dirt to Dinner with a focus in health and nutrition.

This article was originally published at Dirt To Dinner and has been republished here with permission. Follow Dirt To Dinner on Twitter @Dirt_To_Dinner

intersex

Stigmatization faced by people who underwent intersex surgeries to correct ambiguous genitalia

Eugene Robinson recovered from his double mastectomy on a hospital porch in Durham, North Carolina. It was August 1956, and as a Black child in the Jim Crow South, Robinson wasn’t allowed to heal next to White patients.

Sarah Robinson, Eugene’s mother, brought a daughter to the hospital. She returned home with a son. It was his third of four surgeries. Two of his nine siblings had undergone similar operations, but his relatives never talked about the fact that androgen insensitivity syndrome, a genetic intersex condition, ran in the family.

Nearly 65 years later, Sean Saifa Wall, 41, sifts through Robinson’s medical records, looking for answers about his uncle’s story that might shed light on his own. Wall, like Robinson, is intersex.

Intersex is an umbrella term for people with variations in sex characteristics that don’t fit neatly in the binary of male or female. Some intersex people are born with varying reproductive anatomy or sex traits — some develop them later in life. About 1.7 percent of people are born intersex, according to a 2000 report by Dr. Anne Fausto-Sterling.

Since the 1960s, medical convention has been that intersex variations should be “corrected,” often through a combination of painful surgeries and hormone therapy starting from infancy or before a child can consent. But on July 28, the Ann and Robert H. Lurie Children’s Hospital of Chicago became the first hospital in the United States to suspend the operations. The news comes after a three-year campaign against the hospital led by Wall and Pidgeon Pagonis, co-founders of the Intersex Justice Project.

Activists have been protesting intersex surgeries since 1996, when a group demonstrated outside the American Academy of Pediatrics’ convention in Boston. Since then, the UN has condemned the surgeries — which remain legal in almost every country in the world — as “irreversible” and unnecessary procedures that can cause “permanent infertility and lifelong pain, incontinence, loss of sexual sensation, and mental suffering.”

Wall knows that pain intimately.

Wall came out as gay at age 14. Then, he came out as transgender. In both cases, his mom “lost it,” he said. “She was like, ‘why do you want to wear men’s clothes, men’s underwear?’”

seansaifawall e
For almost two decades, Sean Saifa Wall has worked as an intersex rights activist, fighting to end medically unnecessary surgeries. Credit: Sean Saifa M. Wall

But Wall’s oldest aunt reminded his mom about his intersex uncle, now deceased. His aunt said “‘do you not remember playing with Queen Esther as a child?’”

“And my mom was like, ‘Who’s that?’ And she’s like “‘That’s Gene.’”

Wall says the memory “blew my mom’s mind” — for seven years she had a sister. Looking back, she did remember Esther.

Eight of his family members were intersex, Wall says. The more that Wall started to talk about himself, the more his family opened up about their own histories.

Up until the time he was 13, Wall’s mom resisted doctors’ insistence that he have surgery to remove undescended testes, he says. She saw his older intersex siblings suffer through their own operations and thought they were unnecessary.

“They told my mom that the testes were cancerous,” Wall said. So his mom agreed to the surgery. Wall never had cancer.

He had spent two years under the care of a doctor that he says studied him, asking him questions about whether or not hormones made him less gay. Still, it wasn’t until college, while doing a Yahoo internet search, that Wall pieced together that he is intersex.

“I was so angry,” he said. “I was like, ‘Oh, this is not fair. It’s not right. I didn’t talk about it for a while. I would tell people here and there, but I didn’t talk about it publicly because I had so much shame.”

When he was 25, he started taking testosterone, something he wanted to do as a trans person to confirm his gender. But he wasn’t metabolizing the testosterone the way most people on the hormone do.

“I think I felt really suicidal,” he said, referring to people constantly misgendering him. “But I knew that if I took my own life, that no one would ever know what happened to me, and no one would ever know my side of the story.”

That’s when Wall decided to start organizing for intersex rights.

For 19 years, Lurie patient Pidgeon Pagonis also believed they had survived ovarian cancer. The surgeries and exams started before Pagonis could remember, at 6 months old. They had another operation when they were 3 or 4 years old, and another when they were 10.

ad b fbc b image
Intersex advocate Pidgeon Pagonis. Credit: M. Spencer Green/AP

“Since I was like 11 they would always just lift my shirt off, touch my chest and then pull my pants down and look at my vulva area,” Pagonis recalls. “And then they’d ask me questions like, ‘How are you? How are your grades?’”

Pagonis thought that because of the cancer, they would never be able to have a baby. In truth, Pagonis never had cancer. Years of intersex surgeries to make their body conform to the idea of the female sex had left them unable to feel most sexual sensation.

They spent 18 years in and out of Lurie for surgeries, hormones and exams. Doctors would ask Pagonis if they had questions. Pagonis wanted to know why they were experiencing puberty differently than other kids.

“I didn’t know I had a vaginoplasty, and I didn’t know I was intersex,” Pagonis said. “I did not know I had a castration, and I did not know I had a clitorectomy at that point. I thought I survived cancer.”

Pagonis attended college practically in the shadow of the hospital at DePaul University, watching doctors come and go as they studied for finals. It wasn’t until they learned about intersex issues at DePaul that they realized that all those visits to Lurie hadn’t been about cancer at all.

“I just thought these were my doctors that I had to go to because I had cancer when I was a kid,” Pagonis said. “And also, I was so unlucky that I had this ‘urethra problem.’”

No other major U.S. hospital has ever stated that they don’t perform intersex surgeries, so Lurie was far from the only institution performing such procedures. However, Lurie has enjoyed a sterling reputation among LGBTQ+ people since 2013, when it opened one of the first pediatric gender clinics in the nation under the leadership of Dr. Robert Garofalo, a nationally-renowned expert in transgender health. Under Garofalo’s leadership in the Gender & Sex Development Program, Lurie became the first hospital in the United States to adopt a trans-inclusive policy for its young patients.

That prestige made Lurie a prime target for a campaign to end intersex surgeries. Intersex activists have long pointed to a disconnect between the gender-affirming care for trans and non-binary youth at the hospital and surgeries done on intersex children without their knowledge or consent.

“The truth of the matter is they are very distinct and separate populations in many ways,” said Garofalo. “But there are areas where there are some overlaps.”

And those cast a pall on the gender clinic as calls to end the surgeries overwhelmed its social media channels.

The Intersex Justice Project — Pagonis and Wall’s organization of intersex activists of color — led its first protests against Lurie in 2017 and again in 2018, when the Androgen Insensitivity Syndrome-Differences of Sex Development Support Group held its conference in Chicago. About 70 people showed up to protest outside Lurie. Since that time, Lurie has been the target of a relentless campaign to end the surgeries, and protests outside the hospital have only grown.

In July, “Pose” star Indya Moore excoriated the hospital for using their image to promote LGBTQ+ inclusion. “You cannot stand W/ trans ppl & step ON intersex ppl!” Moore wrote on Twitter. The tweet set off a firestorm of bad press for the hospital as an old petition against the surgeries at Lurie racked up 45,000 signatures.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

Garofalo said the hospital has long been revising its polices on intersex care, but it had never apologized for the harm those surgeries had caused.

“I mean, the truth of the matter is that it has been uncomfortable for me at times,” conceded Garofalo, who does not oversee intersex care at the hospital.

On July 28, the same day the hospital announced it was suspending the surgeries, the hospital apologized.

“We empathize with intersex individuals who were harmed by the treatment that they received according to the historic standard of care and we apologize and are truly sorry,” the hospital stated in a letter signed by President and CEO Dr. Thomas Shanley. “When it comes to surgery, we are committed to reexamining our approach.”

A number of staffers within Lurie pushed for an end to the surgeries, most notably transgender research coordinator Dr. Ellie Kim, who publicly criticized the practice.

“I really owe Ellie a debt of gratitude for really stepping forward and not being shy about her thoughts on the matter,” Garofalo said. “And to that extent, I’m really proud to be where I’m at.”

Lurie’s end to intersex surgeries marks a watershed moment for intersex rights. Lurie is ranked among the top pediatric hospitals in the nation, and intersex rights activists hope that other hospitals follow suit.

But for advocates like Wall, the campaign has also taken a deep toll. Pagonis and Wall garnered support and educated the public by sharing intimate personal stories. It’s largely considered disrespectful for reporters to ask transgender people about their surgeries or genitalia. Intersex activists don’t have that luxury yet, says Hans Lindahl, director of communications for youth intersex organization InterAct.

“Something that we say a lot is that we have not yet had our Laverne Cox moment,” said Lindahl. “We’re still so under the purview of being medicalized that I think there’s a pressure that we almost have to tell these stories at this point in our movement in order to get people to listen.”

For Pagonis and Wall, that has meant revealing details about their own traumas, sexual experiences, anatomy and family histories.

And largely lost in this moment is the history of intersex surgery itself. Intersex operations were born out of gynecology, a practice developed by James Marion Sims, who performed brutal experiments on enslaved Black women without anesthesia. Although intersex surgeries were popularized in the 1960s, doctors had been doing them for years before, as Wall’s family history shows.

Wall says his family was already harassed as a Black family in the segregated South. But a Black family with three kids whose sex characteristics varied meant they were tormented endlessly.

“So for me, my intersex story comes out of this legacy that’s rooted in the South, that’s rooted in North Carolina,” Wall said. “By the time this intersex variation appeared in my family, there was knowledge and awareness of it, but people didn’t talk about it, because there was shame and stigma and secrecy.”

Kate Sosin focuses on transgender rights, incarceration, politics and public policy. Kate has conducted deep-dive investigations into transgender prison abuse and homicides for NBC News. They previously worked at Logo TV, INTO and Windy City Times. Find Kate on Twitter @shoeleatherkate

A version of this article was originally published at The 19th and has been republished here with permission. The 19th can be found on Twitter @19thNews

best apples for pie reupload kenji

Chile poised to tackle food shortages and climate change with ‘Golden Apple’ and other CRISPR-edited crops

Chile’s intense political unrest exacerbated by months of COVID-19 quarantine has temporarily overshadowed a relentless environmental, farming crisis: an intense drought—the worst in the country’s history— now moving into its tenth year. The last few months have offered a temporary respite, with rains reaching average levels. But Chile is in desperate need of longer term responses to worsening climactic conditions that threaten to intensify existing food shortages and jeopardize the nation’s vital agriculture industry.

Biochemist and president of the Chilean Society of Plant Biology Dr. Claudia Stange believes she is part of the solution. Climate change is here to stay, she believes, so it’s time to mobilize genetic technology and adapt. Stange and her colleagues at the University of Chile are gene editing new varieties of apple, kiwi and tomato to improve their nutritional content and resistance to drought and saline soils.

Golden Apple’s development hits a GMO wall

claudia
Claudia Stange

This is not the first major effort to harness CRISPR and and transgenics (GMOs) to improve the environmental hardiness or nutritional content of crops. Golden Rice, recently approved for roll out in the Philippines after more than two decades of stops and starts, is a humanitarian project initiated by university scientists to generate a GMO rice high in beta carotene, a precursor to vitamin A. Vitamin A is largely absent from the diets of millions of people in southeast Asia. This deficiency is to blame for 250 000 to 500 000  cases of childhood blindness every year, with half of them ending in death, according to the World Health Organization (WHO).

Dr. Stange’s first major project, launched in 2011 and financed with public funds, had a similar goal to that of Golden Rice: to develop an apple genetically modified to synthesize carotenoids. Due to technical constraints and an inability to get similar results using conventional plant breeding methods, genetic engineering was recognized as the best tool for producing what came to be known as the ‘Golden Apple’.

If the project is successful, it could be a major economic and health coup for Chile. It is the world’s fourth largest exporter of apples, so improving the nutritional profile of exported varieties would  boost its apple industry and benefit consumers worldwide, Stange told me:

Today consumers are looking for foods that are functional, that means, with a higher content of antioxidants, vitamins, etc. Those characteristics would be fulfilled by our apples with the highest content of carotenoids -which are provitamin A molecules- and antioxidants that counteract various diseases and aging.

The Golden Apple project successfully developed transgenic lines of biofortified apple seedlings years ago, but commercializing it was another matter. Although the development and cultivation of GMO crops like corn, soybean and canola is routine in South American countries including Brazil and Argentina, progress in Chile has been slowed by regulatory obstacles and political opposition to recombinant DNA technology.

Although the country imports large quantities of grain harvested from GMO plants in other countries, Chile’s biotech regulations would have prohibited the commercialization of home-grown Golden Apples. Chile currently exports locally cultivated GMO corn, soybean and canola seeds, mostly to the United States, Canada and South Africa. Facing regulatory obstacles, financing for the Golden Apple project dried up by 2014, bringing the research to an unceremonious end.

apple
Apple trees transformed with beta-carotene-producing genes. Credit: Dra. Stange

But the Golden Apple was recently given a faint breath of life. Stange was blocked in bringing her bio-fortified apple to market because it was transgenic—it involved the transfer of genes from one species to another. But with advances in gene editing, specifically CRISPR, a technique that has fueled development of a new generation of improved crops, an apple with similar traits could be developed without the use of ‘foreign’ genes. Chile is now growing gene-edited cereal, vegetable and fruit crops in field trials, although there is as yet no path to commercialization

There are crucial differences between gene editing and older genetic modification technology, Stange explained:

In GMOs, one or more genes from another plant or organism are inserted into a plant of interest so that gene, when expressed, gives it beneficial traits the original plant didn’t have—for example, the production of provitamin-A, resistance to drought or pathogens. 

In gene editing, molecular biology strategies are also used, but in this case it’s to avoid that a specific gene is expressed in the plant of interest. By specifically editing or mutating that gene, the plant presents positive traits that it didn’t previously have….They [genetic modification and gene editing] are two strategies that seek the same end. Only in the last one there is no exogenous DNA material. For this reason, it’s more easily accepted in countries where GMOs aren’t.

After Argentina became the first country to green light gene editing research for agricultural purposes in 2015, Chile followed in 2018 with a similar rule that allows the techniques to be used as long as no transgenes are added to the target plant. Brazil, the United States, Australia, Canada, Colombia, Israel, Japan and other countries subsequently enacted their own gene editing regulations.

CRISPR and Golden Apple 2.0

Building on their earlier research, Stange and her team expanded work on Golden Apples in 2018, but this time with CRISPR. These next-generation apples will not only provide high levels of Vitamin A and more antioxidants, they will resist browning, which reduces food waste—the same effect achieved by the Arctic apple, developed using a different genetic engineering technique in Canada.

To date we are selecting apple seedlings that have the desired traits: this means, that they have edited the genes of interest, that produce less browning, higher carotenoid content and that are not GMOs. At the end of the year we will be able to have the first seedlings to be transferred to Los Olmos nursery, where they will continue the evaluation in the greenhouse and field.

“In the meantime, our team will continue to generate and select more lines so as to have a large number of plants that allows us to choose the best ones when they produce fruit,”  Stange adds about the project financed by CORFO and carried out in association with the Biofrutales Consortium and Vivero Los Olmos.

These apples won’t reach our tables for a while, however. Stange estimated that it will take five years to select the best genotypes of edited apple trees, before taking them to field production.

crispr applePASSA Project and climate change

In March 2020, Stange’s laboratory launched another effort designed to address the impact of climate change on regional agricultural production: Proyecto Anillo (Ring Project) Plant Abiotic Stress for a Sustainable Agriculture (PASSA), financed by ANID. The project was developed with the help of Drs. Michael Handford and Lorena Norambuena at the University of Chile’s Center for Molecular Biology, in association with Dr. Juan Pablo Martínez from the Institute of Agricultural Research (INIA) and Dr. Ricardo Tejos from Arturo Prat University.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

PASSA aims to develop drought- and salt-tolerant tomato and kiwi rootstocks with CRISPR, directly addressing the water emergency situation that is gradually worsening in Chile. Tomatoes are the most consumed vegetable globally and in Chile. The South American nation is also the third largest kiwi exporter after New Zealand and Italy. Shielding these crops from increasing water scarcity and desertification is therefore an essential objective.

According to Stange:

Tomato and kiwi crops are very relevant to the country’s economy. In the case of tomato, we’ll study the traits of the ‘Poncho negro,’ a Chilean variety originating in the Azapa Valley (Arica) that has a high salinity tolerance and whose genetic breeding would increase the productivity of tomato 7742 (seminis), the most produced and marketed variety in Chile. [I]t can be grafted onto Poncho Negro.

Regarding kiwis, we will seek to increase salinity and drought tolerance of varieties used as rootstocks, to improve the productivity of Hayward commercial kiwi plants.

apple
In vitro culture process for the genetic edition of the kiwi from callus to seedling. Credit: Dr. Stange.

While the Golden Apple project requires established, fruit-producing trees, fruits are not needed to grow young kiwi and tomato plants; they can be evaluated at the laboratory and greenhouse level under conditions of drought and salinity.

Unfortunately, quarantine has forced the researchers to prioritize bioinformatic activities over laboratory experiments, delaying the project for up to six months. With this setback, it could take three years to edit the plants and evaluate them in field trials.

There is another technical obstacle that must be surmounted as well: adapting crops originating in other parts of world to local conditions:

Currently the new varieties are acquired by paying royalties to foreign companies.

This implies bringing those varieties [to Chile] and waiting a few seasons until they adapt to our edaphoclimatic conditions, with the expectation that they will produce the fruits as they are produced where they were generated. This is a risk. In our case, they are varieties already produced and marketed in Chile to which we will add these new traits.

Future challenges and public perception

Stange believes that the future for genetically engineered fruit, vegetables and grains will be brighter than the recent past. GMO crops are gradually being embraced and there is a growing global trend outside of precautionary-obsessed Europe towards relatively lax regulatory oversight of CRISPR gene editing. That could allow for the commercialization of new consumer-focused crop varieties, she says.

The benefit that these biofortified plants bring will overcome the conceptual reluctance to GMOs, especially in countries that appreciate the health value that these types of improved products give them. The need will make countries join the incorporation of GM and edited plant crops.

Daniel Norero is a science communications consultant and fellow at the Cornell Alliance for Science. He studied biochemistry at the Catholic University of Chile. Follow him on Twitter @DanielNorero

lab grown meat

How do you make meat without animals? The 5-step ‘recipe’ for a lab-grown, cell-based burger

Cultured meat, lab grown meat, cell based meat, clean meat, and cultivated meat are all terms used to describe meat grown using animal cells. Wait, what does that mean…

Cell based meat is meat that is grown from a tissue biopsy of cells taken from an animal.

This technology enables the production and consumption of meat and fish without the need for any animals to be slaughtered. Cultured meat is also considered to be an environmentally friendly alternative to factory farming, which is a major producer of greenhouse gas emissions and nutrient pollution.

 

The exact environmental reduction metrics are still undetermined as there are currently no cultured meat manufacturing facilities operating at scale.

What We Do Know:

According to the Food and Agriculture Organization of the United Nations, emissions from livestock production account for about 14.5% of total greenhouse gas emissions globally. The primary activities responsible for the GHG emissions include feed production and processing and enteric fermentation (farts) from ruminants representing 45 and 39 percent of total emissions respectively.

In order to accurately compare lab grown meat production processes with traditional animal farming it is necessary to understand the land, energy and water use necessary to produce cell based meat at scale. We’re not there yet – however, the simple fact that animals won’t need to be bred, fed, and slaughtered leads scientists to believe lab meat at scale is the most environmentally appropriate way to feed a growing population.

Cell based meat is cultivated in a controlled environment that mimics the environment of an animal.

There are 4 main inputs to cultivate meat: 1. Cell Source 2. Cell Culture Media 3. Bioreactor 4. Scaffold. Each input may have its own unique supply chain and production process — the cell culture media, bioreactor, and scaffolding may be dependent on the specific cell type and cell species being cultured.

 

Cells are sourced from a tissue biopsy or in select cases may be sourced from a feather or hair follicle.

 

Cells are placed in a controlled and sterile environment beginning in a small flask and scaling up to a bioreactor.

The exact rate at which cells expand and process to scale up from flask to bioreactor is dependent upon a specific company process and proprietary technology.

 

Cells are given a nutrient rich cell culture medium (aka food), which enables cells to grow and form into meat.

This cell culture medium contains many critical components including amino acids, vitamins, glucose, inorganic salts, and growth factors. The growth factors are by far the most complex and expensive component to cell culture media. The exact composition of the cell culture media is dependent on a given cell line and cell species, meaning there is no one-size-fits-all approach for the most efficient media solution.

 

Cells are harvested in a bioreactor until they grow enough to form a suitable amount of meat

Cells have two main jobs once they make it to the bioreactor: 1. Proliferation (division to generate a large number of cells) and 2. Differentiation (cells become the desired and final cell type suitable for consumption). How the cells accomplish these tasks are dependent upon the aforementioned cell culture media, bioreactor design, and in some scenarios a supportive structure for cells to adhere to called a scaffold.

Scaffolds can be considered materials (hydrogel, collagen, mycelium) and tissue construction techniques (such as 3D printing/additive manufacturing, electro-spinning, or electrical stimulation) that are used to turn a slurry of cells into meat products.

The main challenges:

  • Cell Line Engineering: developing a cell line suitable for long term replication.
  • Cell Culture Media and Growth Factors: developing and manufacturing a formulation that is low cost with optimized cell proliferation.
  • Bioreactor Design: creating a bioreactor that is large enough for adequate yield without harming the cells. As the cell density increases there must be mechanisms in place for nutrients and oxygen to reach all cells within the closed system. Also, a bioreactor system must be sterilized to meet food grade standards.
  • Efficient Recycling of Cell Toxins: as cells proliferate they release substances like ammonia that can lead to cell death. It is important to consider interventions that enable the re-uptake of potentially toxic excretes.

Recap:

Lab grown meat is molecularly identical to meat and is NOT the same as plant based meat options like Impossible Burger or Beyond Burger.

No animals are killed in the process of producing cell based meat, but the end product is identical to real meat.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

What’s the potential?

The processes for making lab grown meat are still under development, but potential benefits include higher yields, lower consumer costs, higher quality, and lower environmental impact. For space enthusiasts: consider the potential of feeding a space station or a Mars colony.

Lab grown meat companies

There are a number of startups globally that are developing cell based meat and foods, with global investments around $1.2 billion. These companies are currently focused on taste, cell culture media, and being able to mass produce at a competitive market price. 

Where can I buy it? 

Lab grown meat is not available for purchase anywhere in the world. There are some non-meat lab based ingredients available like lab grown heme (derived from soybeans), which is found in the plant-based Impossible Burger. Lab meat is expected to be available for purchase as early as 2021. However, we are about 5 years away before we see cell based meat in grocery stores, sold at an affordable price.

Brooke Sunness has an M.A. in Food Systems from New York University and is the Managing Director at Cell Based Tech. Brooke is a thought leader and influencer in the food and biotech space working closely with private investors to evaluate emerging science, technology and companies. Follow Cell Based Tech on Twitter @cellbasedtech

Grass Fed Beef vs Grain Fed Beef

Viewpoint: Ideology, politics pollute the debate over health risks of red meat

For decades there has been a statistical controversy about meat. By statistical I mean it was never a real health issue. Instead, though we clearly evolved to eat it, epidemiologists statistically correlated meat to dying and said therefore we shouldn’t eat it. Though such studies noted down at the bottom that the relationship was not causal, they wanted the public to believe it because they highlighted the causal inference in press releases, and so media rushed to claim that meat causes heart attacks.

A few years ago, epidemiologists at France’s International Agency for Research on Cancer (IARC) joined in, using their own meta-analyses to declare that meat was just as hazardous to health as plutonium. And smoking. And mustard gas.

Their methodology was as ridiculous as the result but media outlets, who seem to think IARC is a Supreme Court over scientists, declared the science settled yet again. But as we have seen in the last month, even Lancet and NEJM peer review of epidemiology is suspect. Which means peer review of epidemiology in Environmental Health Perspectives is more like astrologers peer-reviewing each other and declaring astrology is science.

The public is unsure what to trust, because bizarre epidemiological claims have been treated like science fact if they match the political proclivities of journalists. Does hydroxychloroquine work for COVID-19 or does it cause heart attacks? Peer-reviewed epidemiologists claimed both. Journals rushed both to print. Media rushed to endorse both. The only reason they were debunked was because outsiders criticized the work but with so many food frequency questionnaire and chemophobia claims produced each month, there is no time to debunk them all.

f e a e ffc cd a ap virus outbreak india
Credit: USA Today

Here is what journalists and the public need to know to ground epidemiology, and mouse study, claims. They can exclude benefit or harm, but never prove it.  They can find a statistical link that might merit follow-up, if it is scientifically plausible and not ‘Kennedy had a secretary named Lincoln’ coincidence. Somewhere along the cultural way journalists stopped understanding what “exploratory” means and that meant the public has been bounced all over the place with fat-free diets, low-carb diets, Blood Type Diets, and now Blood Type COVID-19 effects.

A new observational paper debunks claims that meat increases risks of heart attacks, adding onto one from last year that so enraged the anti-meat academic group True Health Initiative, they tried to get the Philadelphia district attorney to sue Annals of Internal Medicine for publishing it.  So an observational study is offsetting a statistical claim.

Which do you believe? Neither yet, that is the whole point. While the meat industry will cheer, and journalists will rush to churnalism up the press release, you as readers should be more critical. The design was fine but the sample size was in the tiny range. Not as small as the papers that set off the gluten-free and vaccines-cause-autism nonsense, but in a post-coronavirus world people should expect more, the same way they stopped buying green alternatives that pretended they were not chemicals and started buying products that work.

The study used 33 middle-aged obese people and a randomized, crossover, controlled-feeding trial is great but two months is not long enough. The metrics for insulin sensitivity were fine. They also used blood pressure, which seems silly since blood pressure is only a risk factor for a risk factor for heart disease, but that is what a whole lot epidemiology papers do so it makes sense in an ‘apples to apples’ way.

/files/images/meat_lipids.jpg
The result was that all of the changes in risk factors that claimed to be linked to greater risk of heart attacks were not much different for meat eaters and those without.

The reason to be skeptical is the same reason to be critical of most food claims; it is underpowered. Instead of attacking the methodology, critics will allege that because a beef organization provided funding that the results are tainted but that is Naomi Oreskes-type conspiracy theory, not rational criticism. The authors have also received funding from Big Almond, Big Avocado, and Big Cereal but it would be ridiculous to assume any of those want you to eat more steak. And where does the funding conspiracy end? Should Republicans not trust science funded during the Obama administration? Do Democrats not trust the FDA now? That sounds silly but it is no more silly than claiming that a beef group is telling researchers they’ll get grants only if they produce a positive result.

I have never had a donor tell me I will only get money if I agree to do something. I have had PR groups ask if I might be interested in writing about the science of a product they represent but they have never gotten anywhere because they can’t pass the science test. That is what matters.

Yet the anti-meat side is throwing money everywhere.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

There is an entire industry built up around demonizing meat, just like there is around demonizing science itself. The anti-meat academic group True Health Initiative not only funds detractors, they email bombed the Annals of Internal Medicine Editor-in-Chief to try and get her to block publication of a study showing meat was not harmful, and epidemiologists Walter Willett And Frank Hu claimed their detractors were colluding with the beef industry. Using sexist dog whistling they suggested all the female co-authors on the paper were under the Svengali spell of a male who had gotten funding from a food group to study…sugar.

Who rushed to agree with their misogynist rhetoric? A lot of prominent academic women who have made their careers alleging Big Food conspiracies. True Health Initiative even lobbied the Philadelphia district attorney to launch an investigation into the journal, all because inconvenient science threatened their income stream.

People inside the anti-meat industry are first to charge that everyone else is on the take. But if you look at their organizations and see their corporate sponsors, you can see the reason they believe academics are bought off is simple; they are getting a lot of corporate money to promote the products of their sponsors, and assume everyone else must be also.

NOTES:

(1) On this site, I have even documented how activists game the system by recruiting high-caliber scientists to be lead authors to get into prominent journals before the paper is even written. And then having a cabal of lower-tiered scientists and activists standing by to promote the manufactured result when it gets published.

Hank Campbell is the founder of Science 2.0 and co-author of the book Science Left Behind. Follow him on Twitter @HankCampbell

This article originally ran at Science 2.0 and has been republished here with permission.

parenting facebookjumbo med hr

Debating group differences in intelligence: A conversation with philosopher Nathan Cofnas

Nathan Cofnas is an American philosopher and philosophy PhD Candidate at Oxford University. He is known for his works on the evolution of morality; his debate with Kevin B. MacDonald about Jewish ethnic interests; and his paper titled “Research on group differences in intelligence: A defense of free inquiry.”

The following interview is part of a series of conversations of independent scholar Grégoire Canlorbe with natural and social scientists (In addition to his scientific interviews Canlorbe also interviewed a variety of renowned cultural and political figures—such as Greenpeace’s co-founder and former president Patrick Moore and Hollywood stars’ trainer Kamel Krifa).

nathan cofnas
Nathan Cofnas

Canlorbe has critically studied Kevin. B. MacDonald’s thesis on Jewish ethnic interests—namely that Jews are genetically and culturally predisposed to a combination of high collectivism and high out-group hostility, and in the two last centuries have been serving their perceived ethnic interests through promoting left-wing doctrines like anti-racism.

A retired psychology professor at California State University, MacDonald is claimed to be an anti-Semitic theorist; his controversial and often derided thesis has yet gained traction in some quarters. His critical interest led Canlorbe to have a conversation with Kevin B. MacDonald in March 2019; and then the following interview in May 2020 with one of MacDonald’s most renowned intellectual critics, Nathan Cofnas, who is generally supportive of the theory of genetically based group intellectual differences.

kevin macdonald
Kevin MacDonald

Grégoire Canlorbe: It is not uncommon to hear that IQ tests are not measuring intelligence stricto sensu, but only the success in passing IQ tests. Hence so many people supposedly gifted with a high IQ turn out to be complete morons in the real life… lacking subtlety, depth, hindsight, creativeness, polyvalence, humility, alertness, and a critical and independent mindset. As a defender of the research on group differences in intelligence, do you contest such claim?

Nathan Cofnas: The claim that IQ tests only measure the ability to take IQ tests is a common critique, but not among those who are familiar with the relevant evidence. IQ is highly correlated with a range of real-life outcomes both inside and outside the classroom: educational attainment, job performance, health, even your chance of getting into a car crash. This is not surprising when you consider that, as Robert Gordan put it, “everyday life [is] an intelligence test.”

Nonacademic tasks like planning and following a healthy diet, preventing or treating diseases, reading a bus schedule, making a budget, avoiding accidents, or setting up household appliances involve problems that have the same basic form as IQ test questions. People with higher IQs tend to do these things better and more reliably than those with lower IQs.

That being said, the ability that IQ tests purport to measure—so-called “general intelligence”—is not well understood in any detail, and “intelligence” certainly has other dimensions. Success at any given activity requires a constellation of abilities and dispositions. It’s pretty much always an advantage to have more general intelligence, but the people with the highest IQs are not necessarily the most successful or the “smartest” in a colloquial sense. The traits you mention—subtlety, creativity, critical thinking, etc.—are to some extent independent of general intelligence, and can be just as essential.

As readers may or may not know, there are nontrivial differences in the distribution of IQ among racial groups, and these differences go a long way toward explaining racial disparities in socioeconomic status. There is a debate about the role played by genes vs. environment in producing race differences in IQ. We know that environmental factors can influence IQ: better nutrition/healthcare as well as familiarity with abstract, scientific thinking both increase IQ up to a point.

But race differences persist even when environments become as equal as we know how to make them. The 15-point IQ gap between Blacks and Whites in the US has been stable for decades, and has resisted extreme interventions including cross-racial adoption. I have argued that it’s time to start thinking about what the political and ethical implications would be if these differences are influenced by genes.

lg nyjl[Editor’s note: For a historical discussion on Black-White differences in culture and genes, read Taboo: Why Black Athletes Dominate Sports and Why We are Afraid to Talk About It, by Genetic Literacy Project’s Jon Entine]

Grégoire Canlorbe: In contrast to the view that the evolution of moral and juridical norms is best explained by the psychological forces operating within individuals (and facing the trial of natural selection), you argue that the success of an established norm is most often imputable to the magnitude of the power backing the latter. How do you sum up your argument? Does your thesis apply to the transition of Ancient Judaism to Talmudism—a renovated practice of Judaism in which kings and priests would be left behind for the benefit of the masters of exegesis?

Nathan Cofnas: An influential approach in cultural evolutionary theory assumes that beliefs/ideas/practices spread as a result of individuals’ learning biases, natural selection, and random forces. People have learning biases to, for example, conform to the majority or adopt practices that seem useful. Then natural selection favors individuals and groups with adaptive beliefs and practices.

William Durham, Joseph Fracchia, and Richard Lewontin raised the objection that this ignores the role of power in cultural evolution. Maybe cultural evolution is not driven by the aggregate of the individual decisions of agents in a population but by the whim of the powerful. If so, the learning biases that feature in some cultural evolutionary models of the evolution of morality would be largely irrelevant in practice.

Drawing on work by Christopher Boehm, I argued that the evolution of morality probably was driven largely by the exercise of power in ways that undermine cultural evolutionary models that emphasize individual learning biases. Hunter–gatherers in the Pleistocene did not choose what moral rules to follow based on learning biases. Instead, rules were imposed by coalitions of the majority to advance their explicitly represented collective interests. Rule-violators were subject to fitness reducing punishments. This created selection pressures to internalize group norms and, I argue, to be innately receptive to certain rules that were widely enforced across groups.

This is not to deny that we have the learning biases identified by cultural evolutionary theorists. We really are disposed to, for example, conform to the majority and copy prestigious individuals. But these are not always decisive forces in cultural evolution. In regard to morality, the ultimate source of many of our moral values are powerful individuals and coalitions who managed to enforce values that serve their interests. Once a norm becomes culturally entrenched, people conform to it without being aware of its origin. The idea that power influences morality in this way might seem like common sense to many people, but it hasn’t been incorporated into mainstream cultural evolutionary theory because it doesn’t fit with the standard models.

Regarding the transition to Talmudic Judaism, there wasn’t really an option to continue with the old system. The Temple in Jerusalem was destroyed in the year 70, and the Bar Kokhba revolt was put down by the Romans in 135. So there was no Temple for priests to operate, and no country for a king to rule. In the absence of strong central authorities, individual choice might have been more important than usual in driving cultural evolution.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

Grégoire Canlorbe: An eminent assertion in the field of evolutionary psychology has been that human individuals are born with an innate capacity for language, which is unique to our species and which emerged as a tool to solve the specific problem of communication among hunter-gatherers. Do you judge this view to be substantially corroborated?

Nathan Cofnas: I don’t know enough about this subject to have an opinion.

Grégoire Canlorbe: Challenging psychologist Kevin MacDonald’s thesis on Jewish ethnocentrism and the “culture of critique” you make the case that the Ashkenazi intellectual brilliance simply leads to Jewish overrepresentation in all intellectual movements (instead of the Jewish perception of their ethnic interests leading them to destabilize their host societies out of genetic and cultural reasons). It seems quite reasonable to hypothesize that those of Jews who are preaching cosmopolitanism—and who are self-identifying as Jews in the process—are indeed acting (at least in part) on behalf of a certain perception of their ethnic interests; but that the aforesaid perception, far from being stipulated in the Torah and genetically influenced, is really contingent: only one of the perceptions possible in the Jewish mindset. Also it seems quite reasonable to hypothesize that the Jewish perception of their ethnic interests—just like the Jewish actualization of their messianism—is actually molded by the Western intellectual climate; and not the other way around.

Nathan Cofnas: It would not be surprising if some cosmopolitan Jews have acted “in part” to advance “a certain perception of their ethnic interests.” But MacDonald makes a much stronger claim, which is that modern liberalism is a Jewish intellectual movement designed (consciously or unconsciously) to promote Jewish ethnic interests. He says explicitly that Jews’ pursuit of their ethnic interests was a “necessary condition for the triumph of the intellectual left in late twentieth-century Western societies.”

This can be broken down into claims about the motivation of Jewish liberals (i.e., ethnocentrism) and the influence they had (i.e., without Jewish activism the intellectual left as we know it would not have triumphed). I can find no compelling evidence that the leading Jewish intellectuals discussed in MacDonald’s book were particularly concerned with Jewish interests. Many of them in fact opposed Jewish interests as conceived by MacDonald (e.g., they promoted multiculturalism for Jews and multiracial immigration to Israel). And the West was on a liberal trajectory long before Jews became influential at all, and liberalism has triumphed in a number of societies where Jews had virtually no influence.

bklfultll sx bo[Editor’s note: For a historical discussion on genes and Jewish identity, including an interview with Kevin MacDonald, read Abrahams’ Children: Race, Identity and the DNA of the Chosen People by Genetic Literacy Project’s Jon Entine]

Grégoire Canlorbe: MacDonald also deals with the National Socialism movement in Germany, claiming Nazism to have been a group evolutionary strategy mimicking (what MacDonald believes to be) the very own principles of Judaism—outgroup hostility combined with within-group collectivism—as a response to alleged Jewish parasitism. What are your thoughts about it?

Nathan Cofnas: MacDonald never clearly defines what he means by “group evolutionary strategy.” Sometimes he implies that strategies are shaped by group selection, sometimes that they were (or are) consciously designed. In any case, if National Socialism was a “group evolutionary strategy” it wasn’t a very successful one. Twelve years of National Socialism led to several million German deaths, and the survivors were subject to the largest mass rape in history. The political movements that MacDonald sees as opposed to white interests were largely a backlash against National Socialism, so it indirectly led to multiculturalism and mass immigration to Germany.

Grégoire Canlorbe: When it comes to explaining the “cross-cultural convergence on liberalism,” an occasionally proposed narrative is that people came to acknowledge the objective, universal truth of liberalism—what is plausibly a laicization of the Biblical faith in the march of humanity towards the acceptance of Yahweh and His objective law. Another occasionally invoked factor lies in the extension of peace and the increasingly intricate interdependence of humans within the worldwide division of labor. As the proponent of a “debunking explanation for moral progress,” how do you assess those perspectives?

Nathan Cofnas: I do not believe that there are objective, mind-independent moral truths. We may have the intuition that morality is objectively real, but this is an illusion that can be explained by non-moral-truth-tracking forces such as natural selection. If we find that the cause of our belief that p doesn’t track the truth about p, then the belief loses its justification. Since (in my view) our moral beliefs are satisfactorily explained by naturalistic processes, there is no reason to postulate moral truth.

Some moral realists, however, have argued that cross-cultural convergence on liberalism does not have a naturalistic explanation, so (they say) the best explanation for this phenomenon is that societies are independently discovering the mind-independent moral truth. I have argued that there are good naturalistic explanations for why societies tend to gravitate toward liberalism as they become more prosperous and adept at keeping the peace. Peace makes people more sensitive and averse to violence, and prosperity (and everything that goes along with it) removes many of the incentives for illiberal practices like oppression and fighting.

Grégoire Canlorbe: Besides familiar considerations about the dysgenic trends that are allegedly miscegenation, the proliferation of spiteful mutants, and the higher fertility of low IQ people, the marginalization of war has been claimed to be one of the most psychologically detrimental features of our bourgeois industrial era. Robert Ardrey’s remark on this point deserves to be recalled. “We face in the elimination of war this most fundamental of psychological problems. For almost as long as civilization has been with us, war has represented our most satisfactory means of at once escaping anonymity and boredom while preserving or gaining a measure of security.” Fifty years later is The Territorial Imperative still relevant?

Nathan Cofnas: I think the reduction in war is an overwhelmingly positive development, but it may have some negative side effects. Our innate psychology is adapted to conditions where war and violence were much more common. The desire to bond with groups to fight an enemy used to be adaptive, but may now lead to pathologies.

Grégoire Canlorbe: Thank you.

Grégoire Canlorbe has authored a variety of philosophical and metapolitical articles, and proposed a renovation of Platonic metaphysics as well as a new approach to the influence of Judaism on the “Aryan” mentality. Visit his website or email him: [email protected]

Find Nathan Cofnas on Twitter @nathancofnas

dreamstime farm to fork supply chain

Viewpoint: Farm to Fork failure—How Europe’s ‘obsession’ with organics undermines the global sustainable farming movement

Europe’s quest to confront climate change and achieve carbon neutrality is being undermined by “Big Ag”? That’s not my claim. It’s the latest in a series of attacks on those who question whether the European Union’s Farm to Fork policy recommendations, cobbled together mostly with input from green activist groups, have any chance of achieving its sustainability goals.

The latest attack on conventional agriculture and its embrace of cutting-edge biotechnology comes in a scathing piece last week by openDemocracy headlined: “How the agricultural lobby is sabotaging  Europe’s Green Deal”.

b b ec f

Its basic premise: “Big Farming” is forging nefarious alliances to block agriculture’s necessary role in ‘transforming Europe’ into a ‘climate neutral’ economic bloc by 2050. These are serious, sweeping charges…and clearly not true.

Most politicians on the Continent embrace the goal to dramatically reduce greenhouse gas emissions over the next three decades. Agriculture can play a key role. But thoughtful questions have been raised about how to achieve the broad sustainability goals outlined in the F2F policy, as it calls for dramatically increasing food production while scaling up organic farming and slashing synthetic pesticide use, all without any clear plan as to how to address agricultural pests and productivity challenges. The gap between aspiration and action appears huge. 

Getting this right is critical as Europe’s global policy influence is huge. Too much is at stake to turn this serious challenge into a political football. Rather than critics of conventional agriculture offering mostly bromides and broadsides, we would all be better served by applying science rather than innuendo and hyperbole.

bb d df a b e aab a ffcda f

Addressing food insecurity

Reading the F2F document, I was struck by one insight. Although we occasionally see scenes on the news of malnourished children in distant countries, most people in Europe and the wealthier parts of the world believe that we are well on our way to solving what has for most of human history been life’s primary challenge: producing enough food for a growing global population.

We are told that we already grow enough food to feed everyone, but much of it is wasted—88 million tons of food annually in Europe alone. So, increasing production, food activists say, is itself wasteful. Rather than increasing food production, activists claim, we should create a  “sustainable agricultural system.” That claim would be true if people could eat statistics. Green advocates offer no concrete plan as to how we can transport food scraps from western households, restaurants and grocery stores to under-developed countries. 

food waste
Credit: Reuters

In the real world, capping food production at current levels, which would happen with the spread of organic farming, would work if crops were never lost to pests in the field or spoiled in storage before they got to market, if the massive global challenges of transportation and distribution just magically disappeared, and we assigned a food monitor to every home, farm and restaurant to collect the world’s scraps after we ate our assigned calorie allotment for the day. 

Here’s a wake-up call. Food security is emerging as the number one issue of our time. F2F’s central premise is a need to steer farming  in Europe and the world away from conventional methods that rely on high technology tools such as pesticides, genetic engineering, key elements of precision farming.

Yet many people who embrace the same sustainability goals say these recommendations, taken as a whole, are a prescription for disaster. They will not only increase hunger, they will undermine the climate change environmental goals as well.

It’s time we got a reality check on food insecurity, and how we’ve managed to reduce world hunger over the last 90 years. It wasn’t until after World War II and the widespread adoption of agricultural technologies, including the genetic manipulation of plants to create advanced hybrid crops, modern chemical pesticides, and synthetic fertilizer, that everyone in Europe—not just the upper classes—had enough to eat. Cultural memories of hunger grow fainter after a few generations, but as late as the early 20th century, malnutrition was still widespread in Europe. 

The UN estimates that 821 million people are suffering from hunger. The number was rising before COVID, but the pandemic is making it even worse. An additional 10,000 children every month are expected to die from malnutrition as farms are cut off from markets and food aid no longer reaches hungry populations.

a c ee c bf cb be
Malnutrition in India. Credit: 2020 Global Nutrition Report

Reckoning on farming and food

The novel coronavirus may seem like a once-in-a-century disaster, but plenty of food and farm crises roil the world. Most frighteningly, the Middle East, much of India and East Africa are being ravaging by a Biblical scale, crop-destroying locust plague. It threatens some 22 million people in Africa alone with starvation. And only the widespread deployment of insecticides have been able to get it under control.

That’s just one of the many scourges threatening agriculture and biodiversity. Myriad other plant pests, viral, bacterial, and fungal crop diseases, droughts and other weather events threaten agricultural production. And don’t assume this is just happening in Africa and Asia. The Varroa parasite that sickens and kills honeybees is an invasive species that only arrived in Europe in the 1960s before spreading to the United States in the late 1980s. The massively destructive Fall Army Worm which jumped across the Atlantic to Africa a few years ago could reach Europe any moment. So could the locusts.

ddc ddbf c a a ce a df

Add to this the fact that climate change could drastically alter growing conditions, increasing drought and other destructive weather patterns, and Black Swan events, like COVID, are inevitable. Consider these threats in light of the fact that we will need to increase food production between 70 and 100 percent by 2050. This is not a prediction; this is a fact.

We face two challenges: a fast-growing population and a gradual demand for more and higher caloric food in developing countries in Asia and Africa who now subsist on 1,000 calories a day, and won’t be satisfied with their meagre bowls of rice. In other words, they’ll be eating more like Americans and Europeans. 

Certainly, we can and should cut food waste. But that’s not a game changer when it comes to making food and farming both more sustainable and more productive. We need to decide whether we are going to address the issue seriously, or if we’re just going to pretend that this perfectly predictable crisis, like the next pandemic, isn’t going to happen.

Which brings us back to Farm to Fork. The broad goal, according to F2F, is to “reduce the environmental and climate footprint of the EU food system in the face of climate change and bidoversity loss.”

It’s an impressive manifesto. As we’d say in America, that broad goal is an ‘apple pie’ aspiration; everyone embraces that. But how to achieve that is where F2F careens off course. But as you drill down into the details, examining it with the eyes of someone who has struggled with sustainability challenges for upwards of 30 years, it is deeply disappointing. The entire strategy, in the end, is predicated on the idea that we can address food security with agricultural strategies that have already come up short—despite their faddish popularity. Most eyebrow-raising: the primary tool to transform European farming is to embrace organic farming and food.

Most significantly, F2F does not provide for the embrace of advanced farming and food technologies, such as transgenic GMOs and CRISPR gene editing of seeds, which offer the only suite of tools proven to increase food production while decreasing the use of unnecessary chemicals.

It even advocates for a labeling system for foodstuffs such as the Nutri-Score system which France is promoting. This kind of “traffic light” labeling scheme purports to regulate Europeans’ plates and is based on a controversial algorithm which denigrates some kinds of foods as unhealthy—slapping a red label on them—while giving others the green light. A number of nutritional allege that Nutri-Score gives an advantage to some categories of foodstuffs over others—for example, French industrial foods over products such as olive oil, one of the building blocks of the healthy Mediterranean diet.

It’s filled with ‘solutions’ that sound great on paper but defy definition, things like promoting a “circular bio-based economy” and developing an “integrated nutrient management action plant.” It’s mostly aspiration and verbiage, demonizing agricultural technology when it should be science based. Environmental activists, say, farmers globally should expand the model pioneered in Europe, where organic farming is almost religion. But as in many cases, below the surface of environmental platitudes the reality is complicated.

edad b f f a a e a

In fact, Netherlands (24), Belgium (28), Ireland (29), Italy (31), Portugal (36), Switzerland (41) Germany (44) and France (47)—indeed, almost every country in Europe—uses far more toxic pesticides per hectare of available cropland than the US, which ranks 59th. 

Those statistics are shocking to many, as there is a widespread misconception that Europe is on the cutting edge of sustainable farming, when the opposite is the case. Let’s explore why that is so.

Synthetic chemical myths

As part of this new sustainability equation, there are calls to cut conventional pesticide use by 50 percent, regardless of their effectiveness or toxicity. Why? That’s never addressed scientifically. It can’t be over concerns about health or environmental impacts.  Many people, including it appears the drafters of F2F, do not even realize that organic farming uses dozens of approved synthetic chemicals and hundreds of natural chemicals. 

But aren’t synthetic pesticides, which are most commonly used by conventional farmers, more harmful than natural ones? Many people believe that, and environmental advocacy groups based their fund raising almost entirely around convincing people they should be ‘scared to death’ by chemicals. But the science answer is ‘no’. The most toxic chemicals in the world are natural, and more than 99% of the pesticides we eat are produced naturally. 

Science has come a long way since synthetic agricultural chemicals were first introduced in mid-last century. Early, crude chemicals have been phased out. Functionally, the newer ones are targeted, designed to prevent specific plant diseases, kill weeds, and kill or repel harmful insects without harming beneficial ones, and overwhelmingly they do that.

a ba dd de c a d c aed f

The overall toxicity of synthetic pesticides has decreased steadily over the decades as technology has improved. Despite epidemiological studies finding that some pesticides have deleterious effects, in almost all cases that’s based on levels of exposure that we just don’t encounter in the real world.  

Overall per acre toxicity levels on US farms begin declining dramatically in the 1960s, and dropped again with the introduction of genetically engineered crops in the 1990s, although the volume of chemical usage has stayed about the same—primarily because of the introduction of low toxic pesticides, such as glyphosate.

Another key driver has been the introduction of crops engineered to express natural insecticides.  Insecticide use on American farms has dropped more than 90% since the mid-1990s spurred by the introduction of GMO corn, soybeans and cotton that produce the insect-repelling natural bacterium, Bacillus thuringiensis (Bt).

f e b c d eaa d d d

The sustainable GMO technology is spreading to the developing world. Bangladesh eggplant farmers have cut insecticide use by more than 75% with the introduction of Bt brinjal and India has become a world leader in the production of cotton. The transition from old-line farming techniques to the use of bioengineered seeds has dramatically improved the health of tens of thousands of women and children who do much of the farming.

It’s all part of a global move away from toxic chemical usage spurred by biotechnology innovation that is expected to accelerate dramatically with advances in gene editing that could eliminate some harmful chemicals altogether.

Meanwhile, the organic movement remains wedded to the past. It is addicted to ‘technology’ that is a century old or even older, even when the health and environmental consequences can be catastrophic. Consider copper sulfate, used by organic farmers, particularly in the wine industry, and some conventional farmers to limit fungus on wine grapes. It’s highly toxic.

Unfortunately, it also kills beneficial insects and is a human carcinogen. Only strong lobbying by the Europe’s organic industry, which has helped shape the Farm to Fork strategy, has prevented copper sulfate from being banned by the European Union because of its “particular concern to public health or the environment,” according to the European Food Safety Authority. So much for Europe’s model organic farming practices.

Copper sulfate is also far more toxic than the herbicide glyphosate whose use has set off paroxysms of hysteria across Europe. Glyphosate is less toxic than salt and has been found safe by 18 major global health and environmental safety organizations, including four in Europe.

f ecd a e

Although glyphosate accounts for one quarter of herbicides applied by weight to corn, it only accounts for one tenth of one percent of the chronic toxicity hazard associated with weed control in corn. Put another way: The other 74% of herbicides accounted for 99.9% of chronic toxicity hazard in weed control for corn. Or to put it yet another way, taking glyphosate out of the picture could raise the toxicity hazard in corn by 26%, 43% in soybeans, and 45% in cotton. Yet, green groups want to ban it, which directly contradicts the science goals of F2F and the Green Deal.

How to achieve sustainable farming

F2F gets sustainability backwards. Rather than set a goal—sustainable agriculture that results in increased food production while moderating inputs—and figure out what tools best work, F2F elevates facile proposals that only appear to support what it seeks to achieve. Organic agriculture is held up as both a European goal—F2F proposes to more than triple its implementation in ten years—and as a global model, but its bereft of actionable, toxicity-reducing specifics. 

Which brings us to the most egregious problems with the Farm to Fork fantasyland. What would happen if a country—say the United Kingdom— fully embraced organic farming, the ultimate goal of Green Deal backers? As there is almost no arable land left in the world, the move to organic would result in a shift in production to the developing world, which would lead to the clear-cutting of forests to create more farmland. In essence, the EU would be exporting to the poorest regions of the world its environmental “externalities”, as economists calls them, all because of its organic fixation.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

That’s exactly the question asked and answered by researchers in a state-of-the-art study published last year in the prestigious journal Nature Communications comparing conventional and organic agriculture and its impact on carbon emissions. As the organic industry itself acknowledges, they found organic farming is as much as 40 percent less productive than conventional farming. Transitioning from conventional farming to organic would pump somewhere between 20 and 70 percent more greenhouse gases into the atmosphere than conventional farming. 

Just to meet the current demand for food (It’s actually expected to increase steadily in the years ahead) and make up this 40 percent shortfall, the independent research team found the UK would have to dramatically increase its import of food.

“This has an associated impact on the environment, adding potentially unnecessary food miles and greenhouse gas emissions to our food systems,” said Philip Jones, from the University of Reading, one of the authors of the groundbreaking study.

According to a BBC analysis, “due to significantly lower productivity in other countries, this would require five times the amount of land that is currently used for food in England and Wales, consuming 6 million more hectares of land.”

Organic production and greenhouse gases

The questions surrounding F2F multiply exponentially when you consider greenhouse gas emissions. Growing concerns about climate change—and estimates that one third of greenhouse gas emissions come from agriculture—have helped fuel the market for organic foods, which is perceived as reducing environmental impacts. Many scientists contest those claims. 

One of the great early advances of organic farming was the use of compost to promote soil health. But there are sustainability trade-offs. During the process of composting, methane is emitted, a greenhouse gas 30 times more potent than carbon dioxide.  Methane is also released in catastrophic amounts by flatulent cows, the primary generator of organic waste for use as fertilizer on organic farms. Cattle livestock is already blamed for generating nearly 20% more greenhouse gases in terms of carbon equivalency as compared to driving automobiles. The use of organic fertilizer often results in the release of nitrous oxide, a highly potent greenhouse gas.

Organic farmers also rely on tillage far more than their conventional counterparts. Many conventional farmers have switched to no-till, ridge-till, and mulch oil (reduced plowing up of the soil) practices, facilitated by the use of GMO crops, because tillage contributes to soil erosion and the release of greenhouse gasses. No-till practices allow the soil structure to stay intact, protecting beneficial microorganisms, fungi and bacteria. It also conserves water, reduces erosion, and unnecessary labor to ride carbon-belching machinery so common in large scale organic farming. The use of no-tillage farming has grown sharply over the last two decades in the US, in step with the growth in GMO farming, accounting for more than 35 percent of cropland. 

fcba f e bf b b d

One study estimates that using glyphosate herbicide in conjunction with GMO glyphosate-resistant corn and soybean has prevented 41 billion lbs. of CO2 from being released into the atmosphere between 1996 to 2013. A 2016 study by Purdue University researchers found that agricultural greenhouse gas emissions would increase by nearly 14 percent if there were a ban on GMOs in the countries now using them. These figures help explain why the US is so far ahead of Europe in toxic pesticide reduction.

Beyond Farm to Fork: How do we put agricultural sustainability ahead of ideology?

If the supporters of the Farm to Fork strategy take seriously their desire to ‘export’ the organic agricultural model to ‘the rest of the world’, they have to soberly reassess the impact of their carbon-increasing strategy. Boutique ideas like urban farming and local production or reverting world agriculture to more “natural” low-yield, land intensive and disease-vulnerable farming methods—are the fantasies of an affluent society. Organic farming is like an impulse buy, and such thinly supported decision-making has no place in a document that purports to seriously address the enormous challenges facing the world.

Here is my disappointment with the notions promoted by F2F. They don’t address the real complexity of food and farming; they are bereft of nuance and a science-based understanding of environmental and economic tradeoffs. Synthetic chemicals are only part of the sustainability equation. Eco-responsibility means different things to different experts. Greenhouse gas emissions? Productivity per acre? Land usage? Labor intensive vs. mechanized agriculture? These and other factors should be part of a complex, value-based assessment of what constitutes agricultural sustainability.  

We could actually begin solving many challenges if we stopped choosing methods based on superficial notions of sustainability and instead looked to outputs and goals. Do we want to feel virtuous or actually solve real-life problems? Modern technology offers solutions, first and foremost: gene-editing that can make plants more resistant to disease, drought, and pests; more nitrogen efficient (meaning they would need less or no chemical fertilizer); safer (peanuts without the harmful proteins that can kill; wheat without the gluten that is deadly to people with celiac disease); healthier (crops with heart-healthy omega 3s). The advantages are endless—if we don’t regulate this promising technology to death.

It may not be fashionable to say this, particularly in Europe, but we will continue to need targeted chemical pesticides. A lot of them. Complemented by a new suite of genetically engineered products based on synthetic biology with little to no toxic footprint. The toxicity of modern pesticides has dropped 98% since the 1960s, and is being reduced every year. Organic pesticides toxicity has dropped zero percent since 1960. Should we be judicious and careful going forward? Yes. But let’s listen to the science here, not to chemophobic scaremongering, when it comes to setting farm production policy.

We need a food system that is efficient, productive, environmentally sustainable, and can provide nutritious food with an increasingly tinier environmental footprint. That can only happen if it is based in reality, not wishful thinking.

Jon Entine is founder and executive director of the Genetic Literacy Project. Jon is also known for his research and writings on corporate social responsibility and environmental sustainability, and was US editor for 15 years of the UK-based publication Ethical Corporation. Find Jon on Twitter @JonEntine

A version of this article was originally published at the European Scientist.

picture

Anti-GMO movement merging with anti-vaccine groups, escalating threat to global coronavirus response

As scientists around the world work at an unprecedented pace to develop a vaccine for COVID-19, anti-vaccine proponents are planting seeds of doubt about its safety and effectiveness.

Surveys show that up to nine percent of British people, 18 percent of Austrians and 20 percent of Swiss are wary of or outright opposed to being immunized. The skepticism, unfortunately, is even higher in the US, where a recent survey indicated that just 50 percent of those asked said they would get vaccinated; 31 percent said they were not sure and the remainder said they would not get vaccinated.

The anti-vaccine forces are finding allies in both the developing and developed nations among those who are suspicious of genetic engineering, as any approved vaccine is likely to be a product of biotechnology. Their combined efforts stand to prolong the pandemic by drowning out responsible expert voices that are trying to inform a confused and fearful public.

ronnie katherine march protest contp x
Ronnie Cummings (right). Credit: Organic Consumers Organization

Led by the Organic Consumers Association (OCA), many organic food activists, who are predisposed against GE crops, have been particularly outspoken about the dangers of vaccines. Ronnie Cummins, the organization’s international director, has said vaccines “are dangerous and that’s why I didn’t vaccinate my kids.” He has peddled the falsehood that the novel coronavirus was leaked from a genetic engineering lab, a move that has allegedly been covered up by a cabal of the US and Chinese governments, Big Pharma and global scientists.

Cummins comments reflect OCA’s general position on vaccination. On its website, the organization cites an obscure Polish study that purports to prove that vaccines have no historical benefits. And in an article titled “How Mainstream Media Insults the Public’s Intelligence on Vaccines, OCA accuses journalists of displaying  “zero tolerance for critical debate about vaccine safety.” The group also strongly opposed California Bill SB277, which eliminated all non-health related exemptions for vaccinations.

The OCA has specifically targeted immigrant groups in spreading its anti-biotechnology and anti-vaccine propaganda. It was among the activist groups that organized an anti-vaccine meeting in Minneapolis in 2017 that attracted many Somali-Americans. Their anti-vaccine message is blamed for leading to an outbreak of measles among the Somali-American community.

https cdn cnn com cnnnext dam assets measles outbreak minnesota restricted
Somali-American Suaado Salah comforts her 3-year-old son, who got measles during an outbreak in Minneapolis. Salah had previously refused the MMR vaccine for them because of false rumors that it caused autism. Credit: Courtney Perry/Washington Post

Kris Ehresmann, infectious disease division director at the Minnesota Department of Health, called the outbreak a “public health nightmare” and indicated she was beyond frustrated with the disinformation campaign by anti-vaccine advocates, who have been working against efforts to contain the outbreak.

Alternative health proponents spread disinformation

Fear of genetic engineering used to produce some vaccines, such as those to fight Hepatitis B, Rotavirus, Ebola and the Human Papillomavirus virus, is a central trope of anti-biotech campaigners. Natural product peddler and major donor to OCA Dr. Joe Mercola links GMOs in food to vaccines to hype the scare factor. He runs what he calls the National Vaccine Information Center (NVIC).

In a presentation on NVIC’s website called “Are you Concerned over GM Vaccines?” Mercola laid out his speculative case against biotech vaccines:

If you’ve ever had qualms about eating genetically modified (GM) foods, you’d likely be deeply concerned about receiving a GM vaccine as well .… We don’t know what portion of the GM DNA can be incorporated into our own genome, we don’t know what portion could be inheritable to our children, we also don’t know what happens when the immune system is exposed to DNA that has been recombined in lots of ways that the human body, through the course of time, has never had any exposure to….

Fringe “naturalists” have driven support for vaccines in the US down. A recent Gallup poll indicated that 84 percent of Americans believed it is extremely or very important that parents get their children vaccinated, a significant drop from 94 percent in 2001.

The alternative health movement is now specifically targeting a COVID-19 vaccine with claims that can only be described as bizarre. In a widely shared YouTube video, Dr. Andrew Kaufman, a natural healing consultant, alleged that a vaccine would provide a vessel to “inject genes” into humans by a procedure known as “electroporation.” During this process, an electric current creates little holes in our cells that allow DNA to enter, followed by the insertion of foreign proteins that supposedly generate immunity. As a result, according to Kaufman, the vaccine will make humans “genetically modified organisms.”

These spurious claims about the dangers of vaccines are often paired with political rhetoric designed to stifle policies that promote immunization. “Though a Covid-19 vaccine is likely still more than a year away, according to experts, concerns over mandatory vaccinations have spread throughout the anti-vaxxer community,” wrote Texas Monthly in a recent article. A woman named Jacqueline Belowsky told the publication she was not concerned about the coronavirus and would treat it like she does any other illness, “naturally and not in a panic.” She added that she “will never accept any vaccine no matter how scary the government makes the situation seem.  I will refuse no matter what.”

“The anti-vaccine community at large believes vaccines are a tool of government control that make big pharmaceutical companies rich and have side effects that can cause lasting damage,” Texas Monthly noted.

That’s the line pushed by Children’s Health Defense, a non-profit founded by activist-attorney Robert F. Kennedy Jr, a notorious anti-vaxxer. Kennedy said the rush to find a COVID-19 vaccination instead of focusing on treatments is driven by profit, because “fast-tracked vaccines were a sweetheart deal for both biopharma and government.”

While already exerting troubling influence on public opinion, antivax conspiracies could gain political traction in the 2020 elections. Jo Rae Perkins, who has been selected to be the Republican candidate for the Senate in Oregon, winning about 50 percent of the vote in the primary, indicated she would not get any vaccine developed in response to the coronavirus. “I don’t know what they are pumping me full of,” she complained, “I don’t want that crap.”

Among the most popular sources of COVID-19 conspiracy thinking was the May 2020 documentary Plandemic. Before Facebook and YouTube banned the incendiary film, it attracted roughly eight million viewers. It alleges that a “shadowy cabal of elites was using the virus and a potential vaccine to profit and gain power,” relying heavily on discredited scientist Judy Milovits, who claims her research on “contaminated” vaccines has been “buried.” A just-released sequel, Plandemic: InDoctorNation, was preemptively banned by Twitter and Facebook in hopes of preventing a repeat of Plandemic’s viral success.

plandemic a film about the global plan to take control of our lives liberty health freedom

This onslaught of disinformation is nothing new, vaccine advocates have said. “I’m seeing a very similar pattern that I see when outbreaks of measles happen, noted Karen Ernst, executive director of the parent-led Voices for Vaccines, in an interview with Undark:

These are people who make this part of their self-identity: I’m a mother, I have a natural lifestyle, I refuse vaccines. It’s important to deny things in order for that identity to be protected.

“It’s not surprising a significant percentage of Americans are not going to take the vaccine because of the terrible messaging we’ve had, the absence of a communication plan around the vaccine and this very aggressive anti-vaccine movement,” added Peter Hotez, Dean of the National School of Tropical Medicine at Baylor College of Medicine, which is developing a vaccine for COVID-19.

International opposition

But opposition to a COVID vaccine is not a uniquely American phenomenon. In Africa, for instance, there have been demonstrations against a vaccine as misinformation about its possible side effects and effectiveness infiltrates the continent. Protesters at a vaccine trial in South Africa carried placards that read, “No safe vaccine.” Seth Berkley, CEO of the GAVI vaccine alliance, told the African Union vaccine conference that anti-vaccine sentiment in Africa is “the worst I’ve ever seen.” In reference to COVID-19, he said, “the rumor mill has been dramatic.”

“Trust in vaccines is generally higher in the developing world where the impact of infectious diseases is more obvious,” said Heidi Larson, Director of the Vaccine Confidence Project. “But here too there could be resistance, particularly if people suspect they are being used as guinea pigs.”

Anti-vaxxers have made headway in Eastern Europe, too. A recent survey conducted in Romania indicated that one third of those surveyed said they would refuse to take a COVID-19 vaccine “under any circumstance.” The poll was taken as the nation’s Parliament debated a bill to make vaccines mandatory. According to Balkan Insight, “The results of the survey come against the backdrop of the growing visibility of the anti-vaccine movement in the country …. Anti-vaccine groups [have] picketed in Bucharest and other cities to denounce what they see as a health hazard and a state abuse of personal freedoms.”

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

The refusal by a significant minority of the population to be vaccinated could dramatically impact the course of the pandemic. The fewer people who receive a vaccine, the higher the risk that SARS-COV-2 will continue spreading. In the case of measles, for comparison, herd immunity is achieved when 93-95 percent of the population is vaccinated. For the coronavirus, herd immunity has been estimated at between 60 percent and 80 percent.

As COVID-19 vaccines move into the final stages of testing for safety and efficacy, there will no doubt be a ferocious counterattack by anti-vaccine forces and those strongly opposed to genetic engineering. These efforts, unfortunately, are likely to fan confusion and hesitancy among some segments of the public already distrustful of the government’s response to this crisis. As is the case with other vaccines, some people will pay dearly for listening to the false prophets who claim to know “the truth” about vaccines.

Steven E. Cerier is a freelance international economist and a frequent contributor to the Genetic Literacy Project  

gene editing invitro crispr and cows x a

CRISPR cows could boost sustainable meat production, but regulations and wary consumers stand in the way

When Ralph Fisher, a Texas cattle rancher, set eyes on one of the world’s first cloned calves in August 1999, he didn’t care what the scientists said: He knew it was his old Brahman bull, Chance, born again. About a year earlier, veterinarians at Texas A&M extracted DNA from one of Chance’s moles and used the sample to create a genetic double. Chance didn’t live to meet his second self, but when the calf was born, Fisher christened him Second Chance, convinced he was the same animal.

Scientists cautioned Fisher that clones are more like twins than carbon copies: The two may act or even look different from one another. But as far as Fisher was concerned, Second Chance was Chance. Not only did they look identical from a certain distance, they behaved the same way as well. They ate with the same odd mannerisms; laid in the same spot in the yard. But in 2003, Second Chance attacked Fisher and tried to gore him with his horns. About 18 months later, the bull tossed Fisher into the air like an inconvenience and rammed him into the fence. Despite 80 stitches and a torn scrotum, Fisher resisted the idea that Second Chance was unlike his tame namesake, telling the radio program “This American Life” that “I forgive him, you know?”

In the two decades since Second Chance marked a genetic engineering milestone, cattle have secured a place on the front lines of biotechnology research. Today, scientists around the world are using cutting-edge technologies, from subcutaneous biosensors to specialized food supplements, in an effort to improve safety and efficiency within the $385 billion global cattle meat industry. Beyond boosting profits, their efforts are driven by an imminent climate crisis, in which cattle play a significant role, and growing concern for livestock welfare among consumers.

Gene editing stands out as the most revolutionary of these technologies. Although gene-edited cattle have yet to be granted approval for human consumption, researchers say tools like Crispr-Cas9 could let them improve on conventional breeding practices and create cows that are healthier, meatier, and less detrimental to the environment. Cows are also being given genes from the human immune system to create antibodies in the fight against Covid-19. (The genes of non-bovine livestock such as pigs and goats, meanwhile, have been hacked to grow transplantable human organs and produce cancer drugs in their milk.)

thehornlesso
The hornless offspring of a gene- modified bull (L), alongside a horned control cow, are seen at the University of California-Davis

But some experts worry biotech cattle may never make it out of the barn. For one thing, there’s the optics issue: Gene editing tends to grab headlines for its role in controversial research and biotech blunders. Crispr-Cas9 is often celebrated for its potential to alter the blueprint of life, but that enormous promise can become a liability in the hands of rogue and unscrupulous researchers, tempting regulatory agencies to toughen restrictions on the technology’s use. And it’s unclear how eager the public will be to buy beef from gene-edited animals. So the question isn’t just if the technology will work in developing supercharged cattle, but whether consumers and regulators will support it.

Cattle are catalysts for climate change. Livestock account for an estimated 14.5 percent of greenhouse gas emissions from human activities, of which cattle are responsible for about two thirds, according to the United Nations’ Food and Agriculture Organization (FAO). One simple way to address the issue is to eat less meat. But meat consumption is expected to increase along with global population and average income. A 2012 report by the FAO projected that meat production will increase by 76 percent by 2050, as beef consumption increases by 1.2 percent annually. And the United States is projected to set a record for beef production in 2021, according to the Department of Agriculture.

original x
Alison Van Eenennaam

For Alison Van Eenennaam, an animal geneticist at the University of California, Davis, part of the answer is creating more efficient cattle that rely on fewer resources. According to Van Eenennaam, the number of dairy cows in the United States decreased from around 25 million in the 1940s to around 9 million in 2007, while milk production has increased by nearly 60 percent. Van Eenennaam credits this boost in productivity to conventional selective breeding.

“You don’t need to be a rocket scientist or even a mathematician to figure out that the environmental footprint or the greenhouse gases associated with a glass of milk today is about one-third of that associated with a glass of milk in the 1940s,” she says. “Anything you can do to accelerate the rate of conventional breeding is going to reduce the environmental footprint of a glass of milk or a pound of meat.”

Modern gene-editing tools may fuel that acceleration. By making precise cuts to DNA, geneticists insert or remove naturally occurring genes associated with specific traits. Some experts insist that gene editing has the potential to spark a new food revolution.

Jon Oatley, a reproductive biologist at Washington State University, wants to use Crispr-Cas9 to fine tune the genetic code of rugged, disease-resistant, and heat-tolerant bulls that have been bred to thrive on the open range. By disabling a gene called NANOS2, he says he aims to “eliminate the capacity for a bull to make his own sperm,” turning the recipient into a surrogate for sperm-producing stem cells from more productive prized stock. These surrogate sires, equipped with sperm from prize bulls, would then be released into range herds that are often genetically isolated and difficult to access, and the premium genes would then be transmitted to their offspring.

Furthermore, surrogate sires would enable ranchers to introduce desired traits without having to wrangle their herd into one place for artificial insemination, says Oatley. He envisions the gene-edited bulls serving herds in tropical regions like Brazil, the world’s largest beef exporter and home to around 200 million of the approximately 1.5 billion head of cattle on Earth.

Brazil’s herds are dominated by Nelore, a hardy breed that lacks the carcass and meat quality of breeds like Angus but can withstand high heat and humidity. Put an Angus bull on a tropical pasture and “he’s probably going to last maybe a month before he succumbs to the environment,” says Oatley, while a Nelore bull carrying Angus sperm would have no problem with the climate.

The goal, according to Oatley, is to introduce genes from beefier bulls into these less efficient herds, increasing their productivity and decreasing their overall impact on the environment. “We have shrinking resources,” he says, and need new, innovative strategies for making those limited resources last.

nelore cows flickr christopher borges e
Nelore cows, the most common breed in Brazil. Credit: Christopher Borges/Flickr

Oatley has demonstrated his technique in mice but faces challenges with livestock. For starters, disabling NANOS2 does not definitively prevent the surrogate bull from producing some of its own sperm. And while Oatley has shown he can transplant sperm-producing cells into surrogate livestock, researchers have not yet published evidence showing that the surrogates produce enough quality sperm to support natural fertilization. “How many cells will you need to make this bull actually fertile?” asks Ina Dobrinski, a reproductive biologist at the University of Calgary who helped pioneer germ cell transplantation in large animals.

But Oatley’s greatest challenge may be one shared with others in the bioengineered cattle industry: overcoming regulatory restrictions and societal suspicion. Surrogate sires would be classified as gene-edited animals by the Food and Drug Administration, meaning they’d face a rigorous approval process before their offspring could be sold for human consumption. But Oatley maintains that if his method is successful, the sperm itself would not be gene-edited, nor would the resulting offspring. The only gene-edited specimens would be the surrogate sires, which act like vessels in which the elite sperm travel.

Even so, says Dobrinski, “That’s a very detailed difference and I’m not sure how that will work with regulatory and consumer acceptance.”

In fact, American attitudes towards gene editing have been generally positive when the modification is in the interest of animal welfare. Many dairy farmers prefer hornless cows — horns can inflict damage when wielded by 1,500-pound animals — so they often burn them off in a painful process using corrosive chemicals and scalding irons. In a study published last year in the journal PLOS One, researchers found that “most Americans are willing to consume food products from cows genetically modified to be hornless.”

Still, experts say several high-profile gene-editing failures in livestock and humans in recent years may lead consumers to consider new biotechnologies to be dangerous and unwieldy.

In 2014, a Minnesota startup called Recombinetics, a company with which Van Eenennaam’s lab has collaborated, created a pair of cross-bred Holstein bulls using the gene-editing tool TALENs, a precursor to Crispr-Cas9, making cuts to the bovine DNA and altering the genes to prevent the bulls from growing horns. Holstein cattle, which almost always carry horned genes, are highly productive dairy cows, so using conventional breeding to introduce hornless genes from less productive breeds can compromise the Holstein’s productivity. Gene editing offered a chance to introduce only the genes Recombinetics wanted. Their hope was to use this experiment to prove that milk from the bulls’ female progeny was nutritionally equivalent to milk from non-edited stock. Such results could inform future efforts to make Holsteins hornless but no less productive.

The experiment seemed to work. In 2015, Buri and Spotigy were born. Over the next few years, the breakthrough received widespread media coverage, and when Buri’s hornless descendant graced the cover of Wired magazine in April 2019, it did so as the ostensible face of the livestock industry’s future.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

But early last year, a bioinformatician at the FDA ran a test on Buri’s genome and discovered an unexpected sliver of genetic code that didn’t belong. Traces of bacterial DNA called a plasmid, which Recombinetics used to edit the bull’s genome, had stayed behind in the editing process, carrying genes linked to antibiotic resistance in bacteria. After the agency published its findings, the media reaction was swift and fierce: “FDA finds a surprise in gene-edited cattle: antibiotic-resistant, non-bovine DNA,” read one headline. “Part cow, part… bacterium?” read another.

Recombinetics has since insisted that the leftover plasmid DNA was likely harmless and stressed that this sort of genetic slipup is not uncommon.

“Is there any risk with the plasmid? I would say there’s none,’’ says Tad Sonstegard, president and CEO of Acceligen, a Recombinetics subsidiary. “We eat plasmids all the time, and we’re filled with microorganisms in our body that have plasmids.” In hindsight, Sonstegard says his team’s only mistake was not properly screening for the plasmid to begin with.

While the presence of antibiotic-resistant plasmid genes in beef probably does not pose a direct threat to consumers, according to Jennifer Kuzma, a professor of science and technology policy and co-director of the Genetic Engineering and Society Center at North Carolina State University, it does raise the possible risk of introducing antibiotic-resistant genes into the microflora of people’s digestive systems. Although unlikely, organisms in the gut could integrate those genes into their own DNA and, as a result, proliferate antibiotic resistance, making it more difficult to fight off bacterial diseases.

“The lesson that I think is learned there is that science is never 100 percent certain, and that when you’re doing a risk assessment, having some humility in your technology product is important, because you never know what you’re going to discover further down the road,” she says. In the case of Recombinetics. “I don’t think there was any ill intent on the part of the researchers, but sometimes being very optimistic about your technology and enthusiastic about it causes you to have blinders on when it comes to risk assessment.”

The FDA eventually clarified its results, insisting that the study was meant only to publicize the presence of the plasmid, not to suggest the bacterial DNA was necessarily dangerous. Nonetheless, the damage was done. As a result of the blunder, a plan was quashed for Recombinetics to raise an experimental herd in Brazil.

Backlash to the FDA study exposed a fundamental disagreement between the agency and livestock biotechnologists. Scientists like Van Eenennaam, who in 2017 received a $500,000 grant from the Department of Agriculture to study Buri’s progeny, disagree with the FDA’s strict regulatory approach to gene-edited animals. Typical GMOs are transgenic, meaning they have genes from multiple different species, but modern gene-editing techniques allow scientists to stay roughly within the confines of conventional breeding, adding and removing traits that naturally occur within the species.

That said, gene editing is not yet free from errors and sometimes intended changes result in unintended alterations, notes Heather Lombardi, division director of animal bioengineering and cellular therapies at the FDA’s Center for Veterinary Medicine. For that reason, the FDA remains cautious.

“There’s a lot out there that I think is still unknown in terms of unintended consequences associated with using genome-editing technology,” says Lombardi. “We’re just trying to get an understanding of what the potential impact is, if any, on safety.”

Bhanu Telugu, an animal scientist at the University of Maryland and president and chief science officer at the agriculture technology startup RenOVAte Biosciences, worries that biotech companies will migrate their experiments to countries with looser regulatory environments. Perhaps more pressingly, he says strict regulation requiring long and expensive approval processes may incentivize these companies to work only on traits that are most profitable, rather than those that may have the greatest benefit for livestock and society, such as animal well-being and the environment.

“What company would be willing to spend $20 million on potentially alleviating heat stress at this point?” he asks.

On a windy winter afternoon, Raluca Mateescu leaned against a fence post at the University of Florida’s Beef Teaching Unit while a Brahman heifer sniffed inquisitively at the air and reached out its tongue in search of unseen food. Since 2017, Mateescu, an animal geneticist at the university, has been part of a team studying heat and humidity tolerance in breeds like Brahman and Brangus (a mix between Brahman and Angus cattle). Her aim is to identify the genetic markers that contribute to a breed’s climate resilience, markers that might lead to more precise breeding and gene-editing practices.

“In the South,’’ Mateescu says, heat and humidity are a major problem. “That poses a stress to the animals because they’re selected for intense production — to produce milk or grow fast and produce a lot of muscle and fat.”

Like Nelore cattle in South America, Brahman are well-suited for tropical and subtropical climates, but their high tolerance for heat and humidity comes at the cost of lower meat quality than other breeds. Mateescu and her team have examined skin biopsies and found that relatively large sweat glands allow Brahman to better regulate their internal body temperature. With funding from the USDA’s National Institute of Food and Agriculture, the researchers now plan to identify specific genetic markers that correlate with tolerance to tropical conditions.

“If we’re selecting for animals that produce more without having a way to cool off, we’re going to run into trouble,” she says.

brahman cow dyllan furness scaled
A Brahman cow at the University of Florida’s Beef Teaching Unit. Credit: Dyllan Furness

There are other avenues in biotechnology beyond gene editing that may help reduce the cattle industry’s footprint. Although still early in their development, lab-cultured meats may someday undermine today’s beef producers by offering consumers an affordable alternative to the conventionally grown product, without the animal welfare and environmental concerns that arise from eating beef harvested from a carcass.

Other biotech techniques hope to improve the beef industry without displacing it. In Switzerland, scientists at a startup called Mootral are experimenting with a garlic-based food supplement designed to alter the bovine digestive makeup to reduce the amount of methane they emit. Studies have shown the product to reduce methane emissions by about 20 percent in meat cattle, according to The New York Times.

In order to adhere to the Paris climate agreement, Mootral’s owner, Thomas Hafner, believes demand will grow as governments require methane reductions from their livestock producers. “We are working from the assumption that down the line every cow will be regulated to be on a methane reducer,” he told The New York Times.

Meanwhile, a farm science research institute in New Zealand, AgResearch, hopes to target methane production at its source by eliminating methanogens, the microbes thought to be responsible for producing the greenhouse gas in ruminants. The AgResearch team is attempting to develop a vaccine to alter the cattle gut’s microbial composition, according to the BBC.

Genomic testing may also allow cattle producers to see what genes calves carry before they’re born, according to Mateescu, enabling producers to make smarter breeding decisions and select for the most desirable traits, whether it be heat tolerance, disease resistance, or carcass weight.

Despite all these efforts, questions remain as to whether biotech can ever dramatically reduce the industry’s emissions or afford humane treatment to captive animals in resource-intensive operations. To many of the industry’s critics, including environmental and animal rights activists, the very nature of the practice of rearing livestock for human consumption erodes the noble goal of sustainable food production. Rather than revamp the industry, these critics suggest alternatives such as meat-free diets to fulfill our need for protein. Indeed, data suggests many young consumers are already incorporating plant-based meats into their meals.

Ultimately, though, climate change may be the most pressing issue facing the cattle industry, according to Telugu of the University of Maryland, which received a grant from the Bill and Melinda Gates Foundation to improve productivity and adaptability in African cattle. “We cannot breed our way out of this,” he says.

Dyllan Furness is a Florida-based science and technology journalist. His work has appeared in Quartz, OneZero, and PBS, among other outlets. Follow him on Twitter @dyllonline

This article was originally published at Undark and has been republished here with permission. Follow Undark on Twitter @undarkmag

genetic blood disorders g rf x

Gene therapy for hemophilia delayed until 2022 after FDA rejects one-time treatment, shocking doctors and scientists

U.S. regulators rejected [Biomarin’s] potentially game-changing hemophilia A gene therapy over concerns it might not really be a one-and-done lifetime treatment.

The U.S. Food and Drug Administration’s rejection late [August 18] means the San Rafael, California-based company will have to complete an ongoing late-stage patient study, likely delaying possible approval till late in 2022.

The infused therapy, called Roctavian, could have freed hemophilia A patients from frequent, extremely expensive infusions of a blood-clotting therapy to prevent dangerous internal bleeding. It had been highly anticipated by doctors, patients and investors.

In a statement, BioMarin said the company and the FDA previously agreed on how much patient testing data the agency required to review the therapy, but in its rejection letter the FDA for the first time recommended Biomarin finish the late-stage study and provide two years of follow-up data on the therapy’s safety and efficacy in preventing internal bleeding for all study participants.

Questions about whether it would work for a lifetime or just a few years came amid rumors that Biomarin might set a price tag as high as $3 million per patient. That would top the price for the most expensive therapy ever approved by the FDA… Biomarin has estimated the lifetime cost of current treatments to prevent bleeding at about $25 million, arguing its gene therapy would save far more than its cost.

Read the original post from ABC here

The Genetic Literacy Project’s Ricki Lewis previously addressed Biomarin’s gene therapy:

The clotting disorder hemophilia A may become the third gene therapy that the US Food and Drug Administration approves, joining treatments for a form of retinal blindness in 2017, and spinal muscular atrophy in 2019.

Biomarin Pharmaceutical Inc. has submitted a biologics license application to FDA and documentation of clinical trial results to the European Medicines Agency, with reviews slated to begin early this year at both organizations.

Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter.

An article in the January 2 New England Journal of Medicine from a UK research team presents the findings of a phase 3 analysis of continuing success of a phase 1/2 trial (instead of a new phase 3 trial). The hemophilia gene therapy – called valoctocogene roxaparvovec for now – can mean a one-time infusion that replaces the more than 100-150 infusions of clotting factor a patient takes each year, and can also alleviate the painful joint bleeding that is the hallmark of the disease.

The different clotting disorders result from mutations in different genes in the pathway that knits a clot from protein fibrils. Hemophilia A is a deficiency of clotting factor VIII, and is also called classic hemophilia. It accounts for 80 percent of people with the disease. The clotting disorder that threaded through the royal families of Europe was hemophilia B, which is a deficiency of factor IX.

Both hemophilias are transmitted by genes on the X chromosome, and therefore affect only males. One in 10,000 males has hemophilia A, and it arises as a new mutation (rather than being inherited), in about a third of cases.

Read the full GLP article here