The GLP is committed to full transparency. Download and review our Annual Report.

Global food shortage? How advanced breeding could domesticate 50,000 wild, edible plants

| January 5, 2016

We’re always talking about improving the crops we have — disease-resistant corn, vitamin-enhanced rices — but what about the plants we can eat but don’t? What about the hundreds of tubers and flowers and grains and beans that people have eaten in some capacity or another since time immemorial but have never been put through the process of domestication?

If we want a second Green Revolution, so often promised by supporters of biotech, we’re going to need to leverage some of these ‘orphan crops’; this is the argument of Hillary Rosner in a fascinating Wired feature, “How We Can Tame Overlooked Wild Plants to Feed the World.” In it, she chronicled a small but inspired sect of scientists who see the future not in staples that were domesticated “when people were just learning to weave clothing,” but in using everything we’ve learned from the several-thousand-year history of domestication to do it again and do it better with an new suite of staples better-suited for a warming world.

Famed physiologist-author Jared Diamond has called plant and animal domestication “the most important development in the past 13,000 years of human history.” Diamond, in a review of the history of domestication, published in Nature in 2002, paints a compelling picture of an accidental process by which the first farmers and ranchers got their start.

“[T]he transition from hunter and gathering to farming eventually resulted in more work, lower adult stature, worse nutritional condition and heavier disease burdens” for those who took the first steps toward domestication, wrote Diamond. He offered an example of how the first “farmers” were likely only semi-aware of what they were doing at best, and how their actions eventually came to dramatically change the biology of the plants they gathered.

For example, wild wheats and barley bear their seeds on top of a stalk that spontaneously shatters, dropping the seeds to the ground where they can germinate (but where they also become difficult for humans to gather). An occasional single-gene mutation that prevents shattering is lethal in the wild (because the seeds fail to drop), but conveniently concentrates the seeds for human gatherers. Once people started harvesting those wild cereal seeds, bringing them back to camp, accidentally spilling some, and eventually planting others, seeds with a non-shattering mutation became unconsciously selected for rather than against.

But a shift in climate at the end of the Pleistocene and end of the last great ice age lead to a drop in the availability of big game and more stable climate conditions for farming — hunters and gathered eventually turned to cultivation to stabilize their food supply. Then the process of domestication began in earnest. It took, by Rosner’s reporting, some two to four thousand years to domesticate wheat, rice and barley. And it was done by necessity, circumstance, and a human hand guiding evolution before humanity even had a concept of evolution. When Mendel famously bred his peas and worked out the concept of the gene, he was only able to observe the outward expression of the plant’s genes.

Today, we can quickly and efficiently examine the genes of a given plant directly; we can read genotypes instead of phenotypes. That means that new domestication efforts could be completed inside of a century instead of taking longer than most of us can even effectively conceptualize. If we start now, your grandkids could have an entirely new suite of staple crops. Some people already have startled. Rosner profiles the efforts of U.S. Department of Agriculture geneticist Steven Cannon to domesticate Apios americana, the potato bean, a North American legume.

“Native Americans gathered them and may even have served them at the first Thanksgiving,” writes Rosner. “European settlers found them living in their cranberry bogs — places with low light, few nutrients, and bad soil. But they didn’t bother domesticating them into an agricultural staple.”

The promise of domesticating more plants is provocative: of 50,000 some-odd edible plants, we rely on less than 150 for most of our nourishment and “just three cereal crops — wheat, rice, and corn — make up two-thirds of the world’s calories.”

The temperature in our world appears to be changing, perhaps like it did at the end of the Pleistocene. Perhaps the only solution isn’t to keep trying to eke new tricks out of old dogs using biotechnology on the same crops we developed before biotech was born; we might also be served by a suite of entirely new breeds. This ties back into an idea I explored some weeks ago, although it was then in the context of farming methodologies. “Can we improve on nature?” I asked, echoing a blog post from Andrew McGuire at BioFortified.

Improving on nature is the central tenet of any domestication effort. In light of the work of scientists like Cannon, the question becomes, simply, “Can we improve on our improvement?” With 13,000 years of accumulated practical, scientific, and technological experience, the answer should be yes.

Kenrick Vezina is writer for the Genetic Literacy Project and an educator and naturalist based in the Greater Boston Area. Follow me on Twitter @Rickken.

The GLP featured this article to reflect the diversity of news, opinion and analysis. The viewpoint is the author’s own. The GLP’s goal is to stimulate constructive discourse on challenging science issues.

News on human & agricultural genetics and biotechnology delivered to your inbox.
Optional. Mail on special occasions.

Send this to a friend