CHAPTER 1
LIBERATE YOUR INNER COW: LIFE UNGRAINED
Goldfish do not eat sausages.
—John Cleese, "How to Feed a Goldfish,"
Monty Python's Flying CircusSINCE YOU ARE reading this book, I take it that you are a member of the species Homo sapiens. You are likely not a giraffe, toad, or woodpecker. Nor are you a ruminant, those taciturn creatures that graze on grass.
Ruminants, such as goats and cows, and their ancient, wild counterparts, ibex and aurochs, enjoy evolutionary adaptations that allow them to consume grasses. They have continuously growing teeth to compensate for the wear generated by coarse, sandlike phytolith particles in grass blades; produce in excess of 100 quarts of saliva per day; have four-compartment stomachs that host unique microorganisms to digest grass components, including a compartment that grinds and then regurgitates its contents up as a cud to rechew; and a long, spiral colon that's also host to microorganisms that further digest grassy remains. In other words, ruminants have a gastrointestinal system uniquely specialized to consume grasses.
You don't look, smell, or act like a ruminant. Then why would you eat like one?
Those of you who have already forgone wheat do not, of course. But if you remain of the "healthy whole grain"-consuming persuasion, you have fallen victim to believing that grasses should be your primary source of calories. Just as Kentucky bluegrass and ryegrass in your backyard are grasses from the biological family Poaceae, so are wheat, rye, barley, corn, rice, bulgur, sorghum, triticale, millet, teff, and oats. You grow teeth twice in your life, then stop, leaving you to make do for a lifetime with a prepubertal set that erupted around age 10; produce a meager quart of saliva per day; have three fewer stomach compartments unpopulated by foreign organisms and without grinding action; don't chew a cud; and have a relatively uninteresting, linear, nonspiral colon. These adaptations allow you to be omnivorous--but not to consume grasses.
Early members of our species found nourishment through scavenging, and then hunting, animals such as gazelles, turtles, birds, and fish, and consuming the edible parts of plants, including fruit and roots, as well as mushrooms, nuts, and seeds. Hungry humans instinctively regarded all of these as food. About 10,000 years ago, during a period of increasing temperature and dryness in the Fertile Crescent, humans observed the ibex and aurochs grazing on einkorn, the ancient predecessor of modern wheat. Our hungry, omnivorous ancestors asked, "Can we eat that, too?" They did, and surely got sick: vomiting, cramps, and diarrhea. At the very least they simply passed wheat plants out undigested, since humans lack the ruminant digestive apparatus. Grass plants in their intact form are unquestionably unappetizing. We somehow figured out that for humans, the only edible part of the einkorn plant was the seed--not the roots, not the stem, not the leaves, not the entire seed head--just the seed, and even that was only edible after the outer husk was removed and the seed was chewed or crushed with rocks and then heated in crude pottery over fire. Only then could we consume the seeds of this grass as porridge, a practice that served us well in times of desperation when ibex meat, bird eggs, and figs were in short supply.
Similar grass-consuming adventures occurred with teosinte and maize (the ancestors of modern corn) in the Americas; rice from the swamps of Asia; and sorghum and millet in sub-Saharan Africa, all requiring similar manipulations to allow the edible part--the seed--to be consumed by humans. Some grasses, such as sorghum, posed other obstacles; its content of poisons (such as hydrocyanic acid, or cyanide) results in sudden death when the plant is consumed before maturity. Natural evolution of grasses led to wheat strains such as emmer, spelt, and kamut as wheat exchanged genes from other wild grasses, while humans selected strains of corn with larger seeds and seed heads (cobs).
What happened to those first humans, hungry and desperate, who figured out how to make this one component of grasses--the seed--edible? Incredibly, anthropologists have known this for years. The first humans to consume the grassy food of the ibex and aurochs experienced explosive tooth decay; shrinkage of the maxillary bone and mandible, resulting in tooth crowding; iron deficiency; and scurvy. They also experienced a reduction in bone diameter and length, resulting in a loss of as much as 5 inches in height for men and 3 inches for women.1
The deterioration of dental health is especially interesting, as dental decay was uncommon prior to the consumption of the seeds of grasses, affecting less than 1 percent of all teeth recovered, despite the lack of toothbrushes, toothpaste, fluoridated water, dental floss, and dentists. Even though they lacked any notion of dental hygiene (aside from possibly using a twig to pick the fibers of wild boar from between their teeth), dental decay was simply not a problem that beset many members of our species prior to the consumption of grains. The notion of toothless savages is all wrong; they enjoyed sturdy, intact teeth for their entire lives. It was only after humans began to resort to the seeds of grasses for calories that mouths of rotten and crooked teeth began to appear in children and adults. From that point on, decay was evident in 16 to 49 percent of all teeth recovered, along with tooth loss and abscesses, making tooth decay as commonplace as bad hair among humans of the agricultural Neolithic Age.2
In short, when we started consuming the seeds of grasses 10,000 years ago, this food source may have allowed us to survive another day, week, or month during times when foods we had instinctively consumed during the preceding 2.5 million years fell into short supply. But this expedient represents a dietary pattern that constitutes only 0.4 percent--less than one-half of 1 percent--of our time on earth. This change in dietary fortunes was accompanied by a substantial price. From the standpoint of oral health, humans remained in the Dental Dark Ages from their first taste of porridge all the way up until recent times. History is rich with descriptions of toothaches, oral abscesses, and stumbling and painful efforts to extract tainted teeth. Remember George Washington and his mouthful of wooden false teeth? It wasn't until the 20th century that modern dental hygiene was born and we finally managed to keep most of our teeth through adulthood.
Fast-forward to the 21st century: Modern wheat now accounts for 20 percent of all calories consumed by humans; the seeds of wheat, corn, and rice combined make up 50 percent.3 Yes, the seeds of grasses provide half of all human calories. We have become a grass seed-consuming species, a development enthusiastically applauded by agencies such as the USDA, which advises us that increasing our consumption to 60 percent of calories or higher is a laudable dietary goal. It's also a situation celebrated by all of those people who trade grain on an international scale, since the seeds of grasses have a prolonged shelf life (months to years) that allows transoceanic shipment, they're easy to store, they don't require refrigeration, and they're in demand worldwide--all the traits desirable in a commoditized version of food. The transformation of foodstuff into that of a commodity that's tradeable on a global scale allows financial manipulations, such as buying and selling futures, hedges, and complex derivative instruments--the tools of mega-commerce--to emerge. You can't do that with organic blueberries or Atlantic salmon.
Examine the anatomy of a member of the species Homo sapiens and you cannot escape the conclusion that you are not a ruminant, have none of the adaptive digestive traits of such creatures, and can only consume the seeds of grasses--the food of desperation--by accepting a decline in your health. But the seeds of grasses can be used to feed the masses cheaply, quickly, and on a massive scale, all while generating huge profits for those who control the flow of these commoditized foods.
MUTANT NINJA GRASSES
The seeds of grasses, known to us more familiarly as "grains" or "cereals," have always been a problem for us nonruminant creatures. But then busy geneticists and agribusiness got into the act. That's when grains went from bad to worse.
Readers of the original
Wheat Belly know that modern wheat is no longer the 4 1/2-foot-tall traditional plant we all remember; it is now an 18-inch-tall plant with a short, thick stalk; long seed head; and larger seeds. It has a much greater yield per acre than its traditional predecessors. This high- yield strain of wheat, now the darling of agribusiness, was not created through genetic modification, but through repetitive hybridizations, mating wheat with non-wheat grasses to introduce new genes (wheat is a grass, after all) and through mutagenesis, the use of high-dose x-rays, gamma rays, and chemicals to induce mutations. Yes: Modern wheat is, to a considerable degree, a grass that contains an array of mutations, some of which have been mapped and identified, many of which have not. Such uncertainties never faze agribusiness, however. Unique mutated proteins? No problem. The USDA and FDA say they're okay, too--perfectly fine for public consumption.
Over the years, there have been many efforts to genetically modify wheat, such as by using gene-splicing technology to insert or delete a gene. However, public resistance has dampened efforts to bring genetically modified (GM) wheat to market, so no wheat currently sold is, in the terminology of genetics, "genetically modified." (There have been recent industry rumblings, however, that make the prospect of true GM wheat a probable reality in the near future.) All of the changes introduced into modern wheat are the results of methods that predate the technology to create GM foods. This does not mean that the methods used to change wheat were benign; in fact, the crude and imprecise methods used to change wheat, such as chemical mutagenesis, have the potential to be worse than genetic modification, yielding a greater number of unanticipated changes in genetic code than the handful introduced through gene-splicing.4
Corn and rice, on the other hand, have been genetically modified, in addition to undergoing other changes. For instance, scientists introduced genes to make corn resistant to the herbicide glyphosate and to express Bacillus thurigiensis (Bt), a toxin that kills insects, while rice has been genetically modified to make it resistant to the herbicide glufosinate and to express beta-carotene (a variety called Golden Rice). Problem: While, in theory, the notion of just inserting one silly gene seems simple and straightforward, it is anything but. The methods of gene insertion remain crude. The site of insertion--which chromosome, within or alongside other genes, within or without various control elements--not to mention disruption of epigenetic effects that control gene expression, cannot be controlled with current technology. And it's misleading to say that only one gene is inserted, as the methods used usually require several genes to be inserted. (We discuss the nature of specific changes in GM grains in Chapter 2.)
The wheat, corn, and rice that make up 50 percent of the human diet in the 21st century are not the wheat, corn, and rice of the 20th century. They're not the wheat, corn, and rice of the Middle Ages, nor of the Bible, nor of the Egyptian empire. And they are definitely not the same wheat, corn, and rice that was harvested by those early hungry humans. They are what I call "Frankengrains": hybridized, mutated, genetically modified to suit the desires of agribusiness, and now available at a supermarket, convenience store, or school near you.
Wheat: What Changed . . . and Why Are the Changes So Bad?
All strains of wheat, including traditional strains like spelt and emmer, are problems for nonruminant humans who consume them. But modern wheat is the worst.
Modern wheat looks different: shorter, thicker shaft, larger seeds. The reduction in height is due to mutations in Rh (reduced height) genes that code for the protein gibberellin, which controls stalk length. This one mutant gene is accompanied by other mutations. Changes in Rh genes are thereby accompanied by other changes in the genetic code of the wheat plant.5 There's more here than meets the eye.
Gliadin
While gluten is often fingered as the source of wheat's problems, it's really gliadin, a protein within gluten, that is the culprit behind many destructive health effects of modern wheat. There are more than 200 forms of gliadin proteins, all incompletely digestible.6 One important change that has emerged over the past 50 years, for example, is increased expression of a gene called Glia-α9, which yields a gliadin protein that is the most potent trigger for celiac disease. While the Glia-α9 gene was absent from most strains of wheat from the early 20th century, it is now present in nearly all modern varieties,7 likely accounting for the 400 percent increase in celiac disease witnessed since 1948.8
New gliadin variants are partially digested into small peptides that enter the bloodstream and then bind to opiate receptors in the human brain--the same receptors activated by heroin and morphine.9 Researchers call these peptides "exorphins," or exogenous morphine-like com£ds. Gliadin-derived peptides, however, generate no "high," but they do trigger increased appetite and increased calorie consumption, with studies demonstrating consistent increases of 400 calories per day, mostly from carbohydrates.
Gluten
Gluten (gliadin + glutenins) is the stuff that confers the stretchiness unique to wheat dough. Gluten is a popular additive in processed foods such as sauces, instant soups, and frozen foods, which means the average person ingests between 15 and 20 grams (g) per day.10 Gluten has been genetically manipulated to improve the baking characteristics of its glutenin. Geneticists have therefore crossbred wheat strains repeatedly, bred wheat with non-wheat grasses to introduce new genes, and used chemicals and radiation to induce mutations. Breeding methods used to alter gluten quality do not result in predictable changes. Hybridizing two different wheat plants yields as many as 14 unique glutenin proteins never before encountered by humans.11
Wheat Germ Agglutinin
The genetic changes inflicted on wheat have altered the structure of wheat germ agglutinin (WGA), a protein in wheat that provides protection against molds and insects. The structure of WGA in modern wheat, for instance, differs from that of ancient wheat strains.12 WGA is indigestible and toxic, resistant to any breakdown in the human body, and unchanged by cooking, baking, and sourdough fermentation. Unlike gluten and gliadin, which require genetic susceptibility to exert some of their negative effects, WGA does its damage directly. WGA alone is sufficient to generate celiac disease-like intestinal damage by disrupting microvilli, the absorptive "hairs" of intestinal cells.13
Copyright © 2014 by William Davis, MD, Author of the #1 New York Times Bestseller Wheat Belly, Wheat Belly Cookbook, and Wheat Belly 30-Minute (or Less!) Cookbook. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.