You Are What You

Early farmers ate foods that hunter-gatherers did not eat, or at least they ate them in much greater quantities, and at first they were not well adjusted to the new diet. In Europe and western Asia, cereals became the dietary mainstay, usually wheat or barley, while millet and rice became the primary foods in eastern Asia. Those early farmers raised other crops, such as peas and beans, and they ate some meat, mostly from domesticated animals, but it looks as if the carbohydrate fraction of their diet almost tripled, while the amount of protein tanked.6 Protein quality decreased as well, since plant foods contained an undesirable mix of amino acids, the chemical building blocks of which proteins are made. Almost any kind of meat has the right mix, but plants often do not—and trying to build muscle with the wrong mix is a lot like playing Scrabble with more Qs than U's.

Shortages of vitamins are also likely to have been a problem among those early farmers, since the new diet included little fresh meat and was primarily based on a very limited set of crops. Hunter-gatherers would rarely have suffered vitamin-deficiency diseases such as beri-beri, pellagra, rickets, or scurvy, but farmers sometimes did. There is every reason to think that early farmers developed serious health problems from this low-protein, vitamin-short, high-carbohydrate diet. Infant mortality increased, and the poor diet was likely one of the causes. You can see the mismatch between the genes and the environment in the skeletal evidence. Humans who adopted agriculture shrank: Average height dropped by almost five inches.7

There are numerous signs of pathology in the bones of early agriculturalists. In the Americas, the introduction of maize led to widespread tooth decay and anemia due to iron deficiency, since maize is low in bioavailable iron. This story is not new: Many researchers have written about the health problems stemming from the advent of agriculture.8 Our point is that, over millennia, populations responded to these new pressures. People who had genetic variants that helped them deal with the new diet had more surviving children, and those variants spread: Farmers began to adapt to an agricultural diet. Humanity changed.

We are beginning to understand some of the genetic details of these dietary adaptations, which took several forms. Some of the selected alleles appear to have increased efficiency—that is to say, their bearers were able to extract more nutrients from an agricultural diet. The most dramatic examples are mutations that allow adults to digest lactose, the main sugar in milk. Hunter-gatherers, and mammals in general, stop making lactase (the enzyme that digests lactose) in childhood. Since mother's milk was the only lactose-containing "food" available to humans in days of yore, there wasn't much point in older children or adults making lactase—and shutting down production may have decreased destructive forms of sibling rivalry. But after the domestication of cattle, milk was available and potentially valuable to people of all ages, if only they could digest it. A mutation that caused continued production of lactase originated some 8,000 years ago and has spread widely among Europeans, reaching frequencies of over 95 percent in Denmark and Sweden. Other mutations with a similar effect have become common (despite starting several thousand years later) in some of the cattle-raising tribes in East Africa, so that 90 percent of the Tutsi are lactose tolerant today. These mutations spread quite rapidly and must have been very advantageous.

When you think about it, the whole process is rather strange: Northern Europeans and some sub-Saharan Africans have become "mampires," mutants that live off the milk of another species. We think lactose-tolerance mutations played an important role in history, a subject we will treat at some length in Chapter 6.

Some genetic changes may have helped to compensate for shortages in the new diet. For example, we see changes in genes affecting transport of vitamins into cells.9 Similarly, vitamin D shortages in the new diet may have driven the evolution of light skin in Europe and northern Asia. Vitamin D is produced by ultraviolet radiation from the sun acting on our skin—an odd, plantlike way of going about things. Less is therefore produced in areas far from the equator, where UV flux is low. Since there is plenty of vitamin D in fresh meat, hunter-gatherers in Europe may not have suffered from vitamin D shortages and thus may have been able to get by with fairly dark skin. In fact, this must have been the case, since several of the major mutations causing light skin color appear to have originated after the birth of agriculture. Vitamin D was not abundant in the new cereal-based diet, and any resulting shortages would have been serious, since they could lead to bone malformations (rickets), decreased resistance to infectious diseases, and even cancer. This may be why natural selection favored mutations causing light skin, which allowed for adequate vitamin D synthesis in regions with little ultraviolet radiation.

There were other changes that ameliorated nasty side effects of the new unbalanced diets. The big increase in carbohydrates, especially carbohydrates that are rapidly broken down in digestion, interfered with the control of blood sugar and appears to have caused metabolic problems such as diabetes. A high-carbohydrate diet also apparently causes acne and tooth decay, both of which are rare among hunter-gatherers. More exactly, both are caused by infectious organisms, but those organisms only cause trouble in the presence of a high-carbohydrate diet.

Some of the protective changes took the form of new versions of genes involved in insulin regulation. Researchers in Iceland have found that new variants of a gene regulating blood sugar protect against diabetes.10 Those variants have different ages in the three populations studied (Europeans, Asians, and sub-Saharan Africans), and in each population the protective variant is roughly as old as agriculture. Alcoholic drinks, also part of the new diet, had plenty of bad side effects, and in East Asia there are strongly selected alleles that are known to materially reduce the risk of alcoholism.

Clearly, the evolutionary responses to an agricultural diet must differ, since different peoples adopted different kinds of agriculture at different times and in different environments. This variation has caused biological differences in the metabolic responses to an agricultural diet that persist today, but it has also generated differences in every other kind of adaptive response to the new society. Agriculture began in the Middle East 10,000 years ago and took almost 5,000 years to spread throughout Europe. Amerindians in the Illinois and Ohio river valleys adopted maize agriculture only 1,000 years ago, but the Australian Aborigines never domesticated plants at all. Peoples who have farmed since shortly after the end of the Ice Age (such as the inhabitants of the Middle East) must have adapted most thoroughly to agriculture. In areas where agriculture is younger, such as Europe or China, we'd expect to see fewer adaptive changes—except to the extent that the inhabitants were able to pick up genes from older farming peoples. And we'd expect to see fewer adaptive changes still among the Amerindians and sub-Saharan Africans, who had farmed for even shorter times and were genetically isolated from older civilizations by geographical barriers. In groups that had remained foragers, there would presumably be no such adaptive changes—most certainly not in isolated forager populations.

Populations that have never farmed or that haven't farmed for long, such as the Australian Aborigines and many Amerindians, have characteristic health problems today when exposed to Western diets. The most severe such problem currently is a high incidence of type 2 diabetes. Low physical activity certainly contributes to that problem today, but genetic vulnerability is a big part of the story: Navajo couch potatoes are far more likely to get adult-onset diabetes than German or Chinese couch potatoes. The prevalence of diabetes among the Navajo is about two and a half times higher than it is in their European-descended neighbors, and about four times more common among Australian Aborigines than in other Australians. We think this is a consequence of a lesser degree of adaptation to high-carbohydrate diets. Interestingly, Polynesians are also prone to diabetes (with roughly three times European rates), even though they practiced agriculture, raising crops such as yams, taro, bananas, breadfruit, and sweet potato. We believe that their case still fits our general picture of incomplete adaptation, however. Among the Polynesians, adaptation would have been limited by the relatively small population size and the low rate of protective mutations it would have generated. In addi tion, settlement bottlenecks and limited contacts between the populations of the far-flung Polynesian islands would have interfered with the spread of any favorable mutations that did occur.

Our explanation of this susceptibility pattern differs from the well-known "thrifty genotype" hypothesis originally promulgated by James Neel. He suggested that pre-agricultural peoples were especially prone to famine and that the metabolic differences that led to diabetes in modern environments had helped people survive food shortages in the past.11 This seems unlikely. The lower rungs of agricultural societies in Europe and East Asia usually suffered food shortages severe enough to cause below-replacement fertility, and weather-related crop failure often struck whole nations or even larger regions. Sometimes this led to famines severe enough to lead to widespread cannibalism, as seems to have occurred in the great famine that struck most of northern Europe from 1315 to 1317.

Hunter-gatherers should have been, if anything, less vulnerable to famine than farmers, since they did not depend on a small set of domesticated plant species (which might suffer from insect pests or fungal blights even in a year with good weather), and because local violence usually kept their populations well below the local carrying capacity.12 State societies limited local violence, but in a Malthusian world, something always limits population growth. In this case, fewer deaths by violence meant more deaths due to starvation and infectious disease. Moreover, hunter-gatherer societies do not appear to have been divided into well-fed elites and hungry lower classes, a situation that virtually guarantees malnourishment and/or famine among a significant fraction of the population, whereas agricultural societies did have divisions of this sort. We believe that our explanation, based on the evolutionary response to a well-established increase in carbohydrate consumption among farmers, is more likely to be correct than an explanation based on the idea that hunter-gatherers were particularly prone to famine, a notion that has no factual support.

Most populations that are highly vulnerable to type 2 diabetes also have increased risks of alcoholism. This is no coincidence. It's not that the same biochemistry underlies both conditions, but that both stem from the same ultimate cause: limited previous exposure to agricultural diets, and thus limited adaptation to such diets.

Booze inevitably accompanies farming. People have been brewing alcoholic beverages since the earliest days of agriculture: Beer may date back more than 8,000 years. There's even a hypothesis that barley was first domesticated for use in brewing beer rather than bread. Essentially all agricultural peoples developed and routinely consumed some kind of alcoholic beverage. In those populations with long exposure, natural selection must have gradually increased the frequency of alleles that decreased the risk of alcoholism, due to its medical and social disadvantages. This process would have gone furthest in old agricultural societies and presumably would not have occurred at all among pure hunter-gatherers.

We must wonder why farming peoples didn't just evolve an aversion to alcohol. It seems as if that would have been a bad strategy, since moderate consumption of traditional, low-proof alcoholic drinks was almost certainly healthful. People who drank wine or beer avoided waterborne pathogens, which were a lethal threat in high-density populations. Alleles that reduced the risk of alcoholism therefore prevailed.

There is also some reason to believe that populations that have been drinking alcohol for hundreds of generations may have also evolved metabolic changes that reduced some of alcohol's other risks. In particular, we know that alcohol consumption by pregnant women can have devastating effects on their offspring. Those effects, called fetal alcohol syndrome, or FAS, include growth deficiency, facial abnormalities, and damage to the central nervous system. FAS is, however, far more common in some populations than in others: Its prevalence is almost thirty times higher in African American or Amerindian populations in the United States than it is among Europeans—even though the French, for example, have been known to take a drink or two. Some populations, such as those of sub-Saharan Africa and their diaspora, may run higher risks of suffering from FAS than others consuming similar amounts of alcohol. If so, study of the alleles protecting against FAS in resistant populations might lead to greater understanding of the biochemical mechanisms underlying the syndrome. With luck, we might be able to use that information to decrease the incidence of FAS in vulnerable populations.

This picture of adaptation to agricultural diets has two important implications: Populations today must vary in their degree of adaptation to such diets, depending on their historical experience, and populations must have changed over time.

For example, there must have been a time when no one was lactose tolerant, a later time in which the frequency was intermediate, and finally a time when it reached modern levels. In this instance, we have hard evidence of such change. In a 2007 study, researchers studied DNA from the skeletons of people who died between 7,000 and 8,000 years ago. These skeletons were from central and northern Europe, where today the frequency of the lactase-persistence variant is around 80 percent. None of those ancient northern Europeans had that allele.13 In another study, a different group of researchers looked at central

European skeletons from the late Bronze Age, some 3,000 years ago. Back then, the gene frequency (judging from their sample) was apparently around 25 percent.14 This shows that the frequency of lactose tolerance really has changed over time in the way indicated by the HapMap genetic data. The theory made sense, but experimental confirmation is always welcome. We expect that there will be many similar results (showing ongoing change in sweeping genes) in studies of ancient DNA over the next few years.

Over time, if our argument is correct, farming peoples should have become better adapted to their agricultural diets in many ways, and we might expect that some of the skeletal signs of physiological stress would have gradually decreased. Although such genetic adaptation clearly occurred, cultural changes that improved health must have occurred as well. For example, the adoption of new crops and new methods of food preparation would have improved the nutritional quality of the average peasant's diet. Of course, some of those new methods (polishing rice) and new crops (sugarcane)—actually made things worse. Adaptive change is slow and blind, but it is also sure and steady. Cultural change is less reliable.

But cultural change is important. Although many traditional archaeologists and anthropologists will probably see us as biological imperialists out to explain everything that ever happened with our pet genetic theories, we firmly believe that cultural change—new ideas, new techniques, new forms of social organization—were powerful influences on the historical process. We're simply saying that the complete historical analyst must consider genetic change as well as social, cultural, and political change. Once a list of battles and kings seemed plenty good enough, but life keeps getting more complicated.

Was this article helpful?

0 0

Post a comment