Hunter-gatherer diet

This article is about a modern nutritional approach. For information on the dietary practices of Paleolithic humans, see Paleolithic#Diet and nutrition.

The paleolithic diet (abbreviated paleo diet or paleodiet), also popularly referred to as the caveman diet, Stone Age diet and hunter-gatherer diet, is a modern nutritional plan based on the presumed ancient diet of wild plants and animals that various hominid species habitually consumed during the Paleolithic era—a period of about 2.5 million years which ended around 10,000 years ago with the development of agriculture and grain-based diets. In common usage, the term "paleolithic diet" can also refer to actual ancestral human diets, insofar as these can be reconstructed.[1]

Centered on commonly available modern foods, the contemporary "Paleolithic diet" consists mainly of fish, grass-fed pasture raised meats, eggs, vegetables, fruit, fungi, roots, and nuts, and excludes grains, legumes, dairy products, potatoes, refined salt, refined sugar, and processed oils.[2][3]

First popularized in the mid-1970s by gastroenterologist Walter L. Voegtlin,[4][5] this nutritional concept has been promoted and adapted by a number of authors and researchers in several books and academic journals.[6] A common theme in evolutionary medicine,[7][8] Paleolithic nutrition is based on the premise that human genetics have scarcely changed since the dawn of agriculture, and modern humans are genetically adapted to the diet of their Paleolithic ancestors. Therefore an ideal diet for human health and well-being is one that resembles this ancestral diet.[3][9]

Proponents of this diet argue that modern human populations subsisting on traditional diets, allegedly similar to those of Paleolithic hunter-gatherers, are largely free of diseases of affluence.[10][11] They assert that multiple studies of the effect of Paleolithic diet in humans have shown improved health outcomes relative to other widely recommended diets.[12][13] Supporters also point to several potentially therapeutic nutritional characteristics of preagricultural diets.[9][14][15][16]

The paleolithic diet is a controversial topic amongst some dietitians[17][18] and anthropologists.[6][19] An article on the website of the National Health Service of the United Kingdom Choices refers to it as a fad diet.[20]

History

Gastroenterologist Walter L. Voegtlin was one of the first to suggest that following a diet similar to that of the Paleolithic era would improve a person's health.[5] In 1975, he self-published The Stone Age Diet: Based on In-depth Studies of Human Ecology and the Diet of Man,[4] in which he argued that humans are carnivorous animals. He noted that the ancestral Paleolithic diet was that of a carnivore — chiefly fats and protein, with small amounts of carbohydrates.[21][22] His dietary prescriptions were based on his own medical treatments of various digestive problems, namely colitis, Crohn's disease, irritable bowel syndrome and indigestion.[23][24]

In 1985, S. Boyd Eaton and Melvin Konner, both of Emory University, published a paper on Paleolithic nutrition in the New England Journal of Medicine,[25] which attracted wider mainstream medical attention to the concept.[26] Three years later, S. Boyd Eaton, Konner, and Marjorie Shostak published a book about this nutritional approach,[27] which was based on achieving the same proportions of nutrients (fat, protein, and carbohydrates, as well as vitamins and minerals) as were present in the diets of late Paleolithic people. It did not exclude foods that were not available before the development of agriculture. As such, this nutritional approach included skimmed milk, whole-grain bread, brown rice, and potatoes prepared without fat, on the premise that such foods supported a diet with the same macronutrient composition as the Paleolithic diet.[21][28][29] In 1989, these authors published a second book on Paleolithic nutrition.[30][31]

Starting in 1989, Staffan Lindeberg, a Swedish medical doctor and scientist now associate professor at Lund University, led scientific surveys of the non-westernized population on Kitava, one of the Trobriand Islands of Papua New Guinea. These surveys, collectively referred to as the Kitava Study, found that this population apparently did not suffer from stroke, ischemic heart disease, diabetes, obesity or hypertension. Starting with the first publication in 1993,[32] scholars with the Kitava Study have published a number of scientific works on the relationship between diet and western disease.[33] In 2003, Lindeberg published a Swedish-language medical textbook on the subject.[34] In 2010, this book was wholly revised, updated, translated and published for the first time in English.[35]

Since the end of the 1990s, a number of medical doctors and nutritionists[36][37][38] have advocated a return to a so-called Paleolithic (preagricultural) diet.[6] Proponents of this nutritional approach have published books[39][40][41] and created websites[42][43][44][45] to promote their dietary prescriptions.[46][47][48][49][50] They have synthesized diets from modern foods that emulate nutritional characteristics of the ancient Paleolithic diet. Some of these allow specific foods that would have been unavailable to pre-agricultural peoples, such as some animal products (i.e. dairy), processed oils, and beverages.[39][51][52]

Practices

The paleolithic diet is a modern dietary regimen that seeks to mimic the diet of preagricultural hunter-gatherers; it generally corresponds to what was available in any of the ecological niches of Paleolithic humans.[2][3] Based upon commonly available modern foods, it includes cultivated plants and domesticated animal meat as an alternative to the wild sources of the original pre-agricultural diet.[2][53] The ancestral human diet is inferred from historical and ethnographic studies of modern-day hunter-gatherers as well as archaeological finds, anthropological evidence and application of optimal foraging theory.[9][54][55][56]

The Paleolithic diet consists of foods that can be hunted and fished, such as meat, offal, and seafood, and foods that can be gathered, such as eggs, insects, fruit, nuts, seeds, vegetables, mushrooms, herbs, and spices.[2] The meats that are recommended to consume are preferred to be free of food additives, preferably wild game meats and grass-fed beef since they contain higher levels of omega-3 fats compared with grain-produced domestic meats.[2][53][57] Food groups that advocates claim were rarely or never consumed by humans before the Neolithic (agricultural) revolution are excluded from the diet, mainly grains, legumes (e.g. beans and peanuts), dairy products, salt, refined sugar and processed oils.[2] Some advocates consider the use of oils with low omega-6/omega-3 ratios, such as olive and canola oils, to be healthy and advisable.[53]


On the Paleolithic diet, practitioners are permitted to drink mainly water, and some advocates recommend tea as a healthy drink.[53] Eating a wide variety of plant foods is recommended to avoid high intakes of potentially harmful bioactive substances, such as goitrogens, which are present in some roots, vegetables, and seeds.[2][54][58] Unlike raw food diets, all foods may be cooked, without restrictions.[2][59] But, there are Paleolithic dieters who believe that humans have not adapted to cooked foods, and so they eat only foods which are both raw and Paleolithic.[60][61]

According to certain proponents of the Paleolithic diet, practitioners should derive about 56–65% of their food energy from animal foods and 36–45% from plant foods. They recommend a diet high in protein (19–35% energy) and relatively low in carbohydrates (22–40% energy), with a fat intake (28–58% energy) similar to or higher than that found in Western diets.[53][62][63]

Staffan Lindeberg advocates a Paleolithic diet, but does not recommend any particular proportions of plants versus meat or macronutrient ratios.[2][54] According to Lindeberg, calcium supplementation may be considered when the intake of green leafy vegetables and other dietary sources of calcium is limited.[2]

Rationale and evolutionary assumptions

According to S. Boyd Eaton, "we are the heirs of inherited characteristics accrued over millions of years; the vast majority of our biochemistry and physiology are tuned to life conditions that existed before the advent of agriculture some 10,000 years ago. Genetically our bodies are virtually the same as they were at the end of the Paleolithic era some 20,000 years ago."[64]

Paleolithic nutrition has its roots in evolutionary biology and is a common theme in evolutionary medicine.[7][8][65] The reasoning underlying this nutritional approach is that natural selection had sufficient time to genetically adapt the metabolism and physiology of Paleolithic humans to the varying dietary conditions of that era. But in the 10,000 years since the invention of agriculture and its consequent major change in the human diet, natural selection has had too little time to make the optimal genetic adaptations to the new diet.[2] Physiological and metabolic maladaptations result from the suboptimal genetic adaptations to the contemporary human diet, which in turn contribute to many of the so-called diseases of civilization.[3]

More than 70% of the total daily energy consumed by all people in the United States comes from foods such as dairy products, cereals, refined sugars, refined vegetable oils and alcohol. Advocates of the Paleolithic diet assert these foods contributed little or none of the energy in the typical preagricultural hominin diet.[9] Proponents of this diet argue that excessive consumption of these novel Neolithic and industrial-era foods is responsible for the current epidemic levels of obesity, cardiovascular disease, high blood pressure, type 2 diabetes, osteoporosis and cancer in the US and other contemporary Western populations.[9]

Physical activity

Researchers have applied the evolutionary rationale to the paleolithic lifestyle to argue for high levels of physical activity in addition to dietary practices. They suggest that human genes "evolved with the expectation of requiring a certain threshold of physical activity" and that the sedentary lifestyle results in abnormal gene expression.[66][67] Compared to ancestral humans, modern humans often have increased body fat and substantially less lean muscle, which is a risk factor for insulin resistance.[68] Human metabolic processes were evolved in the presence of physical activity-rest cycles, which regularly depleted skeletal muscles of their glycogen stores.[69] To date it is unclear whether these activity cycles universally included prolonged endurance activity (e.g. persistence hunting) and/or shorter, higher intensity activity. S. Boyd Eaton estimated that ancestral humans spent one-third of their caloric intake on physical activity (1000 kcal/day out of the total caloric intake of 3000 kcal/day),[70] and that the paleolithic lifestyle was well approximated by the WHO recommendation of the physical activity level of 1.75, or 60 minutes/day of moderate-intensity exercise.[71] L. Cordain estimated that the optimal level of physical activity is on the order of 90 kcal/kg/week (900 kcal/day for a 70 kg human.)[67]

Opposing views

Critics have questioned the accuracy of the science on which the diet is based. John A. McDougall (M.D), author of The Starch Solution, attempted to discredit the science used to determine the paleolithic diet, and proposed that the human diet around this time was instead based primarily on starches.

The evolutionary assumptions underlying the Paleolithic diet have been disputed.[1][19][28][72] According to Alexander Ströhle, Maike Wolters and Andreas Hahn, with the Department of Food Science at the University of Hanover, the statement that the human genome evolved during the Pleistocene (a period from 1,808,000 to 11,550 years ago) rests on the gene-centered view of evolution, which they believe to be controversial.[72] They rely on Gray (2001)[73] to argue that evolution of organisms cannot be reduced to the genetic level with reference to mutation, and that there is no one-to-one relationship between genotype and phenotype.[72] They further question the notion that 10,000 years is an insufficient period of time to ensure an adequate adaptation to agrarian diets.[72] They note that alleles conferring lactose tolerance increased to high frequencies in Europe just a few thousand years after animal husbandry was invented. Recent increases in the number of copies of the gene for salivary amylase, which digests starch, appear to be related to the development of agriculture.[74] Referring to Wilson (1994),[75] Ströhle et al. argue that "the number of generations that a species existed in the old environment was irrelevant, and that the response to the change of the environment of a species would depend on the heritability of the traits, the intensity of selection and the number of generations that selection acts."[76] They state that if the diet of Neolithic agriculturalists had been in discordance with their physiology, then this would have created a selection pressure for evolutionary change. Modern humans, such as Europeans, whose ancestors have subsisted on agrarian diets for 400–500 generations, should be somehow adequately adapted to it. In response to this argument, Wolfgang Kopp states that "we have to take into account that death from atherosclerosis and cardiovascular disease (CVD) occurs later during life, as a rule after the reproduction phase. Even a high mortality from CVD after the reproduction phase will create little selection pressure. Thus, it seems that a diet can be functional (it keeps us going) and dysfunctional (it causes health problems) at the same time."[76] Moreover, S. Boyd Eaton and colleagues have indicated that "comparative genetic data provide compelling evidence against the contention that long exposure to agricultural and industrial circumstances has distanced us, genetically, from our Stone Age ancestors";[11] however, they mention exceptions such as increased lactose and gluten tolerance, which improve ability to digest dairy and grains, while other studies indicate that human adaptive evolution has accelerated since the Paleolithic.[77]

Referencing Mahner et al. (2001)[78] and Ströhle et al. (2006),[79] Ströhle et al. state that "whatever is the fact, to think that a dietary factor is valuable (functional) to the organism only when there was ‘genetical adaptation’ and hence a new dietary factor is dysfunctional per se because there was no evolutionary adaptation to it, such a panselectionist misreading of biological evolution seems to be inspired by a naive adaptationistic view of life."[72]

Katharine Milton, a professor of physical anthropology at the University of California, Berkeley, has also disputed the evolutionary logic upon which the Paleolithic diet is based. She questions the premise that the metabolism of modern humans must be genetically adapted to the dietary conditions of the Paleolithic.[19] Relying on several of her previous publications,[80][81][82][83] Milton states that "there is little evidence to suggest that human nutritional requirements or human digestive physiology were significantly affected by such diets at any point in human evolution."[19]

There is some evidence suggesting that Paleolithic societies were processing cereals for food use at least as early as 23,000. These findings are a matter of dispute.[84][85][86][87][88]

Plant-to-animal ratio

The specific plant to animal food ratio in the Paleolithic diet is also a matter of some dispute. The average diet among modern hunter-gatherer societies is estimated to consist of 64–68% of animal calories and 32–36% of plant calories,[63][89] with animal calories further divided between fished and hunted animals in varying proportions (most typically, with hunted animal food comprising 26–35% of the overall diet). As part of the Man the Hunter paradigm, this ratio was used as the basis of the earliest forms of the Paleolithic diet by Voegtlin, Eaton and others. To this day, many advocates of the Paleolithic diet consider high percentage of animal flesh to be one of the key features of the diet.

However, great disparities do exist, even between different modern hunter-gatherer societies. The animal-derived calorie percentage ranges from 25% in the Gwi people of southern Africa, to 99% in Alaskan Nunamiut.[90] The animal-derived percentage value is skewed upwards by polar hunter-gatherer societies, who have no choice but to eat animal food because of the inaccessibility of plant foods. Since those environments were only populated relatively recently (for example, Paleo-Indian ancestors of Nunamiut are thought to have arrived in Alaska no earlier than 30,000 years ago), such diets represent recent adaptations rather than conditions that shaped human evolution during much of the Paleolithic. More generally, hunting and fishing tend to provide a higher percentage of energy in forager societies living at higher latitudes. Excluding cold-climate and equestrian foragers results in a diet structure of 52% plant calories, 26% hunting calories, and 22% fishing calories.[89] Furthermore, those numbers may still not be representative of a typical Stone Age diet, since fishing did not become common in many parts of the world until the Upper Paleolithic period 35-40 thousand years ago,[91] and early humans' hunting abilities were relatively limited,[dubious ] compared to modern hunter-gatherers, as well (the oldest incontrovertible evidence for the existence of bows only dates to about 8000 BCE,[92] and nets and traps were invented 20,000 to 29,000 years ago).

Another view is that, up until the Upper Paleolithic, humans were frugivores (fruit eaters), who supplemented their meals with carrion, eggs, and small prey such as baby birds and mussels, and, only on rare occasions, managed to kill and consume big game such as antelopes.[93] This view is supported by the studies of higher apes, particularly chimpanzees. Chimpanzees are closest to humans genetically, sharing more than 98% of their DNA code with humans, and their digestive tract is functionally very similar to that of humans. Chimpanzees are primarily frugivores, but they could and would consume and digest animal flesh, given the opportunity. In general, their actual diet in the wild is about 95% plant-based, with the remaining 5% filled with insects, eggs, and baby animals.[94][95] However, in some ecosystems chimpanzees are predatory, forming parties to hunt monkeys. [96] Some comparative studies of human and higher primate digestive tracts do suggest that humans have evolved to obtain greater amounts of calories from sources such as animal foods, allowing them to shrink the size of the gastrointestinal tract, relative to body mass, and to increase the brain mass instead.[82][97]

A difficulty with the frugivore point of view is that humans are established to conditionally require certain long-chain polyunsaturated fatty acids (LC-PUFAs), such as AA and DHA, from the diet.[98] Human LC-PUFA requirements are much greater than chimpanzees' because of humans' larger brain mass, and humans' abilities to synthesize them from other nutrients are poor, suggesting readily available external sources.[99] Pregnant and lactating females require 100 mg of DHA per day.[100] But LC-PUFAs are almost nonexistent in plants and in most tissues of warm-climate animals.

The main sources of DHA in the modern human diet are fish and the fatty organs of animals, such as brains, eyes and viscera. Microalgae is a farmed plant-based source commonly used by vegetarians. Despite the general shortage of evidence for extensive fishing, thought to require relatively sophisticated tools which have become available only in the last 30–50 thousand years, it has been argued that exploitation of coastal fauna somehow provided hominids with abundant LC-PUFAs.[99] Alternatively, it has been proposed that early hominids frequently scavenged predators' kills and consumed parts which were left untouched by predators, most commonly the brain, which is very high in AA and DHA.[100] Just 100 g of scavenged African ruminant brain matter provide more DHA than is consumed by a typical modern U.S. adult in the course of a week.[100][101] Other authors suggested that human ability to convert alpha-Linolenic acid into DHA, while poor, is, nevertheless, adequate to prevent DHA deficiency in a plant-based diet.[102]

Nutritional factors and health effects


Since the end of the Paleolithic period, several foods that humans rarely or never consumed during previous stages of their evolution have been introduced as staples in their diet.[9] With the advent of agriculture and the beginning of animal domestication roughly 10,000 years ago, during the Neolithic Revolution, humans started consuming large amounts of dairy products, beans, cereals, alcohol and salt.[9] In the late 18th and early 19th centuries, the Industrial revolution led to the large scale development of mechanized food processing techniques and intensive livestock farming methods, that enabled the production of refined cereals, refined sugars and refined vegetable oils, as well as fattier domestic meats, which have become major components of Western diets.[9]

Such food staples have fundamentally altered several key nutritional characteristics of the human diet since the Paleolithic era, including glycemic load, fatty acid composition, macronutrient composition, micronutrient density, acid-base balance, sodium-potassium ratio, and fiber content.[9]

These dietary compositional changes have been theorized as risk factors in the pathogenesis of many of the so-called "diseases of civilization" and other chronic illnesses that are widely prevalent in Western societies,[3][9][103][104][105][106] including obesity,[107][108][109] cardiovascular disease,[110][111][112] high blood pressure,[113] type 2 diabetes,[114][115] osteoporosis,[116][117] autoimmune diseases,[118] colorectal cancer,[119][120][121] myopia,[122] acne,[123][124][125][126] depression,[127] and diseases related to vitamin and mineral deficiencies.[118][128][129][130]

Macronutrient composition

Protein and carbohydrates

"The increased contribution of carbohydrate from grains to the human diet following the agricultural revolution has effectively diluted the protein content of the human diet."[131] In modern hunter-gatherer diets, dietary protein is characteristically elevated (19–35% of energy) at the expense of carbohydrate (22–40% of energy).[62][63][132] High-protein diets may have a cardiovascular protective effect and may represent an effective weight loss strategy for the overweight or obese.[9] Furthermore, carbohydrate restriction may help prevent obesity and type 2 diabetes,[133][134] as well as atherosclerosis.[112] Carbohydrate deprivation to the point of ketosis has been argued both to have negative[135] and positive effects on health.[136][137]


The notion that preagricultural hunter-gatherers would have typically consumed a diet relatively low in carbohydrate and high in protein has been questioned.[138] Critics argue that there is insufficient data to identify the relative proportions of plant and animal foods consumed on average by Paleolithic humans in general,[6][19][79][139] and they stress the rich variety of ancient and modern hunter-gatherer diets.[1][19][72] Furthermore, preagricultural hunter-gatherers may have generally consumed large quantities of carbohydrates in the form of carbohydrate-rich tubers (plant underground storage organs).[1][18][72] According to Staffan Lindeberg, an advocate of the Paleolithic diet, a plant-based diet rich in carbohydrates is consistent with the human evolutionary past.[2][3]

It has also been argued that relative freedom from degenerative diseases was, and still is, characteristic of all hunter-gatherer societies irrespective of the macronutrient characteristics of their diets.[140][141][142] Marion Nestle, a professor in the Department of Nutrition and Food Studies at New York University, judging from research relating nutritional factors to chronic disease risks and to observations of exceptionally low chronic disease rates among people eating vegetarian, Mediterranean and Asian diets, has suggested that plant-based diets may be most associated with health and longevity.[17][139]

Fatty acids

Hunter-gatherer diets have been argued to maintain relatively high levels of monounsaturated and polyunsaturated fats, moderate levels of saturated fats (10–15% of total food energy[143]) as well as a low omega-6:omega-3 fatty acid ratio.[9][63][144] Cows fed a grass-based diet produce significant amounts of omega-3 fatty acids compared to grain-fed animals, while minimizing trans fats and saturated fats.[145] This high ratio of polyunsaturated to saturated fats has been challenged. While a low saturated fat intake was argued for[63] it has been argued that hunter-gatherers would selectively hunt fatter animals and utilise the fattiest parts of the animals (such as bone marrow).[146]

Energy density

The Paleolithic diet has lower energy density than the typical diet consumed by modern humans.[147] This is especially true in primarily plant-based/vegetarian versions of the diet, but it still holds if substantial amounts of meat are included in calculations. For example, most fruits and berries contain 0.4 to 0.8 calories per gram, vegetables can be even lower than that (cucumbers contain only 0.16 calories per gram).[148] Game meat, such as cooked wild rabbit, is more energy-dense (up to 1.7 calories per gram), but it does not constitute the bulk of the diet by mass/volume at the recommended plant/animal ratios, and it does not reach the densities of many processed foods commonly consumed by modern humans: most McDonalds sandwiches such as the Big Mac average 2.4 to 2.8 calories/gram,[149] and sweets such as cookies and chocolate bars commonly exceed 4 calories/gram.

There is substantial evidence that people consuming high energy-density diets are prone to overeating and they are at a greater risk of weight gain. Conversely, low caloric density diets tend to provide a greater satiety feeling at the same energy intake, and they have been shown effective at achieving weight loss in overweight individuals without explicit caloric restrictions.[150][151][152]

Even some authors who may otherwise appear to be critical of the concept of Paleolithic diet have argued that high energy density of modern diets, as compared to ancestral/primate diets, contributes to the incidence of diseases of affluence in the industrial world.[83]

Micronutrient density


Fruits, vegetables, meat and organ meats, and seafood, which are staples of the hunter-gatherer diet, are more micronutrient-dense than refined sugars, grains, vegetable oils, and dairy products in relation to digestible energy. Consequently, the vitamin and mineral content of the diet is very high compared with a standard diet, in many cases a multiple of the RDA. Fish and seafood represent a particularly rich source of omega-3 fatty acids and other micronutrients, such as iodine, iron, zinc, copper, and selenium, that are crucial for proper brain function and development.[128] Terrestrial animal foods, such as muscle, brain, bone marrow, thyroid gland, and other organs, also represent a primary source of these nutrients.[58] Calcium-poor grains and legumes are excluded from the diet. Although, leafy greens like Kale and dandelion greens as well as nuts such as almonds are very high sources of calcium. Also, components in plants make their low calcium amounts much more easily absorbed, unlike items with high calcium content such as dairy[153][154][155][156] Two notable exceptions are calcium (see below) and vitamin D, both of which may be present in the diet in inadequate quantities. Modern humans require much more vitamin D than hunter-gatherers, because they do not get the same amount of exposure to sun. This need is commonly satisfied in developed countries by artificially fortifying dairy products with the vitamin. To avoid deficiency, a modern human on a hunter-gatherer diet would have to take artificial supplements of the vitamin, ensure adequate intake of some fatty fish,[157] or increase the amount of exposure to sunlight (it has been estimated that 30 minutes of exposure to mid-day sun twice a week is adequate for most people).[158]

Fiber content and glycemic load

Despite its relatively low carbohydrate content, the Paleolithic diet involves a substantial increase in consumption of fruit and vegetables, compared to the Western diet, potentially as high as 1.65 to 1.9 kg/day.[159] Hunter-gatherer diets, which rely on uncultivated, heavily fibrous fruit and vegetables, contain even more. Fiber intake in preagricultural diets is thought to have exceeded 100 g/day.[64] This is dramatically higher than the actual current U.S. intake of 15 g/day.[64]


Unrefined wild plant foods like those available to contemporary hunter-gatherers typically exhibit low glycemic indices.[160] Moreover, dairy products, such as milk, have low glycemic indices, but are highly insulinotropic, with an insulin index similar to that of white bread.[161][162] However, in fermented milk products, such as yogurt, the presence of organic acids may counteract the insulinotropic effect of milk in mixed meals.[163] These dietary characteristics may lower risk of diabetes, obesity and other related metabolic syndrome diseases by placing less stress on the pancreas to produce insulin due to staggered absorption of glucose, thus preventing insulin insensitivity.[164]

Sodium-potassium ratio

It has been estimated that people in the Paleolithic era consumed 11,000 mg of potassium and 700 mg of sodium daily.[25]

The dominance of sodium over potassium in the U.S. diet adversely affects cardiovascular function and contributes to hypertension and stroke:[117][165] the Paleolithic diet inverts this ratio.

Calcium and acid-base balance

Diets containing high amounts of animal products, animal protein, processed foods, and other foods that induce and sustain increased acidity of body fluid may contribute to the development of osteoporosis and renal stones, loss of muscle mass, and age-related renal insufficiency due to the body's use of calcium to buffer pH.[166][167] The paleo diet may not contain the high levels of calcium recommended in the U.S. to prevent these effects.[168] However, because of the absence of cereals and energy-dense, nutrient-poor foods in the ancestral hunter-gatherer diet—foods that displace base-yielding fruits and vegetables—that diet has been estimated to produce a net base load on the body, as opposed to a net acid load,[116] which may reduce calcium excretion.[169]

Bioactive substances and antinutrients

Furthermore, cereal grains, legumes and milk contain bioactive substances, such as gluten and casein, which have been implicated in the development of various health problems.[3] Consumption of gluten, a component of certain grains, such as wheat, rye and barley, is known to have adverse health effects in individuals suffering from a range of gluten sensitivities, including celiac disease. Since the Paleolithic diet is devoid of cereal grains, it is free of gluten. The paleo diet is also casein-free. Casein, a protein found in milk and dairy products, may impair glucose tolerance in humans.[3]

Compared to Paleolithic food groups, cereal grains and legumes contain high amounts of antinutrients, including alkylresorcinols, alpha-amylase inhibitors, protease inhibitors, lectins and phytates, substances known to interfere with the body's absorption of many key nutrients.[3][104][118] Molecular-mimicking proteins, which are basically made up of strings of amino acids that closely resemble those of another totally different protein, are also found in grains and legumes, as well as milk and dairy products.[3][104][118] Advocates of the Paleolithic diet have argued that these components of agrarian diets promote vitamin and mineral deficiencies and may explain the development of the "diseases of civilization" as well as a number of autoimmune-related diseases.[3][104][118]

Research

Archeological record

One line of evidence used to support the Stone Age diet is the decline in human health and body mass that occurred with the adoption of agriculture, at the end of the Paleolithic era.[1][118] Associated with the introduction of domesticated and processed plant foods, such as cereal grains, in the human diet, there was, in many areas, a general decrease in body stature and dentition size, and an increase in dental caries rates. There is evidence of a general decline in health in some areas; whether the decline was caused by dietary change is debated academically.[6][170][171]

Observational studies

Based on the subsistence patterns and biomarkers of hunter-gatherers studied in the last century, advocates argue that modern humans are well adapted to the diet of their Paleolithic ancestor.[172] The diet of modern hunter-gatherer groups is believed to be representative of patterns for humans of fifty to twenty-five thousand years ago,[172] and individuals from these and other technologically primitive societies,[173][174] including those individuals who reach the age of 60 or beyond,[32][175] seem to be largely free of the signs and symptoms of chronic disease (such as obesity, high blood pressure, nonobstructive coronary atherosclerosis, and insulin resistance) that universally afflict the elderly in western societies (with the exception of osteoarthritis, which afflicts both populations).[3][11][172] Moreover, when these people adopt western diets, their health declines and they begin to exhibit signs and symptoms of "diseases of civilization".[10][172] In one clinical study, stroke and ischaemic heart disease appeared to be absent in a population living on the island of Kitava, in Papua New Guinea, where a subsistence lifestyle, uninfluenced by western dietary habits, was still maintained.[32][176]

One of the most frequent criticisms of the Paleolithic diet is that it is unlikely that preagricultural hunter-gatherers suffered from the diseases of modern civilization simply because they did not live long enough to develop these illnesses, which are typically associated with old age.[11][18][177][178][179] According to S. Jay Olshansky and Bruce Carnes, "there is neither convincing evidence nor scientific logic to support the claim that adherence to a Paleolithic diet provides a longevity benefit."[179] In response to this argument, advocates of the paleodiet state that while Paleolithic hunter-gatherers did have a short average life expectancy, modern human populations with lifestyles resembling that of our preagricultural ancestors have little or no diseases of affluence, despite sufficient numbers of elderly.[11][180] In hunter-gatherer societies where demographic data is available, the elderly are present, but they tend to have high mortality rates and rarely survive past the age of 80, with causes of death (when known) ranging from injuries to measles and tuberculosis.[181]

Critics further contend that food energy excess, rather than the consumption of specific novel foods, such as grains and dairy products, underlies the diseases of affluence.[1][18][182] According to Geoffrey Cannon,[18] science and health policy advisor to the World Cancer Research Fund, humans are designed to work hard physically to produce food for subsistence and to survive periods of acute food shortage, and are not adapted to a diet rich in energy-dense foods.[183] Similarly, William R. Leonard, a professor of anthropology at Northwestern University, states that the health problems facing industrial societies stem not from deviations from a specific ancestral diet but from an imbalance between calories consumed and calories burned, a state of energy excess uncharacteristic of ancestral lifestyles.[182]

Intervention studies

The first animal experiment on a Paleolithic diet suggested that this diet, as compared with a cereal-based diet, conferred higher insulin sensitivity, lower C-reactive protein and lower blood pressure in 24 domestic pigs.[184] There was no difference in basal serum glucose.[184] The first human clinical randomized controlled trial involved 29 people with glucose intolerance and ischemic heart disease, and it found that those on a Paleolithic diet had a greater improvement in glucose tolerance compared to those on a Mediterranean diet.[12][185] Furthermore, the Paleolithic diet was found to be more satiating per calorie compared to the Mediterranean diet.[186]

A clinical, randomized, controlled cross-over study in the primary care setting compared the Paleolithic diet with a commonly prescribed diet for type 2 diabetes. The Paleolithic diet resulted in lower mean values of HbA1c, triacylglycerol, diastolic blood pressure, body mass index, waist circumference and higher values of high density lipoprotein when compared to the Diabetes diet. Also, glycemic control and other cardiovascular factors were improved in both diets without significant differences. It is also important to note that the Paleolithic diet was lower in total energy, energy density, carbohydrate, dietary glycemic load and glycemic index, saturated fatty acids and calcium, but higher in unsaturated fatty acids, dietary cholesterol and some vitamins.[187] Two clinical trials designed to test various physiological effects of the Paleolithic diet are currently underway,[188][189] and the results of one completed trial[190] have shown metabolic and physiologic improvements.[13] The European Journal of Clinical Nutrition published a study[191] of a trial of the Paleolithic diet in 20 healthy volunteers. The study had no control group, and only 14 individuals completed the diet. In the study, in three weeks there was an average weight reduction of 2.3 kg, an average reduction in waist circumference of 1.5 cm (about one-half inch), an average reduction in systolic blood pressure of 3 mm Hg, and a 72% reduction in plasminogen activator inhibitor-1 (which might translate into a reduced risk of heart attack and stroke.) However, the NHS Knowledge Service pointed out that this study, like most human diet studies, relied on observational data. The NHS concluded that the lack of a control group, and the small sample of size of the study, compromises their conclusions. With only 14 participants the study lacks the statistical power to detect health improvements, and perhaps the simple fact that these 14 individuals knew that they were on a diet program made them more aware of weight and exercise regime, skewing the results.[192]

Reception

Critics have argued that to the extent that hunter-gatherer societies fail to suffer from "diseases of civilization", this may be due to reduced calories in their diet, shorter average lifespans, or a variety of other factors, rather than dietary composition.[140] Some researchers have also taken issue with the accuracy of the diet's underlying evolutionary logic or suggested that the diet could potentially pose health risks.[1][72][139][140]

A 2011 ranking by U.S. News & World Report, involving a panel of 22 experts, ranked the Paleo diet lowest of the 20 diets evaluated based on factors including health, weight-loss and ease of following.[193] These results were repeated in the 2012 survey, in which the diet tied with the Dukan diet for the lowest ranking out of 29 diets; U.S. News & World Report stated that their experts "took issue with the diet on every measure".[193] However, one expert involved in the ranking stated that a "true Paleo diet might be a great option: very lean, pure meats, lots of wild plants. The modern approximations… are far from it."[193] He added that "duplicating such a regimen in modern times would be difficult."[193]

The U.S. News ranking assumed a low-carb version of the paleo diet, specifically containing only 23% carbohydrates.[194] Higher carbohydrate versions of the paleo diet, which allow for significant consumption of root vegetables,[195] were not a part of this ranking.[193] Dr. Loren Cordain, a proponent of a low-carbohydrate Paleolithic diet, responded to the U.S. News ranking, stating that their "conclusions are erroneous and misleading" and pointing out that "five studies, four since 2007, have experimentally tested contemporary versions of ancestral human diets and have found them to be superior to Mediterranean diets, diabetic diets and typical western diets in regard to weight loss, cardiovascular disease risk factors and risk factors for type 2 diabetes."[15][16] The editors of the U.S. News ranking replied that they had reviewed the five studies and found them to be "small and short, making strong conclusions difficult".[16]

See also

References

This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
 
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
 
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.