Background
In 1871, naturalist Charles Darwin, working without using a fossil record or modern research, proposed that humans and African apes must have shared a common ancestor. Today, genetic studies confirm that human's closest living relative is the African chimpanzee. Incredibly, our two species share about 98% of the same genes but this close relationship doesn't mean humans evolved from chimpanzees nor does it mean that chimpanzees will evolve into humans. What it does mean is that millions of years in the past we shared a common ancestor. Ancestors begin to look less human and more and more ape like as they are traced further back into ancient history. The fossil records give important clues about where we come from. Eight million years ago a major portion of the continent was covered in lush forests. A diversity of apes thrived in these forests, feeding, and easily navigating this world above ground. Evolutionary adaptation in their anatomy such as grasping toes and joint mobility and arms and shoulders made the apes extremely successful animals in a forest environment 4. But beginning about 6 million years ago the world became a much drier and colder place. In the African forests, trees started to thin and were gradually replaced by open woodlands. Although most of the ape species went extinct, in time a few started to adapt to the new environment. One species that survived was a common ancestor of the African apes and humans.
The direct lineage from the ancestor of both man and the modern apes to modern man is not known. Evidence is increasing. Thousands of relics fit the general pattern. The word hominid is used to describe the total member species of the human family that have lived since the last common ancestor of both man and the apes. A hominid is an individual species within that family. The field of science which studies the human fossil record is known as paleoanthropology. It is the intersection of the disciplines of paleontology (the study of ancient life forms) and anthropology (the study of humans). Each hominid name consists of a genus name (e.g. Australopithecus, Homo) which is always capitalized, and a species name (e.g. africanus, erectus) which is always in lower case.
Human Evolutionary History
A history of the fossil record has so far shown the following evolution. The most primitive hominid yet found, Ardipithecus ramidus is approximately 4.4 million years old (mya) (early Pliocene epoch); it has features more similar to chimpanzees than humans. Fossils found near A. ramidus imply that this species lived in a forest and may have walked upright. 'Ardi' means ground or floor in the Afar language which suggests these were one of the first species to live not in the trees but on the floor of the forest. 4.2 to 3.9 mya (still early Pliocene epoch) Australopithecus anamensis had a very primitive jaw but yet a humerus (arm bone) quite human like. Its tibia's (lower leg bone) structure allows assumption of upright movement on two feet. The famous Lucy fossil (Australopithecus afarensis), found by Donald Johanson in 1974, indicate a small brain structure and chimpanzee like oral muscles, but humanistic hip and leg bones. Lucy and her species lived 3.5 to 2.9 mya (late Pliocene epoch). The species following Australopithecus afarensis was similar but had somewhat larger brains. The Australopithecus africanus' teeth indicate a diet of vegetation. Next, Australopithecus robustus lived 2.1 to 1.6 mya (late Pliocene to early Pleistocene epoch). The robust in A. robustus' name refers to its thicker skull and strong jaw. Skulls and teeth continue to grow with the Australopithecus boisei 2.3 to 1.1 mya (late Pliocene to early Pleistocene epoch). Homo habilis was the first species that shows use of tools about 2.4 to 1.5 mya (late Pliocene to early Pleistocene). Only 1.8 mya to 300,000 (early Pleistocene to late Pleistocene) years ago Homo erectus roamed the Earth. Homo erectus symbolizes a departure from previous hominid forms in its augmented body size, and basic digestive anatomy 5. Homo sapiens (archaic) continued skull and brain growth and rounded out the back of the skull. Skeletal remains have been found all over Africa and Europe dating 500,000 to 200,000 years ago (late Pleistocene to Holocene). The last fossil record before modern humans is the Homo sapiens neanderthalensis. They had shorter limbs and a larger brain than modern humans 230,000 to 20,000 years ago (late Pleistocene to Holocene).
These mentioned changes eventually evolved into the modern Homo sapiens, quite familiar today. Interestingly, genes from the initially evolved human and the newest modern newborn are almost identical. "Genetically, we are still cavewomen and cavemen despite living in ultra modern homes 6." However, human lives as compared to those of other primates and most mammals, have at least three unique characteristics: children are dependent upon parental figures longer than other mammals, humans have an unusually lengthy lifespan, and human have a large brain with advanced psychological attributes including: increased ability for learning, understanding, and intuition 7. The combination of these unique qualities is specific to humans and is a distinct example of evolution in process. Each of the ancestors of modern Homo sapiens evolved over time. Evolution can occur rather quickly or rather slowly, there is no completion. Evolution is not a marathon with a beginning and end, but more of a process. Many humanoid ancestors existed concurrently but yet this process was not immediately evident and has taken paleoanthropologists centuries to discover.
Dietary Evolution
In addition to the attributes mentioned above, diets and eating habits have inevitably changed between the modern human and our ancient ancestors. Early ancestral species are stereotyped as herbivores. This generalized belief is simply incorrect. However, the evolution from herbivore to omnivore was not an overnight process and is not clearly known. Fossil evidence suggests many theories. There seems to be a gradual increase in dietary flexibility between Ardipithecus ramidus and Australopithecus africanus 8. A lot can be told from the structure, size, and shape differences in teeth between each species. To understand the changes one must understand the uses of each type of tooth. Foods that are hard to break apart are usually sheared by the edges of sharp crests (incisors), this type of food may be held before the tearing begins by the canine teeth, but hard and brittle foods are bulldozed by planar surfaces (molars). All Australopithecines have relatively flat molar teeth compared with many living and fossil apes 9. These teeth were best at breaking down hard but easily broken foods including most fruits and nuts. Soft and weak foods were also not a problem. But, the teeth were not equipped to break down tough bendable foods like stems, or meat 9.
Australopithecus anamensis shows the first indications of thicker molar enamel in a hominid, and its molar teeth were equivalent in size to those of A. afarensis. Its lower jaw is middle sized compared to living great apes and later australopithecines. This combination of features suggests that A. anamensis might have been the first to be able to eat hard and perhaps rough objects. Australopithecus afarensis was similar to A. anamensis in tooth sizes and enamel thickness, yet it did show a large decrease in lower jaw size. Decreased molar size, a smaller lower jaw, and changes in incisor shapes all imply greater emphasis on foods requiring less grinding and more tearing, such as meat 10. Either way, hard and perhaps abrasive foods may have become even more important components of the diet of A. afarensis. The subsequent "robust" australopithecines do show such patterns.
Nutritionally the ability to grind foods especially nuts and seeds would greatly increase consumption of fatty foods. This change (increased consumption of hard and abrasive foods) probably increased intake of vegetable fat and would probably have facilitated overall access to food energy 10. Additional energy may have resulted in additional cognative thinking processes. Finally, sometime in the late Pliocene epoch, during Australopithecus africanus' and Homo habilis' reign, evidence begins for consumption of animals as food. About 2.5 million years ago, there is evidence that animal foods began to occupy an increasingly prominent place in our ancestors' dietary subsistence 10. Animal protein is far more complete and concentrated than plant protein. Thus animal protein and fat in diets dramatically increases caloric density. Therefore Homo habilis could eat less and yet consume more calories. Additional calories may have facilitated in the elongating of stature from Homo habilis to Homo erectus. Animal fat also contains three essential fatty acids which are found in brain gray matter 10. This might explain the changes in brain structure and size written by fossil records.
With a need to feed less often, Homo habilis had more time to invest in hunting, developing cooperative relationships, and an ability to delay gratification 11. Homo erectus evolved with the selective adaptive ability to hunt for wild game. Homo erectus developed specific tools, and traveled spreading from tropical Africa into Europe and Asia where hunting was plentiful. For more than 99 percent of their time span hominid societies depended upon hunting and gathering for survival. The first humans led an extremely active lifestyle. Hunting and gathering of food, combined with protecting oneself from both the elements and predatory actions led to a vigorous routine. Evidence from archeological findings has shown that hunter-gatherers were lean, fit, and largely free of chronic diseases 12.
It was only within the past 10,000 years that these habits have switched to an agricultural production. Humanoids were collectors of foods for millions of years and have only been producers of food for a few thousand years. The shift from the nomadic, hunter-gatherer lifestyle to a settled existence based on agriculture was mankind's first step on the path towards the modern world. The transition was anything but sudden. People had to learn how to gather and process wild cereal grains before they began cultivating them deliberately, all around 10,000 years ago 13.
As communities shifted from a lean meat, fruit and vegetable rich diet to a grain based diet, this had dramatic affects on their stature and ability to thrive. Average heights shrank, and a lifespan decreased, many mineral efficiency diseases were prevalent, and children were less likely to survive 14. A simple change in diet added these significant changes.
Many genetic changes are also linked to this switch from collector to producer. First, adults sometimes had to become accustomed to digesting lactose, a practice not known to adult hunter gather populations. The ability to utilize and absorb milk is believed to be a product of natural selection 15, and many people still struggle with lactose intolerance or adversion today. Another interesting genetic change is the phenylthiocarbamide (PTC) tasting system which allows humans to taste bitterness. Interestingly, in African populations, there is a relationship between insensitivity to bitter taste and occurrences of malaria, which may suggest insensitivity was selected for areas in which eating bitter plants would offer some protection from malaria 16. Additionally, skin color can be linked to nutritional origins. Darker skin decreases the ability for vitamin D production underneath the skin in areas of high ultraviolet ray exposure (sunlight). Under conditions of low sunlight these individuals may experience adverse reactions to lack of vitamin D and suffer from rickets. Areas with low sunlight would select for individuals with lighter skin pigment and greater ability to produce vitamin D. In areas with higher sun exposure, darker pigments make sufficient vitamin D nutrients 15. This is in no way a complete list of genetic mutations occurring sometime in the past millennia, it is simply an example of applicable adaptations.
Simultaneous Evolution in Food Preservation
The preservation of food made it possible for our ancestors to settle down in one place, or organize into jurisdictions building agrarian communities where they were confident they would not lack food. Without the ability to store and preserve foods free of disease and pathogens it could be argued that man would still be foraging for food. The ability to preserve foods allowed for the advancement in many cultural and technological conquests. In fact, storage of food also aided in travel as longer journeys were possible packed with preserved food (e.g. Spice Trade, Oregon Trail).
These small, isolated communities around the ancient world searched for methods of preserving food and therefore preserving life. What accumulated was a variety of techniques that postponed or prohibited the natural process of food decay. Methods included drying, salting, smoking, and fermenting. Later processes of refrigeration, canning, dehydration, and unnatural chemical additives were discovered and highly utilized. Every community sought out methods of preserving that fit their culture, climate, and environment 17. Almost everything we eat today has been treated in some way to ensure safe passage, prolonged shelf life, or enhance flavor.
Drying is the most simplistic and oldest method of food preservation. It is fathomable that man was drying food even before cooking it, as any hunter gather might have noticed a connection between hanging a piece of meat in the sun and its lack of insect and maggot infestation. The meat may have actually enticed repeated action, as dried foods concentrate in flavor. This, of course, is speculative as little evidence confirms or contradicts this theory. However, in almost every ancient culture drying of food was practiced from the earliest days. Drying does date back to 12,000 B.C. Egyptians dried fish and poultry along the Nile in the hot sun. Ancient Babylonians made a concentration of pounded dried fish paste that was used to make broths 17. Human ancestors may also have stumbled upon drying through experience; fruits fallen from branches may have dried under the hot sands. Dried dead snakes and bugs were edible. These accidental discoveries may have begun the first surpluses of food.
Food that failed to dry effectively would be hung over a campfire or stove to complete the process. The exact date of this preservation method is unknown; it was probably invented by people trying to speed up the process of drying foods 18. The chemical makeup of fire and smoke includes alcohol, acids, phenol, and other phenol compounds. Smoke however does not only act to as a drying mechanism, it also brings in formaldehyde which preserves. The main ingredient in the flavor of smoked food is the fuel or wood burned underneath it. Sophisticated smoking communities have been discovered in Poland that date back to the ninth century 17.
Similarly to other types of preservation methods, hunter gathers would have first uncovered fermenting fruit and meat naturally in the wild. Through observation and curiosity they also would have noticed that foods brought back for storage would have fermented. Partially fermented food tastes different and is actually easier to digest; it cooks quicker, and keeps for quite a long time. All sorts of foods ferment, grains that have been moistened, grapes, figs, etc 17. When herding developed someone must have noticed that milk that was left out soured and become curdled. Thousands of different brewing and fermenting traditions developed because of these observations. Fermentation requires the correct combination of atmospheric conditions, ingredients, and time. Fermentation was especially prevalent in cool damp climates where drying of foods was a timely and exasperated process.
Dry salting food is similar to the process of drying, but includes a chemical additive. Salt mining began in Europe as early as the second millennium B.C. 18. In dry salting the food is coated several times in salt. It is then completely covered in salt placed in a container and hung out to dry. Salt speeds up the drying process bringing out additional moisture through osmosis. The salt serves as an antibacterial agent; some salt is absorbed into the item but most remains on the outside and acts as a barrier. Salt naturally dehydrates some bacteria. Salted fish were a staple of fisherman on long journeys throughout the thirteenth century 18. Dry salting however is only effective for short term preservation. If the item required a longer storage a wet salting process or brine was added. Salt has been used since ancient times to cure fish. Despite fish being infinitely important to the diet of Japan and Southeast Asia salt deposits are not natural to that portion of the world. However, around 200 B.C. it was found that they could cultivate salt from dried seaweed 17.
Pickling in vinegar is quite common today, but also has some ancient heritage. Vinegar works by creating a highly acidic environment. Vinegars are as diverse in taste and strength as humans are in height. Pickling became all the rage in England in the sixteenth century after salting had become a staple of the food for the poor. Eggs, cooked meat, fruit, and even nuts were popularly pickled. Specialized jars were made as the acidity in vinegar makes lead based containers useless, and taints the contents. In Middle Eastern households jars of pickled vegetables are still a staple, picked for vibrant colors and flavors. Relishes and ketchups, also derived from a vinegar base began consumption in 1760 19.
Like vinegar, sugar, honey, or molasses creates an environment where no living organism can thrive. Sugar and honey were also common methods of preserving food and adding flavor. Honey coated and preserved ham can be traced back to the Roman Empire 17. In the sixteenth century sugar cane became readily available from India via China and was used increasingly for banquets and the affairs of the wealthy. These foods could be kept for months in wet or dry brines and usually maintained their consistencies, colors, and textures well. The introduction of sugar allowed for unripe or barely ripe fruit to be utilized preseason by boiling them with sugar into a sticky sweet concoction 20.
The need for more industrious methods of food preservation continued throughout the following centuries. From a practical side salted food eventually tasted too salty, dried or smoked food hardened to an inedible form, and humidity and seasonal changes made poor harvests and great hunger real possibilities. In the wake of the French Revolution of 1789 the French government actually offered a reward to anyone who could develop a new method of preserving food. The requirements included a product that was transportable, inexpensive to produce, and nutritionally superior to salted meat 17. Not a scientist but the son of a wine maker answered the call in 1810 Charles (Nicholas) Appart was awarded 12,000 francs on the condition that he make his process available to the general public. He had single handedly created the first industrious version of canning. Using glass bottles, corks, and wires, (not unlike mason jar usage today) Appart heated the contents and glassware and created a sealing system that sealed in freshness but sealed out unwanted bacteria and parasites. Appart's ideas along with an English created metal can give us a process similar to the canning seen today.
Keeping foods at low temperatures was also an ancient practice, chambers, caves and pits all worked to keep foods as cool as possible. Ice inclined climate bound hunters froze their meats with the same tenacity as warm weather hunters dried it. The cooler the temperature the slower bacterial growth; cold slows the metabolism of microorganisms. The warmer the food becomes the more active the metabolism of the organism. Even freezing doesn't kill bacteria, it simply makes them dormant. But frozen treats were not specialized to northern climates, ice houses preserved meats and specialties for warmer climates. China had ice houses before 2000 B.C., they were popular in England in the seventeenth century, and American butchers kept these facilities 21. The first actual refrigerators (1834) worked on the idea that decompressed gas will get cold and absorb heat; the compressed gas was removed by a water cooled coil. Today refrigerators use the exact same parts, but the refrigerant is a Freon. Freon chemicals are known to cause negative effects to Earth's ozone layers, but are still readily used as refrigerants. Industrialized frozen foods are subjected to one of three processes: in immersion bath food is wrapped and then placed in a liquid refrigerant, in spraying food is sprayed with a liquid nitrogen or similar chemical, and in blasting the food is the food is subjected to a stream of low temperature fast moving air.
Chemical additives are now far more effective in food preservation than ever before. Sugar and salt as sole preservatives are utilized very rarely. Instead we have replaced them with various acids, sulphur dioxides, antibiotics, and many unfamiliar unpronounceable multi-syllable words. Chemicals are added to preserve shelf life, preserve color, enhance flavor, protect against contamination, restore moisture, maintain aroma, or all of the above 22. However the effects of this mode of preservation on human health are too young to have an obtuse view. Evidence in longitudinal studies to affirm or deny is still too premature to make scientific speculations. How will our evolutionary bodies adapt to these unnatural chemical additives? How will natural selection play apart in future generations?
Links to Human Disease
The combination of Paleolithic genetics, preservation of food, and 21 st century dietary habits are wreaking havoc on civilization as we know it. The evolutionary collision of our ancient genome with the nutritional quality of recently introduced processed and synthetic foods underlies virtually all of the chronic diseases of Western Civilization 23. Twenty-first century living has provided many advantages that have led to a more sedentary way of life. From vehicles to news brought via the internet to pre-packaged foods modern human have mechanized previously athletic adventures. Unfortunately these advantages have cost not just quality of human lives but also quantity. It is believed that many chronic and some degenerative conditions exist in such plethora because the rate at which natural selection operates is not adequate enough to match the cadence of environmental change.
Lifelong health problems and diagnosis can be linked to a modern diet and lifestyle. The changed nutritional manners in which diets have been altered from Paleolithic patterns impact health in many ways. Genes and traits that were positive or neutral in the past are now potentially venomous within the contexts of industrialism and modern culture. The average total cholesterol level in adult Americans is approximately twice the physiologically normal level 12 high cholesterol is a direct result of inappropriate dietary intake. High cholesterol is directly linked to intake of saturated fatty acids. Cardiovascular disease remains the number one cause of death, accounting for 41% of all fatalities, and the prevalence of heart disease in the United States is projected to double during the next 50 years 24. The American Heart Association states that 1 in every 2.9 deaths is related to cardiovascular disease 25. For ancestral humans cholesterol ridden caloric intake only encompassed five percent of all foods. American intake is fifteen percent or more 10. Evidence indicates that high cholesterol and cardiovascular disease related to high cholesterol were minimized in Paleolithic humanoids. Specific diets high in saturated fat intake and human habits of gluttony have overcome our bodies' natural defenses and now cholesterol causes vein blockage and advanced cardio pulmonary problems. These problems were not experienced by past humanoids.
Humans are the only species to experience an increase in blood pressure as age increases. What causes this difference? Humans are the only mammals to consume more sodium than potassium. Remember salt has been a purposeful and highly aggressive additive to our food for about 3,000 years. Ninety percent of consumed sodium comes from preservation processes, preparation, and seasoning. Only ten percent is innate in foods. Previous species' only consumed 25 percent of the sodium consumed today. Human ancestors weren't plagued with either clinical hypertension or rising blood pressure with age 26.
Refined grains are, since the beginning of agriculture forward, the most significant sources of food energy in the human diet. However, they are completely unnecessary. Grains contribute on average 65 percent of caloric intake in humans. Grains have displaced the fruits, foliage and vegetables of our ancestors. Ketchup is often labeled and served as a vegetable and placed with a high fructose complex carbohydrate grain derived product. Over five million years of successful diet intake has been replaced with the refinery of grains 10. While fruits and vegetables have not conclusively proven to lower all cancer risks 27, have exerted convincing data exists for many specific cancers including colon, stomach, and digestive track cancer 10. Refrined grains however, show no characteristics of preventating cancer. Recently, whole grains left unrefined with germ intact have shown some positive relationships. This could be do to the phytochemical make-up of noncereal plant foods. However, whole grains are not as mainstream or as publicly advertised as refined. Current human biology became adapted to these phytochemicals millions of year ago. The evolutionary relationship had millions of years to adapt 23. However the phytochemical make up of grains has only had a few thousand year introduction to affect human genome and may still be in the process of exerting selection.
The existance of abundent food has allowed civilization to procure many other fields and technology, however, it has also cut the need of physical exertion exponentially. The necessity of physicality was unavoidable until industrialization. As was eating regularly scheduled meals, much less three meals a day. Currently obtaining energy through food is physically unsubstantial. Calories are consumed without thought to caloric expenditure. Lunch is served in public schools but recess is limited and often denied. The results of inactivity or low activity is shortage in skeletal muscle, sarcopenia. Unburned calorie consumption is stored in the body not as muscle, but as fat cells and fatty tissues. Muscle tissues more easily and accessibly takes up glucose in the blood stream. Muscle tissues' ability to respond to insulin in taking glucose from the blood stream far out performs the ability of fat tissues. The phrase "thrifty genotype" was introduced by human genetics pioneer James Neel to describe the benefit of a sustained hyperglycemic response after an occasional hefty meal by hunter-gatherers 28. This continued response to overly caloric eating in conjunction with less glucose disposing tissues means an excess of glucose in the blood stream and an increased need for insulin. Hence, the insulin over-production that was an asset at an early stage in human evolution is now a liability 29. The more proportional your body composition (muscle tissue to fat) the less likely your body will have adverse reactions. The more closely aligned caloric intake to output the less unused glucose in your bloodstream.
Our genetic and evolutionary knowledge indicates the existence of important interactions between genetic characteristics and nutritional patterns. There are strong suggestions of important changes in our diets in recent times that may have set in motion evolutionary processes that are probably still going on. We have only an inkling of some of these processes. More could have been cited, especially in the realm of connections to preventable diseases, asthma, and specific allergies. Our knowledge is restricted to a very small fraction of the human genome; however our curiosity should not be. Despite major advances in medicine, preventative health care, and pharmaceuticals, human choices, instead of situations, are now limiting life expectancy. These detriments indicate a need to review current dietary ingestion and possibly realign current standards to our ancient genome. These efforts may impact human health in a multitude of positive manners.
Comments: