The world is living dangerously: either because it has little choice, or because it is making wrong choices. Let me put it another way. Six billion people co-existing on our fragile planet. On the one side are the millions who are dangerously short of the food, water and security they need to live. On the other side are the millions who suffer because they use too much. All of them face high risks of ill-health.
Gro Harlem Brundtland
Director-General of the World Health Organization
Address to the Fifty-fifth World Health Assembly
Geneva, 13 May 2002
Sometime during late 1999, the human population exceeded six billion. This fact is astonishing, remarkable, and not just a little disconcerting. What’s more, the conservative prediction is that, given our current rate of growth, by the year 2050, we will top out at some 9 billion. Considering our humble beginnings some 3-4 million years ago on the open plains of the Serengeti, our rise to prominence as a dominant mammalian species seems improbable. The success of a ponderously slow moving bipedal species could occur only under very special ecological and evolutionary circumstances. In order to survive, we had to rely largely on cunning and guile, rather than on physical prowess. There are numerous hypotheses, and many are based on a well-established fossil record, as to how we managed to avoid early extinction. Some physical anthropologists reason that the rapid evolution of a slightly enlarged brain mass - one capable of rational thought - and the advent of opposable thumbs, allowed development of acute dexterity which led to tool and weapon manufacture. Advanced techniques for hunting, as well as for defense against predators and other competing hominid species, as portrayed in the opening scenes of " 2001: A Space Odyssey," could now displace less efficient gathering and scavenging strategies. Another less popular theory holds that, contrary to tool making for purposes of hunting, we functioned in those savannas as omnivores, whose main activity was scavenging. Hand axes and other simple tools were used to break open the long bones of the carcasses of game for the nutrient-rich marrow inside. There is much evidence in support of this idea from the fossil record. The notable absence of hyena-like ancestors in Africa, but not in western Europe, during our early development further strengthens this hypothesis, since the present day hyena, a notably aggressive species, is the only Serengeti predator/scavenger capable of cracking open bovine long bones with its incredibly strong jaws. Surely its earlier presence would have negated any opportunity afforded us to harvest that energy-rich food source unencumbered by competition. Perhaps it is most prudent to view the modest level of success we enjoyed during our early development as the deployment of a combination of survival strategies, a sort of “whatever works best” at the time of need. Most would agree that, if nothing else, early hominids were resourceful, thoughtful mammals capable of a wide range of behavior, firmly grounded in a strong, instinctual drive to “live long and prosper.”
Regardless of how we made it through that tenuous period, the fact remains that we are now firmly embedded in the middle of most of the world’s terrestrial ecosystems, and have quickly and efficiently learned how to manipulate them to our own benefit. Unfortunately, our penchant for encroachment has had grave consequences for the other life forms in those regions, often leading to severe reductions of populations of indigenous animals and plants, and in some cases, extinctions (e.g., Dodo bird, passenger pigeon, to mention a few familiar examples). It is estimated that at the present time, because of our mere presence on earth, we cause some 17,000 species extinctions per year. The displacement of plants and animals in favor of food production and settlement has frequently led to increased health risks for us as well. That is in fact the theme of this section of our web site.
Each adult person's target caloric intake is approximately 2,500 calories daily (regrettably, many do not reach this goal), regardless of the source [link2]. Producing food requires an enormous commitment of land [link2]. In fact, nearly 35% of the earth’s surface is currently devoted to food production of one kind or another. With the advent of another 3 billion people, the estimated increased amount of land that needs to be set aside for agriculture is approximately the size of Brazil, a country larger in square miles than the contiguous United States, but with nutrient-poor soils. How did we come to this state of affairs, and what are some of the major health consequences associated with land use as it relates to food production?
In order for us to have flourished, it is generally acknowledged that we had to somehow engineer a series of agricultural revolutions. With more food available, the populations responsible for them increased and spread outward. Natural harvesting of food items by small bands of hunter-gatherer societies could not provide enough food to allow for the development of permanent settlements, without which the advancement of human culture would not have been possible. Technologies for sustaining food production (domestication of plants and animals, development of irrigation schemes [links: 2, 3, 4], food processing and storage systems were in full swing in many places around the world some 10,000 years ago, and seem to be associated with the development of modern spoken and written languages. The origins of our root language goes back much earlier in our history, perhaps 100,000 years or more (See: From Lucy to Language. Donald Johanson and Blake Edgar. Simon and Schuster Editions, 1996). But prior to written historical records, there is also much evidence for sustainable crop production. The druids conceived of and erected Stonehenge, the Anasazi made use of the sun dagger spiral, and the Aztecs created a complex calendar known as the “Sun Stone”. All of these inventions, and probably many more that we are unaware of, were capable of allowing these advanced cultures to predict the seasons. Planting and harvesting practices revolved around these predictions, and most certainly gave rise to religions based on them. As agriculture became firmly established, populations grew, establishing permanent settlements which, in turn, led to a brilliant series of blossoming cultures. Europe, Asia, South Asia, The Middle East, South and Central America all spawned dominant civilizations, which revolved around religions based on the production of food. We have inherited the legacy of those ancient cultures, and have refined and redefined both the ways in which food is provided as well as reinterpreted the religious tenets upon which those practices were founded.
A philosophical subtext arose out of most of the popular western-based religions that were established following the first few waves of the agricultural revolution. It stated that God has given permission for us to dominate over the land and its natural processes. The Old Testament of even the earliest bibles explained that all the plants and animals were placed here on earth to satisfy our own needs, and make use of them we have, to the point of genetically rearranging their intrinsic biological capabilities. This pervasive attitude of dominance over natural process gave rise to a multi-cultural arrogance and a false sense of being in control. In contrast, weather events - floods, droughts, heat waves, cyclones, dust storms, hurricanes, tornadoes, earth quakes, volcanic eruptions - continued to give us pause, and forced us to modify this central theme. As a result, numerous insightful and creative mythologies, and a robust fiction-based literature pitted “man against nature.” In most cases, nature usually came out ahead.
The current collective world view of how we should conduct our lives in context with the rest of earth’s living entities recognizes the same basic facts, regardless of the cultural context of time and place; namely that nature is never wholly predictable, often poses threats to our very existence, and above all, can never be fully tamed. When we alter our living space without an awareness of the environmental consequences, risks to our health frequently result. The global scientific community is rapidly coming to a consensus that adheres to the concept that the only real choice regarding the way we must carry out our lives at both the individual and population level and avoid health consequences is to strive to achieve ecological balance. Failure to do so will ultimately force our resignation from the ranks of the living.
Despite this enlightened view, at present, we continue to invent newer, more pervasive (ecologically disruptive) ways of engineering the environment for our own benefit [links: 2, 3], while remaining blithely ignorant of the ecosystem function disconnects induced by an encroachment-driven imperative. The enabling factors that allowed for this kind of behavior arose soon after the advent of the industrial revolution. Most significant was the development of the internal combustion engine. It allowed those fortunate enough to be able to take advantage of it to replace beasts of burden with more efficient machines, which rapidly proved highly useful for all kinds of agricultural tasks. We were finally free to fully exploit and transform the world’s arable landmasses on a nearly unimaginable scale. The invention of dynamite and other powerful explosives facilitated the clearing of large tracts of land, and the race was on in earnest to level the soil-rich, temperate zone deciduous forests, allowing the mass production of fruits and vegetables. Grasslands gave way to mono-cultures of a wide variety of grains. Produce of all kinds became common-place, so that today we take their presence on our table for granted. The development of food processing and storage (refrigeration, freezing, freeze-drying, pasteurization, food additives for preservation, etc.) insured a certain measure of safety regarding food consumption, and has led to the globalization of world markets for an astoundingly wide variety of plant and animal food items. Feeding densely settled populations was now not only feasible, but perceived by most cultures as highly desirable. Today, some 60 percent of the earth’s population lives in or near urban centers, thanks to a constant supply of calories. It has been conservatively predicted that over the next 50 or so years, that figure will go up to 80 percent or higher.
In contrast, a few societies (most notably the Maasai in East Africa), a few tribes of Australian aborigines, numerous small tribes throughout the Amazonian basin, and some native American tribes in our Southwestern United States, have remained remarkably well integrated into the world around them. Physical and biological processes are part and parcel of their daily lives. Their philosophies reflect these links, allowing them to survive in what seems to many of us to be inhospitable environments. These societies failed to invent large-scale methods for producing food surpluses. Instead, they remained stable and small compared to those in Europe, Asia, and the urbanized areas of the Americas. There was no need to produce vast quantities of food if the only people to be fed were themselves. Agricultural revolutions made it possible to produce more than enough to meet our immediate needs, permitting the luxury of exporting large quantities of produce to foreign markets for pure profit. The ecological footprint expanded exponentially as agricultural technologies improved to provide food for resource poor areas. As already mentioned, 35 percent of the earth’s land surfaces are devoted to some form of agriculture. This represents nearly 80 percent of the total available land that can be put into use. As populations expand even more over the next 50 years, it almost certainly will be necessary to use the remaining 20 percent, if current agricultural practices continue to be employed.
No one questions the value of the contributions that modern agriculture has made enabling us as a species to evolve to our present state. The growth and development of culturally sophisticated societies, countless medical advances, improved sanitation, and high standards of living all owe their origins to the development of food production. The “Green Revolution” of the 1960s further changed the world with new plant varieties and more potent agrochemicals (pesticides, herbicides, and fertilizers). Global food production has doubled in the past 35 years and 92 percent of this remarkable increase comes from higher yields per acre. The world average amount of productive land and near-shore sea that it currently takes to feed one human being is 2.1 hectares (i.e., the ecological footprint in area of landmass). Inequalities abound, however. Developing nations use only 1 hectare per capita, while the United States uses almost 10 times that or 9.6 hectares. Fertile bottomland along riverine flood plains are prime farmland, but are also favored sites for development. As a result, more and more of our food-growing lands are at risk of being consumed by urban sprawl.
Wealthier nations have chosen to devote much of their agricultural acreage to the production of livestock. In the last 50 years, world meat consumption has quadrupled, and livestock now consumes 40 percent of all grain produced. At the same time, there are world food surpluses that fail to reach the hungry. More than 1.3 billion people (nearly 20 percent of the earth’s population) live on less than a dollar a day; 70 percent of these are the rural poor, who survive mainly by subsistence farming. One-third of the world lives with stressed drinking water supplies. Agricultural runoff is the dominant reason for the stress. Humans already use more than half of all accessible, renewable fresh water, and 70-80 percent of that is for modern agriculture, more than any other human activity. Currently, over 40 percent of world food production occurs on irrigated land. Irrigation in some instances contributes to the spread of infectious diseases. Malaria is transmitted by vectors that breed in standing freshwater [link2], while schistosomes, the causative agents of schistosomiasis, use aquatic snails living in similar environments as intermediate hosts. Improper irrigation practices contribute to waterlogging, salinization, and siltation of cropland and adjacent riverine environments. Erosion of soil is another major problem facing the world’s arable lands. Of all the available land set aside for farming, currently more than 15 percent has been degraded beyond use for growing crops by human activity.
In more naturally managed farm settings, leftover vegetative cover, food by-products, and animal manures are allowed to decompose in-place locally, and plowing is kept to a minimum. This creates carbon-rich compost that allows for slow release of nutrients. Topsoils improve, water retention capacity increases, and crop yields rise. Yet, in the main, these situations are rare compared to the fertilizer-intensive farming methods practiced throughout most of the grain and corn belts of the world. What can be done to reverse this trend and at the same time return land to its original state?
In some ways, the urbanization of Homo sapiens made it easy to lose sight (both literally and figuratively) of the natural systems that sustain us. Urban societies have become alienated from their formerly visceral understanding that cyclic, interdependent food and land-use systems (not a grocery or restaurateur) feed a city. The root issue is a lack of ecological awareness: our much-heralded drive for survival has blinded us to the need to maintain the health of the ecosystems surrounding us. If we continue to alter these relationships by our presence, we run grave risks of destabilizing agricultural productivity, perhaps irrevocably. Cyclic flow of nutrients, materials, and energy is what sustains a nourished, nourishing ecosystem throughout its evolutionary history. In order to maintain secure, healthy human societies and diverse, healthy ecosystems in the 21st century and beyond, we must re-invent how to feed ourselves.