High Stakes at Tule Springs
New theories appear, new arguments rage, and a fully satisfying solution has not been reached. The peopling yarn is still coming in installments, like the Pickwick Papers, without a plot or denouement. Gary Haynes, The Early Settlement of North America: The Clovis Era
The "deadly syncopation" argument does not work, of course, if the dates of near-time extinctions do not correlate with the dates humans arrived in new lands. Particularly in the Americas, the latter are the subject of ongoing debate. The prevalent opinion is that foragers or hunters suddenly appeared in the western United States around 13,000 years ago; their archaeological sites can be dated by geochronological techniques (C. V. Haynes 1991, 1993). They left artifacts, including Clovis spear points, distinctive fluted blades tightly lashed to wooden spears or throwing sticks and used for thrusting and throwing. As I discuss in chapter 8, Clovis points are occasionally associated with mammoth bones.
Some anthropologists, however, believe that people reached America well before 13,000 years ago. These claims have generated tremendous interest among prehistorians and in the media. If people did arrive and grow to appreciable numbers thousands of years before the extinction of the megafauna, fewer objections could be made for a noncultural explanation for the extinctions.
Through the early 1960s, I accepted the claims of some archaeologists of an early human occupation of the Americas. Based on various publications, especially the first publications of radiocarbon dates, over ten millennia appeared to separate the initial human arrivals in the New World from the last occurrences of extinct megafauna. The former arrived 20,000 years ago or much earlier; the latter vanished by 8,000 years ago or even later. I had no reason to doubt the claims of professional archaeologists I had met and in some cases interviewed, such as Alan L. Bryan, Ruth Gruhn, George Carter, Tom Lee, Scotty MacNeish, and Ruth Simpson. I listened to their talks and read their reports of archaeological sites that long predated Clovis time. Then some intensive efforts to confirm this early occupation of the Americas yielded negative results and I turned into a skeptic.
In the late 1950s, Ruth Simpson, an archaeologist with the Southwest Museum in Los Angeles, undertook excavations of colluvial fans (accumulated rock detritus and soil) in the Calico Hills, in the Mojave Desert east of Los Angeles. Interpreting sharp-edged cherts (selected from a very large quantity of fan gravels) as artifacts, she reported finding an ancient pre-Clovis site. Other archaeologists were dubious.
Then the famous African archaeologist and paleontologist Louis Leakey inspected Simpson's excavations and endorsed her finds (Leakey, Simpson, and Clements 1968). Simpson invited a blue-ribbon panel of professional archaeologists and geomorphologists to inspect the site. They did, and gave it a thumbs-down. Although closely resembling manmade tools (see Tankersley 2002, 192, photograph of a pseudo-artifact produced by earthquake-generated liquefactions at the Calico Hills site), the cherts were widely scattered, rather than concentrated, as one would expect of artifacts in an archaeological site. In addition, they could not be distinguished from rocks in transport in a fan, which may fracture naturally. The archaeologists were also bothered by the burial of artifacts supposedly 40,000 years old in alluvial fans that geologists viewed as at least 400,000 years old. Although some people still regard Calico Hills as an ancient cultural deposit, most professional archaeologists dismiss it, including some who accept certain other claims of pre-Clovis colonization. This is only one example of the many proposals of an early New World antiquity that have failed to pass the test of tangible and reproducible evidence (Martin 1974).
An earlier test of the pre-Clovis invasion hypothesis had failed. The circumstances seemed so promising at first: the site, Tule Springs, would nail down the presence of people and extinct animals long before 11,000 radiocarbon years ago. In those years before explosive urban growth, Tule Springs lay just northwest of the tourist mecca of Las Vegas (it has since been built over by it). In 1933 the Southwest Museum conducted excavations there, turning up an obsidian flake of human manufacture in association with remains of an extinct camel. Mark Harrington, the excavator of Gypsum Cave in the 1920s, was part of this team. At Tule Springs Harrington found the bones of extinct bison and other species associated with charcoal, presumably cultural in origin, and on this basis reported early man. Then, in the early 1960s, Willard Libby began searching for groundbreaking new applications of his radiocarbon dating method. Two earlier radiocarbon dates, one by Libby, the other by Wally Broecker and Larry Kulp of the Lamont-Doherty Earth Observatory at Columbia University (see Wormington and Ellis 1967, 3) suggested that humans were present at Tule Springs over 23,000 years ago. These dates cried out for verification. Libby convened an ad hoc committee of leading archaeologists and geologists of the region; at their suggestion, the Southwest Museum returned to the site for multidisciplinary excavations between October 1, 1962, and January 31, 1963 (Wormington and Ellis 1967).
The Tule Springs project was funded in part by the National Science Foundation, along with massive assistance from the private sector. Her-schel Smith, a Southern California contractor with an active interest in archaeology and geology, offered the services of the construction industry gratis. International Harvester provided two of the largest bulldozers then manufactured in the United States. Allis-Chalmers contributed a large motor scraper. Pafford and Associates of Los Angeles made available aerial photos, surveyed a grid, and prepared a detailed contour map. Union Oil donated all fuels and lubricants. Members of the International Operating Engineers Union, Local 12, ran the heavy equipment, donating their time (C. V. Haynes 1967b, 16).
Several of my friends and colleagues participated in the Tule Springs dig, and I was as excited about it as they were. Dick Shutler, by then with the Nevada State Museum, helped organize the project and became its field director. He invited Vance Haynes and Pete Mehringer, at the time grad students in the geochronology program at the University of Arizona, to join him. Both were enthusiastic, talented, and experienced. Funds were available to bring in leading archaeologists of the time to inspect the results.
With all that support, Dick, Vance, and their team opened a remarkable 3-mile maze of trenches, in some places to a depth of 30 feet through strata well over 15,000 radiocarbon years old. Vance had the glorious opportunity to map the cuts, fills, and soils and (with some 80 radiocarbon dates) determine in detail the chronology of the exposures. Even now such an abundance of controlled radiocarbon dates at such a crucial site is exceptional. Forty years ago it was state of the art. In addition, the field team could count on weekend turnaround of radiocarbon dates from Libby's lab. Normally such service takes weeks or months. This control on stratigraphy allowed at least some of the uncertainties in correlating units to be resolved as the units were being uncovered and mapped.
With abundant radiometric and stratigraphic control, Pete Mehringer extracted and analyzed fossil pollen from the exposed alluvium and from sectioned spring mounds. The mounds form when dust storms intercept artesian springs. The conifer pollen and plant macrofossils gave evidence of biotic and climatic change during the Quaternary. Meanwhile, the paleontologists identified fossils of mammoths, large camels, large and small horses, ground sloths, pronghorn antelope, the extinct American lion, and a teratorn (Mawby 1967).
All these were valuable data well worth obtaining. But when Vance Haynes analyzed the organic material that Harrington had considered to be cultural charcoal, it proved to be naturally oxidized plant material, soluble in alkaline washes. It was not charcoal. Equally disappointing, no evidence of chipping of lithics or workshop activity could be detected at the site. There were no Clovis points or, for that matter, spear points or knives of any kind. The entire recovery of artifacts was minuscule. According to Haynes, "All I know of are a single scraper from Locality 4 and a crude caliche bead (perhaps natural) and polished bone awl tip from Locality 3" (see Wormington and Ellis 1967, 39, 360). Moreover, nothing to match or replicate the date of more than 23,800 years ago (lab catalogue number C-914) for the extinct bison discovered by Harrington, allegedly with cultural remains, could be verified, and the stratigraphic overlap between the oldest artifacts and the youngest extinct faunas narrowed to a unit dated at 13,000 to 15,000 calendar years ago.
The members of the Tule Springs team and their consultants had hoped that the magnificent new sections would yield direct evidence of hunting or butchering of extinct animals. The National Geographic Society was ready to trumpet such a find; in anticipation of success, one of the society's best artists, Jay Matternes, prepared a dramatic color illustration of a kicking camel being speared by early hunters. (The illustration ended up as a black-and-white frontispiece to Wormington and Ellis 1967, a classic monograph that is still in print.) Despite the great expectations and three months of careful field work by a superbly supplied team overseen by the best professional archaeologists in the West, the excavations at Tule Springs satisfied none of those dreaming of an early archaeological site, particularly a kill site. But they did yield something more fundamental: an appreciation of negative evidence. It is true that "the absence of evidence is not evidence of absence"—but there are limits to how long and how strongly one can keep believing when supporting evidence is lacking. Apparently, professional archaeologists can not all agree on what constitutes an archaeological site, a disagreement that erupts from time to time at archaeological sites throughout the Americas, in recent years no less than in the days of Tule Springs.
Now a new set of sites and a new generation of advocates champion pre-Clovis inhabitants of the New World. The "sites" are the result of an ardent search over the last 20 to 30 years, on the heels of conspicuous failures at Gypsum Cave, Calico Hills, Tule Springs, and elsewhere, along with the embarrassment of Sandia points (see p. 146). Among some of those who advocate for early sites and trumpet a "paradigm shift" in the absence of solid evidence, I detect a passion similar to that exhibited by cryptozoologists in their eternal search for Bigfoot. I acknowledge that the subject of pre-Clovis colonization is one for experienced archaeologists and geologists to resolve. However, this time around, unless and until the heady claims can be replicated independently by skilled skeptics, I am going to stay on the sidelines.
From time to time colleagues ask why I do not accept Tom Dillehay's Monte Verde site in Chile, or Jim Adovasio's Meadowcroft site in Pennsylvania, as predating Clovis (see Bonnichsen and Turnmire 1999). In the view of Adovasio and David Pedler (1997), another main contender is Bluefish Caves, Yukon; there are perhaps a dozen more (Lavallee 2000; Roosevelt, Douglas, and Brown 2002), including Pedra Furada in Brazil. But none has been excavated and verified independently by neutral (or skeptical) parties. This is no reflection on the optimism of the original investigators; no one should be denied the opportunity to search for the unknown and to report their discoveries. But the fact remains that no claim for pre-Clovis archaeology has been put to what I call the "Tule Springs test." Sadly, many sites have been totally excavated by their claimants. Scotty MacNeish deserves credit for responding positively to my urging that he not completely excavate Pendejo Cave east of Oro Grande, New Mexico, on the Fort Bliss military reservation in southern New Mexico, a site he pronounced to be over 30,000 years old. I was amazed at how few professional archaeologists came to have a look at Scotty's claim, in particular those who are convinced that such evidence exists. Although he was a lifelong professional archaeologist trained at the University of Chicago and one-time president of the Society for American Archaeology, many members of his profession interested in early sites did not bother to inspect his site. (For an insightful, experienced overview of pre-Clovis claims, I recommend chapter 1 in G. Haynes 2002b.)
In the past decade geologists Jay Quade, Vance Haynes, and Erv Taylor accepted Scotty's kind invitation to visit the site and were less than convinced. The problem was not a matter of its antiquity—radiocarbon dates and extinct fauna are among the evidence for a pre-Clovis age— but of the claim for human occupation. I hope that other archaeologists will also recognize their responsibility to spare significant portions of their sites, ideally at least half, for independent, reasonably unbiased verification teams. Such have not been permitted to excavate Meadowcroft in Pennsylvania. The famous Monte Verde site in Chile was visited by a team of outside archaeologists only after excavation terminated. Not all in the group agreed on the claims of antiquity. Moreover, if there was a pre-Clovis population in Chile 13,000 radiocarbon years ago, in Pennsylvania 18,000 radiocarbon years ago, or in Alaska even earlier, as has been claimed, those brave pioneers did not demonstrate the environmental adaptations seen at the end of the Stone Age (Upper Paleolithic) in the Old World. Where are the large numbers of large sites with many distinctive stone tools? Where are the cave drawings? Where are the Upper Paleolithic huts made of concentrations of mammoth bones (see Klein 1999, 538-540)? We know Clovis hunters of mammoth occupied North America 11,000 radiocarbon years ago. Had people been here at an earlier time and lived as their Upper Paleolithic ancestors did in Asia, there should be no difficulty finding archaeological sites older than Clovis.
The heart of the argument appears to be that while the chronology of megafaunal extinction falls in the late glacial and in well-dated deposits close to 11,000 radiocarbon years ago, the age of Clovis points, there are few kill sites, most of them of mammoths. Archaeologists impressed with the claims for much older archaeology in the New World wash their hands of the matter of megafaunal extinction by assigning it to climatic change, rarely if ever the special research interest of those making this interpretation. Can this dilemma be resolved?
The contrast of the New World with the Australian experience is striking. In 1962 John Mulvaney discovered 16,000-year-old cultural char coal in Kenniff Cave in central Queensland. Within four years, radiocarbon dates associated with cultural material for Koonalda Cave, Bur-rill Lake on the New South Wales coast, and three rock shelters in Kakadu National Park all demonstrated occupation older than 20,000 years.
The 1970s brought still earlier dates for human occupation on the shore of Lake Mungo in western New South Wales and at Devil's Lair, a cave in southwest Australia. These findings established firmly a prehistory of more than 30,000 radiocarbon years and the possibility of at least 40,000. By 1980 over 20 sites of similar vintage had been identified. By 1999 Mulvaney and Johan Kamminga recognized more than 150 Pleistocene-age archaeological sites (Mulvaney and Kamminga 1999, 136). On the hypothesis that a flush of charcoal from vast and intense brush fires would occur during or very soon after human invasion, Peter Kershaw and others (2002) reported such evidence in marine sediments dating back about 42,000 years, supporting the archaeological record and another consequence of colonization, the known time when Australian megafauna suffered mass extinction.
By 1980 new discoveries were running into the limits of radiocarbon dating as it could be applied at the time. At Lake Mungo human skeletons had been found and dated to over 30,000 radiocarbon years. According to Richard Gillespie (n.d.), they are now thought to exceed 40,000 years. The Australian record has yielded early sites from Tasmania in the south to the vicinity of Darwin in the north and from Perth in the southwest to Cookstown in the northeast. Extinction of the Australian megafauna, once thought to have occurred later in time, is dated in the 40,000^-50,000-year bracket. If there is a lengthy gap between the time of human arrival in Australia and prehistoric extinctions, robust evidence for such a chronology has yet to appear. In addition, the Australian extinctions took place tens of thousands of years earlier than any known extinctions of megafauna (moas) in New Zealand, hippos, elephant birds, and giant tortoises in Madagascar (Burney, Robinson, and Burney 2003), or ground sloths in South America. Therefore, whatever caused extinction of megafauna in Australia, the Younger Dryas or any other worldwide climatic pulse of the last 20,000 years can be ruled out, since it would have affected these areas as well.
Australia is about the size of the contiguous United States. Its population is more than an order of magnitude smaller, which translates roughly into about a tenth the number of archaeologists searching for early sites. Australia's primary productivity is lower, meaning its mean annual production of plant dry matter is less than in the United States.
As a result, there would have been smaller numbers of prehistoric people to leave fossils or artifacts. Finally, there has been no vast program of salvage archaeology to expose buried sites that might yield new discoveries. Despite these handicaps, Australian archaeologists have in the last four decades radically extended their chronology of human arrival to or beyond the limit of radiocarbon dating at dozens of sites, while archaeologists hot on the trail of pre-Clovis colonization have failed to nail down any robust evidence of North American sites that is acceptable to the community of archaeologists as a whole. The discrepancy should trigger serious revisionary thinking. Perhaps American archaeologists in search of pre-Clovis sites need to hire some Australians. Aussies seem to be capable of finding and agreeing on the existence of sites tens of thousands of years older than the late-glacial fluted points and fishtail points that are the oldest artifacts unclouded by controversy in the Americas (G. Haynes 2002b).
But there is more to my position than the lack of pre-Clovis artifacts. I have grave doubts about the existence of a widespread and biologically effective human population in the Americas before 13,000 years ago precisely because large, slow-moving, eminently huntable animals such as ground sloths continued to occupy their favorite dung caves in North and South America as late as they did. The youngest dates on ground sloth dung may have as much to say about human presence as the oldest dates on artifacts. I admit that such inductive reasoning takes me onto treacherous ground. For example, Alejandro Garcia (2003) recently reported radiocarbon dates from La Gruta del Indio in Argentina that are 2,000 years younger than accepted dates on ground sloth extinction in America. Did these huntable ground sloths escape discovery and destruction by the first people into southern Argentina? A 9,000-year date on survival of ground sloths in North or South America is almost as noteworthy as the discovery of pre-Clovis archaeology would be.
Those of us not only skeptical of claims for human arrivals long before Clovis time in the late glacial but also willing to entertain the idea of human involvement in the extinction process find ourselves labeled as conservatives by European archaeologists who happily accept much older claims and choose not to consider the possibility of overkill (Lavallee 2000). Nevertheless, I think that what we know and can deduce of the behavior of various species in the late Stone Age, including our own, supports my argument. Once a species as adaptable as ours entered a continent as rich in resources as America, what would have prevented our species from immediately exerting an extraordinary impact on large, vul nerable native animals, in particular the ground sloths? An understanding of our species and its capabilities, as well as the morphology of giant xenarthrans, reinforces the theoretical case, as I see it, for overkill hard on the heels of first human arrival.
Any hypothetical colonists reaching the New World well in advance of Clovis time would have had to be inept indeed to leave no ecological trace of their presence. Gifted with the natural resources of a continent of Eden, a land extraordinarily rich in edible wild plants and animals, would the first humans fail to multiply and adapt? This would indeed be puzzling, since their economies must have been derived from those of the late-Paleolithic hunters and foragers of Eurasia. The stone and bone tools of those peoples are commonly found in association with bones of large mammals. Bison, reindeer, and red deer were popular prey, as evidenced by large numbers of Old World sites (Klein 1999). In contrast, our hypothetical pre-Clovis "flower people," as Daniele Lavallee's "radicals" appear to view them, must have ignored the doomed large animals in the New World. Instead, we are led to believe that the self-styled radicals imagine the First Americans tapped the resources of the Americas so modestly that humankind managed to remain scarce for thousands or tens of thousands of years before the large mammals suddenly vanished (to the great surprise, no doubt, of these early innocents).
Even if these First Americans had nothing to do with the extinctions, there is every reason to expect they would somehow have been involved with large animals, such as by using ivory or bones in crafting art or constructing huts (C. V. Haynes 1991). But little if any evidence of the use of animal remains has been detected to support the concept that humans populated America appreciably before 11,000 radiocarbon years ago. In the Old World, where far fewer animals became extinct, the bones or ivory of large extinct animals are abundantly present in an archaeological context, including in huts of mammoth bones constructed by late-Paleolithic foragers.
And what of the animal-related art? Old World cave paintings and stone and ivory carvings are widespread. They date back to and beyond 40,000 years ago, the limit of radiocarbon dating. In the Aurignacian (about 40,000 years ago, when the first modern Europeans appeared), some cave artists magnificently portrayed woolly mammoths, woolly rhinoceroses, giant deer, and other now extinct or extirpated mammals (Klein 1999). There is none of this in pre-Clovis or Clovis America. Clo-vis sites with mammoth or mastodon remains are tightly constrained to 13,000 years ago (Taylor, Haynes, and Stuiver 1996). In the New World, archaeological remains are virtually unknown in secure association with extinct animal remains. For example, we found no archaeological remains associated directly with the mammoth dung layer in Bechan Cave, which was a few thousand years older than Clovis time.
Some archaeologists discount the skills, the abilities, the very genius of the Clovis pioneers, depicting them as timid, tentative, and diffident foragers baffled at first by the major changes in climate and vegetation to be found as they spread south through the Americas. I find it much more likely that the first people here were skillful, robust, accomplished, highly adaptable, and above all, persistent and very likely passionate hunters. Their remote ancestors had overrun the forests and savannas of the Asian and African tropics, the shrub lands of the Mediterranean, the mixed conifer and oak forests of central Europe. More recently they had penetrated the cold, wind-swept high plateaus of Central Asia, the boreal taiga, and finally, in late-glacial Beringia, the arctic and subarctic steppe tundra—where winter temperatures now may drop to 50 degrees Fahrenheit, 51 degrees Celsius, below freezing—with minimal resources for months at a time. In the process of crossing the Bering Land Bridge into the New World, the early Americans must have found many animals familiar to them, such as woolly mammoths, musk oxen, bison, horses, caribou or reindeer, wapiti, and Dall's sheep, as well as the unfamiliar mastodons and megalonychid ground sloths, both relatively scarce. The American animals had no prior experience of the new invaders and, like elk in Yellowstone National Park, long separated from wolves and no longer fearing them, almost surely responded fearlessly to the sight and scent of these strange bipeds.
Very likely, wolfish dogs and/or ravens (Corvus corax) accompanied the First Americans and helped the invaders locate prey (Heinrich 1999). In lower latitudes they could have found fresh kills by observing scavenging birds such as condors, eagles, teratorns, and vultures. They could have followed fresh game trails to watering holes, especially those of proboscideans; droughts would have offered them particularly good opportunities for locating and dispatching mammoths (Jelinek 1967; G. Haynes 2002b). The foraging habits of native herbivores would have shown them some of the edible native plants. In addition, they must have had the geological and geographical insight to locate and remember the best outcrops of cryptocrystalline rocks, such as cherts, widely sought for making stone tools. The early people knew lithology.
In short, the idea of Homo sapiens as a subdued or ineffective inhabitant of the Americas, one who had the skills to get here but not the bi-
otic potential to make a difference after arrival, in particular to neglect to hunt desirable and vulnerable large prey, seems to me simply absurd. As they arrived, people immediately became the keystone species.
Beyond the skills of our late-Stone Age predecessors was the unusual vulnerability of many large animals in America and on any other landmass unknown to hominids. Among the large animals, ground sloths should have been among the most vulnerable to human predation. Being large and leaving large and distinctive droppings, the ground sloths and their relatives, the glyptodonts, would have been easy to locate and track. Moreover, the sloths most likely defended themselves by sitting up and clawing their attackers, as do their modern relatives, the giant ant-eaters. Presumably some such strategy would have served ground sloths for millions of years against contemporary carnivores, but human hunters would soon have learned to stay out of reach of the slow-moving animals and would have speared or stoned them to death from a safe distance.
Even if the sloths shared the giant anteater's other defense of a foul taste, their unusual vulnerability would have tempted younger hunters or their preteen followers to use the luckless animals for target practice. Similarly, the armored carapace of the pampatheres and glyptodonts suggests a passive defense that might ward off big cats and dire wolves, and the glyptodonts' clublike or macelike tails must have packed a lethal wallop against unwary attackers. Neither defense would have lasted long against adaptable human hunters.
Turning to persistent proposals of a pre-Clovis culture, the question is how humans could have been skulking around the hemisphere for thousands of years without depleting the megafauna or hastening extinctions. Harrington's goats were better suited than the ground sloths to escape human predators, so why did they succumb? To be sure, mountain goats at the generic level did survive; Harrington's goats, living at lower altitudes, may have been more exposed to the newcomers than Rocky Mountain goats.
Joel Berger's recent findings suggest that at first contact the American fauna would have lacked behavioral defenses against humans, including the fear and alarm response necessary to inspire potential prey to fight or flee. Berger, Swenson, and Persson (2001) have reported a naive response by moose (normally highly suspicious creatures) and elk when wolves were reintroduced to the Yellowstone region after an absence of at least a century. Very likely it would have taken longer for potential prey to learn to fear the new human predators than it did for the moose to learn to fear the wolves, which their ancestors had known to be dangerous.
In general, naïve prey species are utterly unprepared for the intense prédation humans can inflict. In the undisturbed Galapagos Islands, Charles Darwin wrote of doves so tame they were killed steadily and left in a growing pile by a boy with a switch who waited for them at a spring (Voyage of the Beagle): "The birds are Strangers to Man & think him as innocent as their Countrymen the huge Tortoises" (quoted in Terrell 1986, 94). Sea lions, Darwin's Finches, and many other animals in the Galapagos are still famous for their fearless behavior in the presence of humans. I believe they give a fair indication of how the large animals of any uninhabited lands, islands or continents, might have responded to their first contact with humans (see plate 14).
A similar example can be found in the Gauttier Mountains, an isolated, uninhabited, and largely unvisited range rising out of tropical lowlands in New Guinea. It was the tree kangaroos in these mountains that led ecologist Jared Diamond to shed his doubts about the possibility of prehistoric overkill in America. Diamond wrote:
Until I had worked in the Gauttiers, I was mystified to understand how the few Maoris in the vastness of New Zealand's South Island could have killed all the moas, and how anyone could take seriously the Mosimann-Martin hypothesis of Clovis hunters eliminating most large mammals from
North and South America in a millennium or so. I no longer find this at all surprising when I recall the large kangaroo Dendrolagus matschiei remaining on a tree trunk at a height of 2 meters, watching my field assistant and me as we talked nearby in full sight. The low densities of these mammals elsewhere in New Guinea, even in areas visited annually only by nomadic hunters, illustrate how susceptible large, k-selected mammals with low reproductive rates are to hunting pressure. (Diamond 1984, 847)
Interestingly, many of the North American large mammals to survive human contact originated in or were closely related to species found in the Old World, where they overlapped with humans (Kurten and Anderson 1980). Examples include caribou (Rangifer), elk (Cervus), moose (Alces), and mountain sheep (Ovis). When they first arrived the moose, elk, caribou, and bison that we now consider natives of North America were every bit as foreign to the continent as the humans who followed soon after. The extinction of large mammals of North America eliminated mostly long-established natives. Most of the newcomers survived (Ward 1997; Kurten and Anderson 1980).
Some are very reluctant to accept that our species killed off the large mammals. Would the first people have been prudent predators, adjusting their harvest to their needs? Or would they have killed freely, without restraint, if prey did not attempt to escape? We cannot know for cer tain one way or the other, but we do have some interesting data on how predatory species behave when killing is easy.
From his research on the Serengeti Plains of Tanzania, ethologist Hans Kruuk (1972) records an excessive slaughter of Thompson's gazelle by spotted hyenas. On a moonless and stormy night in November 1966, hyenas killed 82 gazelle and badly injured 27 more in a single 4-square-mile area. Kruuk reconstructed the killings "from tracks in the wet mud, injuries of the victims, and other evidence." Interestingly, Kruuk noted, "Of a sample of 59 dead ones, 13 had been partly eaten (almost only soft parts). . . . Tracks indicated that spotted hyenas had walked very quietly from one victim to the next at a normal walking pace. . . . When gazelle are hunted by hyenas in the usual manner, there is a long and fast chase before the gazelle is caught, over distances of up to 5 km." Presumably the gazelle had been dazed and traumatized by the storm, thus becoming uniquely vulnerable to the hyenas, which had not hesitated to kill far more than they could eat.
Kruuk's account set off a flood of literature on "surplus killing" or "excessive killing." Wildlife ecologists discovered that wolves fed on only half of the carcasses of newborn caribou they killed in minutes in a 1-square-mile area of the Northern Territories of Canada (Miller, Gunn, and Broughton 1985). In deep snow in especially snowy winters in Minnesota, wolves killed more deer than they could consume (Del Giudice 1998). Especially during seasons when the animals were under stress, making their dispatch even easier, might humans have been similarly disposed to surplus killing of their naive American prey? Some research suggests an affirmative answer: "In recent years anthropologists and conservation biologists have suggested that the hunting strategies of subsistence hunters are opportunistic, not density dependent or designed for sustained yield" (Kay 1994; see also Winterhalder and Lu 1997 and references therein). Both historically and prehistorically, an opportunistic hunting strategy reduced or suppressed high-ranked (highly desired) resources in parts of western North America (Kay 1994; Truett 1996; Broughton 1997; Martin and Szuter 1999).
In addition, anthropologists have noted that what might appear to be wasteful killing may not be. Much of the carcass of a large animal is inedible; consuming too much protein can even be poisonous. People active in an outdoor life have high caloric requirements; often their prime need is fat. In times of drought and for much of the late winter and spring, bison, and presumably other Quaternary mammals, would have been in poor condition, with minimal body fat. Most of the carcass would have been unfit to eat, as Lewis and Clark discovered. Thus, the historic killing of bison for the fat-rich tongues and hump alone may not have been as wasteful, at least in lean seasons, as John Speth (1983) observed. The foot pads of proboscideans would have been another tempting source of fat.
Occasionally the argument may be heard that First Americans were skilled conservationists who would not have exterminated potentially valuable and attractive species of big game. There is little doubt about First Americans' abilities as shapers of habitat; the ethnographic and pa-leoecological evidence for knowledgeable manipulation of fire, in particular, is overwhelming (Bonnicksen and others 1999; Davis and others 2002). But this does not prove that the First Americans were incapable of surplus killing. Some prefer to think of them as vegetarians who lived in harmony with nature. They would not have done violence to large animals, and certainly would not have exterminated the "gentle giant" ground sloths. Perhaps the elders would have sought to protect ground sloths. Who knows? But there is no reason to believe that the first hunters could perceive, much less control, the negative impacts of their arrival. Enjoying an ample food supply and not threatened by any serious diseases or enemies themselves, the hunters would more likely have rapidly increased their numbers, expanded their range, and eliminated their more vulnerable preferred prey.
Emotional objections to a view of our ancestors as surplus killers may influence some theorists. As Peter Murray has written, "The notion that aboriginal hunters may not have been conservation-minded will be obstinately denied by those committed to the idea. Climatic change is a conveniently neutral causal factor that can extinguish a megafauna without any emotive connotations" (Murray 1991, 1141). Alas, though peaceful coexistence is a condition greatly to be desired, yearnings are not enough to create it.
Let me make clear that identifying Quaternary humans as agents of mass extinction denigrates neither the ancestors of present-day Indian or Aboriginal people nor the rights of present-day Indian nations to manage their game resources as they see fit. I am horrified to be told that the theory of overkill has been used against both Native Americans and Australian Aborigines in managerial controversies. The most that can be said is that our species, Homo sapiens, appears to have been involved. We may blame our species, for all the good that will do. Indeed, should we, in the next 12,000 years, cause as few extinctions of large mammals as the Native Americans have in the 12,000 calendar years since the days of the ground sloths, we would be able to consider ourselves incredibly lucky. Alas, the record of American conservation in the years since Columbus suggests we will fall far short of that standard—with a vast number of smaller species at risk even now, as Science and other sources report evidence of the onset of Earth's sixth mass extinction.
A final argument sometimes raised regarding the overlap of humans with extinct mammals is that humans could not have spread rapidly enough to account for virtually simultaneous extinctions in North America and southern South America, at least not without leaving field evidence of having done so (Jelinek 1967; Meltzer 1993). Those archaeologists who are gradualists look for equilibrium between resources and human populations. Nothing would be expected to happen catastrophi-cally, and an empty continent would be populated very slowly. Once upon a time I too ruled out the possibility of a catastrophe, both for large mammals in the late Quaternary and for dinosaurs at the end of the Cretaceous (Martin 1967b). However, to me the evidence (in both cases) is now too strong to ignore. The archaeological record simply does not preclude the possibility of a prehistoric blitz in which the invaders swept the hemisphere in 1,000 years or less, leaving dozens of extinct taxa in their wake (particularly larger, more slowly reproducing species) (Martin 1973).
Models of maximum population growth rates in a favorable environment of previously unhunted animals have yielded some fascinating results. In 1973, with help from Dave Adam and other young faculty in the geosciences department at the University of Arizona, I whipped off a back-of-the-envelope article in which I maximized the rate of human invasion and the magnitude of impact of hunters while minimizing the time required to attain a large population and to sweep the continent. I knew some would find the parameters extreme, but the editors and reviewers of Science, bless their hearts, accepted the article.
A few years later, Jim Mosimann, a biometrician with the National Institutes of Health and a close friend from graduate school days, designed a more respectable model based on difference equations and the work of Russian climatologist Mikhail Budyko (1967). To our knowledge Budyko was the first to treat mathematically the extinction of mammoths by human predation. Using then state-of-the-art software running on an IBM 650, we generated different versions of a discovery scenario (Mosimann and Martin 1975). These were revised by Stephen Whit-tington and Bennett Dyke (1984) and most recently by John Alroy. Al-roy (2001) found he could attain rapid extinctions of many large North American mammals with a much smaller human population and a more modest kill rate. Graeme Caughley (1988) and Richard Holdaway and
Charles Jacomb (2000) have devised versions for New Zealand, and Steve Mithen (1997) has modeled mammoth extinction in Eurasia.
The computations are actually fairly simple. We can start with a very small group—say, 100—arriving from Asia. The maximum rate of population growth observed today anywhere in the world is roughly 3.3 percent per year, or a doubling every 22 years. Historical records of population growth on newly colonized tropical islands are consistent with this figure. In a large and lush New World, well supplied with resources and relatively free from contagious diseases (most of which appear to have originated in the tropics), an annual growth rate of 3 percent is not unreasonable. Anthropologists have estimated that a hunting population requires, for its support, at least a square mile per person, if the game supply is ample (Ward 1997, 146). This means North America could have supported roughly one million Clovis people. At a 3 percent annual growth rate, the Clovis invaders would have reached this number in only 350 years (20 generations). At a growth rate half as high, they would have taken 800 years.
Geographic expansion would have required far more modest movement at any one time than may at first appear. It is probably safe to assume that the Clovis people moved outward from their camps as easy-to-hunt game became locally scarce. If they moved only 10 miles a year, they would have reached the Gulf of Mexico in 350 years. Progressing in this fashion, they could have caused the mass extinctions without ever even reaching their theoretical population maximum. They would simply have abandoned each region after hunting it out. (A succession of such short-term stays would also help explain why they left so little trace of their presence.) After the megafauna were gone, the human population may have crashed unless people rapidly learned alternative survival skills, such as fishing, hunting smaller game, and gathering.
Was this article helpful?