The Hanging of Ephraim Wheeler: A Story of Rape, Incest, and Justice in Early America
Cases of trouble, as Karl Llewellyn and E. Adamson Hoebel called them in Cheyenne Way (Norman, Ok., 1941), define the bonds that hold community together. Rape and incest are always cases of trouble in our society. In early America, they tested the absolute authority of the father in the patriarchal family. Far more than any other crimes, rape and incest tell us much about power and powerlessness in the family. Irene and Richard Brown’s remarkable retelling of the case of Ephraim Wheeler proves Llewellyn and Hoebel’s point.
The setting of this case was Berkshire County, in the far-western hills of Massachusetts. With woods of startling beauty cross cut by streams and valleys, Berkshire might have been an Eden. For its farmers, however, it was anything but a “sylvan paradise” (15). They saw not paradise but commodity, “timber, firewood, and fencing” (15). New arrivals jostled with the older settlers for land. The losers became day laborers on the winners’ homesteads. A few Indians remained, and a handful of African Americans (one percent of the total population in 1790) had joined the natives on the margins of a largely English population. But Berkshire was not picky: “those who worked hard and behaved decently could be accepted on their merits” (16). Such were the Odels, a mixed-race family into which Ephraim Wheeler married.
Ephraim Wheeler and the Odels were people at the bottom of a society that still kept track of rank and order. Even on the old frontier, those who were at the top took pains to let everyone know. In town, “the conflict between gentlemen creditors and mortgaged farmers was often personal” (17). Only twenty years before, these conflicts had led to open rebellion, reprisals, and lasting animosity. Wheeler was one of the rebels. From where he stood, slouched in poverty, he could barely look up to the heights of political and legal authority. There, erect in posture, patriarchs, stood the judges of the Supreme Judicial Court. Simeon Strong, a Yale graduate and leading lawyer; Theodore Sedgwick, a second Federalist whose political rise had taken him to the Continental Congress and then the United States Senate; and Samuel Sewall, whose family had helped rule the colony and the state for a hundred years were the men who would sit in judgment of Wheeler, with his life in their hands.
On June 8, 1805, Wheeler allegedly raped his thirteen-year-old daughter Betsy. Though he was never a good parent or provider, his daughter still agonized before telling her mother what her father had done. Furious, Hannah Wheeler turned to trusted family members, and they, to the justice of the peace. Wheeler’s incarceration was swift. For two days in September 1805, the Commonwealth of Massachusetts prosecuted and Wheeler’s lawyers defended him for the capital offense. For the prosecution, state attorney James Sullivan, whose family background was little different from Wheeler’s but whose fortunes were exactly opposite, thundered that Wheeler must be guilty. For the defense, local attorneys John Hurlbert and Daniel Dewey could not shake Betsy’s testimony nor convince the jury that Hannah had ulterior motives for bringing her daughter’s shame to light.
How then could this silent, violent, and alienated man defend himself? It is not clear from the surviving accounts what part Wheeler took in his own defense. His counsel could not call his wife (though as a party who had accused him, her spousal right not to testify could have been challenged). He did not testify on his own behalf. He had the right to do so but could not be compelled, and defense counsel often keep their clients off the stand. Once sworn, they can be cross-examined, and their prior statements can be used to impeach their present testimony. In any case, the jurors knew him or his reputation and could see in his demeanor his immense hostility to all his betters.
The jurors did not deliberate long. Without meat, drink, or light, as was the custom, they reached a verdict—guilty of the rape—that under the current criminal code had only one penalty. They could have “mitigated” by finding guilt of incest only. In effect that would credit Wheeler’s defense of consent. It would save him from the noose. Juries often enough nullified harsh law by such means, and, in New England, courts allowed such jury discretion. But in this case, given the impact of Betsy’s testimony and the impression that Wheeler made on the jury, mitigation was unlikely. Found guilty by a jury from his own county of Berkshire, Massachusetts, Wheeler died by hanging on February 20, 1806.
Wheeler’s motive is hard to recover. Perhaps it was simply an irresistible impulse building over decades of sexual as well as social and economic frustration. Perhaps the real target was Hannah, a fault-finding and back-biting mate. Rape is always a crime about power. The Browns reveal that, according to the emerging reform criminology of that day, evil came not out of congress with the devil (an earlier view) but from prior experience. Piecing together Wheeler’s hard-knock life gains this monster a modicum of sympathy. Orphaned young, beaten and abused during his apprenticeship, he learned his passionate and harsh ways in a hard school. His marriage to Hannah Odel, whose parents were both white and black, did not make his life any easier. She never loved and rarely obeyed him, and she whined. She had good reason: he could not hold a job, left (or was thrown out) periodically, and surely abused her as well as his three children. Thus, naturally, his affections turned to his little mother, Betsy.
She was her mother’s helper, but her docility and desire to please, if natural, could easily be misread by a man who had trouble reading, in more ways than one. He tried twice before he succeeded in raping her. Both times she concealed her state, hoping, the Browns surmise, that each episode was the last. But abusers only grow bolder, and each time Wheeler was frustrated in his efforts to act the head of household, his subversive desires grew stronger. Finally, announcing that he would leave Hannah and take the two older children, he would no longer take “no” from Betsy for an answer.
The case was notorious in its time, for the capital crime and the sanguinary punishment. Wheeler, a tight-lipped man who could not read or write, gave an account of himself the night before he was executed. A local newspaperman sat through the trial and hired a note taker. The newspapers carried their own accounts. The daughter of one of the judges turned the story into a novel. The defense counsel, Hannah and Betsy, and ninety-three of Wheeler’s fellow Berkshiremen petitioned Governor Caleb Strong and his executive council to commute the sentence. The Browns have researched every similar case and report that most often these petitions for reprieve from the death sentence were granted. This time, there was no reprieve. The nature of the crime and the absence of character testimony at the trial or in the petitions outweighed the antipathy of many for the ultimate sanction.
At the heart of the story is a darkness. While the Browns stretch themselves to see into the heart of the man, to gain for him and for the reader some empathy, he remains an incestuous father and a rapist. The Browns, both accomplished historians, must rest with the obvious conclusion, “Ephraim Wheeler was a vicious man, but still a man” (290), entitled to this one more day in court.
Wheeler’s story still has the power to shock, but it is hardly unique. What makes this retelling so valuable is that Wheeler’s crime and the historical record his trial created throw light on a world too often hidden. History affords little scope to those who could not read or write and hence left no documentary or anecdotal evidence of their passing. They wrote their lives on water and sand. But those lives can be surmised, weighed, and assessed nonetheless. The Browns pour over all the sources, peer into their corners, lift their edges, and when documentary evidence ends, make astute surmises. “We do so in order to better understand the meaning of the events” (10). This venture into what may be called novelesque historical interpretation, recently made popular by John Demos’s The Unredeemed Captive (New York, 1994), is increasingly popular among microhistorians.
The genre of crime stories has grown large. Individual cases—such as those discussed in Elaine Forman Crane’s Killed Strangely (Ithaca, N.Y., 2002) and John Ruston Pagan’s Anne Orthwood’s Bastard (Oxford, 2002), for example—and crime waves—such as those discussed in Peter Charles Hoffer’s, The Great New York Conspiracy of 1741 (Lawrence, Kans., 2003) and Jill Lepore’s New York Burning (New York, 2005)—illuminate the dark corners of our founders’ world. Though many would prefer to celebrate our history uncritically, saving their criticism for the scholars and teachers, these cases reveal the underside of life.
This article originally appeared in issue 6.2 (January, 2006).
Peter Charles Hoffer is distinguished research professor of history at the University of Georgia. His Seven Fires: Urban Infernos that Reshaped America is forthcoming from Public Affairs.
Sometimes a Chair is Only a Chair
Fortress of the Soul: Violence, Metaphysics, and Material Life in the Huguenots’ New World, 1517-1751
Fortress of the Soul is an immensely ambitious, learned, and imaginative but fundamentally flawed book. The chronological sweep is impressive, as indicated by the anagrammatic time frame of 1517 (the year of Luther’s ninety-five theses) to 1751 (which witnessed the publication of Franklin’s Experiments and Observations on Electricity, among other events). Even more notable, however, is the thematic range. As Kamil notes in his preface, the book “is intended to engage historians of science as well as historians of religion, technology, art, and artisanry, sexuality (and the body), agriculture, human geography, textual criticism, the book, ecology, and I hope most of all, the colonization of pluralistic New World societies” (xix-xx). With such an agenda, it is no wonder that Kamil is unable to deliver fully on his promise.
The first part of the book revolves around the teachings of Huguenot potter, alchemist, and Paracelsian natural philosopher Bernard Palissy (1510?-1590?). Stunned by the martyr’s death in 1557of his religious mentor Calvinist preacher Philibert Hamelin, Palissy grew increasingly skeptical of the ability of the walled fortress of La Rochelle to protect the region’s Protestants, particularly in the rural hinterland. By 1563, aided by his ruminations on natural history, he had devised an alternate model for the safety of his fellow Huguenots: “security had either to be internalized as skill and industriousness or carried by fleeting refugees ‘on their backs’ the way an artisan carried his tools, or the limace [snail] its portable, inside-out shell” (97). What Calvin called “Nicodemism,” or the concealment of inner belief for purposes of self-protection, was justified rather than condemned, enabling industrious, camouflaged Huguenots to survive and prosper in hostile environments.
Palissy’s program was not only practical but metaphysical. As a Paracelsian adept, he subscribed to the Neoplatonic notion of cosmological harmony between man-microcosm and universe-macrocosm, since the divine soul of the creator animated all creation. This animate materialism, by positing the monistic unity of all souls in God, provided a philosophical basis for Nicodemism, indeed for reconciliation across the Protestant-Catholic divide. While the ultimate recombination of matter and spirit, what Kamil calls the material-holiness synthesis, would be the task of the rapidly approaching Millennium, the adept could anticipate the process of redemption through alchemy, the separation of pure from impure by fire.
Palissy’s ceramic experiments, culminating in rustic lead-glazed basins encrusted with naturalistic figures such as snails and amphibians, embodied both his belief in camouflage and his industrious search for the alchemical Millennium. His obsession was to make the perfect translucent glaze: “For Palissy, the white glaze signified the flash of astral spirit materialized and then merged with the macrocosm in enamel . . . ” (325).
According to Kamil, Palissy’s ideas became widely diffused among artisans in his corner of central western France, where they persisted for over a century. Acquiring further urgency after the siege and fall of La Rochelle in 1628, they then traveled with the Huguenot Diaspora of the seventeenth century, eventually reaching across the Atlantic to the American colonies. In particular, Kamil finds traces of Palissy’s esoteric worldview in the material culture of Huguenot furniture makers in New York, for instance, viewing a high chest of drawers made by Huguenot descendent Samuel Clement in 1726 as an emblem of “the force of cosmological unity” (847).
Fortress of the Soul contains a number of interesting arguments, among them a new critique of Max Weber’s much-debated classic, The Protestant Ethic and the Spirit of Capitalism. Like Weber, Kamil views innovation and commercial profit as intrinsic to the Protestant project, but he takes issue with Weber’s interpretation of that project as an example of modern man’s “disenchantment of the world.” On the contrary, he argues, Palissian metaphysics reenchanted the world, as its practitioners recognized its spiritual and economic potential for their respective crafts. Kamil thus provides a useful corrective to the teleological narrative of modern history as disenchantment, an approach that fails to account for the continuing strength of faith-based belief systems in contemporary times.
Kamil also goes against the grain in arguing that metaphysics, not skepticism, laid the basis for early modern “multiculturalism,” defined as Christian inclusiveness rooted in Neoplatonism. Such pluralism was embodied in a multilingual sundial engraved in New York by a son of Huguenot refugees in 1751, which juxtaposed rhyming texts about industry and God’s protection in Greek, Latin, Dutch, French, and English. This pluralism was also responsible for a Huguenot-Quaker convergence in New York, as spiritual affinity between the two groups bore fruit in the merging of familial craft networks.
Kamil traces the resulting process of Anglo-French creolization in furniture making, a colonial industry characterized by multitalented artisanship and abundant slave labor. Of the upholstered leather chair, “the single most influential moveable” produced in colonial America (712), he writes, “Thus the New York leather chair . . . was not purely French, English, Dutch, Bostonian, or American. Instead the New York leather chair is a material manifestation of the interactive and competitive discourse of cultural convergence, quotation, and creolization whereby different regional cultures communicated their perception of difference to themselves and others” (749). Successful artisans, moreover, were able to transcend ethnic identity with social status, becoming part of the local elite.
Despite such perceptive observations, Fortress of the Soul is not a successful book. The fundamental weakness is Kamil’s inability to show conclusively that Palissy had a committed following among Huguenot artisans in Aunis and Saintonge and that the descendents of these artisans went on to spread his ideas throughout the Atlantic world in the seventeenth century. His claim that Charentais artisans shared a religious outlook rooted in animate materialism and Paracelsian alchemy rests on the assumption that Palissy, in his role as lay preacher, had evangelized them. Yet Kamil is forced to admit that “we can only guess how many other ‘simple artisans’ shared Palissy’s association with this international tradition . . . ” (191).
Of course, if Kamil is correct in his claim that Huguenot artisans concealed their innermost beliefs in order to survive, then proving their allegiance to those beliefs is no simple task. The only method, in fact, consists of deciphering the cryptic messages encoded within their artifacts. Kamil writes, “By the mid-sixteenth century, southwestern Huguenots had developed a mobile, mutable, largely artisanal culture that expressed its values, attitudes, and beliefs obliquely, usually in material form, by converging invisibly, yet within plain sight, with the most powerful symbols of the dominant host culture. A marginalized people, they chose to display their personal symbols on the margins of their work” (757).
While I am sympathetic to Kamil’s argument, his esoteric readings of the French and American artifacts left me unconvinced. Frankly, the examples of seventeenth-century pottery from La Chapelle-des-Pots (the village where Palissy first learned his potter’s trade) do not much resemble the Palissy basins to which they are compared, though Kamil makes much of the common moldings and lead glazes. His attempt to interpret these ceramics as mystical allegories based on a comparison with texts by Rosicrucian alchemist Jakob Böhme and various occult prints likewise falls flat in the absence of concrete evidence for intertextuality (339-386). The fact that an image was (or may have been) available to an artisan or artist does not constitute proof of influence.
Similarly, Kamil’s interpretation of American furniture and woodworking styles is more successful when he focuses on creolization than on Palissian metaphysics. American craftsmanship had much in common with that of central western France, specifically the Île de Ré near La Rochelle. Although many of the island’s artisans were Protestant in the early seventeenth century, all the examples of insular woodworking discussed by Kamil are associated with the Catholic Church: a baptismal screen, a confessional, a pew door, a choir screen. Even if one accepts Kamil’s argument about Huguenots hiding in plain sight—accepting patronage from the enemy while pursuing cosmic nullification of religious difference—it is difficult to imagine them willingly carving confessionals. Since the chairs made by Huguenots in New York also resemble those—presumably the work of Catholic Frenchmen—of the St. Lawrence and Mississippi valleys (739-740), it seems that Kamil is documenting the transplantation of a regional rather than a specifically Protestant or esoteric tradition.
Kamil indulges in numerous digressions in the course of the book, retracing the ramifications of esoteric thinking far beyond the Huguenot community. Featuring such colorful figures as John Winthrop Jr., Sir Kenelm Digby, and William Hogarth (the subject of a 150-page chapter), these diversions are sometimes more erudite than convincing. In interpreting Hogarth’s 1736 painting portraying a Huguenot church, Noon, L’Église des Grecs, Hog Lane, Soho, Kamil purports to get inside the artist’s mind: “How, he asked himself, does one represent an absence?” (566). Yet here as elsewhere Kamil’s hermeneutics relies more on imagination than evidence.
As a work of 923 pages (1031 with the notes), Fortress of the Soul will not be widely read, at least not in its entirety. Parts 1 and 2 (the first seven hundred pages) will appeal primarily to specialists in the intersection of religion, philosophy, and science in the sixteenth and seventeenth centuries. Part 3, with its careful reconstruction of the craft networks of colonial New York, will be of considerable value to historians of colonization. Although the book could have benefited from editorial intervention—to eliminate redundant quotes, for example—there are surprisingly few errors in the lengthy and complicated text. It is beautifully produced with abundant and helpful illustrations. In the end, readers must make up their own minds as to the validity of Kamil’s original and provocative thesis.
This article originally appeared in issue 6.2 (January, 2006).
Leslie Choquette, L’Institut français Professor of Francophone Cultures and director of the French Institute at Assumption College, is the author of Frenchmen into Peasants: Modernity and Tradition in the Peopling of French Canada (Cambridge, Mass., 1997).
The Revolution Heard Round the World
Empire and Nation: The American Revolution in the Atlantic World
Scholars have long rejected the view that the American Revolution was a limited and staid affair. Indeed, it was profoundly disruptive and belongs in the pantheon of great western revolutions. The American Revolution provoked a series of developments crucial in the move from the early modern to the modern world. The essays in Empire and Nation contribute to this view by situating the events in the thirteen colonies in their broader Atlantic-world context. Arguments that began over taxation and the proper interpretation of the British Constitution ended up, either directly or indirectly, altering political, social, economic, and cultural values and relations around the globe.
The five essays in part 1, “Reconstituting the Empire,” explore how British statesmen unwittingly provoked the colonists and how independence led to significant changes in American political and legal culture. Eliga Gould contributes to a growing scholarly sense that British policymakers were reformers struggling to manage their dizzying success. The misunderstandings of the 1760s and 1770s resulted from the complexities of imperial management and the difficulty of absorbing new and diverse territories.
The remaining essays in part 1 explore the directions Americans headed as they refashioned their political and legal institutions and culture. David Hendrickson shows how determined Americans were, after declaring independence, to allow only the weakest national government. Donald Higginbotham describes how the logistical nightmare of war taught many Americans the benefits of nationalism and a vigorous national state.
Though the Constitution provided a far more centralized government than Americans could have imagined in 1776, Richard Alan Ryerson explains that it was acceptable because Americans ratified it in a climate where politics and law were becoming more democratic. Ryerson closely examines the political thought of John Adams and reveals how quickly American political thought was changing during these years. Adams sought to protect the many from the few, but he always viewed democracy as a distinct social order rather than a mode of government. By the late eighteenth century, Americans were well on their way to understanding democracy as a process, really the process, for pursuing politics.
This sense held implications for jurisprudence, as Ellen Holmes Pearson shows. British common law was far too entrenched and vital for Americans to fully declare their independence from it. Yet, as the sum of immemorial custom, it coexisted uncomfortably with a political culture of popular sovereignty that was determined to continue democratizing. After the Revolution, what a democratic polity wanted trumped tradition as the determinant of lawfulness. States molded common law to their needs. Common law remained significant, but present needs always shaped understandings of what common law meant. Thus common law was more likely to be invoked to alter rather than maintain the status quo.
The six essays in part 2, “Society, Politics, and Culture in the New Nation,” explore how the lives of ordinary people changed with independence. The Revolution gave meaning to the Mason-Dixon line and contributed to the rise of multiethnic politics, which in turn shattered an older commonwealth faith in a unitary public good. It also began a process that democratized the public sphere, while establishing a boundary between the legitimate concerns of voluntary societies and the business and responsibility of governments empowered by popular sovereignty.
In a close examination of the Appalachian valley in Pennsylvania and Virginia, Mary Schweitzer describes how the Revolution divided what had been an integrated backcountry society where colonial borders were once largely irrelevant. This essay shows how slowly but surely Appalachian Pennsylvanians and Virginians became northerners and southerners. Maurice Bric and Stephen Sarson also suggest the growing significance of the Mason-Dixon line. Bric explains that after the Revolution northern cities such as Philadelphia grew increasingly diverse. A new era of ethnically based and interest-group politics challenged the traditional notion that unified elites could articulate and pursue one public good meaningful to all. Sarson shows that the Revolution did not significantly transform the Chesapeake. It remained committed to tobacco and slavery, and, as the North gradually abolished slavery, living with it in the South contributed to the growth of a southern white male identity that appealed across class lines.
The remaining essays in this section build on the themes of democratization and the growing sense of the need for union as the survival of slavery portended troubles. Melvin Yazawa shows that the violent political speech of the 1790s and afterwards must be viewed in a context where all involved considered union preferable to disunion, which lessened the danger of incendiary talk. Yazawa suggests that southerners continued to prefer union to disunion for precisely as long as they considered slavery and union compatible. Marc Harris explores how the Revolution made the public sphere more democratic. But the public sphere itself became more complicated after 1776 since governments now drew their authority from popular sovereignty. After the Revolution, Americans had carefully to delineate which concerns were the proper business of their relentlessly voluntary and egalitarian public sphere and which obligations and duties belonged to their governments alone. Closing part 2, Robert Calhoun describes how religious denominations made themselves compatible with republican political theory and objectives. As with law, religion had to accommodate a democratizing culture. Thus by the 1820s mainstream denominations reinforced growing northern and southern divisions over slavery and, by doing so, strengthened each region’s faith in its own righteousness.
Part 3, “The American Revolution in the Atlantic World,” brings the British Empire front and center. The four essays show that the American Revolution also had a profound impact on the empire Americans left behind. Keith Mason discusses how the Revolution dispersed tens of thousands of loyalists throughout the empire and, in particular, swelled the empire’s free black population. Rapidly after 1780 it was better to be black in the British Empire than in the republican United States. James Sidbury explains how the first slave narratives appeared, why there was a market for them, and why their authors knew to look to the empire for a hearing and not the United States.
Edward Cox situates the development of British abolitionism within the broader Age of Revolution. The American and especially the French Revolutions spread ideas to the Caribbean that made once stable slave societies increasingly ungovernable. Haiti became a symbol of a better future. As the British began to reconsider slavery, they were driven in part by slaves who were forcing the issue. In the volume’s final essay, Trevor Burnard argues that, with regard to slavery, the American Revolution placed the British Empire and the United States on sharply divergent courses. The Revolution threw slavery into sharp relief in the United States and led to its gradual abolition in the North. Yet in South Carolina between 1780 and 1800 the slave population increased, and southerners made it quite clear that the rise of the republic did not mean abolition.
Yet Burnard shows that idealism and the British conviction that the empire cared deeply for liberty meant that British abolitionists could find a hearing. After the Revolution, imperial statesmen concluded that their authority depended on gaining greater control of colonial elites. As slavery became embarrassing and as slaves embraced revolutionary ideas and became harder to govern, imperial managers found that defining British liberty as fully antislavery simultaneously centralized imperial power and provided a deeply appealing moral position. Between 1800 and 1840 slavery declined and was abolished within the empire while it flourished and became central to the development of the United States. Burnard thus poses the provocative question: whose was the empire of liberty?
Taken together, the essays in Empire and Nation show that the American Revolution transformed the Anglophone world and had vital consequences for the cultures that those who spoke English encountered. More particularly, the essays join a literature that connects the American Revolution to the democratization of virtually every facet of American life and that highlights the centrality of slavery and racial prejudice in the history of the United States.
Further Reading:
For general context P.J. Marshall, ed., The Oxford History of the British Empire: The Eighteenth Century (Oxford, 1999) is superb. Jack P. Greene, ed., The American Revolution: Characteristics and Limits (New York, 1987)examines the Revolution and its implications from multifarious perspectives. Michael A. Morrision and Melinda Zook, eds., Revolutionary Currents: Nation Building in the Transatlantic World (Lanham, Md., 2004) contrasts the American Revolution with the era’s other revolutions; the article by John M. Murrin is the best concise statement about the American Revolution that I have read. Gordon S. Wood’s The Americanization of Benjamin Franklin (New York, 2004) explores why the British Empire mattered so much to colonists and how, rather suddenly, they came to loath it. T.H. Breen’s The Marketplace of Revolution (Oxford, 2004) shows how the desire for independence emerged from increasingly intimate connection to the empire, not the other way round.
This article originally appeared in issue 6.2 (January, 2006).
Andrew Shankman is assistant professor of history at Rutgers University, Camden, and is the author of Crucible of American Democracy: The Struggle to Fuse Egalitarianism and Capitalism in Jeffersonian Pennsylvania.
Bones of Contention
Until I heard this exchange between reporter Lesley Stahl and Smithsonian anthropologist Douglas Owsley I wondered why historians should concern themselves with the peculiar saga of Techamnish Oytpamanat (the Ancient One), better known in media circles as Kennewick Man. I knew those who were arguing over the remains of the nine-thousand-year-old man were talking about archaeology, biology, human migration, and the rights of Native Americans to claim as kin an individual who witnessed the murky dawn of human habitation of the Americas. But apparently, Owsley told me, they were also talking about “American history.” “[O]ur history.” According to Owsley and his colleagues, who have sued the government for access to the remains, they are fighting for nothing less than the right to pursue knowledge, to search for truth. Since Owsley has taken up the cause of curiosity, enlisting all disinterested seekers of truth in his battle, I figure historians should at least pause to consider which side of this court fight they might actually like to join.
The case is a complicated one. The plaintiffs, led by Oregon State anthropologist Robson Bonnichsen, have sued the government for the right to study the ancient remains. The government and the tribal claimants counter that, under the Native American Graves Protection and Repatriation Act of 1990 (NAGPRA), the bones are those of a Native American and, by rights, belong to the tribes. But the plaintiffs insist that Kennewick Man is too old to be claimed as kin by anyone now alive. The law, they say, simply cannot apply to anything so old. By extension the plaintiffs’ case suggests that if Kennewick Man cannot be said to belong to one group or tribe, he must somehow belong to all of us, to all humanity.
This sense of belonging to us all is, I think, what Douglas Owsley meant to explain to Lesley Stahl. As such comments suggest, when we talk about Kennewick Man, we are not talking only about American history, but about what might be called species history–the history of Homo sapiens in the Americas. Recovering the history of ancient humanity may in fact be an undertaking in which we all share an interest. But the case of Kennewick Man is teaching us that the pursuit of species history is troubled by the political and intellectual history of the last five hundred years.
For those of you who may have forgotten the story, Kennewick Man reappeared among us in the summer of 1996. Like so many celebrated archaeological finds, the discovery of his bones was an accident, pure and simple. On a hot July afternoon, two young men sneaked into the hydroplane races on the Columbia River near Kennewick, Washington. There on the banks of Lake Wallula, a man-made reservoir under the control of United States Army Corps of Engineers, they happened on a human skull. Figuring their grisly find would keep through an afternoon of boat racing, they stashed the head in the bushes and watched the meet. That night they delivered the skull to the county coroner, who promptly got in touch with an independent forensic anthropologist friend, James Chatters. Later Chatters and the coroner returned to the site and picked from the mud the bones of a nearly complete human skeleton.
At first Chatters was sure he had the remains of a long-dead Euro-American settler, a man who had come west some hundred and fifty years ago looking for a nice place to farm, or for gold and glory. But on closer examination, the forensic details contradicted Chatters’s original hunch.
It was Chatters who literally put a face on Kennewick Man. The bones, he decided, did not belong to a Native American. The skull was just the wrong shape–long, not round. At first Chatters was sure he had the remains of a long-dead Euro-American settler, a man who had come west some hundred and fifty years ago looking for a nice place to farm, or for gold and glory. But on closer examination, the forensic details contradicted Chatters’s original hunch. For on his way west, it seemed, the poor pioneer had run into some hostile primitives; lodged in his right pelvic bone was a projectile point of the sort favored by Stone Age hunters. The Man had also survived a few broken ribs and a minor skull fracture. Curious to verify that the remains belonged to an ancient American and not to either an Oregon pioneer or a more recent victim of foul play, Chatters sent a small piece of bone off to a lab at the University of California, Riverside. To his surprise, the lab reported back that the bone was about eighty-four hundred years old, far older than he had suspected.
We have come upon only a handful of American skeletons this old, and Chatters no doubt recognized he had in his home laboratory a prize with enormous scientific potential. If he were to be the lucky one to publish findings on the remains, the skeleton might prove quite valuable to him professionally and personally. As it turned out, Chatters had little time to enjoy his treasure. Basking in the publicity of his sensational find (and no doubt cherishing the professional rewards it could bring), the aptly named Chatters started talking to the press. Instead of using the surprising lab report to question his initial assumptions about the skull’s European look, Chatters stuck to his first impression, reporting that he had found the skull of a man of European descent, an ancient American with caucasoid features.
The skull of Kennewick Man. Illustration by John McCoy.
Reporters ate it up. Suddenly, the old bones took on flesh and began to resemble British actor Patrick Stewart, best known as Captain Jean-Luc Picard of the starship Enterprise. (Others note a resemblance to the Ainu of northern Japan, more plausible kinsmen for ancient Americans.) In the popular press, Chatters’s “caucasoid,” a loosely descriptive term, hardened into the racial category, Caucasian. And before we knew it, we had the story of an ancient European (with a pretty brave heart) wandering around the Columbia Plateau some nine millennia ago.
The early flurry of media attention taught Chatters that he was on to a good thing, and he quickly contacted well-placed acquaintances at the Smithsonian, offering to share the fame likely to come to those able to solve the riddle of Kennewick Man. But in his pursuit of publicity, Chatters made a few mistakes. The bones, remember, were found on land under control of the Army Corps of Engineers. About discoveries on government land, NAGPRA is quite specific: local tribes must be notified of human remains found on federal land. Furthermore, any bones more than five hundred years old are presumed to be those of a Native American. We can excuse Chatters and the local coroner who first thought the remains those of an Oregon pioneer, but once Chatters learned the age of the remains, he should have alerted the local tribes, as NAGPRA clearly required.
NAGPRA, whose provisions Chatters disregarded at his peril, is an important piece of legislation. Without it, there would be no controversy over custody of Kennewick Man. Chatters could have spirited the bones off to his lab, studying them and publishing his findings more or less as he saw fit. But in 1990, Congress had passed a law that made such independent action illegal.
One story traces NAGPRA’s genesis to a mission to Washington in 1986 by William Tallbull to retrieve a sacred pipe taken from his Cheyenne people some hundred years earlier by the United States Army. Tallbull found the pipe at the Museum of Natural History, where he also discovered, quite by accident, storage bins containing the remains of some eighteen thousand individual Native Americans. Outraged, he took his case to Congress, where he found a sympathetic audience. While some museum professionals and archaeologists initially objected to a bill that mandated the return of human remains and sacred objects to tribal members, the law that finally passed represented, as Senator John McCain put it in 1990, a “true compromise” in the face of “very difficult and emotional issues . . . I believe this legislation effectively balances the interest of Native Americans in the rightful and respectful return of their ancestors with the interest of our Nation’s museums in maintaining our rich cultural heritage, the heritage of all American peoples.”
Of course, NAGPRA would affect not just “our Nation’s museums,” as McCain stated, but also archaeologists doing fieldwork and even little boys playing at skullduggery. Still, McCain was optimistic. And so, at least at first, were many archaeologists and museum professionals who accepted the opportunity NAGPRA offered to redefine their historically troubled relations with Native Americans. But behind many a compromise, I suspect, lurk sore losers. The story of Kennewick Man in court suggests that all were not content with the NAGPRA regulations that now governed museum holdings and archaeological digs. The law said that consultation and collaboration should precede independent inquiry, and to a few of the disgruntled, such provisions seemed to privilege Native claims over those of professional archaeologists and anthropologists.
Those determined to challenge the law recognized that among its weakest points was the assertion that what NAGPRA referred to as “cultural affiliation” would determine the disposition of a find as rare as the Kennewick skeleton. In effect, the plaintiffs’ case asks whether we can use “culture” to connect the present with the very remote past. What evidence establishes a convincing case for a cultural link between present-day tribes and a prehistoric wanderer? Is “cultural affiliation” an immutable connection? Could “cultural affiliation” trump a more scientific definition of inheritance, for example? In Bonnischen, et al. v. U.S., the anthropologist-plaintiffs–who would surely under other circumstances defend the importance of culture–ask whether the modes of behavior and belief that bind human beings into communities and link those communities through memory and ritual to their pasts can reach back into “prehistory.”
To counter, some, like the Denver repatriation coordinator Roger Echo-Hawk, insist that tribal memories do indeed stretch to the dawn of time. Echo-Hawk questions the assumptions that make it possible to divide history from prehistory, finding in stories told aloud and passed from generation to generation traces of cultural memories that reach further into the past than we have heretofore imagined. Traditional stories may not look like good evidence to the scientifically trained plaintiffs, but NAGPRA allows the courts to consider oral traditions and sacred beliefs as evidence of cultural affiliation. Much more than a battle over bones, then, the case amounts to nothing less than a contest between two different ways of looking at the world, two different ways of thinking about facts and evidence.
But ultimately, such competing philosophies of culture may be less important to the Kennewick case than a simple matter of law that regulates how such discoveries are to be handled. Neither Chatters nor his friend the coroner bothered to contact the tribes who might be very interested in Kennewick Man: the Confederated Tribes of the Umatilla Indian Reservation, the Yakama Indian Nation, the Nez Percé, the Wanapum band, and the Confederated Tribes of the Colville Reservation, all still resident in eastern Washington. But partly with Chatters’s help, the find generated publicity. And the first printed report in the Tri-City Herald, the region’s local paper, prompted a representative of the Umatilla tribe to contact federal authorities. The law was clear. The skeleton was the property of the tribes.
With Native claimants in the picture, Chatters’s dream of professional good fortune evaporated. To be sure, Chatters had gotten a lot of publicity, but his luck started to turn. Much to Chatters’s dismay, agents from the Army Corps of Engineers arrived at his house and took the bones. The government then set about figuring out how to return the remains to the Native claimants–the Umatilla, the Nez Percé, the Yakama, the Wanapum, and the Colville–all of whom, whether by dint of history, tradition, culture, or geography, thought they might be related to the Ancient One. In September 1996, less than two months after the boat race, the Corps of Engineers published an official “Notice of Intent to Repatriate” Kennewick Man’s remains, as NAGPRA said they must. This time, it was not a contending tribal group that challenged repatriation, but a group of eight anthropologists, often identified as “eminent” or “prominent” in press reports. Under the lead of Robson Bonnichsen, head of the privately funded Center for the Study of First Americans at Oregon State University, the anthropologists sued to prevent repatriation. They contended, as Owsley would later suggest to Lesley Stahl, that the Kennewick remains were too old to belong to anyone in particular and therefore must belong to everyone. Furthermore, even if the man did have descendents among the tribes of the Columbia Plateau, the only way to find them would be through the kinds of genetic tests some of the tribal claimants sought to prevent.
What if Kennewick Man was a European wanderer and not a Native American at all? What if Kennewick Man has no descendants? What if neither blood nor culture ties him to a contemporary tribe?
It is tempting to turn the case of Kennewick Man either into a contest between selfless scientists and shortsighted Native Americans, or a struggle between selfish anthropologists and right-minded Native claimants. But both constructions risk simplifying matters that are considerably more complex. The anthropologists are perhaps right to question the assumption that we can easily trace a community’s ancestry far into the past, but they should not be surprised that many will dispute their contention that they speak from neutral ground and for all humanity. The checkered history of their discipline haunts the plaintiffs’ case. It is also apparent that their large claims about a noble and disinterested pursuit of truth serve the much smaller purposes of challenging NAGPRA. Antone Minthorn, Chairman of the Board of Trustees of the Confederated Tribes of the Umatilla Reservation, characterized the struggle over Kennewick Man as a struggle over the interpretation and application of NAGPRA. “It is not science versus religion,” Minthorn wrote, “it is science versus the law.”
While this custody fight, of sorts, continues in the U.S. District Court in Portland in the form of Bonnichsen, et al. v. U.S., the remains of Kennewick Man are locked away for safe keeping in the basement of the Thomas Burke Museum of Natural History and Culture In January 2000, U.S. magistrate John Jelderks decided that Kennewick Man was a Native American, as defined by NAGPRA. Left open, however, was how the law governing Native American remains should apply in the case. To help determine the proper “cultural affiliation” for the remains, the court ordered the genetic tests the tribes objected to in the first place.
Labs are still struggling to produce results, but the Interior Department recently determined that the bones should be turned over to the tribes. According to Secretary Bruce Babbitt, geography and oral tradition establish “a reasonable link between these remains and present-day Indian tribe claimants.” “The oral tradition evidence,” he writes, “reveals that the claimant Indian tribes possess similar traditional histories that relate to the Columbia Plateau’s past landscape. The oral tradition evidence lacks any reference to a migration of people into or out of the Columbia Plateau.” Now that the Interior Department has weighed in, accepting as good evidence the kinds of information the plaintiff-scientists surely consider suspect, the case is back in federal court.
Regardless of whether Interior Department policy, a decision in the courts, or DNA testing ultimately seals the fate of Kennewick Man, the issues raised in the battle of the bones will continue to ripple outward. The case has spilled out of the courts and into the culture at large because it forces us to think about how we sort human beings. What does it mean to label Kennewick Man a Native American? How far back in time can we trace a cultural lineage? How far into the past can we extend our contemporary racial categories? On some of these issues, the language of NAGPRA is simple enough. The law states: “‘Native American’ means of, or relating to, a tribe, or culture that is indigenous to the United States.” In practice, any remains from the period before European contact are assumed to be Native American.
This assumption is precisely why the strange looking Kennewick Man seemed to offer such a good opportunity for discontented anthropologists to challenge the law. What if Kennewick Man was a European wanderer and not a Native American at all? What if Kennewick Man has no descendants? What if neither blood nor culture ties him to a contemporary tribe? Raising such complicated questions–questions that wreak havoc with NAGPRA’s neat categories–serves the interests of the plaintiffs. Who better to claim the odd orphan, their suit suggests, than the anthropologists, who have nothing but the best interests of science, and by extension of all humanity, at heart?
And this skeleton, the plaintiffs tell us, could be really important. In the court of public opinion, the plaintiffs bolster their case by churning up renewed curiosity about the peopling of the Americas. Most of us learned in grade school that near the end of the last ice age, around twelve thousand years ago, human beings followed game from Asia into the Americas, crossing the Bering Strait on a land bridge. But recent archaeological findings suggest human habitation far older than initially suspected. A Bering land bridge and an ice-free corridor down the center of the continent may have been available more than once, and humans may have migrated in waves.
Reconstructed head of Kennewick Man. Illustration by John McCoy.
From some newer studies an even more complicated picture of multiple coastal migrations seems to be emerging. According to the innocent-sounding organization, Friends of America’s Past, “Exciting new scientific theories about the peopling of the Americas are changing our understanding of the past.” Kennewick Man promises to add an “important piece to this puzzle.” Outraged, Friends of America’s Past continue, “This unique, nearly complete skeleton was almost reburied without any study.”
In fact, “Friends of America’s Past: A nonprofit organization dedicated to promoting and advancing the rights of scientists and the public to learn about America’s past,” is an organization dedicated to raising funds to help the plaintiffs in Bonnichsen, et al., v. U.S. pursue their challenge to NAGPRA. It is hardly surprising that on their own Website they cast themselves in heroic terms, as valiant Davids struggling against the Goliath of the federal government, a far better villain for their piece than the small tribes of eastern Washington. How ironic as well that for once the government is “for” the Indians and “against” the academics. “It’s time to stand up for the right to learn about the past,” the Friends of America’s Past write. “We all share the past–no one owns it. Imagine if a few people could decide by whom, when, and how evidence from the past can be studied. Is this the legacy we want to leave to future generations?” Of course it isn’t.
What’s more, in the plaintiffs’ scenario, the tribes are not only religious reactionaries, they are guarding ill-gotten privileges. And the mainstream media, particularly reporters covering the story for CBS’s 60 Minutes in October 1998, readily accepted the plaintiffs’ portrait of their opponents. The story 60 Minutes aired underlined the plaintiffs’ contentions: over images of Indian revival–a powwow and a casino–reporter Lesley Stahl repeated Chatters’s suspicion that the “tribes’ fight against further testing of Kennewick Man is based largely on fear, fear that if someone else was here before they were, their status as sovereign nations and all that comes with it–treaty rights and lucrative casinos, like this one on the Umatilla Reservation–could be at risk.”
Of course, as Stahl herself noted, “The Indians say that’s nonsense.” And the Indians are right: under the law, the claim that treaty rights and casinos depend on Native American’s status as “first peoples” is indeed nonsense. As the Umatillas’ Minthorn put it, “The outcome of this case has no legal bearing whatsoever on tribal treaties and tribal sovereignty.” His people, he explains, are not motivated by money, but rather by a desire to defend NAGPRA and by a strong belief that human remains deserve the dignity of permanent burial. Native beliefs about death, burial, and the afterlife are as diverse as the hundreds of cultural traditions they represent, but many tribes do maintain a particular respect for the remains of the dead. For once a law passed by the United States Congress gives the tribes an opportunity to act on their cultural beliefs.
Whatever the merits of the plaintiffs’ case, their position is tainted by their apparent kinship with generations of greedy whites who ignored laws and treaties when laws and treaties thwarted their plans. To back their legal challenge, the plaintiffs have resuscitated some troubling arguments: they hint that Kennewick Man may have been here before the ancestors of contemporary Native Americans. Scholars aired such ideas in the early 1800s, and something pernicious lingers in the assertion that Native Americans came late to the continent. The first generation of American archaeologists who tried to solve the mysteries of the Mound Builders decided that the architectural wonders of ancient America were built by people unrelated to the Indians they knew. Early nineteenth-century scholars imagined a sophisticated and peaceful people eliminated by the violent forebears of the Indians they now fought.
Taken to the extreme, images of battles among the ancients cast a righteous glow over white violence against Native peoples: nineteenth-century white settlers fought in the name of the late, vanquished Mound Builders–the true and rightful owners of the continent. In a milder form, such ideas perhaps eased whatever qualms of conscience came with conquest: when all was said and done, Indians had no better claims to their lands than the European and American usurpers who came after them. This legacy of disputes over treaty rights and first settlement complicates the simple neutrality that Owsley, Bonnichsen, and the rest of the Kennewick plaintiffs seem so ready to adopt. Perhaps in an ideal world, science is disinterested. Scientists, however, rarely are.
It is not hard to understand why the plaintiffs would like to work on the Kennewick remains. We have happened on only a handful of American bones this old, and scientists are sure Kennewick Man can answer many questions about ancient America. Any anthropologist who solves the riddles of the bones–riddles about everything from ancient diet to the origins of American society–will surely receive accolades from the profession and the public. The plaintiffs, who unfortunately missed the opportunities for collaboration and compromise that NAGPRA encourages, fear that repatriation means reburial. They suspect that once the bones of the Ancient One return to the tribes, whatever information they contain will be lost forever. Lost to the plaintiffs, perhaps. But who is to say that an anthropology reconceived in consideration of Native concerns will not extract plenty of information from old bones?
Yet on some level, I sympathize with the plaintiffs’ position. I acknowledge I too experienced a twinge of regret over the repatriation of a collection of Native American skulls, and not a one of them promised to shed light on anything so momentous as who first peopled the Americas. I was just surprised by how beautiful the skulls were and by how much the skulls seemed to tell us, not about the individuals they once belonged to, but about the nineteenth-century Americans who collected them. Last summer I began reading the correspondence of the American naturalist Samuel George Morton. I decided to track down the remnants of the skull collection he built in the 1830s and 1840s. By the time Morton died in 1851, the scientist had gathered in more than one thousand human crania, the bulk of them the skulls of Native Americans. Morton rarely left his native Philadelphia, but with the help of amateur naturalists and doctors in the frontier army, the shelves of his study filled with skulls from the battlefields of Florida and the American West and spoils from the burial grounds of tribes forced off ancestral lands. Morton measured his skulls, but he also cleaned, polished, varnished, and labeled them and then put them on display, inviting the public to visit his collection, free of charge, on Tuesdays and Saturdays.
On his death, Morton’s friends donated his collection to the Academy of Natural Sciences in Philadelphia. In 1893, the academy sent forty of the Native American skulls to Madrid as part of the official United States entry in the exposition commemorating the four-hundredth anniversary of Columbus’s discovery of the New World. Judges awarded Morton’s crania a silver medal. But when the skulls were returned to the United States, tastes had apparently changed, and academy curators decided not to put the bones back on display. The remnants of Morton’s collection eventually wound up in the storerooms of the University of Pennsylvania Museum of Archaeology and Anthropology. The old skulls are now shrouded in bubble-wrap and sealed in plastic containers, awaiting repatriation.
To protect their rights–among them the right to dispose of their dead according to their own traditions, and the right to explore their history according to their own definitions–the tribes have taken a strong position in the dispute. For once, the law is on their side.
Morton’s collection lost its coherence more than a century ago, but I suspect the beauty of the skulls may have motivated him every bit as much as pure science. Morton collected all sorts of heads–not just the heads of humans, but the heads of birds, mammals, and apes. Viewed from one angle, his cranial collection prompted visitors to consider the small physical things that separated human beings from the rest of creation. While Morton presented his collection as a portrait of the continent’s past inhabitants, no doubt visitors saw in the bony forms, the hollow sockets, the toothy grins reminders of their certain future.
But Morton also sorted his skulls by race, measuring the cranial capacity of each of the great families of man. No surprise that white men like him turned out to have the best and biggest heads. His two ways of looking as skulls–as objects that represent humanity and as objects that represent specific races–approximate the two sides in the struggle over Kennewick Man. Should Kennewick Man be made to shed light on a common story of all humanity? Or should he be reserved to a chapter in the history of a particular group?
Morton’s collection, of course, has no direct bearing on the case of Kennewick Man. But when I began to look at the correspondence that accompanied the skulls to Philadelphia, I understood why Native communities feel so strongly about the principles of NAGPRA. Morton measured his skulls, hoping to discover in them an index of racial difference. For his empirical project to work, he needed to know who was who among his heads, and so each skull came with a pedigree of sorts, a provenance laid down by Morton’s friends in the field. Morton’s agents told him how they came by the heads they sent him. Describing their collecting, they detailed the violence behind the scientist’s tranquil speculations. One man collected two “fine” Seminole skulls left unburied after the battle of Lake Okee-Chobee. He apologized that “only two out of twelve killed . . . could be taken the others being very offensive.” A correspondent from Indiana promised to procure for Morton “the skulls of Chapodicac and Rushynble both eminent chiefs,” just “as soon as the Indians are removed from our neighborhood which will be this Fall.” And Morton’s memoirist described one man who “exposed his life robbing an Indian burial place in Oregon, and carried his spoils for two weeks in his pack in highly unsavory condition, and when discovery would have involved danger and probably death.”
Are the plaintiffs in Bonnichsen, et al. v. U.S. the direct descendants of Samuel Morton and his headhunting friends? Yes and no. On the one hand, Bonnichsen and his associates had nothing to do with Kennewick Man’s death. As the plaintiffs contend, his life so predates the disputes NAGPRA was designed to settle that it seems absurd to subject his remains to regulations devised to correct excesses of nineteenth- and twentieth-century science. But then again, Morton is a father of American physical anthropology. Perhaps in the Kennewick case, we witness a moment when the sins of the father are indeed visited on the sons. Little wonder that an atmosphere of hostility and mistrust surrounds the case.
The plaintiffs in the Kennewick case have used the media to create an artificially polarized situation, pitting enlightened professionals against narrow-minded reactionaries, Western science against Native mumbo jumbo. But, as I have tried to suggest, the plaintiffs’ claim to the moral and intellectual high ground is subject to question. So is their claim to speak for science and the unencumbered pursuit of truth.
For one thing, the plaintiffs simply do not represent the entire range of opinion in the community of “enlightened” professionals. Both Keith Kintigh, professor of archaeology at Arizona State University and president of the Society for American Archaeology, and David Hurst Thomas, curator of anthropology at the American Museum of Natural History in New York, have expressed support for NAGPRA. They see the law as an invitation to reinvent their profession, as an occasion to replace the racist arrogance that characterized old projects, like Morton’s, with cooperation and consultation. Thomas suggests we need to take a broad view of the issues involved in the Kennewick case. It may be easy, as the media have done, to tell this story as a struggle between disinterested professionals and unenlightened Indians, but “[u]ltimately,” Thomas writes, echoing sentiments expressed by Umatilla chairman Minthorn, “the Kennewick dispute is not a matter of science v. religion, or even Indians v. scientists. At its heart, the matter of the Kennewick skeleton involves political power and property rights.” To protect their rights–among them the right to dispose of their dead according to their own traditions, and the right to explore their history according to their own definitions–the tribes have taken a strong position in the dispute. For once, the law is on their side.
Mr. Owsley, then, may be right after all. The Kennewick case is about “American history.” But perhaps not quite in the way he intended. Our route to the ancient history of the continent is troubled by the history of the last five hundred years. Past relationships haunt the current dispute. Sometime this fall, Magistrate Jelderks will decide how best to dispose of the ancient remains. When the case is closed, Kennewick Man likely will have taught us nearly as much about who we are as a people as about who the people were who dwelled on the banks of the Columbia River nine thousand years ago.
Further Reading: A Note about Kennewick Man on and off the Web
Kennewick Man emerged from the mud of the Columbia River in 1996, only to be caught in the strands of the World Wide Web. To write this piece for Common-place I decided to play the student-researcher and try to reconstruct this story by following Kennewick Man all over the Web. Pursuing Kennewick Man, a sort of model citizen of cyberspace, made me dizzy: there seemed an endless supply of information, an infinite number of links. Consider that the search engine Google lists some 11,100 sites for Kennewick Man and another ninety-three for those inclined to call him Kenniwick Man. The contending parties in the suit (the university-based anthropologists, professional archaeologists, the tribes, the National Park Service, the museums, the newspapers and the networks) maintain Websites with pages devoted to Kennewick Man. The Burke Museum touts its celebrated skeleton on its home page. On your visit to the virtual museum you can even attend a symposium on Kennewick Man and listen to brief presentations by several of the players in the story.
For the local media, the story has also been a good thing. Tri-City Herald maintains a Kennewick Man Virtual Interpretive Center where you can review the paper’s coverage of the case and even sign up to receive by e-mail the breaking news on Kennewick Man. You can also read the story from the point of view of the Archaeological Institute of America. Or from “America’s Leading Indian News Source,” Indian Country. Or from the Journal of Indian Justice. By following the links offered by Friends of America’s Past, you can even give money to the plaintiffs.
When I found the Website that asked for money, I knew I was getting in over my head, and turned to the historian’s more traditional printed sources as an antidote. I needed a community of scholars to help sort out the voices on the Web. Last spring, I attended a conference at Harvard’s Peabody Museum on the tenth anniversary of NAGPRA and read an article by Scott L. Malcomson, “The Color of Bones: How a 9,000-year-old Skeleton Called Kennewick Man Sparked the Strangest Case of Racial Profiling Yet,” New York Times Magazine (April 2, 2000): 40-45, and a book by David Hurst Thomas, Skull Wars: Kennewick Man, Archaeology and the Battle for Native American Identity (New York, 2000). I also found useful Roger Echo-Hawk’s “Forging a New Ancient History for Native America,” in Native Americans and Archaeologists: Stepping Stones to a Common Ground, eds. Nina Swidler, Kurt E. Dongoske, Roger Anyon, and Alan S. Downer (Walnut Creek, Calif., 1997): 88-102.
Perhaps the best single source on Kennewick Man is Roger Downey’s recent Riddle of the Bones: Politics, Science, Race, and the Story of Kennewick Man (New York, 2000). Downey, a reporter who has been following the story for the “alternative” Seattle Weekly, sorted out the figures in the case, offering his explanations for their various positions. You can read some of Downey’s original columns online. But as the workings of the Web would lead us to expect, Downey has detractors in cyberspace. In his account, he cast Kennewick Man’s self-proclaimed Norse kinsmen, the Asatru Folk Assembly, as the New Age buffoons of the story. And not exactly as harmless buffoons, either. Although the followers of Odin dropped their claim to the remains of the Ancient One, Downey paused to note ties of some Asatru leaders to the Afrikaner Resistance Movement and the white supremacist Church of the Creator. The Odinists struck back, using Amazon’s readers’ forum to denounce Downey’s book as “blatent [sic] lies.” A reader signing herself “maryscats” tried to give the book no stars at all: Amazon, to her consternation, insisted she give it at least one.
I wish I could say that my pursuit of Kennewick Man on the Web turned me into an adept electronic researcher. Not quite. I confess I reverted to form and consumed nearly a ream of paper printing out the contents of the Websites I visited. In some cases this was useful (particularly with the reports on the Kennewick remains compiled by National Park Service archaeologist Francis P. McManamon but in others hardly necessary. I was also left with the impression that even though this story is a relatively manageable one, I would never be able to visit every Website devoted to it or to assess every opinion expressed on it.
Poor Kennewick Man: nine thousand years of repose interrupted by such a lot of chatter.
This article originally appeared in issue 1.2 (January, 2001).
Ann Fabian teaches American studies and history at Rutgers University. Her most recent book is The Unvarnished Truth, a study of personal narratives. She is working on a new book about Samuel George Morton and his collection of skulls.
Publick Occurrences 2.0 February 2008
February 29, 2008
Seems Like Old Times, I: Nationalizing the state militias
One of the features I have planned for this blog is a series of items highlighting issues from the Early Republic that have come back or never gone away.
One of those issues is the drive to concentrate as much control as possible over the nation’s armed forces in the federal government and its military leadership. A perennial sticking point in this drive has been what used to be called the state militias, known in modern times (speaking broadly) as the Reserves and the National Guard. As both military officers and civilian officials, George Washington and Alexander Hamilton were famously dissatisfied with their dependence on poorly trained and equipped militia troops, questioning the citizen-soldiers’ ability to stand and fight against regular troops, and, just as importantly, doubting their reliability when called upon to apply force to their fellow citizens in times of domestic unrest.
During the French war scare of the late 1790s, the Federalist Congress authorized President John Adams to call out 80,000 militiamen and create a 10,000-man Provisional Army in case of a declaration of war or foreign invasion. Nothing was ever done with this authority except the appointment of a few officers. Instead, Adams, Hamilton, and other Federalists struggled to create (with different agendas) a sizable Additional Army that, along with volunteer units who paid for themselves, would be usable “at the President’s discretion” whether there was a war or invasion or not. [The clearest explanation I have ever found of these matters is: William J. Murphy Jr., “John Adams: The Politics of the Additional Army, 1798-1800,” New England Quarterly 52 (1979): 234-249.]
Admittedly I found the story several weeks ago, but I find it interesting, more than two centuries later, when Reserve and National Guard units have been deployed overseas for years at a time, and on regular basis, that the Pentagon feels that it still does not have enough control of state troops and also wants a greater role in policing what I guess we now have to call the “homeland.”
Pentagon control over Reserves, Guard proposed
By Philip Dine
POST-DISPATCH WASHINGTON BUREAU
Friday, Feb. 01 2008
WASHINGTON — More than six years after the terrorist attacks of Sept. 11, 2001, the nation’s plans for meeting the threats to the homeland are so thin they could be written “on the back of an envelope,” the chairman of a national military commission said Thursday.While the country has detailed contingency options for military action overseas, the capacity for responding to a terrorist attack or natural disaster within the United States is dangerously low, retired Maj. Gen. Arnold Punaro, chairman of the Commission on the National Guard and Reserves, said Thursday.
“You couldn’t move a Girl Scout unit” with the amount of planning federal officials are doing for domestic contingencies, he said, likening it to a disorganized “sandlot game.”
“You cannot do that in dealing with weapons of mass destruction,” Punaro said.
Among the shortfalls are a lack of equipment for the National Guard, with Missouri and Illinois particularly hard hit in some categories, according to the commission’s report released Thursday.
The panel called for a drastic overhaul of the military structure that would put the National Guard and Reserves under the direct control of the Army and Air Force and essentially integrate the nation’s “citizen-soldiers” into the military structure. The plan would include integrated training, pay, promotions, medical care and retirement — and improved resources and equipment.
Meanwhile, the Pentagon would be put in charge of homeland security, which would be carried out by the Guard and Reserves.
Those changes are necessary both to meet homeland security shortfalls and to allow the over-extended military to focus on overseas missions, commissioners said. Many can be implemented by the Pentagon while some require legislation by Congress.
The Guard’s current status made sense during the Cold War when it was “designed as a reserve force to be dusted off once in a lifetime,” but no longer when reservists are being used as a wing of the military, Punaro said. The current problems are heightened by the personnel limitations of an all-volunteer military, he said.
The commission, which was authorized by Congress, found that the only other alternative for dealing with a stretched-thin military — increasing the size of the active-duty component — is prohibitively expensive.
Adding the 600,000 active-duty soldiers that would be required for current needs would cost more than a trillion dollars, Punaro said. Beyond that, the military couldn’t recruit enough people to meet that target, he said.
Not only are there enough reserve forces to take over homeland security, they are highly skilled and are already in the states and cities, he said.
February 28, 2008
Barack Hussein Obama: Inheritor of an All-American Semitic Naming Tradition
Juan Cole had great post yesterday, in part explaining the Semitic origins of the names of a long list of presidents and other American heroes. Here is one of the key passages:
Let us take Benjamin Franklin. His first name is from the Hebrew Bin Yamin, the son of the Right (hand), or son of strength, or the son of the South (yamin or right has lots of connotations). The “Bin” means “son of,” just as in modern colloquial Arabic. Bin Yamin Franklin is not a dishonorable name because of its Semitic root. By the way, there are lots of Muslims named Bin Yamin.
As for an American president bearing a name derived from a Semitic language, that is hardly unprecedented.
John Adams really only had Semitic names. His first name is from the Hebrew Yochanan, or gift of God, which became Johan and then John. (In German and in medieval English, “y” is represented by “j” but was originally pronounced “y”.) Adams is from the biblical Adam, which also just means “human being.” In Arabic, one way of saying “human being” is “Bani Adam,” the children of men.
Thomas Jefferson’s first name is from the Aramaic Tuma, meaning “twin.” Aramaic is a Semitic language spoken by Jesus, which is related to Hebrew and Arabic. In Arabic twin is tau’am, so you can see the similarity.
James Madison, James Monroe and James Polk all had a Semitic first name, derived from the Hebrew Ya’aqov or Jacob, which is Ya`qub in Arabic. It became Iacobus in Latin, then was corrupted to Iacomus, and from there became James in English.
Zachary Taylor’s first name is from the Hebrew Zachariah, which means “the Lord has remembered.”
Abraham Lincoln, of course is, named for the patriarch Abraham, from the Semitic word for father, Ab, and the word for “multitude,” raham,. Abu, “father of,” is a common element in Arab names today.
So, Mr. Cunningham, Barack Hussein Obama fits right in this list of presidents with Semitic names. In fact, we haven’t had one for a while. We are due for another one.
February 27, 2008
Scary Pictures
An outbreak of anti-Obama Muslim-baiting flared up recently after a controversy over Clinton campaign’s alleged “circulation” of an old photo of Barack trying on traditional Somali garb on a congressional garb, in the dress-up routine innumerable politicians have been subjected to over the years. That controversy, and some likely Clintonian pressure, led even respectable mainstream media sources Tim Russert and CNN to make a number of crude, witless efforts to link Obama with hated Muslim or Muslim-ish figures like Louis Farrakhan and the dictator of Libya, the latter of whom is actually chummier with the Bush administration than Obama. In what I expect to be be a campaign comedy cliché by week’s end, Hillary Clinton and America’s Toughest Dumb Journalist made a big stink at the last Democratic debate about trying to pin down on the literally meaningless question of whether Obama would merely “denounce” Farrakhan (the weaker option, apparently) or “reject” his support, whatever that means. I am picturing Obama campaign operatives waiting in ambush at Farrakhan’s Chicago polling place to make sure their candidate doesn’t receive his support.
Of course, as an early political historian, the use of an image to “other” an opposition political figure in the eyes of religious and patriotic voters, reminded me of the famous Jefferson cartoon from the election of 1800, one of the very few cartoons from that era. Jefferson’s only suspect article of dress is a cloak — to hide his shame, no doubt — but the effect is pretty similar, the candidate’s true, anti-American self revealed. There was also quite a lot of Federalist verbiage along the lines of calling on Jefferson to denounce and reject the French Revolution, Thomas Paine, the Enlightenment, the opposition press, and many of his own writings. I can picture Tim Russert reading Jefferson’s infamous letter to Philip Mazzei aloud during an Adams-Jefferson debate and truculently demanding what Jefferson could do to reassure English-Americans that he did not consider their mothers harlots.
February 21, 2008
The Evitability of the Inevitablity Strategy
The primary campaign is by no means over, but the media and the blogosphere have now realized that the unstoppable Hillary Clinton juggernaut they have been building, image-wise, for these past 3 years is, in fact, eminently stoppable. The inevitable Democratic nominee is now one more big loss away from having to get the nomination Corrupt Bargain-style, and/or risk digging herself an even deeper hole by breaking out the racial codewords again. That seemed to work short-term in New Hampshire but also galvanized black primary voters down south behind Barack Obama and turned the race around. Ezra Klein of the The American Prospect has one of the better recent commentaries on Hillary’s troubles, “The Underperformer.”
We historians know that Olympian historical contextualization of everyone else’s opinions is a sure way to alienate friends and family, so I say, keep it on the blog. To wit:
As historians could have told Hillary, and the media, “inevitability” is about the most evitable thing in politics. Has the “inevitability strategy” ever worked? Let’s ask the long line of prohibitive front-runners whose proud ships ran immediately aground as soon as actual voters were sighted: Ed Muskie, Nelson Rockefeller, Mitt Romney’s dad, the list could go on and on. I remember when John Connally and Howard Baker were big presidential names. Incumbent presidents have gotten the nomination through inevitability, only to have it flop in the general election. Remember Carter and Bush I’s Rose Garden strategies?
Inevitability may have worked occasionally in the Early Republic, for John Adams in 1796 and James Madison in 1808, but that was before such a thing as a nomination process was even invented. Alexander Hamilton’s plan of swapping Adams for a Pinckney might have done the job if there had been a Federalist Super Tuesday in 1796 or 1800. De Witt Clinton might have given Madison quite a shock if could have taken him on in a Pennsylvania or Massachusetts primary. Congressional caucus nominations meant never having to burst the Beltway bubble, if I may be permitted one final anachronism, er, counterfactual.
Back here in the modern world, when will the media learn that those early poll numbers measure nothing but name recognition? For the vast majority of citizens who do not follow politics closely, telling a pollster that they supported Hillary Clinton for president 1 or 2 years before the election was more akin to saying yes, they had heard that the most famous woman in America (non pop-star category) was running for president against that Jock Edwardson — the haircut guy — and noted Irish revolutionary or Muslim poet Brock O’Bama.
Once the identities of everyone else in the race came into focus, Hillary Clinton’s weaknesses as a candidate did likewise: she was a deeply polarizing figure who brought along most of her husband’s baggage — especially his penchant for calculating triangulation — and little of his charisma; she was on the wrong side of the issue that Democratic primary voters cared most about, the war; and her track record of “proven leadership” began with mismanaging the only real chance at national health care the U.S. has had in my adult life. In addition, she just has not run a very effective campaign. How could Clinton possibly have been such a towering figure in the Democratic party for as long she has and still not have state organizations strong enough to do well in caucuses and navigate the delegate selection rules? Like most inevitable front-runners, she took the DC-centric view that fundraising and press coverage was more important, and waited for the electoral tides to come in. Oops.
February 20, 2008
Another dog in the hunt
Turns out there is yet another Lewis, Clark & Dog monument in the St. Louis area, this one down on the waterfront by the Arch (see below). It appears not to be as big, or as accessible, but just as canine-centered as the one in suburban St. Charles. The earlier post’s comments included some worthwhile links for those who want to explore the history of pets. One of the commenters also pointed out that Franklin D. Roosevelt’s dog Fala appears in one of the scenes at the FDR monument in Washington. Unfortunately, unless the D.C. representation of Fala is approximately the size of a bear, which I do not remember, it is no match for the Seaman of St. Charles.
February 19, 2008
Wisconsin Primary Cheesehead Special: Andrew Jackson’s even more mammoth cheese
In honor of the primary election being held today in America’s Dairyland, I offer a fromage-related item that recently came to my attention. (Sadly this post does not actually mention Wisconsin.) As many readers of the blog will know, I wrote an article a few years back on the Mammoth Cheese presented to Thomas Jefferson by the dairy farming Baptists of Cheshire, Massachusetts, in 1802. (It was published as a chapter in the Beyond the Founders collection I co-edited with David Waldstreicher and Andrew W. Robertson, but seemingly read by far more people in the earlier version posted on my web site.) One of those readers, Loyola College student Erin Bacon, wrote last week with news that I had missed the biggest cheese of all, a 1400-pound specimen that is apparently common knowledge among residents of Oswego County, New York. I had mentioned a 100-pound cheese sent to Andrew Jackson by a Cheshire couple, but Ms. Bacon’s “local pride” impelled her to inform me of her hometown’s far more imposing tribute, a dairy product that was indeed as giant as Old Hickory’s self-regard. She sent a link to an old Oswego County history available online. Here is the account from 1895 Landmarks of Oswego County:
Dairying, and especially cheese-making, had become an important industry, particularly in the south part of the town [Sandy Creek, NY] in the Meacham neighborhood. In 1835 it made the locality famous. Col. Thomas S. Meacham was a man of enthusiastic temperament and fond of remarkable things, and in that year he conceived the idea of making a mammoth cheese as a gift for President Jackson. He had 150 cows, and for five days their milk was turned into curd and piled into an immense cheese-hoop and press constructed for the purpose. The cheese weighed half a ton, but was not large enough, so the colonel enlarged his hoop and correspondingly enlarged the cheese until it tipped the scales at 1,400 pounds. It was then started on its journey to Washington. Forty-eight gray horses drew the wagon on which it rested to Port Ontario, whence it was shipped November 15, 1835, the boat moving away amid the firing of cannon and the cheering of the people. Colonel Meacham accompanied it. It was conveyed by water by way of Oswego, Syracuse, Albany, and New York, and along the entire route its projector was given a series of ovations. Reaching Washington the huge cheese was formally presented to the President of the United States in the name of the “governor and people of the State of New York.” In return General Jackson presented Colonel Meacham with a dozen bottles of wine. The mammoth production was kept until February 22, 1836, when the President invited all the people in the capital to eat cheese. The scene is thus described by an eye-witness:
This is Washington’s birthday. The President, the departments, the Senate, and we, the people, have celebrated it by eating a big cheese! The President’s house was thrown open. The multitude swarmed in. The Senate of the United States adjourned. The representatives of the various departments turned out. Representatives in squadrons left the capitol – and all for the purpose of eating cheese! Mr. Van Buren was there to eat cheese. Mr. Webster was there to eat cheese. Mr. Woodbury, Colonel Benton, Mr. Dickerson, and the gallant Colonel Trowbridge were eating cheese. The court, the fashion, the beauty of Washington, were all eating cheese. Officers in Washington, foreign representatives in stars and garters, gay, joyous, dashing, and gorgeous women, in all the pride and panoply and pomp of wealth, were there eating cheese. It was cheese, cheese, cheese. Streams of cheese were going up in the avenue in everybody’s fists. Balls of cheese were in a hundred pockets. Every handkerchief smelt of cheese. The whole atmosphere for half a mile around was infected with cheese.
Colonel Meacham also sent a cheese to Vice President Van Buren, another to Gov. William L. Marcy of Albany, a third to the mayor of New York, and a fourth to the mayor of Rochester, each weighing 700 pounds. In return he received from the latter a huge barrel of flour containing ten ordinary barrels.
My Mammoth Cheese article made no pretensions to cataloging every single instance of presidential food tributes, but I will say that this Super-Mammoth Jacksonian Cheese makes one of my points fairly well. Thomas Jefferson’s cheese was a homely salute from a whole community, and it had a political message — New England Baptists’ support for Jefferson’s free-thinking, tolerant approach to religious freedom and many common Americans’ excitement at what promised to be a more democratic era. Jefferson’s Federalist opponents, still clinging to power in many places, sneered at the gesture and turned up their noses at the cheese. (It did smell.)
On the other hand, at least from the account above, Jackson’s cheese was something of an advertising stunt*, and only political in the sense of being the then-existing political establishment’s tribute to itself. A hard-charging local entrepreneur conceived the idea, and Whigs and Democrats and all of Washington society embraced it. Like most of the political festivities of the mid-19th century (as opposed to the earlier period), the Jacksonian Mammoth Cheese was bigger chiefly in the amount of money and ballyhoo that went into it.
*I wonder if the writer of the children’s book I complained about in the article, A Big Cheese for the White House, conflated the two cheeses. In that story, it was the original Mammoth Cheese that was an advertising stunt.
Superdelegates team up against the Spectre (of a brokered convention)
Following up on the earlier “brokered convention” post, I noticed that at least some of the key “superdelegates” (Democratic officeholders who can vote for any candidate they please) seem to share Ed Kilgore’s fears of a convention fight. Rep. Charles Rangel and Sen. Charles Schumer were both quoted warning the Hillary Clinton forces against relying on superdelegates or parliamentary maneuvers (like the seating of delegates selected in the non-sanctioned Michigan and Florida primaries) to take the nomination away from Barack Obama at the convention:
“It’s the people (who are) going to govern who selects our next candidate and not super delegates,” Rangel said Sunday night at a dinner for the New York State Association of Black and Puerto Rican Legislators conference in Albany, N.Y.
“The people’s will is what’s going to prevail at the convention and not people who decide what the people’s will is,” he added.
This a better argument than the fear of a chaotic convention projecting a bad image for the party.
The idea that the party’s decision should reflect “the people’s will” can be traced back to Jacksonian and “Old School” complaints about the use of congressional and legislative caucus nominations back in the 1810s and 1820s. The Democrats adopted the delegate convention system partly in response to the past outrage over the denial of the nomination and the presidency in 1824 to alleged popular favorite Andrew Jackson. 1824 front-runner William Crawford was nominated by the Democratic-Republican congressional caucus despite the fact the candidate was medically incapacitated and the competing candidates’ supporters boycotted the caucus. Party mastermind Martin Van Buren and other Crawford supporters then discovered first hand how perilous and self-defeating it was to seize the nomination for their favorite when the perception existed that a majority of the party did not support him.
You have to admit that it would be fun if the first black presidential nominee ends up owing his nomination partly to Jacksonian arguments.
February 15, 2008
Nicolas Cage, National Menace
Most people, even most historians, do not seem to share my degree of detestation for the National Treasure movies starring the once-noted thespian Nicolas Cage. The first one finally caught up with me on a trans-Atlantic flight a couple of years ago. Now, as readers of this blog will discover to their peril, I am not remotely averse to popular culture, even when it takes some liberties with early American history. Cotton Mather as a super-villain? No problem! Who doesn’t love a mind-controlling Puritan with a fire-shooting cross? I can tell you from personal experience that reading those comics as a kid only made it more interesting to learn who Mather really was when I got older.
But National Treasure? Holy hand grenades, Batman, that thing combines the egregious dumbness of most pop-culture historical window dressing with an earnest “love” of history as treasure hunt and Chamber of Cool Secrets that resonates unfortunately well with the version of actual history sold on basic cable and the sale table books at Barnes & Noble. It also feeds into the public tendency to treat great historical documents like holy relics with special powers rather than texts with content and context. The real Declaration of Independence, with its complicated backstory and sometimes paradoxical legacy, can’t compete with the one that has a treasure map on the back or might be worth somethin’ if you find a copy in Granddad’s old truck. Actual anecdote there.
I understand that even most average moviegoers do not consciously take a dopey thriller like National Treasure seriously, but I also know that people do pick up information from sources like that, especially if they know little or nothing else about the subject. Hollywood should consider how wide an array of topics that statement covers. In National Treasure (and The Da Vinci Code), a wide international audience was exposed to the basics of a number of legends and conspiracy theories that had previously been limited to devotees of paranoid pseudo-knowledge, and the scholars who follow them. Now I see that the federal government has actually found it necessary to put on an exhibit correcting the misinformation on the Great Seal of the United States, a.k.a. the brand of masonic slavery, that the Cage vehicles have spread.
Were the brain cells Nic destroyed with Con Air and Ghost Rider not enough? Did he have to go after our history as well?
UPDATE: No sooner did I finish writing the post above last Friday, did I get in a car and hear a cutesy NPR piece on the same exhibit. While the reporter took one of those audibly, breathily grinning tones that drive me nuts, and while she let the curator cover the Great Seal myths debunked, the story begins with the voice of Christopher Plummer (from the movie) reciting the myths in much more dramatic and convincing form.
This article originally appeared in issue 8.3 (April, 2008).
Jeffrey L. Pasley is associate professor of history at the University of Missouri and the author of “The Tyranny of Printers”: Newspaper Politics in the Early American Republic (2001), along with numerous articles and book chapters, most recently the entry on Philip Freneau in Greil Marcus’s forthcoming New Literary History of America. He is currently completing a book on the presidential election of 1796 for the University Press of Kansas and also writes the blog Publick Occurrences 2.0 for some Website called Common-place.
To Our Readers
The American Antiquarian Society (AAS) is very proud of Common-place and the contributions it makes to scholarship and a broader public. We believe it is the premier on-line journal of early American history and culture and we are convinced that its editors, beginning with Jane Kamensky and Jill Lepore, and continuing with Edward Grey, Cathy Kelly, Anna Mae Duane, and Walt Woodward, are chiefly responsible. We remain deeply indebted to Jane and Jill for their inspiring conception and leadership of the journal from its formation seventeen years ago. Truly Common-place has served as a common forum between the academy and the general public – illuminating, educating, and engaging a wide variety of people in many aspects of America’s past.
Over the next twelve months the American Antiquarian Society is undertaking a comprehensive evaluation of all our programs so as to create a strategic plan to provide a fresh, forward looking road map for how we will engage and serve all our constituents in the coming decade. To do so, we are including Common-place in our assessment of the current state of electronic publishing and the Society’s strengths and capabilities.
Here is the current plan: Anna Mae and Walt Woodward will remain as editors through Volume 18, Issue 2, to appear in the spring of 2018. That will be followed by two special issues in the summer and fall of 2018. As of October 2018, the strategic plan including Common-place will be completed and it is our expectation that a plan for the journal’s direction and control will be in place.
We encourage everyone to join us in this examination of Common-place. Please contact James David Moran, AAS vice president for programs and outreach, at jmoran@mwa.org with your thoughts about Common-place and the Society’s program evaluation and planning. To all of you who have created the Common-place community we extend our deepest appreciation and thanks, your help going forward will, we trust, make the AAS an even more productive contributor to the understanding of American history and culture.
In the final years of the seventeenth century, an Italian adventurer going by the name of Giovanni Francesco Gemelli Careri became one of the first Europeans to complete a solo tour of the world using public transportation. This pioneer of global tourism traveled without any other apparent purpose than to hang around exotic places, chatting with strangers and checking out the views. In order to finance his extravagant lifestyle, Gemelli Careri bribed custom officials all over the world, smuggling valuable merchandise from one country to sell it in the next, while charging people for his services as an unlicensed healer. After returning to Europe, the adventurer published an account of his voyage under the title Giro del Mondo. This book, which was translated into English and French, became an instant bestseller, and today constitutes an invaluable source for those interested in the cultural history of travel literature as a genre. Moreover, the fact that Gemelli Careri joined the Spanish treasure fleet for almost two thirds of his trip gives the modern historian an opportunity to see this transportation system at work through his detailed account.
The history of the treasure fleet is one of imperial crossroads. After obtaining from the Vatican some exclusive but questionable rights over most of the territories newly discovered on the other side of the Atlantic, Spain entered the sixteenth century as a nominal superpower challenged by the hemispheric scale of its ambitions. Fabulous mountains of gold and silver, the two pillars of the European monetary economy, were apparently up for grabs in America. For many a British, French, or Dutch open-minded entrepreneur, that proved to be too much of a temptation. Soon enough, the Spanish Crown would need an enormous amount of ingenuity and resources to keep such competitors out of the profitable pilfering operation mounted by a handful of conquistadors in the Indies.
At the time, crossing the Atlantic in small overloaded ships was still a tricky business that required both excellent navigation skills and favorable meteorological circumstances. With the additional threat of armadas and independent looters, keeping the American riches flowing into Spain became a very complex problem. An obvious solution was found in the form of larger vessels sailing in convoy, during the most favorable months of every year, under the supervision of nautical experts and the protection of powerful warships. By the end of the century this basic idea had evolved into a highly synchronized scheme involving several fleets, mule trains, and caravans of Indian porters linking dozens of mines, garrisons, shipyards, markets, taverns, and seaports from Manila to Seville or Cadiz. The Spanish Flota de Indias became more than just an imperial hauling service, spanning two wide oceans, an isthmus covered by nearly impenetrable rainforests and a labyrinth of pirate-infested islands. For almost three hundred years, this system allowed an unprecedented circulation of exotic products, ideas, and people between Asia, Europe, and America, in what could be considered the first global communication network.
Strategically situated in the northern coast of Cuba and facing the Gulf Stream, the port of Havana became a critical node in this web. It was the final meeting point for all the ships of the Spanish treasure fleet, right before the last and most dangerous leg of their long trip back to Europe. With an ample harbor, an excellent shipyard, and a fertile hinterland, Havana constituted the perfect place for refreshment and repairs. The fleet stayed there several months at a time, waiting for clear horizons, superior orders, or propitious winds. The idle crews of the galleons mingled with the locals. An exchange of forbidden merchandise, alternative ideologies, musical rhythms, and bodily fluids took place every year. Naval officials and illustrious passengers from faraway lands visited with the most distinguished among the Creole elites, while humble sailors gambled their wages against the meager savings of some African slaves employed as harbor workers. Culinary innovations, creative architectural arrangements, and flexible forms of prostitution were developed to accommodate thousands of newcomers. Daily life in Havana was determined by the seasonal presence of the fleet, with commodity prices, moral standards, and levels of hygienic tolerance changing along the sharp fluctuations of a service oriented economy.
Gemelli Careri started his trip in 1693, with a visit to Egypt, Constantinople, and the Holy Land. At the time, this Middle Eastern routine was already becoming a standard ingredient of any excursion into foreign lands, a hike that was almost not worth writing home about. However, from there on our Italian “tourist” would take less traveled routes. After crossing Persia and Armenia, he visited Southern India and entered China, where the Jesuit missionaries assumed that such an unusual Italian visitor ought to be some sort of spy working for the pope. This fortuitous misunderstanding opened for Gemelli Careri many of the most tightly closed doors of the country. He got to visit the emperor at Beijing, attended the Lantern Festival celebrations and toured the Great Wall.
From Macao, our globetrotter sailed to the Philippines, where he stayed two months while waiting for the departure of the Manila galleon. As Gemelli Careri described it in his journal, the half-year-long transoceanic trip to Acapulco was a nightmare plagued with bad food, epidemic outbursts, and the occasional storm. In Mexico, our Italian traveler became a celebrity by the simple expedient of telling his anecdotes over and over to the local aristocrats. His insatiable curiosity would take him beyond the capital, visiting several mine towns and the ancient ruins of Teotihuacan. After five years of wandering around the world, Gemelli Careri was finally on his way back to Europe when he joined the treasure fleet in Cuba.
The ship that brought him from Mexico entered Havana in December 28, 1697, after two weeks of rough sailing. It took a couple of days for the passengers to disembark and find lodging. Apparently, that New Year’s evening was not one of the most animated in the history of the town. The next morning, however, the governor threw a little party to celebrate the election of new city hall officials, with the bishop joining them right after mass. All through that first week, Gemelli Careri visited every possible convent, church, and chapel in the city, observing the holiday’s liturgy with customary fervor. Soon, he would get to know the neighborhood very well. For such a seasoned traveler, to spend a couple of months in Havana doing nothing was obviously too much. That is why his account of this uneventful vacation starts with enthusiastic descriptions of nocturnal soirees and hunting excursions to the outskirts of the city but ends with an impatient countdown of the boring days that took the fleet to get ready for departure.
In the imagination of a Dutch engraver, seventeenth century Havana appears to be a city of stylized churches and oriental domes. All that naval action upfront contributes to the exotic atmosphere of the image. The chain at the entrance of the bay is clearly visible. From John Ogilby, America: Being the Latest, and More Accurate Description of the New World, (London, 1671), 332-33. Courtesy of the Department of Special Collections, Charles E. Young Research Library, UCLA.
During the seventeenth century, Havana was little more than a fortified village with cosmopolitan pretenses. Founded in 1519, it didn’t become the official capital of Cuba until 1607. From then on, its population started to grow faster and faster. Five hundred families were accounted for in 1608. Eighty years later, they amounted to three thousands, for an annual growth rate of more than 6 percent. Almost twenty-seven thousands people lived in Havana when Gemelli Careri visited the town, and they represented more than half of the total population of the island. One third of the urban dwellers were considered to be of African descent but only a handful of them were free from slavery.
Prejudices of the Spanish Crown against any foreign force (from Huguenots to hurricanes) had endowed the city with a cute lighthouse, three solid castles, and a walled perimeter. A floating wooden chain reinforced with iron hardware was extended across the narrow entrance to the harbor, while a formidable battery of cannons humorously nicknamed the Twelve Apostles covered the same channel. At the time, this integration of religious allegory and modern ballistics was not unusual. In fact, protection for Havana was provided by the best combination of heaven’s favor and civil engineering that any system of sanctimonious prayers and imperial subsidies could buy. Of course, while churches and fortresses competed for a visual monopoly of the modest skyline, dark convents and overcrowded garrisons became active centers of political rivalry. Apparently, Havana was a town of petite intrigues; as famous for its cigars as for the inevitable gossiping that accompanied their consumption.
On the other hand, its role as a major port in the route of the treasure fleet contributed to inflate the city’s defense budget. In order to pay for the maintenance of the fortifications and the salaries of the troops, Spain diverted considerable sums to Havana in the form of Mexican silver and gold. This money fueled the city economy for many years, because it was spent mostly to cover the costs of local labor, supplies, and services. However, as often happens with subsidies, the Mexican situados had an ambiguous impact on the society as a whole by creating a dependence on easy financing for the development of public works. Private contractors got overpaid for second-rate jobs done with the inefficiency that is usual in such cases. This combination of paternalism and sloppiness signed by a track of unfinished buildings and chronic deficits would become an essential ingredient in the Cuban municipal culture.
The social center of Havana was in its harbor. Something interesting happened there almost every day. The movement of royal, private, and clandestine freight was equally heavy. Dozens of merchants from everywhere in the Caribbean converged in the city during the winter months, despite severe rules limiting the access to strategic information about the schedule of the treasure fleet. While the inflated cost of life in Havana brought many providers of basic staples, hundreds of travelers and sailors with an appetite for American souvenirs attracted to the city some other interesting characters trafficking in exotic items. According to Gemelli Careri, a handful of half-naked Indians with long braided hair came from the Florida Keys in a small vessel at the beginning of January. These people were allowed to trade with the galleons in exchange for welcoming fourteen Franciscan missionaries on their land. Every year they brought to Havana hundreds of caged cardinals that were immensely popular among the fleet’s passengers. In the following weeks, several other boats loaded with cages came from Florida. At the moment of departure, our witness affirmed, more than eighteen thousand pesos fuertes in North American live birds were on board of the galleons.
Being a merchant himself, Gemelli Careri was impressed by the enormous value of the fleet’s cargo. In a complicated operation that took several weeks, thirty million pesos in gold and silver were carried from La Fuerza castle to the galleons anchored nearby. In the mean time, a controversy ensued between some port functionaries and the captains of the ships. According to the local experts, the safety of the entire fleet would have been compromised had the galleons not been loaded all the way. Any combination of faster and slower ships would amount to a very inefficient convoy. After a weekend of intense negotiations involving all interested parties, the general in command of the fleet agreed to increase the cargo levels in the name of the king. Apparently, similar maneuvers were concocted in the harbor almost every year. They were part of a continuous strategy to play around standard regulations established by the Spanish Consejo de Indias, which limited the volume and nature of the commodities transported from Havana to Cadiz.
On February first, a ship with provisions for the fleet arrived from Spain. Six hundred sacks of flour and roughly the same amount of biscuits were distributed among the galleons. In a solemn procession, a blessed image of the Virgin Mary was carried from the cathedral to the general’s ship, while cannons and muskets were discharged in its honor. Apparently, everything was ready for departure, but nothing happened for weeks. Leaving Havana was a complicated decision to make. It involved the evaluation of nautical, meteorological, and geopolitical variables, becoming sort of a gambling move with too many careers and lives on the line. One Sunday in mid-February, a public announcer blowing the horn used in such official occasions made clear that the fleet would eventually leave Havana . . . in a month or so. Another ship loaded with wheat, fresh fruit, and wine came from the Canary Islands on February 17. From that day on, local authorities made sure that nobody was able to abandon the harbor until the final departure of the galleons. With so much money on board, it was important to keep a reasonable level of secrecy regarding the movements of the fleet.
At the beginning of March things started to move a bit faster. Sailors were paid their full salaries in order to encourage the general performance of the crews, but lots of them deserted right after that. This was another chronic problem faced by the high command of the convoy. Many young kids from the Spanish countryside joined the treasure fleet as a way of getting an easy ticket to America, and Cuba was already becoming a popular destination. Ship officials reacted to this phenomenon by recruiting locals lured with the prospect of European adventures. In some extreme cases, youngsters were practically kidnapped to complete the crews and slaves were “confiscated” in the name of the king to do most of the hard work on the ships.
On the evening of March 12, after enjoying his last dinner in town, Gemelli Careri embarked on the galleon El Gobierno. It was difficult for him to get any rest at all, because of the noise made by a hundred pigs that were accommodated in the ship that night as part of a living cargo that included also several cows and lambs. The laborious process of sailing out of the harbor started early next morning, but the first ship in the long convoy didn’t cross the entrance until sunset. Every galleon saluted each fortress with six or seven salvoes that were dutifully reciprocated. It was, all in all, another noisy night. Next day, a formidable storm hit the fleet offshore of Havana. One of the largest vessels was lost, and it took several days for the convoy to recover from the shock. In the meantime, a woman dressed as a man was found in one of the galleons. According to Gemelli Careri, the captain allowed her to stay with the rest of the female passengers only because it was already too late to take her back to Cuba. On its way to Europe, the fleet had passed the point of no return.
Every city is an urban palimpsest, a used parchment covered with the fragmentary scrawls of its own past. That is probably why some of the institutions and values generated by three centuries of providing services as a transatlantic center of commerce and entertainment are still visible in Havana today. It has the appearance of a colonial city falling apart under the rigors of tropical socialism, but an entire urban fabric of remarkable flexibility is still alive and well behind this mask. Almost every Atlantic trait has been waived into the local culture, and despite four decades of self-imposed isolation, Havana is still an open city. The chain at the entrance of the harbor is gone. In front of the Morro Castle, the little fortress of La Punta is today a museum of maritime history. There, two old keys made of gold and silver are exhibited in a window as part of the treasure of historical information salvaged from a local shipwreck. Like any other key, these were designed to keep doors closed but also to open them again. In that sense, they are symbols of Havana’s convulsed past and signs of hope for its future. In our story, however, these keys play a most modest role. Local archeologists found them in the sunken hull of a galleon named Nuestra Señora de las Mercedes, the very same ship lost in front of the harbor in 1698, when Gemelli Careri survived his last Cuban storm.
Further Reading: This piece is partially based on Giovanni Francesco Gemelli Careri, Voyage du Tour du Monde, Traduit de l’Italien… vol. 6, (Paris, 1727). For further reading on Havana and Cuba in the seventeenth century see: Arrate y Acosta, José Martín Félix de, Llave del Nuevo Mundo; Antemural de las Indias Occidentales. La Habana Descripta: Noticias de su Fundación, Aumentos y Estados (Havana, 1964); Leví Marrero, Cuba, Economía y Sociedad, v. 3-5, (Madrid, 1975); Joseph L. Scarpaci et al, Havana: Two Faces of the Antillean Metropolis, (Chapel Hill and London, 2002). Also relevant are: Timothy R. Walton, 1948- The Spanish Treasure Fleets, (Sarasota, 1994) and Fundación ICO, El Oro y la Plata de las Indias en la Epoca de los Austrias, (Madrid, 1999).
This article originally appeared in issue 3.4 (July, 2003).
Adrian L. Denis is a graduate student at the University of California, Los Angeles, and has written about the history of medicine, health, and disease in colonial Cuba. He is author of Saco, Sagra y el Cólera Morbo (Pinar del Rio, 2000).
The scarcity of sources and the tension between ordinary peoples and the elites are two challenges Li Jianming sees for the future of Revolution-era scholarship.
Alan Taylor reflects on the contagion of slavery, the contagion of liberty, and an alternate path for the history of the American Revolution.
This article originally appeared in issue 14.3 (Spring, 2014).
The Undigested History of the Nantucket Atheneum
Tearing up floorboards in search of hidden treasure is a bad idea. A messy process, it seldom yields anything of value. As a rule, attics are a better bet.
Figure 1: Nantucket Atheneum
Every rule has its exception. In 1994, the Nantucket Atheneum began an extensive renovation that rescued the island’s public library from shabby gentility, restoring the neoclassical beauty the building was known for when it first opened in 1847. The renovation added a striking new wing for the children’s library and highlighted the soaring ceilings in the Great Hall upstairs, where the windows stream sunlight or the gray Nantucket sky, depending on the day. The renovation also uncovered an archive. As construction began, floorboards came up, walls and doors came down. Eventually the building itself was lifted off its foundation as the crawl space was turned into a ground floor. But to my mind, the removal of the floorboards marked the turning point of the renovation. For as it happens, the space beneath them sheltered a collection of over five hundred nineteenth-century pamphlets.
Figure 2: Floorboards removed from the first floor in Nantucket Atheneum
Charlotte Louisa Maison, then the director of the Atheneum, remembers the day the pamphlets were revealed. An island carpenter, there to remove a set of bookshelves, found that he needed to pull up the floorboards on the librarian’s dais on the first floor. Not long into his work, he called Maison into the room. There beneath the dais, where a hundred years’ worth of librarians had sat, lay a carefully organized collection of pamphlets, untouched for more than a century. A deep layer of dust covered the boxes holding them. Breathing masks were in order, and Maison rushed out to buy some. Although they could not be sure, Maison and Betsy Tyler, the Atheneum’s Great Hall librarian, began to speculate that the pamphlets were placed in their uncommon vault late in the nineteenth century as a stopgap measure. Thanks to the renovation, the Atheneum is now a winter refuge for year-round islanders. But in prefiberglass 1890s Nantucket, when the last major overhaul of the building was undertaken, the Atheneum was a notoriously drafty and cold structure in the dreary winter months. Pamphlets seem to have done the trick of blocking those persistent island winds whistling through the floorboards.
Figure 3: Pamphlet from the Nantucket Atheneum
It is an odd collection of materials the Atheneum belched forth. Among the oldest of the pamphlets is the 1825 Sermon on the Death of His Excellency, William Eustis, Late Governor of the Commonwealth of Massachusetts, Preached in the First Church in Roxbury, February 13, 1825. The author, Thomas Gray, “Minister of the Church on Jamaica Plain,” laments the passing of Governor Eustis in a nationalistic language common to many of these short publications. “Most of the revolutionary patriots,” he tells us, “after having served their generation faithfully, have, at length, fallen to sleep.” In The Morals of Freedom. An Oration Delivered before the Authorities of the City of Boston, July 4, 1844, Peleg Chandler chastises Americans for past Independence Day indiscretions and urges a more sober approach to freedom:
The men of the revolution have been accustomed to join in the festivities of the occasion, and their presence has added interest to the scene, imparting to the orators of the day an animation, that has sometimes exceeded the bounds of sound discretion and cultivated taste. But with the present generation a change is taking place. The scenes of the revolution are growing dim in the distance; those venerable men are no longer with us, and our independence is celebrated by the children’s children of those who achieved it. The period for declamation has passed away. A more somber hue now rests upon the day.
Francis C. Gray’s 1832 Oration Delivered before the Legislature of Massachusetts, at Their Request, on the Hundredth Anniversary of the Birth of George Washington similarly marks the passing of the Revolutionary generation and the coming of a new type of American.
The collection includes abolitionist arguments, religious tracts from a variety of sects, and my personal favorite: an illustrated short history of hats. Beyond their obvious draw as historical documents, the pamphlets gathered in this chaotic collection are an appealing site for speculation about the life of readers and reading in the nineteenth-century U.S. How was it that the Atheneum ended up with a haphazard collection such as this one? When, if ever, were these pamphlets read on the island? Other than their status as ephemera, what do these short works share that would have warranted their being organized together? Answers to these questions may lie in the common history of these pamphlets, a history intimately tied to the larger story of Nantucket itself. A history punctuated by every bibliophile’s nightmare: fire.
In 1846, Nantucket suffered a tragedy still known to islanders as the Great Fire. This conflagration swept through downtown Nantucket, destroying businesses, homes, and the original Nantucket Atheneum. It was there in the Atheneum’s first building that Frederick Douglass gave his first major speech before a mixed-race audience in 1841, as he recounts in his Narrative. It was there that both Nantucketers and visitors from the mainland, or “coofs,” publicly debated the death penalty and phrenology, insanity, and Chinese poetry.
Figure 4: Nantucket Atheneum
On January 4, 1847, the Atheneum proprietors wrote the following in their meeting minutes, recalling the events of the previous summer’s fire:
On the night of the 13th of July [1846] every effort was made to save the contents of the Atheneum, and too much praise can hardly be awarded to those who used such great exertions to preserve what they could, from the devouring element, neither does it detract from the credit due to those who labored in removing the articles, that their efforts were of little avail. Books, Pictures, and busts were removed from the Library Room, and were overtaken and destroyed by the Fire, when they were supposed to be in a place of safety . . . The portrait of Professor Silliman and a few Books were all that was saved, about 130 volumes [out of 3200], most of them at the time, in the hands of Proprietors, and out of the Library have been recovered, nearly half of which are odd volumes. About 250 coins, and a few relics from the ruined cities of Europe were picked up from among the ashes, & are little injured.
Mourning their deep loss, the proprietors conclude, “The destruction of this Library, and the loss at the same time of the Coffin School, and of many private libraries, left the inhabitants of the island more destitute of reading resources than ever before. The circumstances called for some unusual exertion.”
As their early standing in the bloody and dangerous whaling industry indicates, Nantucketers were nothing if not tenacious. When the Atheneum’s proprietors learned that its officers had failed to pay the Atheneum’s insurance, and that as a result there would be no money forthcoming to replace the Atheneum’s lost resources, they did not falter. They rattled the cup up and down the Atlantic seaboard, writing the Providence Atheneum, the American Philosophical Society of Philadelphia, and a variety of other organizations, asking each for a donation of books. The proprietors’ minutes acerbically note the results: “Of the Institutions addressed, only one, the ‘Young Men’s Institution of Hartford,’ has yet sent any books. Efforts were made by some members of the Providence Atheneum to procure a vote (of a committee to whom the subject was referred,) giving their duplicate-volumes to the Nantucket Atheneum, but the committee were equally divided on the question of its constitutionality. The American Philosophical Society of Philadelphia, and the Philadelphia Atheneum sent letters of Sympathy.” Undeterred, the Atheneum’s leaders wrote to “Certain Gentlemen of Boston” seeking counsel. They, in turn, “advised that Individuals, as well as Institutions should be addressed.” At last the Atheneum advocates met with success.
In a remarkable demonstration of generosity, books, newspapers, and printed matter of all kinds began to pour into the Atheneum. In the first week after the call to individuals was sent, the Atheneum received “One Hundred and seventy volumes from mess. Little & Brown . . . and One Hundred Dollars, from Amos Lawrence, Esq. of Boston.” From that point on, the tide turned so drastically, the Atheneum faced a virtual flood:
“Citizens of Boston” who are unwilling of a public appearance of their names . . . collected the sum of Seven Hundred dollars, purchased and forwarded to the Atheneum about 800 volumes . . . From Ticknor & Co. Boston, 44 vols. from Edwd. Everett 39 vols. Two daily and two weekly papers are forwarded to the Atheneum by Mr. Everett on the day after their publication. From C. P. Curtis of Boston 32 Vols. and the unbound nos. of two foreign Reviews. From Wiley & Putnam of N.Y., their Library of Choice Reading in 29 vols.
The list goes on at length, closing finally with the following reference to two native Nantucketers: “from Mrs. Lucretia Mott of Philadelphia, 6 vols. & from other ladies 16 volumes, and 20 vols . . . from Paul Mitchell, Esq.”
Several pieces of evidence link the treasure in the floorboards to this moment of unparalleled giving. The boxes holding the pamphlets were labeled according to subject matter in a careful script, which appears to be the handwriting of Maria Mitchell, the Atheneum’s first librarian. Mitchell would go on to be a noted astronomer, a professor at Vassar College, and the first woman admitted to the American Academy of Arts & Sciences. But in the late 1840s, immediately after the fire, she was still the caretaker of the Atheneum’s holdings.
From all indications, it appears that Mitchell processed these pamphlets in the aftermath of the Great Fire and the highly successful call for donations. Each pamphlet bears the stamp of the Atheneum and an accession number. Each pamphlet also carries a different handwritten name on its inside or outside cover. Near as anyone can tell, these names belong to those donors who, responding to the island’s plea for “reading resources,” sent pamphlets old and new.
This collection poses several intriguing problems of interpretation. There is the obvious challenge of confirming that in fact these materials were donated in the aftermath of the fire. But beyond that, assuming they were, how do we understand them? Do they represent what the Atheneum’s generous donors perceived as important materials of the day, things that any major cultural repository should hold? Maison believes they do. She thinks that the pamphlets were, at one time, an “essential part of the library’s collection,” as Mitchell’s meticulous cataloguing might suggest.
On the other hand, maybe the pamphlets were more akin to a current feature of the Atheneum: a swap bin where islanders drop off magazines they have read and take those they haven’t. Lucretia Mott’s name is on the inside cover of one of those pamphlets. Perhaps Mott was, for the good of the Atheneum, sacrificing a piece she cared for. Or perhaps she was simply weeding her collection.
And maybe, finally, the lesson to be drawn is that these two possibilities are not, in fact, mutually exclusive. Maybe at some point it was simply decided that a lamentation of the death of Governor Eustis was more useful as insulation than as reading material. Which leaves us asking the most significant question of all: at what point, and for what reasons, does print “culture” start to look like pulp? When, and at whose command, does treasure turn into trash?
For more information on the Nantucket Atheneum’s pamphlets and its historic nineteenth-century collection, contact Betsy Tyler, the Great Hall librarian.
This article originally appeared in issue 2.1 (October, 2001).
Lloyd Pratt teaches nineteenth-century American literature in the history and literature program at Harvard University. In 1999, he was the Massachusetts Foundation for the Humanities/Bay State Historical League Scholar-in-Residence at the Nantucket Atheneum.
My Dinner with GW
My eighth-grade U.S. history students read letters and speeches by George Washington, John Adams, and Thomas Jefferson. Washington seems “dignified” and “formal,” they say, Adams “dutiful” and “pessimistic,” and Jefferson “fashionable” and “witty.”
With whom would you most like to have dinner? I ask. Rachel chooses Washington for his straightforward judgments, and Matt likes Jefferson for his way with words. Some worry about having dinner with any of them, concerned that they would be intimidated in the presence of such learned men.
Would you feel intimidated by talking to today’s presidents, say, George Bush or Bill Clinton? No way, the students respond, because these modern men seem much more approachable.
The exercise might seem lighthearted, but, in teaching eighth graders at a private school outside Los Angeles, I have found that the more I focus on personality, the better. Middle-school students, deep into discovering their own identities, are entranced by the piercing look in John Calhoun’s eyes and the emotional appeal of Harriet Beecher Stowe’s characters. I supplement the textbook [Henry F. Graff’s America: The Glorious Republic, rev. ed.(Boston, 1998)] with maps, slides, songs, primary sources, literature excerpts, and snippets of videos, from A&E Biographies to Ken Burns’s The West.
One of the most approachable ways to tie past to present is to focus on issues that directly affect students’ lives, such as women’s rights and methods of communication. As students pore over the Declaration of Sentiments from the Seneca Falls Women’s Rights Convention, they commonly cite two complaints as among the most grievous. In 1848, women were “without representation in the halls of legislation” because they lacked the vote and, in cases of divorce, the one who cared for the children was determined “wholly regardless of the happiness of the women.”
Fig. 1. Portrait of Susan B. Anthony and Elizabeth Cady Stanton, between 1880 and 1902. Courtesy of the Library of Congress.
When I ask whether women have completely addressed these grievances today, Melody insists women will not be truly represented in government until the Senate has more than fourteen female members. Rhett wonders why it is now mostly women who keep the children after a divorce. Suddenly the apparent turnaround in women’s rights over the past 150 years does not seem quite so black-and-white.
Fig. 2. The 108th Congress opens with a record fourteen women senators.
Communication changes over two hundred years bring up other interesting personal questions. After we look at Abigail and John Adams’s “Remember the Ladies” letters from 1776, I ask if this couple would have written in the same style if they had instant messaged on the Internet instead of writing in longhand. My eighth graders say no, and they suggest that the two might have corresponded more because it would have been easier!
When we look at cartoons about the bank from Jackson’s 1832 reelection campaign, the students marvel at their complexity, and we compare them to the latest simple missives from current papers. One recent cartoon by Pulitzer Prize-winning artist Michael Ramirez, for example, shows three “looters”–France, Germany, and Russia–tiptoeing into the Iraq Ministry of Reconstruction just after the war ended. People have less time to read the newspaper today, Billy says, and maybe that’s why today’s political drawings are made to be understood in split seconds. Or maybe we are a savvier society and do not need such detail to get the gist of events, one girl says. These young consumers are well aware of the media’s influence on content.
Once we establish personal links from past to present, it is a short step for students to bring their critical skills to bear on broader questions of American culture. When we read John O’Sullivan’s 1845 essay on the “manifest design of Providence in regard to the occupation of this continent,” I ask whether the United States still has a sense of Manifest Destiny. Perhaps not geographically, students say–we seem to be satisfied with the land we have, and conquest is somewhat out of fashion. But Gabriella insists that we have a cultural and technological sense of power instead. And Bryan suggests that we have a new sense of destiny in spreading democracy to other nations. For these students, our modern world seems at once far from the insular mid-1800s and yet also all too close.
Knotty or mundane national topics often come into relief through a current reference. John Calhoun’s nullification argument with Andrew Jackson over the Tariff of 1832 can seem so intricate in the textbook that students wonder why someone would care so much about a tax. After we discuss the South’s resentment toward the North over the issue, I bring in an article from the New York Times on why some steel companies today still want a protective tariff. When we talk about the importance of the Mississippi River for trade throughout the period, I show them an article about a man who swam the length of the Mississippi last year, more than 2,300 miles. “He’s crazy!” the students say, but they are also impressed. Hearing about Martin Strel’s trials from Minnesota to Louisiana makes the United States seem as daunting as it might have been had the students lived in the early 1800s, without cars or airplanes to make their way across the continent.
Fig. 3. Martin Strel swimming the Mississippi River, July 2002. Courtesy of Borut Strel.
I use the same approach in examining international affairs. Every Friday I ask four students each to bring in a relevant article from the newspaper, summarize it, and point out its location on a map. As the war with Iraq loomed through the 2002-03 school year, we heard echoes from conflicts throughout U.S. history. In February, Jessica presented a newspaper article that described the United States’ case for Iraq at the United Nations. “The article reminded me of Madison’s grievances before the War of 1812,” Jessica said of why she chose it. Her comment led to a class discussion comparing the threat created by Iraq’s violations, when the United States is well established, with the danger then posed by Britain’s impressment of soldiers, when the U.S. was a fledgling country. We ended by pondering a thorny question: When should a nation fight against aggression?
Fig. 4. Engraving of the Battle of New Orleans, between 1815 and 1820(?). Engraver: Joseph Yeager. Artist: Benjamin West. Courtesy of the Library of Congress.
Later in the year, we studied Polk’s manipulation of the beginning of the Mexican-American War. How was his insistence that Mexico has “shed the blood of our fellow-citizens on our own soil” different from and similar to Bush’s proclamation that Saddam Hussein must be rooted out of Iraq? The more liberal students were eager to link Polk’s fort-building in disputed Texas to Bush’s argument that Iraq had weapons of mass destruction. The more conservative students, of course, said that Polk was obviously in the wrong and Bush was not! A few months later, one student boomeranged the question back into history by wondering how similar Abraham Lincoln’s resupplying of Fort Sumter in 1861 was to Polk’s decision, since both choices involved presidents’ waiting for the other side to provoke an attack.
In the more than two hundred years since the beginning of the period we study, technology has changed, but personalities and ethical struggles have not. Every morning when I read the headlines, it’s a fair bet I’ll find something to bring up that day in class. For my students, current events are their teenage background noise, a buzz that is sometimes curious and often unfamiliar. Was the Supreme Court always as powerful as it is now? they wonder. Is affirmative action necessary? they ask. I cannot answer all of their questions definitively, but I do point them toward history. And I can remind them that adults don’t have all the answers, then or now–and that when they become adults, they can look to history for parallels and, perhaps, for guidance in their own decisions.
This article originally appeared in issue 4.1 (October, 2003).
Sarah Cooper has taught English, history, and journalism for five years at independent schools in Southern California. She currently teaches middle- and high-school U.S. history at Peak to Peak Charter School outside of Boulder, Colorado.