Beginning in July 2002, the Readex Corporation, in cooperation with the American Antiquarian Society, began releasing the first installments of “Evans Digital,” a full-text-searchable version of Charles Evans’s American Bibliography, first published on microfiche by Readex and the AAS in the 1950s. When complete, the Evans Digital will include more than thirty-six thousand items: virtually all books, pamphlets, and broadsides published in British North America from 1639-1800. The publication of the microfiche collection made astonishingly rare books available at small colleges and large universities across the country and even around the globe. The digital edition, with full-text searching, may well revolutionize the study of early American life. Common-place asked two prominent scholars of early America, whose work has relied extensively on printed materials, to review the Evans Digital.
This article originally appeared in issue 3.3 (April, 2003).
The American Jeremiad at 35
The State of the Americanist Field
The remarks collected here were offered on a roundtable at the 2013 Modern Language Association conference in Boston held to mark the thirty-fifth anniversary of the publication of Sacvan Bercovitch’s influential 1978 study, The American Jeremiad. The charismatic power of Bercovitch’s scholarship in the late 1970s and ’80s is hard to overstate, although more recently (in the wake of the various turns in Americanist criticism toward multiculturalism, post-nationalism, transnationalism, post-exceptionalism, and so forth) it has been as often contested as not, faulted for its alleged focus on a narrow canon of American literature, for its national frame, and for what some see as its embrace of American exceptionalism, a term that has taken an unexpectedly prominent role in political discussion in the U.S. over the past two years.
The contributors to the roundtable were asked to address both the specific case of Bercovitch’s American Jeremiad and, more generally, the state of the field of Americanist literary scholarship 35 years after the appearance of this influential and powerful interpretation of the “meaning of America,” and in the wake of its 2012 reissue (with a new preface by the author) by the University of Wisconsin Press. The 35-year mark is not the only reason for a timely reconsideration of Bercovitch’s scholarship and its legacy: his earlier book, The Puritan Origins of the American Self (originally published in 1975) was also recently republished with a new preface by the author (Yale, 2011). Another occasion for this discussion was the 2011 publication of a rich collection of essays, gathered in Bercovitch’s honor, The Turn Around Religion in America: Literature, Culture, and the Work of Sacvan Bercovitch, co-edited by two of the contributors here, Nan Goodman and Michael Kramer.
The contributors all brought to the roundtable an affiliation with Bercovitch of one kind or another. Along with their history of affiliation with Bercovitch, however, none of these scholars works in what might be called a specifically Bercovitchian mode. Some of them work on non-canonical literature, on ethnic literatures, on educational theory and practice, on queer cultural production, on popular or mass literature, in legal theory, feminism, and so forth. They bring to bear perspectives that are, while not diametrically at odds with Bercovitch’s focus, nevertheless situated in an at once intimate and orthogonal relationship to it, and thus liable to provoke sympathetic yet critical reconsiderations of the place of American Jeremiad and of ambitiously synthesizing critical scholarship in our discipline today.
This article originally appeared in issue 14.4 (Summer, 2014).
Christopher Looby is professor of English at UCLA where he also directs the Americanist Research Colloquium. He is the author of Voicing America, the co-editor (with Cindy Weinstein) of a collection of essays called American Literature’s Aesthetic Dimensions (2012), and the general editor of “Q19: The Queer American Nineteenth Century,” a forthcoming series from the University of Pennsylvania Press.
Invitation
The Nat Fuller Committee hereby invites the reader to set up a Nat Fuller Feast in his or her own city in honor of Nat Fuller’s legacy and his dedication to creating a space for reconciliation.
1. Invitation. Courtesy of the Nat Fuller Committee, photo credit Jonathan Boncek.
For more information about Nat Fuller’s life and work, please visit the Lowcountry Digital History Initiative site, and consult Chef Kevin Mitchell’s discussion of Chef Fuller’s repertoire. A still more complete discussion of the food and the feast can be found in the biography of Nat Fuller authored by David Shields and Kevin Mitchell. All proceeds from the biography will be put in service of future feasts.
The committee invites you to stage the feast at any date that would be significant for your community. The Southern Foodways Alliance is planning to hold a Nat Fuller Feast in Birmingham, Alabama, in late February 2016 with chefs Frank Stitt and Kevin Mitchell.
2. Biography of Nat Fuller. Photo by Jonathan Boncek, courtesy of the Nat Fuller committee.
The next feasts in South Carolina will be held on either April 19 (the anniversary of the original event), or June 17 (in commemoration of the Charleston Massacre).
This article originally appeared in issue 15.4 (Summer, 2015).
Electoral College: Bush v. Gore
For scholars interested in courts and their role in American political development, the aftermath of the two most striking recent American political events poses some deeply unsettling questions. The first event was the U.S. Supreme Court’s decision in Bush v. Gore, which effectively settled the 2000 presidential election in favor of the candidate who received fewer votes nationwide. The second event was what we now call, a bit glibly, “9/11.” What’s unsettling for scholars of law and American constitutional development is that the second event seems to have almost wholly eclipsed the first.
Domestic and world politics are being greatly affected by the broad-ranging responses to the September 11 attacks. Ongoing responses to Bush v. Gore are less apparent. Neither President Bush, whose poll ratings recently were at record highs, nor the U.S. Supreme Court, which continues to decide cases and sometimes strike down laws and overturn precedents much as before, seem affected by Bush v. Gore in any major way. Recent scholarly discussions suggest that, though every law professor, “public law” political scientist, and constitutional historian felt impelled to weigh in on the case in the first few months of 2001, in 2002 it no longer appears to have the great historic significance many of us accorded it a scant year go. Put bluntly, of these two stunning events, “our” event, the Supreme Court decision, now seems not such a much.
Fig. 1. The Constitution. First Draft, August 6, 1787. The Gilder Lehrman Collection, courtesy of the Gilder Lehrman Institute of American History, New York.
But historical judgments require both perspective and ongoing reexamination; and on this topic we need to probe further, not call off the hunt. Commentators were not wrong to think that Bush v. Gore would have enduring importance for many scholarly debates, and I believe it may have even more far-reaching implications than those commonly noted. It suggests much about the standards for governmental legitimacy that actually prevail now and that are likely to prevail in the years ahead.
What are the established debates to which Bush v. Gore seems clearly relevant? Here are some:
Do courts have meaningful independent power in the American constitutional system, or are they essentially instruments of more powerful political forces?
Do courts exercise whatever power they have in ways that express the dictates of authoritative legal sources, or their own ideologies, or their own partisan preferences?
Have courts acquired more power over the course of American constitutional development, and if so, will Bush v. Gore diminish or expand that power?
Have courts become more ideological and partisan, less bound by the rule of law, over the course of American constitutional history, and if so, will Bush v. Gore work to moderate or heighten those trends?
Is the American constitutional system meaningfully democratic, or is it a system that confines its democratic elements within such elaborate systems of indirect representation and checks and balances that it effectively preserves domination by established elites?
Will Bush v. Gore work to make American constitutionalism more democratic or less democratic?
In addition, I suggest, there is a broader academic and political debate about what actually constitute and what should constitute the proper standards for legitimate government today. Francis Fukuyama’s The End of History famously suggested that in its essentials, this debate was settled with the fall of communism: though specific conceptions are sharply disputed, virtually all the world now agrees at least rhetorically on values of democracy and the rule of law that incorporates basic human rights. Strong scholarly currents in modern law, political science, and philosophy similarly suggest that agreement on procedures embodying those values must be the touchstone of political legitimacy today. Around the globe, efforts to craft new political systems that include some form of allegedly meaningful democratic elections, independent judiciaries, and even transnational representative bodies and courts, are gaining ground. So Jürgen Habermas and others have argued that humanity’s political future may well be defined by new forms of “world citizenship” involving overlapping memberships in multiple semi-sovereign political communities that embrace democracy, human rights, and the rule of law.
Those speculations on future history may seem well removed from Bush v. Gore, but I will argue for some real connections–after first going through the case’s significance for these other debates, many with historical dimensions. I will rely throughout chiefly on the most thorough, balanced, and convincing study of Bush v. Gore to date, Howard Gillman’s marvelously readable The Votes That Counted.
The Power of Courts
On the first debate, the power of courts, judicial scholars surely are entitled to claim some vindication of the importance of their objects of inquiry. If the Supreme Court had not decided Bush v. Gore, then the struggle over the election would probably have continued far longer, perhaps including a statewide recount, designation of rival electors by the Florida legislature, and contests in both houses of Congress over which Florida electors to recognize. The outcome of those battles cannot be foreseen. Perhaps all would have come out much as it did, but perhaps deeper divisions and a greater constitutional crisis would have emerged, and perhaps Al Gore might even have become president. Though one can question just how different Gore is from Bush in the big picture, it obviously matters a great deal to many people that Bush, not Gore, is in the White House today; and it is hard to tell the story of the 2000 election in a way that does not affirm that Bush v. Gore really mattered for that result. It is also hard to deny that the Court exercised meaningfully autonomous power here. Initially only a few Bush partisans, at best, thought the Supreme Court could or would get involved in the Florida election disputes, but they did, decisively; and no one made them do it.
Did the U.S. Supreme Court and the other courts involved in the election disputes simply adhere to the rule of law as they saw it? Did they follow their own political and legal ideologies, even at the expense of the clear meaning of authoritative legal texts? Or worst of all, did they ignore text and ideology and simply vote for the party they preferred? Gillman’s work focuses on this issue. He argues persuasively that the Florida courts and the lower federal courts, though doubtless influenced by ideology, did not vote consistently enough for their fellow partisans to give the last explanation great resolving power. But he does not reach the same (arguably) reassuring conclusion in regard to the five justices of the U.S. Supreme Court who determined the result in Bush v. Gore.
He finds that on every issue–in their initial ruling that the Florida Supreme Court could not judge Florida election disputes with reference to the state constitution, only by reference to state statutes; in their order stopping the statewide manual recount because it might irreparably damage Bush, without regard for Gore; and in their final ruling that the conduct of that recount violated the Equal Protection Clause, and that further recounting was impossible because it had to be complete under Florida law by December 12, the day Bush v. Gore was handed down–each time the conservatives voted in favor of Bush. And they did so while adopting novel legal doctrines: The Constitution’s Article II means that in structuring federal elections a state legislature is not bound by the state constitution that authorizes all its powers? Equal Protection prevents a court from permitting in recounts the sort of discretion and variation that occurs in elections? The December 12 deadline is mandatory even though it is specified in no Florida statute, just because the otherwise pervasively mistaken Florida Supreme Court indicated that it was? All these rulings were very fresh news to most constitutional scholars. They also depart from the main thrust of the conservative justices’ previous holdings, for they have often preached great deference to state authorities and restrictive readings of the Equal Protection Clause in most nonracial contexts. Hence Gillman concludes that in general, courts cannot be accused of purely partisan voting in regard to the Florida disputes, however debatable some of the Florida Supreme Court’s rulings might be; but the most important court, the U.S. Supreme Court, can be so accused.
The Power of Partisans
But if Bush v. Gore suggests that courts do matter, and that the U.S. Supreme Court, at least, may have acted in partisan fashion here, what does the case suggest for the historical issues concerning judicial power and partisanship? Given the essentially nonexistent institutional role of the U.S. Supreme Court in earlier disputed presidential elections, and given its expanding role in voting disputes such as apportionment since the 1960s, it seems fair to claim that Bush v. Gore indicates that some measure of “judicialization” of American politics is real: courts have become more likely to play a meaningfully autonomous role in important electoral disputes, at least, than in the past. Whether this Supreme Court or courts generally are more partisan than in the past is harder to say, but Bush v. Gore suggests at least that this question ought to be on our agendas. All the more so because of the undeniable fact that there we do not have a great, raging controversy over the decision today–for that fact suggests that, contrary to some initial predictions, Bush v. Gore may actually work to make judicial power and partisanship more routine and politically acceptable.
The general willingness to abide by the decision also has significance for debates over how meaningfully democratic the U.S. political system is, because it underlines how extensive acceptance of its undemocratic elements is. This was, after all, a decision by a distant, nonelected national court, overriding a state court over which state voters exercise a measure of control. It permitted the election of a candidate who indisputably was not the democratic winner nationwide. And it could do so because the American political system includes the electoral college, an institution that is hard to defend from premises of democratic legitimacy. As many scholars have shown, the Constitution’s structures of representation arose partly from a compromise in which northerners and easterners got national power to promote commercial growth in exchange for overrepresentation of the slaveholding states in Congress, and thus in the electoral college. Slavery is in the past now, and the college’s defenders today argue first, that it often helps produce candidates who win a majority of the states as well as a majority of the popular vote, and second, that it rarely matters much to the outcome. All true; but those points do not alter the fact that when the college matters most, it selects candidates most people don’t want, so that (for good or ill) it is irretrievably undemocratic.
The 2000 election has sparked what I view as salutary national legislation to make voting processes throughout the country more accurate, as well as similar reforms on state and local levels; and to some degree as yet unclear, Bush v. Gore has led to litigation that raises Equal Protection claims in electoral and other contexts and insists that federalism concerns should not prevent their adjudication. It is certain, then, that the 2000 election and Bush v. Gore have inspired efforts to make the American political system more democratic, with some real results. But, even though the election showed that American democratic processes are far more inaccurate than they need be, no truly sweeping reforms are on the agenda; and some undemocratic elements built into the constitutional structure, including both the electoral college and the power of the nonelected national judiciary, appear to have been effectively ratified and reinforced, not weakened, by these developments.
Prerogative Power
How do we make sense of the widespread acceptance of Bush v. Gore? Though some lawyers and political scientists like George Mason University’s Peter Berkowitz (who is both) have attacked criticisms of the decision as excessive, most of those who approve of the case’s electoral result have not tried to defend the Court’s reasoning. More conservative judges, lawyers, and political scientists like Richard Posner, Richard Epstein, and John DiIulio have instead generally dismissed the Court’s Equal Protection arguments as opening a Pandora’s box. Some have placed more weight on the claim that Article II of the Constitution gives state legislatures special authority to structure federal elections that state courts must not thwart. But this contention lacks all support in precedent; it involves interpreting the term “legislature” in Article II far differently than the Court normally reads related terms like “Congress”; and it seems paradoxical to treat a legislature as having authority independent of the constitution that gives it legal existence.
Most defenders of Bush v. Gore have therefore relied on an extralegal argument. The decision may not be very firmly grounded in law; but, they contend, it was necessary to avert a constitutional crisis and thereby serve “the good of the nation.” The decision thus becomes an example of what John Locke referred to as the “prerogative power”: a “power to act according to discretion, for the public good, without the precept of the law, and sometimes even against it.” I have previously argued that, given the limits of law, there can be judicial decisions that should be labeled “prerogative power” decisions, unconstitutional yet good. The test for such decisions is whether they really are seen in retrospect as doing more good than harm, even given the harm that they may do to the rule of law or to judicial credibility.
The fact that much of the public seemed relieved to have a virtually tied election resolved, the decline in wrangling over the decision among politicians and academics, and the unquestionable broad public support for President Bush, especially after September 11, all combine to suggest that Bush v. Gore was indeed such a “prerogative power” ruling. I confess I remain inclined to dissent. As Gillman stresses, that view assumes, without any real evidence, that the elected branches would have made things much worse if the Court had not intervened; and so it expresses a deep distrust and aversion for the messiness of democratic politics. When we embrace the result in Bush v. Gore, I fear we also embrace the rejection of democracy. And though the maximization of democracy is not and should not be the only criterion for judging governments, I think we are more in danger of having too little meaningful democracy in America than too much. Hence I think the decision probably did more harm than good.
Broader Implications
Such a conclusion may seem overstated, especially since the circumstances of the case were so extraordinary. Nothing in it serves as precedent for a judicial role in interfering with, much less overturning, results when democratic electoral processes have produced clear outcomes. One can argue, moreover, that even if no specific constitutional provisions or established legal doctrines guided the decision, it still vindicates the broader structure of American constitutionalism. The system’s complex structure of divided authority and judicial independence defused a potential crisis in publicly accepted ways; and that result might be thought to be broadly mandated by the laws comprising American constitutional democracy, as they are undeniably aimed at making it work.
I think the decision and its aftermath are more disturbing, because I think they are part of a broader pattern in the political life of this country and indeed many countries. That pattern indicates that, contrary to academic wisdom or wishful thinking, the prevailing standards of governmental legitimacy today are not chiefly adherence to democracy and the rule of law; nor are they clearly becoming more so. Governments are instead legitimated first and foremost by substantive results of which people approve, just as, I suspect, they always have been. Bush v. Gore won wide acceptance not because people found its legal reasoning persuasive, not because they thought it sustained the rule of law, but because nearly half the electorate wanted the result–Bush’s election–and a significant number of others cared more about getting the thing settled than getting Gore into office. Correspondingly, Gore partisans have never really been reconciled to the decision; but, since many have given up on their rather lackluster candidate, debating the case now seems too inconsequential to pursue. President Bush, moreover, has since won massive approval not because most people have decided that he was democratically, legally elected, but because most Americans see him as providing sound leadership in an ongoing national emergency. They do not really care very much about how he got there so long as they think he is doing a good job.
Similarly, though talk of democracy, the rule of law, and human rights is ubiquitous around the globe today, and though various sorts of representative institutions and independent judiciaries are being established, I doubt that this is really because most people think only democratic procedures and conformity to rule of law can legitimate governments. It is rather because people believe that establishing such systems is a way to get more satisfactory governmental results, by gaining support from the U.S., Western Europe, other powerful nations, and also from investors, while simultaneously curbing some dangerous internal political forces. I am personally glad if people associate getting good results with having at least minimally democratic and legalistic processes, but I also do not think these associations mean they will necessarily demand very much in the way of democracy and the rule of law. At home and abroad, we seem instead to be moving ever more toward institutions in which meaningfully democratic decision making is actually quite rare; in which elite management of public policies and markets to achieve what can be presented as good results is routine; and in which we have expanded judicial power in many forms, but chiefly in service of such elite management. In this light, though Bush v. Gore represented an exercise of autonomous power by the U.S. Supreme Court, it did so as an undemocratic, extralegal way of deciding which set of competing elites was to govern–and it was approved because it did a satisfactory job in this regard. I suspect such successful judicial service to elite political management is in fact more key to the strengthening of judicial power that we are witnessing in many areas of the world today than is genuine attachment to government according to the rule of law.
In short, I see the undemocratic, thinly legalistic, but juridically assertive aspects of Bush v. Gore, and its ultimately complacent reception, as representative of broader trends in governance in the U.S. and the world today. Despite my deep reservations, I am not unqualifiedly hostile to these developments. I think it is not only inevitable but also wise and right to consider governmental legitimacy at least as much from the standpoint of governmental substantive results as from procedural criteria, whether democratic or legalistic. And I also think that the adoption of even minimally democratic institutions, along with courts that have meaningful power to protect basic rights, is generally salutary even if they essentially service elite management. My reasons are Churchillian: all the alternatives seem so much worse.
In any case, my chief concern here is not to engage in normative assessments. It is rather to argue that Bush v. Gore does matter for analyses of American constitutional functioning today, for assessment of important features of the constitutional system’s ongoing historical development, and for analysis of more global political trends. The case and its aftermath suggest that we have reason to believe that judicial power may have indeed meaningfully increased over time, for reasons that we therefore need to explore. They also suggest that attachments to anything more than rather minimal conceptions of democracy may not in fact be deepening, even if democratic rhetoric is ubiquitous, nor is there any reason to expect them to do so in the years ahead. Finally they suggest that, in the twenty-first century, it remains true that judgments of governmental legitimacy do not really turn chiefly on democratic authorization or conformity to the rule of law, but rather on judgments of performance.
That point matters greatly, because as the violence of September 11 indicates, standards for good governmental performance vary widely around the world today, as they do within most if not all societies. George W. Bush is now reviled in many Islamic regions for precisely the acts that have won him acclaim from much of the American public. And if proponents of militantly radical Islamic perspectives were to contemplate the judicial decision that helped put Bush into power, they probably would not think much of it either. We scholars of law, politics, and history therefore need to pay a great deal of attention to substantive disputes of all kinds, not only to the establishment of democracy and the rule of law. For as long as politics in fact turns more on issues of substance than on issues of process, and as long as people have deep substantive differences, we are not likely to see an end to striking political events, or an end to history.
Further Reading: Though I have found Howard Gillman’s The Votes That Counted: How the Court Decided the 2000 Presidential Election (Chicago, 2001) the best treatment of the Bush v. Gore controversy, readers wanting a treatment more sympathetic to the Court should read Richard A. Posner, Breaking the Deadlock: The 2000 Election, the Constitution, and the Courts (Cambridge, Mass., 2001). Those seeking a more vitriolic denunciation of the Court will prefer Alan M. Dershowitz, Supreme Injustice: How the High Court Hijacked Election 2000 (New York, 2001). The views of Peter Berkowitz, John DiIulio, and others are summarized in a set of letters discussing a conservative critique of the case previously published by Gary Rosen. The exchanges appear in the March 2002 issue of Commentary. On other points mentioned, Jürgen Habermas’s famous essay, “Citizenship and National Identity,” can be found as an appendix to his Between Facts and Norms (Cambridge, Mass., 1996). Francis Fukuyama’s The End of History and the Last Man (New York, 1992) expands on an earlier article. John Locke’s argument concerning the prerogative power appears in the second of his Two Treatises of Government. I have discussed it in an essay called “The Inherent Deceptiveness of Constitutional Discourse,” found in Integrity and Conscience (Nomos XL), ed. Ian Shapiro and Robert Adams (New York, 1998).
This article originally appeared in issue 2.4 (July, 2002).
Rogers Smith is the Christopher H. Browne Distinguished Professor of Political Science at the University of Pennsylvania. His Civic Ideals: Conflicting Visions of Citizenship in U.S. History (New Haven, Conn., 1997), was a finalist for the Pulitzer Prize and the winner of the Merle Curti Award.
Welcome to the New Common-place
By now you will have noticed that Common-place is sporting a new look. Our new design is a result of a year-long process of community outreach and research, editorial input, and digital design and web development by the University of of Connecticut’s Department of Digital Media and Design. We hope you will be pleased with the results. The new site is designed to augment the journal’s unique role in early American scholarship by moving it to a new publishing platform that will better support its growth as a scholarly, pedagogical and digital resource. Toward that end, Common-place now runs on WordPress, the world’s most popular content management system which is behind nearly half of all websites worldwide, including by such publications as The San Francisco Examiner and The Nation. This allows for a better reader search experience and search engine retrieval.
Another goal which our new look–and the technology behind it–achieves is optimizing the reading experience across a range of computing devices. The responsive design and adaptive typography automatically measures the size and orientation of the reader’s screen and adjusts font size and line length to the reader’s device. This better assures comfortable reading across multiple device platforms, from desktops to smart phones.
What you’ll find here
The Common-placeHomepage still features the table of contents of the latest issue, including feature articles in the upper left hand column and roundtables in the upper right hand column. Book reviews and your favorite Common-place columns, including Notes on the Text, Poetic Research, Ask the Author, and Tales from the Vault, are immediately below. All of our columns can also always be found from the main navigation at the top of ever page on the website.
Other highlights of the new website include an improved Web Library, powered by Zotero, an open-source reference manager.
For now, previous issues can be found at common-place-archives.org in the old format. Over the course of the fall and winter, however, we will be migrating the entire Common-place back catalog to the new database. When this work is finished, a new Archives page will allow for sophisticated faceted searching of all of our past issues.
More to Come in Months Ahead
Once the task of migrating Common-Place‘s extensive back catalog to the new site is complete, readers will have new tools for searching and sorting site content. In addition to the usual search box where one can enter custom queries, all articles, reviews, etc., will be tagged with relevant Library of Congress-derived subject headings. Once fully implemented, this system will enable readers interested in, “Native Americans/Indigenous Peoples,” for example, to quickly identify all relevant content–from Volume 1, No. 1 on to the current issue. In addition, readers will be able to sort by column type and issue number. In short, Common-place will become easier to search, easier to read and easier to share with students and colleagues.
Common-place is celebrated for its openness to unconventional content types and experimental texts. We maintain this commitment with the new site, which features a Projects page where you will find such Common-place projects as Just Teach One, Just Teach One Early African American Print, and more.
Finally, at the bottom of every page in the footer, you will find info about the editorial team, information about how to contribute, author guidelines, and more.
We hope you enjoy the new Common-place.
This article originally appeared in issue 15.4 (Summer, 2015).
Do Not Despair: Suicide in the archives
When I tell people that I am thinking about suicide, most tell me I should not let graduate school get me so down. In truth graduate school has me anything but down; my dissertation project—a study of the significance of suicide in the newly United States—gets me up in the morning and keeps me up late into the night. While my advisors have been careful to make sure I am not getting too involved with my subject, the topic has proved so fascinating that I have rarely had time to ponder the darker side of dissertating. And I am not the only one fascinated by suicide. When people learn that I study suicide, the questions always follow thick and fast: How many early Americans killed themselves? Why did they do it? How?
Back when I started this project, those were the same questions that drove me; reconstructing an early American suicide rate from coroner reports seemed like the foundation of any substantive analysis of this topic and I refused to be deterred by the conspicuous absence of previous quantitative work. I soon learned my lesson. With the laws of most states stacked against self-murder (through the 1810s suicides in Massachusetts were punished with a stake through the heart and a roadside burial), not to mention considerable social and religious stigma, there was every reason to cover up a suicide. When Mary Ware, the wife of Harvard President Henry Ware, took her life in 1807, her intentionality was largely withheld from the public record, meriting only an ambiguous mark next to her name in a church ledger.And she was not the only one. Victims, their families, and coroners often conspired to make difficult deaths disappear. If it was accuracy I was after, I quickly discovered that coroner reports were not going to get me very far. So I instead began to look for a discrete community with a multifaceted source base. I found one sitting on the shelves of my university library. Notoriously thorough, the eighteen volumes of Sibley’s Harvard Graduates reconstruct the lives and deaths of each graduate of the college from decades-long researches into every archival corner. Even if the coroners did not pick up on a suicide, Sibley, I thought, probably would. Excited, I added in information drawn from similar biographical projects at Yale and Princeton to produce a large data set. It turns out that twenty-nine of the seven thousand college graduates whose exploits are recorded in these exhaustive minibiographies definitively committed suicide. The number of possible and likely suicides runs much higher. But as I crunched the numbers, bringing all of my high-school math to bear on the problem, I soon realized that the sample is far too small to say anything beyond the tantalizingly suggestive about the suicide rate of this population. Real scientists would throw out the data.
Despite this setback, the exercise was not a complete failure. Reading in Sibley’s teeming volumes, I became overwhelmed by the range of sources, going far beyond coroner reports, in which he and his team had found references to suicide. Early American diaries, I have discovered, are full of them. Sibley led me to John Pierce, a long-serving minister from Brookline and author of a twenty-volume journal. Pierce, who graduated Harvard in 1793,spent the next fifty years recording the deaths of his classmates and parishioners, taking unique pleasure in his own longevity. Suicides drew his particular attention. In 1833 he sat down to record details of all the local suicides that came to mind, offering his own speculations as to their motives and even transcribing a suicide note that had come into his possession. Its author, David Ackers, used his final communication to blame a combination of bad luck and lotteries for sending him “fast down the broad road to destruction.”
The diaries of other early Americans evidence the same interest in suicide. There are, if you are keeping score, twelve suicides in Martha Ballard’s diary, and more than forty in William Bentley’s. Elizabeth Drinker goes so far as to critique the choice of exit of an unnamed slave who jumped to a noxious death in his master’s privy. “The river was running within a few yards of the house,” Drinker pointed out. “[W]hy did he not prefere that more cleanly, and I should think, more easy method?” The diary of Eliza Cope Harrison offers even more. In addition to his considerable curiosity about the suicides of his fellow city-dwellers, Harrison’s diary provides a rare portrait of a man considering taking his own life. In an entry written after midnight on March 10, 1803, Harrison, a failing Philadelphia businessman, confided that he had decided to “rush into the presence of my God, uncalled.”Yet the sight of his daughter, asleep in the room, soon got the better of him. “I pause,” he wrote, “shall I do this thing?” Tears followed and the awful moment finally passed. Calming himself, drying his eyes, he eventually rose, closing his entry: “I will embrace the auspicious moment, carry my lovely infant to her mother & seek to compose & drown my cares in sleep.”
Fig. 1. When Romain, a slave being transported south, cut his throat in a Philadelphia street in 1803, the Pennsylvania Abolition Society made his death the subject of an antislavery pamphlet. Illustration from Humanitas, Reflections on Slavery, with Recent Evidence of its Inhumanity, Occasioned by the Melancholy Death of Romain, a French Negro (Philadelphia, 1803). Courtesy of the American Antiquarian Society.
While a treasure like Harrison’s expansive diary is still best discovered in the original, spending wet New England Wednesdays trawling through unindexed diaries and letters has often left me frustrated. Thankfully, those of us engaged in needle-in-haystack projects can now supplement our researches with a wide variety of indexes and databases. Ballard’s diary, the centerpiece of dohistory.org, is now digitized and enabled for full-text searches. The Library of Congress’s public-access American Memory project has also helped me comb large portions of the papers of Thomas Jefferson and George Washington for references to suicide that might have taken months to find otherwise. The North American Women Letters and Diaries initiative, a subscription service, allows similar searches among the personal documents of more than a thousand women from the colonial period through 1950. How else would I have discovered that in 1836 Lydia Sigourney had attended a lecture at which the tight lacing of bodices was compared to suicide?
The same fascination with suicide that led so many Americans to detail and debate the subject in their diaries led others to preserve artifacts from the event. Researching a paper about suicide notes in the media, I found to my surprise that the Massachusetts Historical Society not only had a few, but had even indexed them in the card catalog! When I began examining newspapers for evidence of and reaction to self-murder, I found even more suicide notes, transcribed and scrutinized to a degree that seems intrusive even by modern standards of journalism. Indeed, for a subject as sensitive as suicide, the pervasiveness with which it infused the media of the early republic is amazing, especially considering the extent to which newspaper editors were complicit in covering up awkward exits. Of course, everyone knew what was going on—cover-ups were so common as to become subjects in themselves. As the self-styled preacher, poet, and peddler Jonathan Plummer wrote in a broadside in 1807, “The relations of those who dispatch themselves are often desirous of keeping the thing as private as they conveniently can, and printers out of pity to them, or from some other motive, are often induced to withhold in public papers a part of the truth in regard to such matters, mentioning the deaths of the deceased in such a way as to lead a reader to expect that they departed in a natural manner.”
Fig. 2. Having been separated from her husband, this slave woman unsuccessfully attempted suicide to avoid transportation to Georgia. A caption reads, “[B]ut I did not want to go, and I jump’d out of the window.” Detail from Jesse Torrey, A Portraiture of Domestic Slavery, in the United States . . . (Philadelphia, 1817). Courtesy of the American Antiquarian Society.
So it is remarkable, then, that suicide appears with such regularity in the newspapers of the early republic. While underreporting was considerable, newspapers, it seems, took a distinct interest in particular types of suicide. Suicides that occurred in public places, for instance, were hard to ignore, and the suicides of those whose families lacked the social and political capital to engineer a cover-up were often splashed across the inside pages. Deaths that were particularly elaborate, instructive, gruesome, or exotic, or those that furnished a suicide note, also appeared with considerable frequency in the prints.
Despite their ubiquity, finding suicide stories, however sensational, amongst the mass of newsprint that washed over the early republic feels like a test of endurance. So in addition to systematic searches of the microfilmed newspaper collections of my university library, I have recently been exploring new electronic newspaper indexes. The CD-ROM index of Niles’ Register, and the free Web-based index of the American Periodical Serieshave both made looking for needles in haystacks a much less daunting prospect. Likewise, the full-text searching possible in the online Pennsylvania Gazette,a subscription service,has saved hours hunched over decrepit microfilm readers, while the Performing Arts in Colonial America CD-ROM has brought together a diverse collection of newspapers and periodicals for subject-level searching. At the American Antiquarian Society this winter, I road-tested a demo for the Readex Early American Newspapers Digital Edition, a new full-text database that I hope will do for newspaper research what their Evans Digital Edition (previously discussed in this column) has done for research in American printed books and ephemera. As with Evans Digital, the real barrier to researchers will be the prohibitive cost of subscribing to this new and powerful tool: when Harvard University balks at the subscription fee, as it has for Evans Digital, you know there is a steep price to be paid.
Fig. 3. An often reprinted feature of temperance literature, Rush’s illustration differentiated distilled spirits from beer and wine. Detail from Benjamin Rush’s “Moral and Physical Thermometer,” The Columbian Magazine, January 1789. Courtesy of the American Antiquarian Society.
As these new databases and indexes are confirming, suicide was a staple of early American newsprint, but early Americans’ infatuation with suicide does not end there. Discussion of suicide infused almost every medium in the early nation. Pamphlets, plays, poetry, and periodicals overflowed with news and opinion about bloody tragedies, both real and imagined. Americans puzzled over gory familicides, sensational slave suicides, and the suicides of convicted criminals in the nation’s prisons and jails. Indeed, lured by the gripping illustrations of slave suicides found in various antislavery publications, I have recently moved suicide and slavery to the center of my dissertation project.
In texts where actual suicides were absent, the language of self-destruction and self-preservation was brought to bear on all kinds of social behaviors deemed threatening to the health of the republic. As Lydia Sigourney discovered at Dr. Mussey’s lecture in 1836, early American reformers compared just about anything to suicide in order to forward their arguments. Dueling was suicide; gambling was too. Habitual drinking was a choice that could only end in death. Benjamin Rush even designed a “Moral Thermometer” to show exactly which spirits would put you on path to self-destruction. (If you are concerned, steer clear of whisky, gin, and brandy).
Fig. 4. Drawing on a visual language derived from Copley’s Watson and the Shark, membership certificates of the Humane Society of Philadelphia emphasized the active preservationist agenda of this moral reform group. Detail of Benjamin Rush’s Diploma of Membership in the Humane Society of Philadelphia. Issued in 1805. Library Company of Philadelphia. Courtesy the Library Company of Philadelphia.
Working through Rush’s temperance papers this past summer, I discovered that the good doctor was not all talk; he also actively worked to prevent suicide. Coming across his membership certificate to an unfamiliar reform society, I stumbled upon the Humane Society of Philadelphia, an antidrowning group for which Rush served as vice president. Usually centered around local physicians, these societies attempted to curb the seemingly ever-increasing number of deaths in the nation’s rivers, lakes, and harbors by the strategic placement of medical equipment and instructions on bridges and ferries where drownings often occurred: “Do Not Despair,” one prominently placed riverside sign told those who lingered in its vicinity. Armed with lifeboats and with new ideas about suspended animation and resuscitation, members of these humane societies established what might be properly understood as the first American suicide prevention centers.
My recent interest in prevention and moral reform is symptomatic, I think, of the change in direction my research has taken since I first opened a volume of Sibley’s Harvard Graduates and set about trying to compute a suicide rate. “Who?” “how?” and “why?” now seem less important than “so what?” I may have despaired about ever knowing how many early Americans committed suicide, but my journey through diaries, newspapers, pamphlets, and letters has shown me that that is not really what matters. How Americans in the early republic responded to suicide or the threat of it and what they understood that threat to be are now the questions that keep me up at night. I still do not have any good answers, but I will not despair just yet.
This article originally appeared in issue 4.4 (July, 2004).
Richard J. Bell is a graduate student in the history department at Harvard University. He is embarked upon a dissertation entitled “The Cultural Significance of Suicide, 1760-1830.”
Misimoa: An American on the Beach
They called it, simply, the Beach. It was the settlement that grew on the shores of Samoa’s Apia Bay, in the 1800s. What else would they have called it? Like the Samoan lifeblood that sustained it, two rivers flowed into the bay, where a small community grew into a town, in the interstices of the Samoan world. Perched on the edges of Samoan life, in hurricane times it threatened to go into the sea. It was almost literally a beach, a bit ramshackle at first, and the street was coral and sand. What was not Samoan had come over, or from, the sea. It was a community cobbled together from the indigenous and the foreign, built on the cusps of about half a dozen Samoan villages, around a bay that only a real-estate agent would call a harbor.
A missionary settlement in the 1820s brought whale ships to this place, Apia, and whale ships brought much else. Soon there were hundreds of foreigners—resident carpenters and boat builders, groggers and laborers—with Samoans working overtime to keep everyone in profitable order, while missionaries struggled to teach Samoans of a godly white world different from that which they saw. The Beach at Apia was competing with places like Honolulu in Hawaii, Levuka in Fiji, Pape’ete in Tahiti, and Kororareka in New Zealand. All sought to be either a center of the Pacific, or a hellhole, depending on whom you read.
In 1850 the Beach was still a sideshow—the epicenters of Samoa were the capitals of the Samoan world,ancient places such as Leulumoega, Safotulafai, Manono, and Lufilufi. But the next fifty years saw Apia become more and more dominated by foreigners, and Samoa became more and more influenced by Apia. By the 1870s, Apia had become the Pacific’s ground zero for strategic and imperial competition between Germany, Britain, and the U.S. This was the Samoan Tangle, a diplomatic and imperial problem for Germany, Britain, and the U.S. from the 1870s until Samoan partition in 1899. The Tangle brought imperial conflict directly to the Beach, exacerbating the civil wars brewing since the early 1850s, largely, but not solely, because of the competition between the two great Samoan lineages of Sa Malietoa and Sa Tupua.
Most Papalagi (foreigners) were scared or troubled by these difficult and uncertain times, though a few saw in them opportunity. Harry Moors (1854-1926) was a member of the latter group: he arrived on the Beach in the late 1870s, and soon became the most prominent American in Samoa. Moors, born in Detroit, became known to Samoans as Misimoa, of the Beach at Apia.
Fig. 1. A view of the Beach at Apia, around 1900. From Llewella Pierce Churchill, Samoa ‘Uma: Where Life Is Different (New York, 1902), collection of the author.
Harry Moors was more than a trader or businessman; he was a writer, a politico, a raconteur, a father, even something of a patriot. He is best remembered outside Samoa for his reminiscences of Robert Louis Stevenson, whom he befriended when the writer came, for health reasons, to live in Samoa in 1890 (Stevenson died there in 1894). Moors began as Stevenson’s agent, and for a while became his friend; but in Samoa, at least, Moors is best remembered for his descendants and his store. Moors was known a generation or so ago as an adventurer, but there was more to Moors than this; his heart and his hearth were firmly anchored on the Beach, which he spent the rest of his life crisscrossing.
Moors saw capitalism in coconuts, and made wealth where others saw only natives. Though his first work in Samoa was for a San Francisco trading firm, Moors soon left the company. A key figure in Samoa’s transition to plantation agriculture, he brokered thousands of the first imported laborers while working as a labor recruiter, also known as a blackbirder, in the Gilbert Islands (modern-day Kiribati). By 1878 Moors was working on Samoan plantations as an overseer of mostly Gilbertese laborers. But the turning point for Moors came when he went into business himself, purchasing some land to grow several crops, though primarily coconuts. Moors trading business meshed well with this, and combined the production and purchasing of coconuts for export, with the importing and sale of trade goods.
The 1890s was a very depressed time in Samoa, as war and a global depression took their toll. Moors’s, however, was one of the few local businesses not in financial trouble. Indeed, it prospered and Moors was able to build an impressive store on the Apia waterfront, a landmark on the Beach that still stands. Moors continued to traverse the Pacific, and even claimed ownership of Niulakita, a guano-rich island in the Ellis group (now Tuvalu). By the early 1900’s, he was the fourth largest employer of Chinese indentured laborers in Samoa and had built a large personal residence and an even larger reputation.
Many thought Moors a sharp trader, and at times people suspected that he was crooked. This was hardly unusual during these years in Samoa. More than once Moors or one of his traders was accused of using doctored weights when measuring coconuts for purchase, and at least once this endangered the life of one of his traders. Even Robert Louis Stevenson, who relied heavily on Moors in his early years, worried that he might be a huckster. Still, Stevenson spent something like $12,000 with Moors as his agent, and also employed Moors to purchase his famous Vailima estate.
Despite their close business ties, and initial friendship, Moors and Stevenson grew apart, largely due to antipathy between Moors and Stevenson’s wife, Fanny. After Stevenson’s death, Moors’s book about Stevenson pointedly wondered whether “Stevenson would have done better work if he had never married.” Fanny, Moors jibed elsewhere, “showed her years,” was “a faded spouse” and was “dictatorially inclined.” There was little love lost—Fanny dismissed Harry as “the village grocer of Apia”—though Stevenson himself never published a bad word about Moors.
Fig. 2. Harry Moors’s store at Apia. From The Cyclopedia of Samoa, Tonga, Tahiti, and the Cook Islands (Sydney, 1907), collection of the author.
When Stevenson arrived in Samoa in 1889, the Samoan Tangle had been strained to breaking point. In 1887 Germany had effectively abrogated its treaties with Britain and the U.S. and placed Samoa under German sovereignty, with Tamasese Titimaea (of Sa Tupua) as symbolic head. After a year of this government, civil war broke out, during which Tamasese was repeatedly defeated. The German navy came to his aid, yet against them the warriors of Mata’afa Iosefo, the leading light of Sa Malietoa, were also successful, in one incident killing or wounding over fifty German sailors. Seeking advantage in the strife, the U.S. and Britain also sent warships. And for a moment these Pacific islands threatened to become more than a footnote to history, as global war threatened to begin right there on the Beach.
But, as Samoans say, God intervened. A hurricane came and caught seven warships bristling at anchor in Apia Bay. They were devastated: six capsized and nearly two hundred sailors, mostly American, drowned. But the disaster only temporarily interrupted the escalating tensions. Mata’afa resumed his pursuit of supremacy, while Germany would only accept Malietoa Laupepa as king. Troubles were set to continue, and Moors was in the thick of it.
Like Stevenson and the majority of Samoans, Moors was steadfast in his support of Mata’afa. Yet unlike Samoans, neither Moors nor Stevenson entirely supported Samoan independence. Stevenson thought Samoans children; Moors thought no Samoan capable of ruling Samoa fairly. Moors’s support for Mata’afa, often cast as Samoan patriotism, was veiled American adventurism, his gun running abetted by U.S. officials and encouraged by both his sense of profit and his strong dislike and distrust of German imperial ambitions. Stevenson thought as much: he intimated that he would have trusted Moors’s political advice fully, “if he were not a partisan, but a partisan he is. There’s the pity.” Moors had become a key player, not only for Mata’afa, helping to keep him active and competitive in the field, but for the U.S., as an American citizen actively and privately working to further U.S. interests.
Fig. 3. Robert Louis Stevenson, family, and friends at his house in Samoa. From Moors, With Stevenson in Samoa, collection of the author.
In the midst of this tumult, Moors left on his most adventurous business enterprise yet—the South Sea Islanders Exhibition at the 1893 Chicago World’s Fair. Looking back, it seems Moors deserted a sinking ship, for in mid-1893 Mata’afa surrendered and was exiled to Micronesia (where he remained until 1897). Malietoa did not forget Moors’s disloyalty and forbade any Samoans from traveling with the American to Chicago. In the end, Moors’s “Samoan” village made for a strange display. It was made up mostly of half-castes (people of mixed Samoan and Papalagi descent) and other Pacific Islanders, with only a few full Samoans who had been spirited away.
“It is my object,” Moors wrote, “to present in Chicago a perfect picture of Samoan life under favorable circumstances. Shewing all that is good and attractive and leaving out all that is bad.” This “favorable” representation did not please everyone, and a number of critics shared the “generous indignation at Moors’s troop, and [the] totally false idea” it gave of Samoa.
Still, as a homecoming for Moors, the whole production was impressive, to be sure. Moors had brought with him a huge cargo of Samoan objects, including a seventy-foot canoe of modern design (a taumualua), several smaller watercraft, and three large houses (fale). All this was evidently to good effect. Business in Chicago boomed for Moors and he organized repeat visits to the U.S. He took a group to the San Francisco Mid-Winter Fair, and later toured the Midwest with another group of Samoans. A blossoming impresario, at one point during performances in St. Louis Moors is reputed to have hired a Mexican band to make the Samoan music more appealing to local audiences.
The islanders who went with Moors agreed to rigid terms—strict limits on behavior and dress, and work on Sundays—all for $12 a month. But Samoans seemed eager to accept those terms for a chance to tafao (wander about) overseas. The Samoans who went with him to San Francisco, Moors promised, “o le a toe foi mai i Samoa, ma le fiafia i mea eseese ua latou iloa i la nu’u mamao, ma ni oloa foi ua latou maua mai ai [will return to Samoa happy with the strange things they have seen in distant lands, and the things they have brought with them].”
Soon after first settling in Samoa, Moors married a Samoan woman from a prominent family, Fa’animonimo (Nimo). She was to remain an essential element in Moors’s success, and was known as an extremely smart and capable woman. Moors already had other sons with a woman from Wallis Island (Kane), and from a relationship with the Samoan Epenesa Enari (Mark). Moors recognized and cared for both of these sons. Nimo toured the U.S. with Moors several times, countering suspicions that Moors was only luring her there to divorce her. “If she comes back again still married to Moors,” Stevenson had written in 1893, “I shall think a lot of her savvy.”
Moors’s marriage to Nimo had apparently upset his family. His mother did not approve of her son’s marriage to a nonwhite—one of those “pork-eaters and cannibals,” she is reported to have said. But the marriage between Nimo and Harry ultimately proved a lasting one.
After Samoa was split between the U.S. and German Empires in 1899, Moors’s political adventures slowed down, but did not stop. The German governor thought him active in intriguing against German rule, and was probably right. Then, after New Zealand took over German Samoa in 1914, he seemed to be involved in protests against what many saw as gross misrule. Even shortly before his passing in 1926 he was active in the rising anticolonial movement, the Mau. By then, though, it was his daughter, Rosabel, who was to be a key figure in the anticolonial opposition.
Moors’s daughters—Ramona, Rosabel, Sophia, and Priscilla—united a number of influential Apia families. These daughters all married either prominent whites or half-castes, and were a core part of the half-caste and Samoan elite that centered on the Beach and that lent its support to the Mau. Rosabel married Olaf Frederick Nelson, son of a Swede and a Samoan. As the chief Taisi, Nelson became one of the key Mau leaders in the 1920s. After her husband was exiled from Samoa, Rosabel not only ran Nelson & Sons (which had merged with the Moors Company to be the largest private enterprise in Samoa), but was one of the leaders of the Women’s Mau, which sought to continue protest in the face of the New Zealandcolonial government’s repressive use of mass arrest, deportation, and violence.
Moors’s only adult son with Nimo, Harry Jr. (known to Samoans as Afoafouvale Misimoa), was educated in the U.S. and served Samoa and the Pacific for many years, including as secretary-general of the South Pacific Commission. The Moors family remain one of the most prominent Samoan families, still a feature in the city of Apia, where the Moors’s store remains as a vestige of its Beach days.
In his later years Moors was a very active writer. This was partly due to Stevenson’s encouragement, and the main fruit was his 1910 book, With Stevenson in Samoa. In addition to With Stevenson he published some short stories, and was an active correspondent to Samoan periodicals, particularly the Samoa Times. He also left a large body of unpublished works, including two novels, and a series of reminiscences (later anthologized and published in 1986).
Samoa never became entirely American, as Moors hoped it would—though the east is still a U.S. colony, or territory, today. Yet Moors serves as an intriguing window onto the times of the Beach and the Tangle. Moors’s life shows how places as diverse and distant as Kiribati, Tuvalu, Wallis Island, Chicago, and Samoa, were part of a vibrant circuitry. For much of the late nineteenth and early twentieth centuries—Moors’s heyday—there was a Brown Pacific, a huge movement of Pacific Islanders throughout the Pacific, and beyond.
Further Reading:
Moors’s main published writings are With Stevenson in Samoa (Boston, 1910) and Some Recollections of Early Samoa (Apia, 1986). Robert Louis Stevenson’s remarks come from his A Footnote to History: Eight Years of Trouble in Samoa (New York, 1892), and The Letters of Robert Louis Stevenson, ed. by Ernest Mehew and Bradford Booth (New Haven, 1994). The other remarks of Moors used above can be found in the unpublished papers of Sir Thomas Berry Cusack-Smith in the Alexander Turnbull Library, Wellington, New Zealand. The source of Fanny Stevenson’s words is her Our Samoan Adventure (New York, 1955). Joseph Theroux wrote briefly of Moors in Pacific Islands Monthly (August and September, 1981), and Tuiatua Tupua Tamasese Taisi Efi writes in his Ia Fa’agaganaina Oe E Le Atua Fetalai (Apia, 1989). For introductions to Samoan history during this period, see Malama Meleisea ed., Lagaga (Suva, 1987) and R.P. Gilson, Samoa 1830-1900 (Melbourne, 1970). On the circulation of Pacific Islanders throughout Samoa and the Pacific, see my “‘Travel Happy’ Samoa,” New Zealand Journal of History (2003), and on whites and half-castes in Samoa, see my Troublesome Half-Castes (forthcoming).
This article originally appeared in issue 5.2 (January, 2005).
Toeolesulusulu Damon Salesa teaches in the history department and in Asian Pacific Islander American studies at the University of Michigan, Ann Arbor; he was educated at the University of Auckland and Oxford University.
Doing History
Carol Berkin
How the worlds of the scholar and the popular historian come together
Common-place asks Berkin to write about how her approach to scholarship and teaching changed as she became involved in public history.
John Edwards may be the star of the TV show Crossing Over, but my experiences have been as revelatory as his—and far less reliant on the spirit world. Over the past two decades, I have ventured into the world outside the ivy-covered halls of academe, spending a considerable amount of my time running institutes for teachers, giving public lectures, appearing as a “talking head” in documentaries, and writing trade books designed for what publicists call the “intelligent layperson” rather than for my colleagues in the profession of history. Although my entrance into these fields of what is commonly (and often sneeringly) called popular history was unplanned and serendipitous, I am deeply grateful for the experience. Aside from the fame (I was once recognized in a discount shoe store by an earnest young woman who had seen me in a documentary and asked me to autograph her shoe box) and the fortune (I now only owe an arm and a leg to my children’s colleges), popular or public history has taught me much about my chosen profession of teaching and my chosen field of research, early American history.
I share my thoughts with some trepidation. Not everyone will agree with what I have to say, and no doubt many of the insights I believe this work has given me will seem to be obvious to others. But, having been asked to write this piece and being of an age to be reckless and fearless, I forge ahead.
When I first started out in academia, here is how I understood the relationship between my field and the outside world: In my mind, there was an “us” and a “them,” and if we in the academy were not exactly a secret society, we were a select group set apart from others by our craft. In dealing with “them” it was important to sound smarter and to make clear that I possessed a body of arcane knowledge that marked me as a skilled practitioner of the esoteric craft of archival research. To the general public, I hoped to appear as a high priestess of the arcane; to students, I wanted to seem the bearer of wisdom and the judge of competence. Knowing something that others did not know mattered; but the real prize was understanding something my students and the lay person could never hope to understand as well as I. While this philosophy might be blamed on callow youthfulness or neurotic insecurity, I fear it might have continued to guide me right through my AARP membership had I not been thrust into the world of popular history.
What did I learn from my experiences in teaching with and writing for people who did not wish to spend lonely hours in a dusty archive or engaged in long conversations about the origins of the first party system? A lot. I have rethought why I teach and how; I have reevaluated my relationship with nonacademics; and I have redefined what being a member of my profession means to me.
From my many years directing or participating in teacher institutes and seminars, I learned to reassess both why I taught and how I taught. For although I came into these institutes as an expert on the role of women in the American Revolution or on Hamilton’s economic programs, I quickly found myself a novice when it came to teaching. The elementary- and secondary-school teachers who participated in these institutes had a far clearer understanding of the goals of teaching and far greater knowledge of how to do it well.
Now, I want to establish right off the bat that I was, by the standards of our semiannual student evaluations and by that most celebrated Website ratemyprofessor.com, a pretty good teacher. I was friendly; I was accessible; I was funny; and I was well prepared for each class. Students thought I knew my stuff—as one of our business students put it in a very backhanded compliment, “You seem pretty smart—how come you do this for a living?” But, I was never really certain what my goal or purpose was in the classroom. The administration, like administrations everywhere, wanted me to evaluate students’ ability, to judge their intelligence and their competence to do the work assigned, to rank them with grades, to grant them a pass or fail them. My department, like departments everywhere, wanted me to get high marks as a “good teacher” and to attract as many students as possible into other history courses. I operated for many years with a nearly schizophrenic set of goals: one, the “rescue fantasy,” which involved the vague notion of joining the ranks of Miss Jane Pittman and other inspirational teachers who were responsible for helping promising youths find themselves; the other, the Robin Williams, Dead Poets Society fantasy, in which my own personal love for history was enough to capture the hearts and minds of students forever.
After working with teachers in a dozen summer institutes, I realized how little I understood about what ought to go on in a classroom. I am talking about more than pedagogy; I am talking about how I teach as well as why. Perhaps I would have come to understand this, in time, on my own. But I believe it was my conversations with teachers that finally led me to define my goals. Over and over again, the best of these teachers conveyed to me the simple fact that their goal was to assist their students—to find effective ways to open up the storehouse of historical knowledge rather than set a series of challenges before them that tested their worthiness. And these teachers retained this philosophy in the face of constraints far more daunting than any we in the academy face: state and national standards, endless testing, overcrowded classrooms, and a perennial shortage of chalk and paper.
Because of these teachers, I have realized how important it is to articulate my goals to myself before I walk into the classroom. What do I want students to get out of this particular hour and fifteen minutes we will spend together—and, more importantly, what do I want them to get out of the study of history? It has become essential to me that I keep these objectives in mind, for they determine the techniques I will use to convey my ideas and information, and they will set the terms by which I judge my success and the students’. My goals are to provide students with the analytic tools to understand their world better, the expressive skills to articulate that understanding well, a greater and more sustained curiosity about how that world came to be as it is, and an empathy for how it was for generations past.
To achieve these goals, I have become a “new light” rather than an “old light” teacher. That is, I do not assume that my job is to speak the truth and theirs is to receive it—or not. Instead, like the evangelical preachers of the Great Awakening, my job is to start from where the students are in their knowledge and help them work their way to a more complex and richer understanding. I first realized this when I heard some elementary school teachers describe their “word trees”—a list on the blackboard made up of words students said they didn’t understand, followed by definitions arrived at by class participation. I thought back on experiences I had in which, for the want of a definition, an interpretation was lost. One freshman survey came to mind. I had given what I was convinced was a brilliant, insightful, even witty lecture on Jackson’s veto of the second Bank of the United States. When I was done, I asked for questions and discussion. The first question was, “Professor Berkin, was Jackson for or against the bank?” The student did not know what the word veto meant. And why should she? She had arrived in this country six months earlier, and vetowas not on the list of one thousand common words you should know.
The account of the word tree started me thinking. And since then, when I walk into the classroom, I remember to ask myself: who are your students? Not who do I wish they were? But, who are they? And how can I best reach them? In my classroom, the students are from over one hundred and fifty countries; most are first generation college students; many received their pre-college education abroad; and for many, English is a second or third language. And, while I don’t create a word tree, I have learned to speak with parentheses: if I use a word that may be unfamiliar—in fact, even a word that has the slightest chance of being unfamiliar—I follow it with a comma, then a definition or a synonym, and then I move on. It is now second nature to me, and it does not distract from the point I am making or “dumb down” the argument; all it does is make the information genuinely accessible to more students.
Secondly, my work on television documentaries has proved to be beneficial. Among other things, it has helped me master, or at least begin to master, the art of succinct explanation. I have discussed my experiences as a talking head at great length in a piece I wrote for the OAH Newsletter (February 2005), but here I only want to repeat one thing I learned about myself as a historian. As I answered questions about my field before the cameras I found great pleasure in being asked to share what I knew with a wide audience. It did not matter if everything I knew about revolutionary camp followers or the constitutional convention couldn’t be recounted; what did matter was the discovery that there was a genuine interest “out there” in the subjects I had devoted so much time and effort to. This discovery has changed my understanding of my profession and the scholarly work I do: they are no longer fit only for an ivory tower; they are now intimately connected to that world outside. Corny as it may sound, this adds to my sense of responsibility as I do historical research: I want to tell the story properly, accurately, and with whatever insight I can, because there are many more people for whom that story matters than I ever imagined.
Finally, this recognition that there is a wide and eager audience for history outside the academy has prompted me to begin writing books aimed at that general audience. In writing these narratives, I have had the satisfying experience of revisiting interpretations and arguments I made in earlier scholarly work. In the past, I often joked with my graduate students that all history monographs should be put out in loose-leaf binders so that, as we mature as scholars, we can go back and revise or expand upon what we said in earlier work. Yet, too often, in my career at least, I have moved on to new topics—too busy to reflect upon older ones. Sometimes, in passing, I have thought back on what I have written and said, “Oh, wait, now I really understand.” Writing popular history, much like writing a textbook, has given me an opportunity to—no, it has demanded that I—review, rethink, confirm or alter my earlier interpretations and analyses. I have been surprised to see how much my interpretations have changed—and how satisfying it is to muse about old topics and correct old mistakes.
These projects have profoundly changed my view of my profession. I once thought of writing history as a highly individualistic enterprise. On several occasions, I have referred to historians as the last remaining independent artisans. This may still be so in the sense that we pick our own topics, we do our own research, we organize our writing as we choose. But writing popular history has led me to recognize that our scholarship is also a collective effort. I could not have written a popular history of the constitutional convention without drawing upon the work of the many scholars in the field; I could not have narrated the role of women in the American Revolution without a reliance on the research and interpretations of others. This has proved to be more than a change in outlook. It has led to an increased willingness on my part to collaborate with other scholars. And it has allowed me to see my own work as one piece of a mosaic that will ultimately produce a coherent picture.
When I first began to venture into the world of teacher development, documentaries, and popular history, I thought that this world and the world of my scholarship would be, and would remain, separate. But the relationship between them has grown more intimate; the linkages have become clearer. For me, the two are part of the same enterprise: doing history.
This article originally appeared in issue 6.4 (July, 2006).
Carol Berkin, professor of history at Baruch College and the Graduate Center, CUNY, is known both for her scholarship on the American revolution and her commitment to public history. Her first book Jonathan Sewall: Odyssey of An American Loyalist (1974), which was nominated for the Pulitzer Prize, established her reputation as both an impeccable scholar and an engaging writer. These qualities are also evident in her more recent books, A Brilliant Solution: Inventing the American Constitution (2002) and Revolutionary Mothers: Women in the Struggle for American Independence ([2005] 2006). A frequent consultant and contributor to documentary films, she is the author of both secondary- and college-level textbooks and documentary editions. She has also directed or participated in over twenty-five teacher institutes during the past five years.
Parson Weems Fights Fascists
G. W. and the cherry tree in 1939
Parson Weems, imitating Charles Willson Peale’s pose in The Artist in His Museum (1822), opens a red velvet curtain on the legendary scene: Augustine Washington, elegant in crimson coat, white ruffle, tan breeches, silver-buckled pumps, and green tricornered hat, grasps in his right hand the slim trunk of the bent cherry tree. A row of cherries dangles from the perfectly rounded treetop, mirroring the very cherry-like fringe of the Parson’s curtain. Augustine’s outstretched left palm and furrowed brow signal a serious inquiry. His son George, boyish in stature and dress—coatless, with sky-blue breeches and petite buckled pumps—is manly in his expression. In fact, his white-wigged head is that of Gilbert Stuart’s portrait and the dollar bill. He points with his right hand to the hatchet in his left. Wood chips lie in the circle of soil at the base of the tree, its lower trunk smoothly incised and poised to split off. In the background, a well-dressed slave couple harvests the fruit of a second tree.
The man behind the man behind the curtain is Grant Wood, forty-eight years old when he painted Parson Weems’ Fable in 1939 and perhaps the best-known artist in the United States at the time. The Washingtons’ shuttered brick house is Wood’s house in Iowa City. The rolling hills and the precisely spaced spherical trees in the background are those of Iowa or at least the Iowa of Wood’s own well-ordered landscapes. The Parson and Augustine Washington are modeled on two of Wood’s University of Iowa colleagues. George is modeled on the nine-year-old son of the colleague who posed for Augustine.
Sudden fame had come to Wood late in 1930, when American Gothic took third prize at the Art Institute of Chicago’s annual competition. Critics immediately praised it as a sharp send-up of small-town or rural life, and newspapers reprinted it across the country. After that, each of Wood’s major paintings—and his pronouncements about American art and culture—had been greeted with eager media attention. Parson Weems’ Fable, ballyhooed by Time as “artist Wood’s first big canvas in three years,” immediately sold to the novelist John P. Marquand for ten thousand dollars (about one hundred and thirty thousand dollars today). Wood had come a long way in the nine years since the Art Institute bought American Gothic for three hundred dollars.
Seen as a satirist when American Gothic first appeared, Wood had refashioned himself by the mid-1930s as an outspoken celebrant of the heartland and of healthy, “native” art. No longer was he under the spell of H. L. Mencken and Sinclair Lewis, with their “contempt” for “the hinterland” and their “urban and European philosophy.” He and many other citizens, he proclaimed, had embraced an “American way of looking at things.”
“The Great Depression has taught us many things, and not the least of them is self-reliance. It has thrown down the Tower of Babel erected in the years of a false prosperity; it has sent men and women back to the land; it has caused us to rediscover some of the old frontier virtues. In cutting us off from traditional but more artificial values, it has thrown us back upon certain true and fundamental things which are distinctively ours to use and exploit.”
Among these things were national myths and legends, few if any of which were more familiar than the fable of George Washington and the cherry tree.
The cherry-tree story first appeared in 1806, in the fifth edition of Mason Locke Weems’s A History of the Life and Death, Virtues and Exploits of General George Washington, originally published as an eighty-page pamphlet in 1800. By 1808, the sixth edition had swollen into a book of over two hundred pages. Twenty-three more editions followed by the time Weems died in 1825. The “Parson” was an Episcopal minister who hit the road as a book peddler around 1791 and, over the next thirty years, constantly traveled the mid-Atlantic and southern states to create what one of his biographers called “a national reading public.” Weems explained his motives to the Philadelphia publisher Mathew Carey: “This country is large and numerous are its inhabitants[.] [T]o cultivate among these a taste for reading, and by the reflection of proper books, to throw far and wide the rays of useful arts and sciences, were at once the work of a true Philanthropist and a prudent speculator.” The Life of Washington, as it came to be known, epitomized this happy combination of moral instruction and steady profits. (In much less elevated terms, he advised Carey in 1809, “You have a great deal of money lying in the bones of old George if you will but exert yourself to extract it.”)
As far as moral instruction goes, it’s worth noting the context in which the cherry-tree story appears in Weems’s book. Young George is prepared for his legendary truth-telling episode by a particularly strong dose of his father’s didacticism. Unprompted by any bad behavior on George’s part, Augustine tells the boy that he’d rather see him dead than “a common liar.” “Oh George! my son!” he exclaims, “rather than see you come to this pass, dear as you are to my heart, gladly would I assist to nail you up in your little coffin, and follow you to your grave. Hard, indeed, would it be to me to give up my son, whose little feet are always ready to run about with me, and whose fondly looking eyes, and sweet prattle make so large a part of my happiness.” Thereupon follows the anecdote:
“George,” said his father, “do you know who killed that beautiful little cherry tree yonder in the garden?” This was a tough question; and George staggered under it for a moment; but quickly recovered himself: and looking at his father, with the sweet face of youth brightened with the inexpressible charm of all-conquering truth, he bravely cried out, “I can’t tell a lie, Pa; you know I can’t tell a lie. I did cut it with my hatchet.”
A tough question, indeed, given that the alternative to a truthful answer was death.
Thanks to Weems’s itinerancy and salesmanship, the fable circulated nationwide even before McGuffey’s Readers picked it up and succeeding Washington biographers canonized it. No doubt it was still a staple of moral instruction in Iowa’s public schools when Grant Wood was a boy. But by the 1930s, it had also been thoroughly debunked. Historians and critics in the 1910s and 1920s delighted in outdoing one another’s dismissals of the Parson’s work: “grotesque and wholly imaginary stories,” “pernicious drivel,” “a mass of absurdities and deliberate false inventions,” a “slush of plagiarism and piety,” “beneath contempt or criticism.” The editor’s note to a 1927 reprinting of The Life of Washington pointed out that after having run through almost seventy editions and having successfully instilled “the popular legend of Washington” in “millions of American minds,” the book had “died a natural and deserved death.” The modest aim of the reprinting, he said, was simply to preserve “one of the most interesting, if absurd, contributions ever made to the rich body of American legend.”
Grant Wood with his painting, Parson Weems’ Fable, 1939. Courtesy of Cedar Rapids Museum of Art Archives.
Only as a piece of folklore did the cherry-tree story still have some life. Harold Kellock, author of a 1928 biography of Weems, made clear his bemusement with his subject in his antiquated title, PARSON WEEMS OF THE CHERRY-TREE; Being a short account of the Eventful Life of The Reverend M. L. Weems, AUTHOR OF MANY BOOKS AND TRACTS, ITINERANT PEDLAR OF DIVERS VOLUMES OF MERIT: PREACHER OF VIGOUR AND MUCH RENOWN, AND FIRST BIOGRAPHER OF G. WASHINGTON. In a bit of Weemsian exaggeration, Kellock remarked that “the story of Washington and the cherry-tree, which is familiar to children along the upper reaches of the Amazon and the Yangtze . . . is perhaps the most widely known folk-lore in any tongue.” But it was also a curious kind of folklore, as Kellock tacitly acknowledged. It wasn’t “distilled from native legends.” It wasn’t a recording of oral traditions, even though Weems fabricated a tale about hearing it from “an aged lady” who had been a slave in Augustine Washington’s household. Rather, it was a “product of [Weems’s] own fecund imagination” that “rolled into the word from his quill pen.” This was folklore that didn’t grow out of the folk or out of any particular place but from the mind of a famous book peddler and from the pages of his national bestseller. Quaint though Weems and his story seemed in the 1920s and 1930s, they hardly emerged from the conditions laid out by the influential folklorist Benjamin A. Botkin in 1934: “Folk groups are distinguished by cultural isolation. The folk, like the primitive, group is one that has been cut off from progress and has retained beliefs, customs, and expressions with limited acceptance and acquired new ones in kind.” Even when he adopted a more expansive conception in A Treasury of American Folklore (1944)—and included the cherry-tree episode from Weems’s book in the anthology—Botkin argued that “folklore is most alive or at home out of print” and that “its author is the original ‘forgotten man.’”
The Depression and World War II heightened the appeal of folklore as the enduring “germ-plasm of society” and hence a resource for survival in difficult and uncertain times. But it would have been preposterous to see the cherry-tree story as an organic outgrowth of “a continuous life of the folk running through the history of the nation,” “rooted in nature,” “a wild plant,” or “hardier and more fit to endure than any form of the cultivated life” as other folklorists described their subject. What to make, then, of a thoroughly debunked piece of inorganic folklore?
Grant Wood, for one, recommended a strategy of knowing acceptance, somewhere between credulity and cynicism. In his view, offered in the form of a press release issued for the painting’s public debut, the cherry-tree story was at once too good to be true and “too good to lose.”
“It is, of course, good that we are wiser today and recognize historical fact from historical fiction. Still, when we began to ridicule the story of George and the cherry tree and quit teaching it to our children, something of color and imagination departed from American life. It is this something that I am interested in helping to preserve.
As I see it, the most effective way to do this is frankly to accept these historical tales for what they are now known to be—folklore—and treat them in such a fashion that the realistic-minded, sophisticated people of our generation accept them.”
Aside from the formal and art-historical reasons for the Parson’s position in the painting’s foreground, his opening of the curtain lets viewers in on the scene as knowing participants in the fabrication. He points to the cherry tree with his left hand and smiles wryly. He shares a sense of irony with his “sophisticated” viewers, but he also conveys his honest affection for the tale he’s telling. And Wood, too, insisted that his underlying intentions were earnest. “I sincerely hope that this painting will help reawaken interest in the cherry tree tale and other bits of American folklore,” he said. Parson Weems’ Fable reveals its layers of inauthenticity—the contemporary artist creates a faux-Peale image of a clever author making up a story about not telling lies—only to help revive the legend for a “present, more enlightened generation.”
Grant Wood, Parson Weems’ Fable, 1939. Oil on canvas, 38 3/8 x 50 1/8 inches. Amon Carter Museum, Fort Worth, Texas. 1970.43.
What was at stake, aside from preserving “color and imagination,” in reawakening interest in this bit of American folklore? Certainly not old-fashioned moral instruction. Wood’s Weems can’t possibly believe in his own book’s “pious mendacities” and “cloying moral lessons” (as Kellock described The Life of Washington). But the contemporary value of the cherry-tree story didn’t have to reside in its didactic content. “In our present unsettled times,” Wood explained in his press release, “when democracy is threatened on all sides, the preservation of our folklore is more important than is generally realized.” Citing a recent Atlantic Monthly article by the Harvard English professor Howard Mumford Jones titled “Patriotism—But How?” Wood worried that “while our own patriotic mythology has been increasingly discredited and abandoned, the dictator nations have been building up their respective mythologies and have succeeded [in Jones’s words] in ‘making patriotism glamorous.’” A democratic national mythology was necessary to fight totalitarianism, yet the content of that mythology, observes the art historian Cécile Whiting, was much less important to Wood than the commitment to mythmaking itself. Following the historian and folklorist Constance Rourke—who, Whiting notes, “considered both fablemaking and humor to be uniquely American character traits”—Wood saw the knowing and willing acceptance of myths like the cherry-tree story as an alternative to the dangerous credulity demanded by the patriotic mythologies of Nazi Germany, fascist Italy, and the Soviet Union. Self-aware engagement in the process of mythmaking rather than believing in the myths themselves would have to serve as America’s cultural response to the “dictator nations.” Parson Weems’ Fable fostered patriotism by taking the national “trait” of mythmaking as its subject. Only at this remove from the cherry-tree story’s obvious falsity could it serve as one of those “true and fundamental things which are distinctively ours to use and exploit” in resisting totalitarianism.
This conception of patriotism—in which national identity is based to a significant extent on a shared sense of irony and humor—was subtle, and not all of the painting’s viewers got it. Life reproduced Parson Weems’ Fable in full color in its February 19, 1940, issue. The accompanying article eagerly reported on the controversy the painting had already stirred up.
“Almost before the paint was dry on this picture it started a battle. Widely publicized as the first painting by Grant Wood in three years, and one of his new series on American folklore, it recently raised the dander of literal-minded patriots all over the country. They bombarded Wood with angry letters because he depicted the cherry-tree story frankly as a fable invented by Parson Weems, and accused him of “debunking” Washington.”
Viewers’ anger was old hat to Wood by then. He owed his fame in part to the irate responses of Iowa farmwives to American Gothic, a few of whom, he reported, had threatened to bash his head in for portraying them as homely and dour. His Daughters of Revolution, a 1932 portrait of three self-righteous patriotic matrons in front of Emanuel Leutze’s Washington Crossing the Delaware, had also provoked a backlash, this time from the D.A.R.
Charles Willson Peale, The Artist in His Museum, 1822. Oil on canvas, 103 3/4 x 79 7/8 inches. Courtesy of the Pennsylvania Academy of the Fine Arts, Philadelphia. Gift of Mrs. Sarah Harrison (The Joseph Harrison, Jr. Collection). 1878.1.2.
In the case of Parson Weems’ Fable, it was Wood’s own advance hyping that got him into trouble. Nine months before he put brush to canvas, he announced in an interview that honest young George “has got to be smug.” The idea that the Father of the Country would be depicted in such a way riled the humorless editors of the Indianapolis News, who contended that the hero who adventurously set off on a western survey at sixteen couldn’t possibly have been smug at six. “Little is known of Washington’s boyhood,” they lectured, “but if that little proves anything conclusively, it is that he was not a smug boy.” The Chicago Tribune reprinted the Indianapolis paper’s editorial and offered up a more scathing one of its own. Young George would be painted as smug “only because Wood is smug,” intoned the Tribune, and this attitude was typical of artists like Wood who worked for the New Deal’s arts projects. These “parasitic” pinkos, “whose pleasure it is to belittle great men, pretending to correct our memories out of their shallow thought and twisted tempers,” were “supposed to foster American art” but instead produced the equivalents of “mural paintings of Lenin at Valley Forge or Stalin crossing the Delaware.” Parson Weems’ Fable would be another of Wood’s “malicious cartoons,” like American Gothic, which almost everyone but the Tribune editors had long since come to see as a celebratory image of American steadfastness and determination. (Three years later, the Tribune finally came around when it reprinted American Gothic as a morale booster for the World War II home front. American Gothic’s couple now represented a “fixed belief in a better tomorrow . . . an undying patriotism . . . a readiness to sacrifice, that their sons and daughters might go forward!”)
No wonder Wood was ready with his press release as soon as Parson Weems’ Fable was finished. “Grant Wood is an earthy, peaceable Iowan who manages to stir up many an artistic rumpus,” Time reported only two weeks after Wood sent the completed painting to his dealer in New York. According to Time, Wood “placidly ignor[ed] the storms such paintings raise,” but his sister Nan—the model for the woman in American Gothic—recalled that the criticism stung him. Besieged from the right by “literal-minded patriots,” he had also been hammered from the left for what the New Masses called his “frantic patrioteering” and “national chauvinism.”
Wood’s “new series on American folklore” started and ended with Washington and the cherry tree. He announced but never painted his version of the John Smith-Pocahontas legend, and his subsequent statements about a proper patriotic art were not as nuanced as those that accompanied Parson Weems’ Fable. In the spring of 1941, he insisted that art had an important role to play in national defense: not “flag-waving” art with “screaming eagles and goddesses of liberty upholding flaming torches” but not “smart, sophisticated stuff” either; just paintings of “the simple, everyday things that make life significant to the average person.” Wood’s last two paintings before his death in February 1942 were titled Spring in the Country and Spring in Town. Both depicted robust midwesterners working their bounteous fields and gardens. Neither contained even a hint of irony.
Had he been alive to read the October 1955 issue of Western Folklore, Wood might have taken heart. There he would have encountered two versions of the cherry-tree story from Texas, collected by the folklorist George D. Hendricks. The first had the Washingtons living in a “modest cottage on the north bank of the Rio Grande.” It is Christmas morning, and little George is thrilled to discover Santa’s gift of a new machete. He rushes out to hack down his father’s favorite huisache tree. The following exchange ensues:
“George, I want you to tell me the truth. Did you or did you not cut down my favorite huisache tree?”
George looked him right in the eye. “Sir, I cannot tell a lie. I cut your favorite huisache tree down with my own little machete.”
“Well, George, I must say that I am really disappointed in you; and there’s only one course left open to me now. I can see that if you cannot tell a lie, you’ll never make a good Texan. I’ll have to take you back to Virginia.”
The second version involves a boy and his father on a farm near the Neches River. The back end of their privy extends out over the river on stilts. One day, after the boy is punished for playing hookey, he takes his revenge by chopping down the privy with his axe. He fesses up to the deed.
“[I]t was me who cut down the privy. I done it with my own axe. I’ll behave now if you promise me you won’t lick me.” The father glared at his offspring, “I’ve a mind to give you a larrupin’.”
“But, paw, remember George Washington told his paw he cut down the cherry tree and he never got no lickin’.”
“I know. But this here’s a powful sight different. When George cut down the cherry tree, his paw warn’t UP IN THE TREE!”
Here was “color and imagination” galore and not necessarily among those whom Wood envisioned as “the realistic-minded, sophisticated people of our generation.” Here was folklore restored to the kind of oral culture from which it was supposed to derive. Here was Parson Weems’s fable at once embraced and debunked, retold with both affection and irony, acceptance and knowingness.
Further Reading:
There are two superb comprehensive studies of Grant Wood’s life and work: James M. Dennis, Grant Wood: A Study in American Art and Culture (New York, 1975; revised edition, Columbia, Mo., 1986) and Wanda M. Corn, Grant Wood: The Regionalist Vision (New Haven and London, 1983). Both include illuminating discussions of Parson Weems’ Fable. I am particularly indebted to Cécile Whiting, “American Heroes and Invading Barbarians: The Regionalist Response to Fascism,” in Jack Salzman, ed., Prospects: An Annual of American Cultural Studies, Vol. 13, (Cambridge and New York, 1988): 295-324. Karal Ann Marling traces the debunking of the cherry-tree story in George Washington Slept Here: Colonial Revivals and American Culture, 1876-1986 (Cambridge, Mass., and London, 1988). In contrast to Dennis, Corn, and Whiting, Marling sees Parson Weems’ Fable as an unambiguous work of debunking; in her view, little George is indeed a “smug little brat,” and the painting treats Washington as “a bit of an embarrassment, a national joke.” The Texas versions of the cherry-tree story are recounted in George D. Hendricks, “George Washington in Texas, and Other Tales: Old Stories in New Surroundings,” Western Folklore, Vol. 14, No. 4 (October 1955): 269-272. For information about Weems, I especially drew on: Lewis Leary, The Book-Peddling Parson: An Account of the Life and Works of Mason Locke Weems, Patriot, Pitchman, Author and Purveyor of Morality to the Citizenry of the Early United States of America (Chapel Hill, 1984) (evidently, it is a requirement of the Parson’s biographers that they use verbose, antiquated subtitles); Ronald J. Zboray, “The Book Peddler and Literary Dissemination: The Case of Parson Weems,” Publishing History, Vol. 25 (1989): 27-44; and Steven Watts, “Masks, Morals, and the Market: American Literature and Early Capitalist Culture, 1790-1820,” Journal of the Early Republic, Vol. 6, No. 2 (Summer 1986): 127-150.
This article originally appeared in issue 6.4 (July, 2006).
Steven Biel is the executive director of the Harvard Humanities Center and author of American Gothic: A Life of America’s Most Famous Painting, just out in paperback from W. W. Norton.
The Common Dust of Potter’s Field
New York City and its bodies politic, 1800-1860
On Sundays, the work of disinterring the many thousands of bodies from New York’s old Potter’s Field—or paupers’ burial ground—halted for the week, and the lot was unguarded, permitting a scene of macabre joviality. For most of the 1850s, in an open lot at Fourth Avenue and Forty-ninth Street, crowds gathered to view hundreds of unearthed pine caskets, broken apart during the week by workers’ spades and picks, their contents spilling into public view. While the genteel gazed upon the remains, vagrant children played kickball with disembodied skulls, and dogs growled over the long bones of arms and legs.
Even in the chaotic bustle of antebellum New York, this desecration did not escape scrutiny. “Is our City always to be disgraced by some public exhibition?” asked a subscriber to the New York Times in 1858. “For the sake of decency, do call the attention of our City authorities to the exhibition of coffins, skulls and decayed bodies lying exposed.” But the city authorities were already well aware of this gruesome public nuisance. Indeed, the public-health controversy and moral outrage that accompanied the disinterment of the city’s old Potter’s Field and the removal of the bodies to the new locations on Randall’s and Ward’s Islands had been brewing since at least the first decade of the nineteenth century.
The phrase Potter’s Field was not specific to New York City’s paupers’ burial ground. It originated in Matthew 27:7: “And they took counsel, and bought with them the potter’s field, to bury strangers in.” “Potter’s field” referred to a place where potters dug for clay, and thus a place conveniently full of trenches and holes for the burial of strangers. For nineteenth-century New Yorkers, the biblical veneer of the term was perhaps an antidote to one of the distressing costs of life in the chaotic new democratic city. As some prospered, hundreds of others died, remembered by no one and memorialized by nothing. At least, in Potter’s Field, they lay under a vague biblical cope.
The first Potter’s Field burial ground in New York City was located at the site of what would become the militia parade ground and city park at Washington Square. On this nine-and-a-half-acre plot, at the city’s pastoral northern edge, lay the densely packed corpses of about 125,000 “strangers,” many of whom had died during two separate yellow-fever epidemics between 1795 and 1803. Not surprisingly, local residents who had fled crowded lower Manhattan for country estates in the region came to find in Potter’s Field an intense nuisance. Whatever sympathy anyone had for the anonymous dead did not supersede wealthy New Yorkers’ sense of entitlement when it came to their comfortable insulation from the city’s darker side. In a letter to the Common Council, they wrote, “From the rapid Increase of Building that is daily taking place both in the suburbs of the City and the Grounds surrounding the field alluded to, it is certain that in the course of a few years the aforementioned field will be drawn within a precinct of the City.” Within the first two decades of the nineteenth century, their prediction had been realized, and the Potter’s Field began a lengthy series of migrations in a vain effort to stay a step ahead of the city’s relentless growth.
In 1823, the city moved Potter’s Field to an empty lot at the corner of Forty-ninth Street and Fourth Avenue—what would then have been the far northern reaches of the metropolis. This place served as the Potter’s Field until the 1840s when, as the city grew northward, it was relocated once again to Randall’s Island in the East River. Cast off the Island of Manhattan like so many family farms, Potter’s Field would no longer clash with the New Yorkers’ Victorian sensibilities or inhibit the Manhattan real-estate boom.
Just south of Randall’s Island, separated by a treacherous, narrow channel known as Little Hell’s Gate, was Ward’s Island, the site of another Potter’s Field in the mid-1850s. Both Randall’s and Ward’s Islands already housed other city institutions for the indigent, including the Emigrant Refuge and Hospital, the State Inebriate Asylum, the juvenile branch of the Almshouse Department, and the headquarters for the Society for the Reformation of Juvenile Delinquents. As one guide to New York and its benevolent institutions observed, “multitudes of persons went from the dram-shop to the police-station, and from the police courts to the Workhouse from whence, after a short stay, they returned to the dram shop . . . until they at length died on their hands as paupers or criminals, and were laid in the Potter’s Field.” For most of New York’s institutionalized underclass, there was literally a direct path from the door of the asylum or workhouse to the Potter’s Field.
Relocating the city’s cemetery from Manhattan’s urban grid to an island in the East River did not put an end to the city’s problem with the indigent dead. In 1849, the Daily Tribune reported on the political and legal wrangling between the governors of the Almshouse and the Common Council (the nineteenth-century name for the City Council), the former seeking to wrest authority over Potter’s Field from the latter. The governors cited the poor management of the paupers’ burial ground, which the Tribune referred to as “that den of abominations,” as evidence that the Common Council was unable to manage the Potter’s Field. “We do sincerely trust somebody will shoulder the responsibility of the Potter’s Field,” the Tribune pleaded, “and rid the Island of the abomination before the advent of another warm and perhaps an epidemic season.”
The Common Council and the Governors of the Almshouse traded letters, pleas, and vitriol for the better part of a decade. In May of 1851, the Governors warned the Common Council that, “the land now appropriated [for the Potter’s Field] is now nearly full, and the small space left for further interment (which now average upwards of one hundred per week), renders prompt action necessary.” Four years later, it was still unclear who had control over the Potter’s Field, and conditions were worsening. By this time, there were two burial grounds for paupers: the primary site on Randall’s Island and a smaller one on Ward’s Island to the south. The Board of Governors proposed to expand the Ward’s Island site in 1854, and the Times supported the proposition, suggesting that “it is time that the remains of paupers were interred in some quarter better fitted for their last resting-place than the one now used on Randall’s Island.” In their reports to the Board of Health and the Common Council, the Governors of the Almshouse urged that, “humanity, a due regard for the living, and a sense of proper respect for the dead” be part of any effort “to remedy the existing and impending evils.”
In the meantime, the disinterment of bodies at the old site on Fourth Avenue aroused its own controversy. In 1851, a plan was adopted by the Common Council to expand Forty-ninth Street through the old Potter’s Field, which required the disinterment of thousands of bodies. This project stretched on for nearly the entire decade, accompanied by foot-dragging and corrupt contractors. Commenting on the enormity of the project, the Times reported in the spring of 1853 that “the City Authorities are cutting a street through the old Potter’s Field . . . where so many victims of the Cholera were hurriedly interred in 1832. The coffins were then, in many instances, stacked one upon another; and now, in digging through the hill, the remains of twenty coffins may be seen thus piled together.”
As with the active Potter’s Field, the old paupers’ burial ground aroused no small amount of controversy. In the summer of 1858, the Times again reported on the work, claiming that “within three weeks past about 3,000 skeletons have been exhumed from the old Potter’s Field . . . and removed to Ward’s Island.” The winter of 1858-59 passed without any further exhumation, and “meantime the thin layer of earth which covered some hundred half-decayed coffins has fallen away, and . . . crowds of urchins assemble there daily and play with the bones of the dead; troops of hungry dogs prowl about the grounds and carry off skulls and detached parts of human bodies.”
Randall’s and Ward’s Islands. NewYork City’s Potter’s Field was relocated from Randall’s Island to the southern end of Ward’s Island in the 1850s. The “dead boat” from the city docked at the southern end of the island to deposit its cargo. Image from Map of New York and Vicinity (New York, 1863). Courtesy of the David Rumsey Map Collection.
Newspaper articles and editorials suggested that there was something morally improper about leaving so many thousands of bodies exposed for the amusement of urchins and stray dogs, or left to rot in packed trenches, awaiting medical-school body snatchers. When the Board of Aldermen considered purchasing new lands on Ward’s Islands for the expansion of the Potter’s Field in 1852, these scenes were not far from the minds of civic-minded newspaper editors. The Times urged the city “to arrange and adorn” the new Potter’s Field “with suitable shrubbery” so that “if the new grounds are properly laid out, they may be made not only useful but ornamental.” The Times reported that “the new place is to be called the ‘City Cemetery’—so farewell to the old Potter’s Field, a name that is never heard without a thrill of horror.” Frequently throughout the controversial disinterment and relocation of the Potter’s Field in the late 1840s and 1850s, the newspapers referred to the necessity of making both the old and new Potter’s Fields “more proper,” “suitable,” and “respectable,” according to the “dictates of propriety.” But what were these “dictates of propriety?”
The Grave Hath a Voice of Eloquence
In 1858, Harper’s New Monthly Magazine ran an article entitled “Civilization and Health,” which argued that sanitation and “high mental culture” were necessary components to a great democracy. As the individual citizens of the republic might be strengthened by intellectual achievement, so could the social body as a whole be strengthened by a refinement in public health because “the connection of cleanliness with civilization is every where manifest in direct ratio with mental culture.”
The problem of high mortality from disease in New York, however, seemed to indicate that inadequate public sanitation prevented the city from full participation in the progress and benefits of civilization. In 1859, the state of New York issued a special report on the health of New York City in which it acknowledged that mismanagement and lack of sophistication within the city’s sanitation infrastructure were to blame for high mortality. “Great cities are certainly the pride of nations,” reported the investigating committee, “but they require a paternal control, and all Christian and civilized communities recognize the duty of exercising it.” The article from Harper’s agreed that “wherever misery is manifest there always exists at man’s disposal means of mitigating or removing it,” and “to find out and apply these means is advancement in civilization.” Citing the benefits of “the science of public health,” Harper’s proposed that high urban mortality was primarily confined to the poor, but that poverty was not, in and of itself, the cause of death. Rather, “the worst effect of poverty is, that it leads to filth and neglect . . . which affects the whole of the inhabitants.” The civilized solution was “contact with well-cleaned streets and external purity,” which “begets a distaste for internal filth and degradation, and there are none so degraded nor impure as not to be benefited and elevated by association with cleanliness.” Few of Harper’s middle-class readers would disagree.
Of special concern to health reformers in antebellum New York were frequent and devastating epidemics of yellow fever and cholera. While large outbreaks of yellow fever were largely contained by the 1830s, lower class wards remained especially vulnerable to cholera epidemics. Public-health reformers cited numerous causes for these continued public health problems. Slaughterhouses, inadequate sanitation, and urban graveyards were among the most important. Simply put, health reformers assumed that carcasses and cadavers made more carcasses and cadavers.
For this reason, medical science was beginning to call for a new approach to interment. Bodies should lie away from cities, not so much for their own peaceful eternal rest, but for the safety of the living. An 1822 pamphlet entitled Documents and Facts Showing the Fatal Effects of Interment in Populous Cities argued that the pernicious yellow fever epidemics in the city were caused, in part, by the “putrid exhalations arising from grave-yards.” Nearly twenty years earlier in 1806, a committee of the Board of Health had reported that burial in urban churchyards was deleterious to the public health, stating that “interments of dead bodies within the city, ought to be prohibited. A vast mass of decaying animal matter, produced by the superstition of interring dead bodies near the churches . . . is now deposited in many of the most populous parts of the city.”
The committee’s report resulted in an 1813 law granting the corporation of the city the power of “regulating, or if they find necessary, preventing the interment of the dead within the city.” But this power was seldom exercised, and the sudden appearance of yellow-fever epidemics left little time for the healthful burial of the dead. According to Dr. Samuel Akerly of the New York Hospital, the Potter’s Field was already part of this public-health concern in the 1820s and was “known to be frequently offensive, and it sickened a detachment of militia stationed near it in 1814 . . . It becomes the corporation therefore, to prevent its becoming a nuisance, and this may be easily done with quick-lime or ashes of wood.”
“Receiving and Removing Dead Bodies at the Morgue,” from J. F. Richmond, New York and Its Institutions (New York, 1871-72). Workers handling a casket, a horse-drawn hearse, a shrouded body being taken from a cart, and several ladies and gentlemen visiting the morgue. Courtesy of the American Antiquarian Society.
Shortly after the publication of the 1822 pamphlet, the Potter’s Field was relocated to the lot at Forty-ninth Street and Fourth Avenue. For the site of the former graveyard, the city followed the suggestion of the 1806 report that “the present burial-grounds might serve extremely well for plantations of grove and forest trees.” The measure was as much about aesthetics as about health. Rather than emanate sickening vapors, a well-planted Potter’s Field would absorb and digest the remaining “putrefying matter and hot-beds of miasmata,” making it at once “useful and ornamental to the city.” From a useless no-man’s land, it would be transformed into a city park and military parade ground, renamed Washington Square.
The city, it turns out, was not opposed to open space—to the contrary, the same impulse behind the drive to push Potter’s Field from Manhattan Island lay behind the drive to create pockets of pastoral delight within the urban grid. Both were part of that antebellum struggle to sustain areas of genteel quietude amidst the city’s bustle and chaos. Among the most popular pastoral retreats for the antebellum middle- and upper-classes were the so-called rural cemeteries, which appeared on the suburban fringes of most American cities between 1830 and 1860. From the vantage of these wooded pleasure grounds, the foul stench of the urban Potter’s Field was particularly offensive.
In a lecture delivered in the 1830s before the Boston Society for the Promotion of Useful Knowledge, Dr. Jacob Bigelow, head of American’s first rural cemetery at Boston’s Mount Auburn, reiterated concerns about the disposal of dead bodies when he observed that “within the bounds of populous and growing cities, interments cannot with propriety take place beyond a limited extent” and suggested that the burial of the dead might better occur “amidst the quiet verdure of the field, under the broad and cheerful light of heaven,—where here the harmonious and ever changing face of nature reminds us, by its resuscitating influences, that to die is but to live again.”
While meeting an urban public-health need, the primary purpose of these rural cemeteries was to contribute to an ongoing discussion about the identity and morality of the nation, quite literally embodied in the illustrious dead. Visitors to rural cemeteries were expected to lose themselves in the harmonies of creation as they contemplated the noble biographies of the departed. In their pastoral classrooms, the illustrious dead were teachers of history and morality, and their timeless presence in such institutional places of rest affirmed the permanence of not only the local community but the national community as well.
For the greater New York metropolitan area, the principal rural cemetery was Brooklyn’s Green-Wood Cemetery, established in 1842. In advocating for its creation, newspapers, health reformers, and city-park enthusiasts reiterated many of the same assertions about public health and morality that had circulated in the city since the 1820s. Green-Wood quickly became famous, with its three hundred acres traversed with fifteen miles of garden pathways. Phelps’ New York City Guide for 1857 described Green-Wood as “one of the most interesting objects of public utility and beauty near New York” and “a favorite rural resort during the summer season” as well as “a holy spot [which] links itself to our being, with a cherished fondness and satisfaction.” Until the completion of Central Park in the 1860s, Green-Wood Cemetery was New York’s primary garden spot.
Having strolled through the rural cemeteries, we can better appreciate why the piles of moldering coffins exposed to the public in the 1850s caused New Yorkers to question their city’s claims to “civilization.” But the Potter’s Field was not only the antithesis of the rural-cemetery ideal (as well as a failure of municipal administration); it was also a site of spiritual death, obliterated social identity, and the graveyard of vice. If, as one proponent of rural cemeteries claimed in 1831, “the grave hath a voice of eloquence,” the Potter’s Field spoke in a dark chorus about the failures of democracy and civilization, the stark and messy exigencies of urban inequality, and thousands of individual lives wrecked on the shores of the great metropolis.
“Bodies at the Morgue Awaiting Identification and Removal,” from J. F. Richmond, New York and Its Institutions (New York, 1871-72). A worker using a hose to wash a room stacked with caskets; two caskets stand upright against the wall at far left. Courtesy of the American Antiquarian Society.
In The Common Dust
In May of 1858, a little-known author named Henry Herbert contemplated and ultimately committed suicide. Before his death, he pleaded to his friend Miles I’anson, “I must not be buried in the Potter’s field, or by charity.” Several days later, when John Aikins, second mate of the ship Mandarin, was killed by a sailor in a fight, Captain Dubois of the California clipper Queen of the Pacific (who only had a passing acquaintance with Aikins) personally took on the cost of a casket and a plot at Green-Wood Cemetery. He could not bear the thought of a fellow officer being taken to Potter’s Field. Similarly, when an elderly man was found “in a fit” on a dock near Washington Market and subsequently died at a police station house, preparations were made to inter the body in Potter’s Field. But at the last minute the body was recognized as a “man of some prominence” by a Brooklyn scientist, who “requested that the body might be given to him for decent burial” in his own plot at Green-Wood. In each of these cases, the deceased either expressed or was saved by a fervid desire to avoid the humiliation and anonymity of a pauper’s burial, which would have signified a fall from the bourgeois society of which they had once been a part. To bury these people in the Potter’s Field would have been an unconscionable breach of propriety.
The sorts of people for whom a Potter’s Field burial was considered acceptable by benevolent institutions and reformers ranged widely from newborn infants abandoned on street corners to inebriates and “fallen” women. These burials were reported with parsimony by New York’s newspapers:
“October 20, 1852: The body of an unknown lad was found yesterday floating in the East River off Burling-slip. The deceased was not recognized, and, upon the Coroner holding an inquest, the body was sent to Potter’s Field for interment.
February 04, 1858: Rosa was 19 years of age. She formerly boarded at a lager-bier saloon at No. 257 William-street, and has for nearly a year past borne a questionable character. No one has thus come forward to claim the body, and it is probable she will be buried in Potter’s Field.”
If it was true, as one reformer claimed in 1854, that “many a saddening pang afflic’s the last moments of the children of poverty, upon reflecting that their remains must moulder in the common dust of Potter’s Field,” tens of thousands of the city’s poor lived in fear of an eternity spent in anonymity. Likewise, middle-class New Yorkers used the possibility of a Potter’s Field burial as a kind of moral barometer. One can easily imagine parlor gatherings attended by much clucking of tongues over the most recent rumor that if Mr. So-and-so continued in his intemperate ways, a pauper’s grave would provide his final rest.
The moral horror of a pauper’s grave was not just an expression of simple Victorian propriety or Protestant death-ways. It also derived from the simple visibility of the Potter’s Field. There was no privacy in a pauper’s death; your bones, like so many family jewels, were exposed to all of the city’s most pernicious impulses. In imagining the horrors of a pauper’s burial, middle-class Americans drew a line between themselves and what Jacob Riis would later refer to as the “other half.” Rural cemeteries like Mount Auburn and Green-Wood were idealized by bourgeois urbanites as didactic landscapes, but the Potter’s Field as both a physical place and a cultural symbol was also imbued with instructional possibilities. It warned that the expected fate of the intemperate, the immoral, and the fallen was a gruesome eternal repose. Antebellum New Yorkers wanted to be frightened and disgusted by the Potter’s Field because it provided a startling moral corrective. Despite their cries for reform, they still expected to find the paupers’ burial ground disgusting and degrading and terrifying because they expected that its occupants, in life, had been the same. When this expectation abutted the very real public-health concern, linked as it was with the language of civilization, middle-class New Yorkers reacted within the expectations of their class: they were properly horrified but took only a limited interest in actual reform. As middle-class New Yorkers began leaving the city at midcentury, the need for such symbols of moral suasion diminished. Potter’s Field could now lie quietly isolated by the slow moving waters of the East River.
Further Reading:
For the role of dead bodies in nineteenth-century medical, anatomical, and cultural discourse, see Michael Sappol’s A Traffic of Dead Bodies: Anatomy and Embodied Social Identity in Nineteenth-Century America(Princeton, 2002), which contains a brief section on the Potter’s Field and its obliteration of social identity; for the development of a health-based idea of propriety and nationalism, see Joan Burbick, Healing the Republic: The Language of Health and the Culture of Nationalism in Nineteenth-Century America (Cambridge, 1994); and for the didactic role of the city park and the rural cemetery in nineteenth-century urban planning, see David Schuyler, The New Urban Landscape: The Redefinition of City Form in Nineteenth-Century America (Baltimore, 1986).
This article originally appeared in issue 6.4 (July, 2006).
Thomas Bahde is a doctoral candidate in U.S. history at the University of Chicago. He is currently writing a dissertation using microhistorical and biographical methods to examine issues of race and justice in the nineteenth-century Midwest.