Girls Just Want to Have Fun

This morning, with great fear and trembling, I entered my stepdaughter’s American Girl room. When my husband and I got married and the three of us moved in together, the room was something of a selling point, a private space of her own, much like a study, and I had promised I would never touch her dolls when she was away. She’d left Addie, the runaway slave, lounging in shorts and hiking boots. Molly was still in her 1940s pajamas but was in the process of climbing out of bed and reaching for her glasses. As for Kit, a child of the Great Depression, she had been dressed in a white silk ball gown and black felt cape and was arranged in a hammock intended for Beanie Babies. She looked a little like Sleeping Beauty.

 

Fig. 1: American Girl Dolls: Addie, Molly, Kit. © by the Pleasant Company.
Fig. 1: American Girl Dolls: Addie, Molly, Kit. © by the Pleasant Company.

These girls come from radically different historical circumstances, but as far as my stepdaughter is concerned, they are sisters. Addie, circa 1864, is, logically enough, the oldest. The girls share clothing, lunches, and accessories freely. For a while, Addie didn’t have a bed. Given her history, I took this to heart and got my stepdaughter–or, more honestly, Addie herself–the “official” Addie bed for Christmas, complete with African American story-quilt. I’ll admit that my compassion was misguided. As far as my stepdaughter is concerned, the three dolls are only incidentally connected to their official profiles. It doesn’t matter that Addie may never be reunited with her family, that Kit was thrown into jail with a group of hoboes, or that Molly’s father is off at war. These dolls are refugees from their own histories. There are seven American Girl dolls in all, spanning a period from the American Revolution to the Second World War. The Pleasant Company, named after its founder, Pleasant T. Rowland, has sold five million dolls since 1986. The company’s staff includes a small team of historians and librarians, and clearly, they are having a terrific time. For anyone with a taste for the history of everyday life, the catalogue makes fascinating reading. Each of the girls is surrounded by meticulously researched doll-sized butter churns or snowshoes or school desks. Kit’s lunchbox is decorated with WPA-style heroic locomotives. Addie’s is an old milk pail big enough to hold a cold meat pie and a bunch of grapes. In some ways, the dolls have the feel of successful market research. At ninety dollars, they’re hardly cheap, and their price tag is part of their appeal. These are not intended to be superficial toys. They’re a way to spark historical imagination, to make connections between the past and present, to help girls understand that, as the catalogue brightly notes, “You’re a part of history too.” Do American Girl dolls turn young girls into junior historians? I can only speak from my own experience. My stepdaughter adores her dolls, but was initially ambivalent about their stories. Each doll comes with the first book of its series, and her father had gotten her a full set of the Molly books, but she never asked to read them. Her reaction to the Addie books was even stronger. She had started on the first, only to find the descriptions of slavery so intense that they gave her nightmares. But as time went on, particularly as she became an independent reader, she began to pick up some of the “short stories,” books that can be read in a single sitting, and slowly graduated to the chapter books. She now has most of the Molly series memorized.

 

Fig. 2: American Girl books. © by the Pleasant Company.
Fig. 2: American Girl books. © by the Pleasant Company.

Fifty-six million American Girl books have been sold since the Pleasant Company was founded. At $5.95, the books are much less of an investment than the dolls and are sold not only through the catalogue, but are available in libraries and bookstores. Although the books can be purchased independently, the American Girl stories and accessories are intended to have a symbiotic relationship. A dress that plays a key part in Addie’s Surprise is available in the catalogue for $22.00; Kit’s typewriter and rolltop desk, key features in several of her adventures as a budding journalist, can be purchased for $82.00. Still, the stories can be read by girls who have no interest in–or no money for–the dolls. My ten-year-old niece, who prefers gerbils, regularly plows through two American Girl books a day. These books deserve their popularity. The writing is lively and, remarkably enough, rarely crosses into gross sentimentality. Admittedly, the stories are formulaic. Each of the dolls gets six books with interchangeable titles: Meet _____, _______ Learns a Lesson, ________ Saves the Day. This drives home the central message, that at bottom, all of these girls face the same problems: a family in transition, adjustments at school, a summer where they are thrown into unfamiliar circumstances. The girls themselves are also essentially identical: spirited and resourceful, surrounded by loving grown-ups, and often in a position to help those less fortunate. As a writer of historical novels, I can appreciate the way the authors of these books use basic similarities to ease their readers into less familiar territory. Historical details are woven through the books: we learn, along the way, how school was taught in 1774, 1834, 1864, or 1934. The authors work not only with staff historians, but also with curators of historical museums in the towns where they set the stories. Each of the chapter books ends with a section called “A Peek into the Past,” a few pages of illustrated historical background. At first, my stepdaughter had no interest in those pages, but as time went on, she began to suffer through them, and sometimes, she would even ask questions. At the end of one of the Molly books was a photograph of Hitler, and suddenly I had an opportunity to carefully broach the subject of the Holocaust.

 

Fig. 3: American Girl books. © by the Pleasant Company.
Fig. 3: American Girl books. © by the Pleasant Company.

It would be easy to poke holes in the way the Pleasant Company presents history. Felicity, “a spunky, spritely girl growing up in Virginia in 1774,” visits a local plantation where there are clearly slaves; the issue never arises. Kirsten, a second-generation Scandinavian pioneer, has an entirely predictable friendship with a Native American girl named Singing Bird. The hardest to take, to my mind, is Samantha, a Victorian orphan who lives with her wealthy grandmother and has befriended an Irish servant girl named Nellie. I suspect, with a sinking heart, that Samantha is the most popular doll of the series. Still, at times the books can take you by surprise. The Addie series, in particular, not only covers her escape from slavery, but moves on to deal with life in Civil War Philadelphia, Northern racism, freedmen’s mutual aid societies, class antagonisms, and gradual, moving reunions with members of her family, including a brother who lost an arm in the war. Josefina lives in 1824 in what is now New Mexico, and her stories don’t dismiss the cultural and historical complexities of that time and place. Even the Samantha books contain a suffragette or two. One could wish for more. I have recurring fantasies of “Rosa, a strong-willed and clever girl growing up in 1914 in the tenements of New York.” On the third anniversary of losing her older sister in the Triangle Fire, Rosa is comforted by cheerful visits by the old family friend, Aunt Emma Goldman, who takes her to her first strike. I imagine the doll dressed in a rather stained shawl and kerchief, carrying a union card. I don’t know what my stepdaughter would make of her. Chances are, Rosa would simply join her sisters, Addie, Kit, and Molly, under a nine-year-old’s benign dictatorship. Even now, when the American Girl books have become staples in our household, the “official” stories have nothing to do with the world the actual dolls inhabit. My stepdaughter creates stories of her own. Perhaps it would be too complex to have three histories coexist, or perhaps it is simply a tribute to the power of imagination. The careful research of the historians on the staff at the Pleasant Company is not relevant, and that is as it should be. In any event, I would probably have to buy Rosa a bed.

 

This article originally appeared in issue 2.2 (January, 2002).


Simone Zelitch is the author of The Confession of Jack Straw (Seattle: Black Heron, 1991), Louisa (NewYork: Putnam, 2000), and a third novel, Moses in Sinai (Seattle: Black Heron, forthcoming). She is currently writing a novel about the Civil Rights movement. She lives in Philadelphia with her husband, stepdaughter, and Addie, Molly, and Kit.




Defining A “Christian Nation”: or, A Case of Being Careful What You Wish For

Steven K. Green’s Inventing a Christian America is an unusual work: one that re-tells an important early American narrative while providing a methodological model for historical debate. Basing his study on an array of primary and secondary sources, Green—Fred H. Paulus Professor of Law and Affiliated Professor of History at Willamette University, author of works on separation of church and state, and frequent contributor to litigation on religious freedom—demonstrates how easily the historical inventions of one era may become the historical facts of another. This tendency is especially common for the history of the American founding given, as Green argues, the almost irresistible connection between myth-making and nation-building as intellectual constructs.

 

Steven K. Green, Inventing a Christian America: The Myth of the Religious Founding.  Oxford: Oxford University Press, 2015. 312 pp., $29.95.
Steven K. Green, Inventing a Christian America: The Myth of the Religious Founding. Oxford: Oxford University Press, 2015. 295 pp., $29.95.

Green writes that he “seeks to unravel the myth of America’s religious foundings,” a process that is “crucial if our nation is to come to grips with its religious past and its pluralistic future” (vii, viii). In his introduction he sets out the three goals of his study: to examine the terms of the debate between what he calls the “religionists” and the “secularists”; to identify some of the errors of analysis and method by the former; and to explain that by the term “myth” he is referring to the efforts of early nineteenth-century writers “to forge a national identity, a process that sought to sanctify the recent past” (15).

Green addresses these points in a running critique of the work of historians who, to summarize an often complex position, have argued that the United States was intentionally established as a Christian nation, that first national documents are replete with Christian references demonstrating the intention of the Founders (both well known and “forgotten”) to establish a government based on God’s higher law (rather than Lockean contract theory alone), and that the Founders’ preferred form of Christianity was Protestantism. His work joins others exploring this “religionist” scholarship, especially two useful anthologies edited by Daniel L. Dreisbach, Mark David Hall, and Jeffry H. Morrison (The Founders on God and Government and The Forgotten Founders on Religion and Public Life).

In his four central chapters, Green expertly weaves together his take on the “Christian nationalist” argument with a narrative aiming to historicize major issues in early American religion and constitutional politics. These include the emergence of the idea of America as a religious haven; the transition from Puritan covenant to civil contract in American governance; the disconnect between the religious statements of the Founders and the foundations of civil government; and the emergence of the democratic consensus regarding the origins of constitutional authority, especially in the U.S. Constitution. In his culminating fifth chapter, Green surveys the influence of the writings of biographer William Weems, clergyman Lyman Beecher, Supreme Court Justice Joseph Story, and novelist Nathaniel Hawthorne in promoting the myth that the United States was founded as an expressly Christian nation and that the Founding Fathers, especially George Washington, were unusually pious.

Green shows effectively how Antifederalists’ fears of the irreligious character of the U.S. Constitution reflected their understanding that the new government was not expressly Christian: just the opposite of what the “religionist” school has concluded. And while prominent clergy “believed that providence had been instrumental in creating the new nation, they acknowledged that the source of authority for the new government rested with the people” (191). Green deftly explores the impact of the Second Great Awakening on the changing character of American Christianity in subsequent decades, far removed from the ecclesiastical mainstream of the Revolutionary era: “By mid-nineteenth-century America, ‘Christian’ meant not only Protestant but evangelical in belief and practice. Washington [an Episcopal rationalist] had been ‘born again’ in his death” (208).

In the manner of a legal brief, Inventing a Christian America is elegantly written, densely argued, and persuasive. But like any “forensic” work—that is, work designed to prove a case—it has its limitations. For one thing, Green focuses exclusively on the religionists’ arguments, leaving aside those of the secularists. A prime example of the latter is Isaac Kramnick and R. Laurence Moore’s The Godless Constitution: A Moral Defense of the Secular State, itself a forensic-style work, originally subtitled “The Case Against Religious Correctness,” and distinctly less nuanced than Green’s. Indeed, one of the difficulties of writing about religion and the Founding is that secularists have their own founding myths: among these, that the patriots were indifferent to religion in the conflict with Great Britain; that the Founders were largely Deists; and that the absence of any references to God in the U.S. Constitution is the prima facie evidence that Americans aimed to create a nation free of religious content. Evidence in recent work on the Revolution, such as T.H. Breen’s American Insurgents, American Patriots: The Revolution of the People, suggests that these conclusions are wide of the mark.

For another thing, Green does not discuss the states’ constitutional provisions regarding religion before and after the ratification of the U.S. Constitution in 1788. In his short survey Church and State in America, James Hutson stresses that under federalism issues relating to religion and churches were left to the states, while separation of church and state did not become constitutional doctrine until the Supreme Court’s Everson v. Board of Education decision in 1947. An important new article by Vincent Phillip Muñoz (“Church and State in the Founding-Era State Constitutions”) sets out the specific provisions of the states’ founding-era declarations of rights and constitutions concerning religion, any number of which permitted government regulation of worship or conditions for office-holding (while, it should be noted, also emphasizing religious freedom and, in several cases, barring clergyman from holding political office).

In fact, as Green’s study suggests but does not explore, a key conceptual problem with the Christian-nation thesis centers on the Founders’ use of the term “Christian,” and, for that matter, “Protestant,” “religion,” and “church.” It’s fair to say that, in keeping with the tradition of kingdoms and nation-states, the creators of an explicitly Christian republic would have been precise about which religious institution they preferred to ally with. But instead, the sources are full of generalities. Several state constitutions, for example, required office-holders to adhere to belief in the Trinity, in the divine inspiration of the Scriptures, in “Christianity,” or the “Christian religion,” or to be Protestants, but made no reference to specific denominations or their clergy. Abstract and generic references pepper Green’s sources, comprising a body of discourse that historians used to call “civil religion” and that Jon Meacham has more recently called “public religion.” Of course, nearly all of the Founders, prominent and not so prominent, were Protestant, by identity if not church-attendance, and many assumed that belief in Providence was a key ingredient of virtue, that piety was the glue that tied Americans together, and that God was on Americans’ side. But such sentiments are far removed from any intention to create a constitutional agreement between government and an organization called a “church,” or, for that matter, to elevate clergy to positions of influence within the government, as would be expected in a religious state.

In short, then as now, the United States was home to myriad church-es, denominations, religious societies, and sects—the words were often used interchangeably—that adhered to widely diverging interpretations of the Scriptures and organized themselves in substantially different ways. In fact—the crux of the matter—there was no, and is no, such institution as the “Christian Church” or the “Protestant Church” with which the United States and individual states could, or can, make an official compact. Had such a compact been possible, there seems little doubt that it would have been between the government and orthodox churches. Under such a regime, controversial sects like the Methodists—the most popular denomination in America by the Civil War but stereotyped as Loyalists and charlatans in the founding era—along with the offshoots of the Second Great Awakening, whose descendants form the bulk of Protestant church membership today, likely would have been denied voting rights and access to political office.

Green concludes his book with the warning: “So long as proponents of America’s Christian origins fail to see the narrative as a myth, they will be unable to appreciate the true import of America’s religious heritage” (243). Something of a jeremiad, Green’s study is also an important exercise in historical method: a model of history-as-debate, based on a lucid and consistent reading of the sources and a clear conviction that getting the story right matters. Given his approach, he can’t be blamed that the story of religion and the founding continues to be only partially told.

 

This article originally appeared in issue 16.3 (Summer, 2016).


Dee E. Andrews is professor of history at California State University, East Bay. She is the author of The Methodists and Revolutionary America (2000) and is currently completing a book on Thomas Clarkson as abolitionist author.




Artificial Light

George Washington Plunkitt teaches a lesson

I don’t understand why they choose to sit in dark.

It’s certainly not a matter of timidity. With my blessing as well as without it, these students are constantly opening up windows, moving around chairs, getting up in the middle of a discussion to grab a tissue or go to the bathroom, and so on. And yet if for some reason I’m not the first person to enter the classroom at the start of the day, they will sit in the liminal early morning light until I enter and flick the switch. Maybe it preserves some notion of receding freedom; as long as the lights are off, school hasn’t really begun.

Anyway, now I’m here, the lights are on, attendance has been taken, and we’re getting down to business. The topic at hand: a discussion of an excerpt from Plunkitt of Tammany Hall, a 1905 portrait of machine politics written—or, perhaps more accurately, filtered—by a young newspaper reporter and editor named William Riordan. Conceived as a response to muckraking journalist Lincoln Steffens’s 1903 exposé The Shame of the CitiesPlunkitt of Tammany Hall is a remarkably rich social document in which we hear—an auditory metaphor seems apt, because the voice is so striking—an indiscreet Irish politician named George Washington Plunkitt reveal far more about himself than he should for his own good. (The book, published at a moment when Plunkitt was seeking to recover his lost seat in the New York state senate, effectively sealed his doom, as Riordan may well have intended.) In one particularly rich passage, which I have assigned as homework and now proceed to read aloud, Plunkitt makes a famous distinction between what comes to be known as “honest graft” and “dishonest graft.”

 

Everybody is talkin’ these days about Tammany men growin’ rich on graft, but nobody thinks of drawin’ the distinction between honest graft and dishonest graft. There’s all the difference in the world between the two.

Yes, many of our men have grown rich in politics. I have myself. I’ve made a big fortune out of the game, and I’m gettin’ richer every day, but I’ve not gone in for dishonest graft—blackmailin’ gamblers, saloon-keepers, disorderly people, etc.—and neither has any of the men who have made big fortunes in politics.

There’s an honest graft, and I’m an example of how it works. I might sum up the whole thing by sayin’: “I seen my opportunities and I took ’em.”

Just let me explain by examples. My party’s in power in the city, and it’s goin’ to undertake a lot of public improvements. Well, I’m tipped off, say, that they’re going to lay out a new park at a certain place. I see my opportunity and I take it. I go to that place and I buy up all the land I can in the neighborhood. Then the board of this or that makes its plan public, and there is a rush to get my land, which nobody cared particular for before.

Ain’t it perfectly honest to charge a good price and make a profit on my investment and foresight? Of course, it is. Well, that’s honest graft.

Or, supposin’ it’s a new bridge they’re goin’ to build. I get tipped off and I buy as much property as I can that has to be taken for approaches. I sell at my own price later on and drop some more money in the bank.

Wouldn’t you? It’s just like lookin’ ahead in Wall Street or in the coffee or cotton market. It’s honest graft, and I’m lookin’ for it every day in the year. I will tell you frankly that I’ve got a good lot of it, too.

 

“Tammany Hall, 1830.” G. Hayward, lithographer (New York, 1865). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.

All right, I tell the students. There you have it. Plunkitt is telling us that dishonest graft means things like running prostitution rings or selling liquor, while the kinds of examples he’s giving us constitute the legitimate practice of politics—whatever “goo-goos” like that Progressive Lincoln Steffens may say. You buy it?

There’s a pause. There often is. J.D. raises his hand first, as he typically does. No, I don’t buy it, he says. Plunkitt’s just trying to justify his corruption.

Well, okay, I say. But just what is it here that constitutes corruption?

He’s using his office to get rich, says Leah, one of my better students. A politician is not supposed to do that.

Oh no? I reply. Maybe it depends on your definition of politics.

She looks at me quizzically.

We’ve already talked about the role of political machines for immigrants earlier in the week, I note. If a guy like Plunkitt will get your brother-in-law a job or deliver a turkey at Thanksgiving, how much do you care if he cuts himself in for a piece of the action? Doesn’t he pretty much need to cut himself in for a piece of the action in order to maintain his position, and thus to help other people? (I launch into a brief discussion here on the role of party patronage as the financial lubricant of the nineteenth-century two-party system and the reformers who tried to break it.) What are you, Leah—one of those Progressive control freaks who uses morality as a cover for hatred of people she considers different from herself?

Leah smiles. She knows I’m only kidding. After all, her people were Jewish immigrants, and she knows I know this. Still, I’m hoping she’ll feel the poke under her ribs.

I kind of like Plunkitt, says Danielle. He’s funny.

Manuel raises his hand. He doesn’t do this often, so I’m eager to draw him into the conversation. I think the guy is right, he says. Let’s be real: this is what politics is all about. Look at Eliot Spitzer.

 

Courtesy of the author
Courtesy of the author

We proceed to discuss the disgraced New York governor, who is in the middle of grappling with revelations of participation in a prostitution ring, over which he will resign shortly. For the next ten minutes or so, we assess how much or how little politics has changed. I ask how much of a difference there is between what Spitzer did and what Plunkitt is describing. (I don’t get into how much of Plunkitt’s persona is a collaboration between him and Riordan; that’s a conversation for another day.) At least Plunkitt is no hypocrite the way the crusading Spitzer is. But I also ask: if Plunkitt buys land at ten dollars an acre and sells it for one hundred dollars an acre, is he stealing ninety dollars an acre from the public?

Absolutely! says Laura.

But isn’t a Wall Street speculator pretty much doing the same thing?

Manuel nods approvingly. J.D. furrows his brow: he’s working this.

Well yes, says Alison. It’s just a different kind of theft, not that she’s particularly surprised by either. I find her sophistication a little unsettling in a sixteen-year-old.

Are you saying that we hold politicians to a different standard than other people? A higher standard than other people?

Alison shrugs, a gesture that says, “Yes I am saying that, but you shouldn’t take me any more seriously than I’m taking you.”

Laura has an incredulous expression on her face: She’s about as bright as Alison but wholly lacks Alison’s sense of irony. I observe to the class that Laura appears bemused by Alison’s position and ask her if she thinks politicians should in fact be held to a higher standard than other citizens.

Yes, she says, but immediately upon doing so she starts to backtrack, recognizing even as she says so that they’re just people too, before her improvised meditations descend into the incoherence of a mouth that moves more slowly than a mind. I might try to untangle her thoughts, but we’re just about out of time. I know this because people like Tess, who never say a word, have begun moving books into their backpacks, part of a growing rustle that’s my not-so-subtle cue to wrap this up.

By way of conclusion, I remind the class of something Leah said: “A politician is not supposed to do that.” Whether or not you agree, I say, a vision of the way the world works is embedded in that assertion. I ask the class to keep thinking about Plunkitt and the way his remarks might help clarify their own notion of what politics should be. (A rather large and amorphous request, to be sure, but you never know who will latch onto what.) And then I say I want them to graduate, go to college, and use that vision of politics to go and make the world a better place. I get some smiles as the desks and chairs slide around and the desperately bored make their exit. They know what I mean in a Jon Stewart kind of way. And they know, just like him, that I’m not entirely joking.

I feel a nagging sense of unease as I head upstairs to my desk to check my messages and grade some papers. I’ve done my job, by my lights anyway: to foster a spirit of thoughtful inquiry in good democratic fashion. As such, I would like be able to see myself as furthering a long and honorable living tradition. And yet I know not only that this isn’t happening very frequently in thousands of high schools around the country but that it can’t. For one thing, there’s a prescribed curriculum, and someone like George Washington Plunkitt is not likely to show up on it. Insofar as he might, it’s likely to be as a series of facts (file under: Tammany Hall; nineteenth-century urban politics). And even if the students were inclined to have a discussion of the kind I did, any number of forces would mitigate against it, ranging from administrative procedures, to student apathy, to district requirements to teach to the test. Far from honest laborer tilling the fields of democracy, I am a cosseted servant of the ruling class inviting the children of an elite to do what our society doesn’t want or can’t afford to do for the majority of its children: invite them to think for themselves. I’m reminded of Marie Antoinette strolling around Le Petit Hameau, her peasant cottage/garden complex at Versailles, fancying herself a farmer.

But that’s not the worst of it. For it might be one thing if, in fact, I knew that the education these students were getting from me and others would allow them to at least preserve the values I’m modeling, if not explicitly espousing. Yet I’m haunted by a fear that insofar as I succeed, their educations will be worse than useless—that, like Marie Antoinette, they will be confronted with a world in which the grooves of their minds make them less, not more, able to adapt to a coming dispensation, one in which a sentimental attachment to democracy will not receive the lip service it does now. I worry, in other words, that they will be old before their time.

It’s at this point that I seek cover in a sense of humility. Don’t kid yourself, I think. They’re not empty vessels, and you’re not a fountain that fills them. These students—the smart ones, anyway—will discard what they need to. Education: a process of figuring out what doesn’t matter. You’re just a teacher.

George Washington Plunkitt is chuckling, softly. Who’s the honest one now, he asks with an almost mirthless smile. Did you remember to turn off the lights before you left the room?

 

This article originally appeared in issue 9.2 (January, 2009).


Jim Cullen, Common-place column editor and regular contributor, teaches at the Ethical Culture Fieldston School in New York, where he serves on the Board of Trustees. He is the author of the newly published Essaying the Past: How to Read, Write and Think about History and other books. This essay is adapted from a work-in-progress about a year in the life of the U.S. History Survey. Names have been changed to protect the identities of his students.




Sex and Public Memory of Founder Aaron Burr

Historian Nancy Isenberg has analyzed the sexualized politics of the early Republic that gave rise to Burr’s reputation as an immoral, sexually dissipated man. As Isenberg explains, Burr became the target of sexually charged attacks in the press for fifteen years beginning with his becoming a U.S. senator for New York in 1792. This depiction of him as sexually corrupt in his private life contrasted sharply with his early pedigree and public accomplishments. The grandson of famed New England minister Jonathan Edwards, Burr was born in 1756 in Newark, New Jersey. He attended Princeton at the age of thirteen, eventually becoming a successful lawyer. He served as a U.S. senator, as the third vice president of the United States, and as a major figure in the development of the political party system in the new nation. In 1800 Aaron Burr stood a “hair’s breadth” away from becoming the third president of the United States, losing to Thomas Jefferson by just one electoral vote. He married widow Theodosia Prevost in 1782. Together they had one child, Theodosia. Both his wife and daughter perished tragically and prematurely: His wife died from cancer in 1794, and his daughter was lost at sea in the winter of 1812. Her son (Burr’s only grandchild) had died at the age of ten that same year. In 1833, after almost four decades of being a widower, he married widow Eliza Jumel, separating just four months later. He lived until 1836, dying at the age of 80—on the very day that their divorce was finalized.

Public memory of Aaron Burr contains fascinating threads that defend his reputation by asserting that his inner self conformed to normative, idealized standards, and thus that he could not have been guilty of the charges of immorality that were leveled against him. There has never been a shortage of negative depictions of Burr, but it has become a nearly two-centuries-old cliché that he “has always been out of favor,” that he has only enjoyed the reputation of “outright villain” among the founders. By tracing defenses of his personal life from the nineteenth century to the recent past, this essay shows that sex has long been used to define the character of the American founders; arguably it continues to be used in this capacity as a window to the nation’s soul.

Two Burrs, Burr the traitor and Burr the rake, were often co-conspirators. In the preface to an 1847 novel titled Burton: Or, the Sieges, the incredibly prolific popular novelist Joseph Holt Ingraham illustrated how negative depictions of Burr explicitly connected his private character and his political person: “In the page of history from which this romance is taken, we see the young aid-de-camp exhibiting the trophies of his conquests, drawn from the wreck of innocence and beauty. If we turn to a later page, we shall see the betrayer of female confidence, by a natural and easy transition, become the betrayer of the trust reposed in him by his country, and ready to sacrifice her dearest interests on the altar of youthful vanity, ripened into hoary ambition.”

His earliest biographer, Matthew L. Davis, stated that he had possession of virtually all of Burr’s letters and met and discussed with him (at Burr’s request) as he worked on his memoirs. Burr’s letters, according to Davis, indicated “no very strict morality in some of his female correspondents.” Acting with the chivalry that his subject supposedly lacked, Davis separated out and destroyed such letters to protect the reputations and virtue, not of Burr, but of the young women and their families. He claimed that Burr wouldn’t let the letters be destroyed in his lifetime, but when Burr died Davis burned them all so that no one else could publish them. In the absence of such sources, biographers have largely had only the accusations to work with.

 

"Portrait of Aaron Burr," engraved by J.A. O'Neill, after portrait by John Vanderlyn (1802). Courtesy of the Portrait Prints Collection, the American Antiquarian Society, Worcester, Massachusetts.
“Portrait of Aaron Burr,” engraved by J.A. O’Neill, after portrait by John Vanderlyn (1802). Courtesy of the Portrait Prints Collection, the American Antiquarian Society, Worcester, Massachusetts.

Davis was criticized by numerous biographers for largely depicting Burr as his political enemies had done. Burr’s second biographer, James Parton, set the tone for future defensive accounts. Parton, who would found American Heritage, was the most popular biographer of nineteenth-century America. He complained: “Mr. Matthew L. Davis, to whom Colonel Burr left his papers and correspondence, and the care of his fame, prefaces his work with a statement that has, for twenty years, closed the ears of his countrymen against every word that may have been uttered in Burr’s praise or vindication.”

Parton’s mid-nineteenth-century account defended Burr from a host of negative depictions, beginning with those that centered on his youth and reputation as a college lothario. “It has been said … that he was dissipated at college; but his dissipation could scarcely have been of an immoral nature.” Burr, he explained, was not given to immoral activities that typically link to sexuality, including gambling, drinking, and general excess.

One such rumor was that during the Revolution, he seduced and abandoned a young woman named Margaret Moncrieffe. Parton’s biography dismissed the story, and additionally cast aspersions on her character. Parton described Moncrieffe as a girl of fourteen, “but a woman in development and appetite, witty, vivacious, piquant and beautiful.” He attempted to discredit her by portraying her as immoral, stating the account had been “published after she had been the mistress of half a dozen of the notables of London.” And he lamented Burr’s legacy: “the man has enough to answer for without having the ruin of this girl of fourteen laid to his charge.”

Later defenders would echo Parton’s response. An 1899 biography of Burr by Henry Childs Merwin explained: “It is evident that, whatever may have been Burr’s conduct toward Margaret Moncrieffe, the lady herself, the person chiefly concerned, had no complaint to make of it.” And Merwin yoked Burr’s sexual reputation to broader character traits. “Burr was all his life an excessively busy, hard-working man; he was abstemious as respects food and drink; he was refined and fastidious in all his tastes; he preserved his constitution almost unimpaired to a great age. It is nearly incredible that such a man could have been the unmitigated profligate described by Mr. Davis.”

Burr’s defenders also trained their sights on his marriage. Similar to popular depictions of Hamilton, Washington, and Jefferson, in the hands of his biographers Burr appears to have experienced the perfect marital union. (And similar to the cases of Hamilton, Washington, and Jefferson, we have little to no documentation to support the characterization of this very personal relationship.) Virtually all of his defenders emphasize the idealized romantic bond that he shared with his wife. Parton insisted: “To the last, she was a happy wife, and he an attentive, fond husband. I assert this positively. The contrary has been recently declared on many platforms; but I pronounce the assertion to be one of the thousand calumnies with which the memory of his singular, amiable, and faulty being has been assailed. … I repeat, therefore, that Mrs. Burr lived and died a satisfied, a confiding, a beloved, a trusted wife.”

Parton made it clear that Burr could have won the hand of any young “maiden” he desired. But that he “should have chosen to marry a widow ten years older than himself, with two rollicking boys (one of them eleven years old), with precarious health, and no great estate,” revealed much about his character. And, indeed, for Parton the marriage countered much that had been written about Burr. “Upon the theory that Burr was the artful devil he has been said to be, all whose ends and aims were his own advancement, no man can explain such a marriage.”

Parton emphasized that Burr was not guilty of marrying for money: “Before the Revolution he had refused, point-blank, to address a young lady of fortune, whom his uncle, Thaddeus Burr, incessantly urged upon his attention.” And he could have married others for personal gain: “During the Revolution he was on terms of intimacy with all the great families of the State—the Clintons, the Livingstons, the Schuylers, the Van Rensselaers, and the rest; alliance with either of whom gave a young man of only average abilities, immense advantages in a State which was, to a singular extent, under the dominion of great families.”

No, it would be made clear that Burr married not for power but instead for love. Parton explained, “no considerations of this kind could break the spell which drew him, with mysterious power, to the cottage at remote and rural Paramus,” where his future wife lived.

Parton wrote in a decade that saw the emergence of a dedicated women’s rights movement, and he portrayed Burr as an early feminist, a view that would later be more fully developed: “He thought highly of the minds of women; he prized their writings. The rational part of the opinions now advocated by the Woman’s Rights Conventions, were his opinions fifty years before those Conventions began their useful and needed work,” Parton claimed. (At the time of the publication of his biography of Burr, James Parton was married to Sara Payson Willis, who had gained fame under her pseudonym Fanny Fern as the author of the proto-feminist novel Ruth Hall.) Parton’s depiction of Burr’s wife as friend supported the claim that Burr had a deep respect for women. “The lady was notbeautiful. Besides being past her prime, she was slightly disfigured by a scar on her forehead. It was the graceful and winning manners of Mrs. Prevost that first captivated the mind of Colonel Burr.”

Burr’s defenders have long recognized the need to defend his personal life as part of the defense of his political life. Virtually all have recognized the significant role that his personal reputation played in his public standing. Parton insisted: “Burr nevercompromised a woman’s name, nor spoke lightly of a woman’s virtue, nor boasted of, nor mentioned any favors he may have received from a woman.” Indeed, he exclaimed, “he was the man least capable of such unutterable meanness!” Although Burr has remained a lesser known founder, and one with a tarnished reputation, his ample supply of defenders have long followed Parton’s well-constructed foundation, one that relied on yoking positive portrayals of his sexuality in an effort to shore up his battered political self.

Many of his early twentieth-century biographers decried the fact that his personal life overshadowed his public accomplishments, and they continued to highlight his intimate life as one of virtue. The alleged falsity of the tale of the seduction and abandonment of Margaret Moncrieffe and additionally the supposed lies behind a story of the intentional “ruin” of one Miss Bullock were repeatedly used to defend his character. A 1925 biography by Samuel Wandell and Meade Minngerode prematurely stated that the legend about Bullock had been “finally laid to rest” by the reference librarian at Princeton, who had “showed conclusively, from evidence furnished by the unfortunate lady’s family” that she had died “quite virtuously.” Nathan Schachner wrote in his biography of Burr, a decade later, that: “Another legend is not so innocuous. It was the forerunner of a whole battalion of similar tales, all purporting to prove Aaron Burr a rake, a seducer, a scoundrel, a man without morals and without principles, wholly unfit to be invited into any decent man’s home. Though, on analysis, not one of these infamous stories has emerged intact.” He then described the “canard” of Burr seducing and abandoning a “young lady of Princeton” who later in “despair committed suicide.” The author explained that the girl died of “tubercular condition” twenty years after Burr graduated from Princeton.

Some accounts defended Burr as having exposed Moncrieffe as a spy for the British. A 1903 historical novel—Blennerhassett, by Charles Felton Pidgin—depicted the Moncrieffe story as a later burden for Burr, despite the fact that he was in fact a great patriot. In this regard, rumors about Burr’s sexual history were criticized for overshadowing the truth of his virtue and for hiding what was his true patriotism. Explained the character of Burr in the novel: “‘I became convinced that she was conveying intelligence to the enemy and I wrote a letter to General Washington informing him of my suspicions. By his orders, she was at once sent out of the city. The chain of circumstances was followed up and it was discovered that the mayor of the city, who was a Tory, and Governor Tryon, the British commander, who made his headquarters on board the Duchess of Gordon, a British man-of-war lying below here in the river, were implicated in the plot.'” The man he explained this to asked: “‘And were you publicly thanked by the commander- in-chief?'” “‘Not by name,’ said Burr, somewhat abruptly, and he thought of the manner in which his name had been coupled with that of the young lady in question.” Here Burr was portrayed as the victim of his own patriotism. For this author, dismissing the Moncrieffe story not only cleared Burr’s name—it made it possible to depict the true Aaron Burr, a patriot and war hero.

Another early twentieth-century account, by Alfred Henry Lewis, romanticized the incident, notably including only vague reference to the young woman’s age: “On that day when the farmers of Concord turn their rifles upon King George, there dwells in Elizabeth a certain English Major Moncrieffe. With him is his daughter, just ceasing to be a girl and beginning to be a woman. Peggy Moncrieffe is a beauty, and, to tell a whole truth, confident thereof to the verge of brazen… . Young Aaron, selfish, gallant, pleased with a pretty face as with a poem, becomes flatteringly attentive to pretty Peggy Moncrieffe. She, for her side, turns restless when he leaves her, to glow like the sun when he returns. She forgets the spinning wheel for his conversation. The two walk under the trees in the Battery, or, from the quiet steps of St. Paul’s, watch the evening sun go down beyond the Jersey hills.” This account styled Moncrieffe as hardly a victim, but rather as “brazen,” welcoming the advances of the dashing young soldier. The defense of Burr in the case of Moncrieffe would continue through the twentieth century. A mid-century account by Herbert Parmet and Marie Hecht dismissed the story directly, stating that the “lady’s own words contradict this assumption” and calling it a “very good example of the propensity of his chroniclers to link Burr’s name with women, particularly notorious ones.” Milton Lomask’s two-volume biography included the story of Miss Bullock as a “typical example of the many half-factual, half-fanciful tales that have attached themselves to the memory of Aaron Burr.” It continued by explaining that, “fed by Burr’s then growing reputation as a ladies’ man, this macabre tale persisted in the face of evidence, unearthed by a Princeton librarian, that Miss Bullock had died in the home of an aunt, ‘quite virtuously,’ of tuberculosis.”

 

"President's Row, Princeton Cemetery," with Aaron Burr's name on tombstone in foreground. Detroit Publishing Company (c. 1903). Courtesy of the Library of Congress Prints and Photographs Division, Washington, D.C.
“President’s Row, Princeton Cemetery,” with Aaron Burr’s name on tombstone in foreground. Detroit Publishing Company (c. 1903). Courtesy of the Library of Congress Prints and Photographs Division, Washington, D.C.

In a similarly defensive move, Burr’s marriage was idealized by his twentieth-century biographers, as it had been by Parton a century earlier. Henry Childs Merwin wrote in his 1899 biography of Burr that “his family life was ideal,” and Charles Burr Todd, writing three years later, stated: “I think it should be mentioned here—because the opposite has been stated—that the marriage was conducive of great happiness to both, and that Colonel Burr was to the end the most faithful and devoted of husbands.” Quoting a lengthy passage in the Leader, it continued, and included the following: “His married life with Mrs. Prevost … was of the most affectionate character, and his fidelity never questioned.” Virtually all of the accounts read in a similar manner. Consider, for example, the following: “This marriage certainly gives no color to the popular belief that Colonel Burr was a cold, selfish, unprincipled schemer, with an eye always open to the main chance.” Similarly, Wandell and Minngerode defended Burr’s marriage thusly: “It was a love marriage, that of Aaron Burr and Theodosia Prevost,” and “admirable in the last degree.”

The depiction of his marriage as spotless provides a powerful counterweight to the blemishes that mar both his public and private reputations. Biographers implicitly and explicitly use the bond of husband and wife to discredit those who challenge his personal character in the area of romantic relations. One 1930s author noted: “Between Burr and his wife ardent love had deepened to an abiding trust.” This depiction only deepened in the twentieth century. In the early 1970s, Laurence Kunstler described the marriage as “twelve wonderful, happy, and triumphant years,” and Jonathan Daniels lauded the union as “a faithful love which only the most austere historians and venomous critics have questioned.” Samuel Engel Burr Jr.—the founder of the Aaron Burr Association, a professor of American studies, and a sixth-generation descendant of Burr—wrote several books in the 1960s and 1970s defending his ancestor’s reputation, and all bolstered his character by defending his marriage. In Colonel Aaron Burr, Burr depicted it as a “happy experience for both of them.” And in a Mother’s Day lecture delivered to the New York Schoolmasters’ Club, he focused on the “influence of [Burr’s] wife and his daughter” on his “life and career” to underscore his domestic bond, in contrast to the view of him as a vile seducer of women. (Burr Jr. also argues that Madame Jumel divorced Aaron on trumped up charges of adultery, arguing it was the “only legal grounds for divorce” at the time, thus trying to further wipe the slate clean.) Virtually all authors agree with Jonathan Daniels, who argued that “Nothing is more clear in the record than Burr’s tenderness and concern for his wife.” Still others, including Milton Lomask, contended: “To trace Aaron Burr’s life as a husband and father … is to glimpse the man at his best. Domesticity became him.”

Of particular importance to Burr’s defenders was his choice of spouse. Virtually all biographers insert that Mrs. Prevost was no “beauty,” underscoring that there was no superficial attraction that drew Burr to her. In a typical example, Nathan Schachner described her as “not beautiful,” “pious,” “well read and cultured.” This view continued through the twentieth century. Burr could have married “into any of those powerful prosperous dynasties,” wrote Laurence Kunstler, emphasizing that he had instead married for love. Charles Burr Todd (a descendant of the Burr family) made a similar point in his 1902 biography: “He was young, handsome, well born, a rising man in his profession, and might no doubt have formed an alliance with any one of the wealthy and powerful families that lent lustre to the annals of their State. This would have been the course of a politician. But Burr, disdaining these advantages, married the widow of a British officer, the most unpopular thing in the then state of public feeling that a man could do, a lady without wealth, position, or beauty, and at least ten years his senior, simply because he loved her; and he loved her, it is well to note, because she had the truest heart, the ripest intellect, and the most winning and graceful manners of any woman he had ever met.” Late in life, Aaron Burr would marry a second time. But as if to underscore the significance of his first marital bond, no biographers dwell on this bond or the marriage.

Virtually all twentieth-century accounts point out that in contrast to the politicized depiction of Aaron Burr as a man who seduced and abandoned women, Burr “showed an understanding of women.” Such authors typically concede that Burr had numerous affairs with women, but that they were not exploitive. As Jonathan Daniels wrote, perhaps over-descriptively: “There was never anything in his life, however, to suggest the bestiality and brutality in sex which his enemies imputed to him. Concupiscent, he may have been, cruel he never was.”

Burr’s most recent biographer, Nancy Isenberg, the only academic historian to take on that task, highlights his support for early feminism as evidenced by the fact that his marriage was “based on a very modern idea of friendship between the sexes.” Calling Burr a “feminist,” she argues that he was alone among the Founding Fathers in this regard: “No other founder even came close to thinking in these terms.”

Today, much as in his own lifetime, the debate rages about the salience of his personal life for understanding the “true” Burr. Some contend that the “true biography” of Burr “must be disentangled” “from … a mass of legend about his lapses with the ladies.” Others revel in those stories as a way to bring to life the Burr they think existed. The view of Burr as unique—for better or worse—is an old one. James Parton, writing in direct response to the early account of Matthew Davis, set the tone for a defense of Burr’s personal life that would last until the present day. Parton could not have been more assertive:

Aaron Burr, then, was a man of gallantry. He was not a debauchee; not a corrupter of virgin innocence; not a de-spoiler of honest households; not a betrayer of tender confidences. He was a man of gallantry. It is beyond question that, in the course of his long life, he had many intrigues with women, someof which (not many, there is good reason to believe) were carried to the point of criminality. The grosser forms of licentiousness he utterly abhorred; such as the seduction of innocence, the keeping of mistresses, the wallowing in the worse than beastliness of prostitution.

This kind of defense continued through the end of the nineteenth century, with biographers outlining their case against his detractors and making a strong case for examining the public and private life of a man who clearly had intimate relationships outside the context of marriage and who raised questions in many minds about his allegiance to the nation.

Twentieth-century biographers wrote of Burr as a victim on many scores: of politics in the early republic, of a back-stabbing first biographer, and of later portrayals, as “one of history’s greatest losers,” as Donald Barr Chidsey put it. The novel Blennerhassett began with a similar note of Burr’s exceptional status: “For a hundred years, one of the most remarkable of Americans has borne a weight of obloquy and calumny such as has been heaped upon no other man, and, unlike any other man, during his lifetime he never by voice or pen made answer to charges made against him, or presented either to friends or foes any argument or evidence to refute them.” Nathan Schachner, writing in the 1930s, similarly captured the view of many biographers who have chronicled Burr. He wrote: “Probably of no one else in American history are there more unsupported, and unsupportable, tales in circulation.” And he ended his biography with a similar refrain: “Who in history has survived a more venomous brood of decriers?”

Burr’s legacy dramatically illustrates the various ways that sexual reputation informs public masculine character. Despite the complaints of his biographers who positioned themselves as solitary champions of history’s greatest victim, a man repeatedly “misinterpreted” and “misjudged,” there has never been a shortage of Burr defenders, then or now, and virtually all of them use sex as one means of shoring up his public standing. Our enduring interest in connecting personal with public selves will almost certainly keep competing Burrs alive in popular memory—and will no doubt prevent Aaron Burr from ever being either completely “rescued” or finally banished from the pantheon of great American founders.

Further Reading

This essay comes out of my research for my most recent book, Sex and the Founding Fathers: The American Quest for a Relatable Past (Philadelphia, 2014), which examines the ways in which we have (or haven’t) talked about the sex lives of the founders. The current depiction of Burr as sexually and morally bankrupt was perhaps most popularly captured by the 1973 historical novel Burr by Gore Vidal, in which Burr is gossiped to be the “lover of his own daughter”—a fictionalized rumor created by Vidal. Burr has been the subject of more straightforward biographies since shortly after his death. The earliest is Matthew L. Davis, Memoirs of Aaron Burr with Miscellaneous Selections from his Correspondence (New York, 1836), while the most enthusiastically pro-Burr may be James Parton, Life and Times of Aaron Burr (New York, 1858). There have been numerous biographies since, including Herbert S. Parmet and Marie B. Hecht, Aaron Burr: Portrait of an Ambitious Man (New York, 1967); Donald Barr Chidsey, The Great Conspirator: Aaron Burr and His Strange Doings in the West (New York, 1967); Jonathan Daniels, Ordeal of Ambition: Jefferson, Hamilton, Burr (New York, 1970), Laurence Kunstler, The Unpredictable Mr. Aaron Burr (New York, 1974); Milton Lomask’s two-volume Aaron Burr (New York, 1979-82). The best modern biography is by Nancy Isenberg, Fallen Founder: The Life of Aaron Burr (New York, 2007). Her essay “The ‘Little Emperor’: Aaron Burr, Dandyism, and the Sexual Politics of Treason,” in Jeffrey L. Pasley, Andrew W. Robertson, and David Waldstreicher, eds., Beyond the Founders: New Approaches to the Political History of the Early American Republic (Chapel Hill, N.C., 2004) is also extremely valuable.

Burr is notable among the founders for the extent to which his descendants have taken up his cause. Charles Burr Todd, a historian of the Burr family from Connecticut, wrote The True Aaron Burr: A Biographical Sketch, in 1902 (New York). Samuel Engle Burr Jr. not only founded the Aaron Burr Association; he also wrote books loyal to his ancestor’s memory, including Colonel Aaron Burr: The American Phoenix (New York, 1961) and The Influence of his Wife and his Daughter on the Life and Career of Col. Aaron Burr (Linden, Va., 1975).

 

This article originally appeared in issue 15.1 (Fall, 2014).


Thomas A. Foster is professor of history at DePaul University. He is the author and editor of six books, including Sex and the Founding Fathers: The American Quest for a Relatable Past (2014). Foster tweets at @ThomasAFoster.




The Kingness of Mad George

The roots of the current debate over presidential power

The recent conflict over President Bush’s domestic surveillance program reflects one of the oldest recurring divisions in American politics, dating all the way to the 1790s. Bush’s Democratic critics have taken a stance that traces back to the Jeffersonian (or Democratic) Republicans, arguing that the U.S. government is rather flexibly bound, but still bound, by the values and rules embedded in our founding documents and, as such, is a government whose power is essentially limited. The Bush administration and its modern (anti-Democratic) Republican defenders have staked out a position that traces back to Alexander Hamilton and the Federalists, reasoning from the inherent nature of government and the overwhelming fearsomeness of the challenges the United States faces that the powers of its government must be essentially unlimited. The GOP-Federalist position applies especially to times of foreign crisis, a state that Federalists saw as virtually perpetual in the early Republic and the Republicans have likewise been warning about ever since the outbreak of the cold war in 1946.

This recurring argument has often turned on the question of whether the norms and procedures of democracy and republicanism are adequate to national survival in a dangerous world of terrorists, Commies, and Frenchmen. Federalists and modern Republicans alike have often indicated their belief, expressed with varying degrees of regret, that the methods of democratic, accountable, transparent government are not strong enough to meet these challenges. Jeffersonian Republicans and modern Democrats, in turn, have tended to respond that they are. The essence of the frequently heard rightist refrain that America cannot fight the evildoers of the moment with democracy tying its hands or with one arm tied behind its back (fill in your Goldwaterish/Cheneyesque metaphor) can be found in a recent Wall Street Journal op-ed columnabout the Pentagon paying Iraqi journalists for favorable coverage. If the U.S. military had elected to “play by Marquess of Queensberry rules,” argued the WSJ, we would have had to “wait decades” for some good Arab press, and we would have created “a heady propaganda win for the terrorist/insurgents, a prolonged conflict, and more unnecessary violence and death”—as opposed to the speedy triumph the writer apparently believes we are experiencing in Iraq right now.

The key difference in the recurring party debate is not so much the government’s or military’s mere use of extraconstitutional powers and undemocratic methods. Those things have happened under many presidents of most of the major U.S. parties, especially during the cold war. The key is the further act of justifying such powers and methods in principle. George W. Bush and Dick Cheney have repeatedly gone out of their way to do this, asserting and exercising an alleged independent presidential authority to do things (like eavesdropping on suspected terrorists) the government was able to do just as swiftly and effectively under existing legal procedures. (A secret court was created in the 1970s with no other purpose than legally authorizing government eavesdropping when national security requires it.) In other cases, they have ordered up briefs to self-legalize obviously unconstitutional powers to have people tortured and to hold American citizens without charge or trial.

A similar tactic was recently used against Senator John McCain’s anti-torture resolution, a measure that Bush vehemently opposed but finally signed just before New Year’s Day. With the president’s signature, the administration included a “signing statement” explaining that it reserved the right to torture whoever it pleased no matter what the resolution said.

The executive branch shall construe [the provision], relating to detainees, in a manner consistent with the constitutional authority of the President to supervise the unitary executive branch and as Commander in Chief and consistent with the constitutional limitations on the judicial power, which will assist in achieving the shared objective of the Congress and the President . . . of protecting the American people from further terrorist attacks.

The recipe for this little writ of mandamus is two parts pure executive prerogative and one part the ends justify the means. The statement invokes the president’s “constitutional authority” but employs a concept not found in the Constitution: the idea that the president has the apparently sole and absolute power to supervise a “unitary executive branch.” Advise and consent this, Congress. The only constitutional limitation mentioned is on the judicial branch and any effort it might make to hold the “unitary executive” to any procedural standards when it decides to detain people. Capping things off we have a statement implying that any action the administration deems handy in the “shared objective” of “protecting the American people” is automatically legal and constitutional.

In the same vein, President Bush’s December radio address assured listeners that the National Security Agency’s warrantless domestic-spying program was “fully consistent with my constitutional responsibilities and authorities.” Not legally authorized by Congress, but “consistent” with the general ends of the president’s duties. Bush could not even cite which constitutional duties he might mean because those would actually be quite hard to find in the Constitution. Commander in chief of the armed forces is one thing, but Bush and Cheney clearly have some broader and frankly more king-like role in mind, something along the lines of the monarchical title that John Adams thought presidents should bear: “His Highness the President of the United States and Protector of the Rights of the Same.” Karl Rove might want to add the British monarchs’ tag, “Defender of the Faith,” for the religious Right’s benefit. Even closer to what Bush and Cheney seem to intend would be the title that Richard III used before he finally dealt with those pesky little congressmen, I mean princes, in the Tower: “Lord Protector of the Realm.”

Hamilton, Lincoln, and the Inherent-Powers Tradition

President Bush’s admirers will doubtless be heartened by the knowledge that he shares some aspects of this governing philosophy with the newly re-burnished “Business Class Hero” of the founding era, Alexander Hamilton. Confronted by Thomas Jefferson and James Madison with the fairly credible argument that the brand-new Constitution did not provide the government with the power to create his proposed national bank, Hamilton appealed by referring, not simply to the text of the Constitution itself, but more importantly to the “general principle . . . . inherent in the very definition of government.” The principle was “That every power vested in a government is in its nature sovereign, and includes, by force of the term, a right to employ all the means requisite and fairly applicable to the ends of such power.” While Hamilton recognized (unlike Bush) that a constitutional government could not legally engage in actions that its constitution specifically prohibited, his “definition of government” was in fact far older than the United States and its founding documents, and in truth it was not terribly respectful to those documents. Hamilton derided Jefferson and Madison’s arguments that the text of the Constitution might truly limit the government’s “sovereign power, as to its declared purposes and trusts,” writing that they presented “the singular spectacle of a political society without sovereignty, or of a people governed without government.” It barely dawned on Hamilton that such a spectacle, of a people governed without a traditional European form of government, was exactly what many Americans thought their revolution had sought.

 

Fig.1
Fig.1

Abraham Lincoln fell back on a similarly ante-constitutional notion of the inherent powers of government in justifying his decision to restore the Union by force. As explained in his first inaugural address, Lincoln held “in contemplation of universal law” that “the Union of these States is perpetual.” Like Bush and Hamilton, Lincoln invoked the Constitution but based his position largely on concepts not mentioned in it. “Perpetuity is implied, if not expressed, in the fundamental law of all national governments. It is safe to assert that no government proper ever had a provision in its organic law for its own termination.” It may not have been as safe to assert this as Lincoln hoped because for many Americans, and not only the defenders of slavery, the U.S. experiment in liberal government had relatively little in common with the fundamental law of all other national governments. They did not see the United States as a “government proper” if that meant it existed in unconditional perpetuity, with the people losing forever the Lockean right of revolution described in the Declaration of Independence.

The Hamilton/Lincoln idea of the “definition of government” or “government proper” amounts, in the final exigency, to the very old and widely embraced idea of government as rulership, the repository of sovereign authority that has no superior within its ambit and cannot be lawfully overruled. Though not necessarily absolute or completely insulated from popular influence, this sort of government derives its authority from some transcendent and irresistible source, a divine source for most of the monarchs who practiced it and a natural source—the nature of government and the practical requirements of nation-building—for Hamilton and other American advocates of inherent powers.

The logic behind this view can seem beguilingly simple and practical. Government is coterminous with the community and the guarantor of its structure, values, and very existence—matters too basic to be left to the whims of political give-and-take. Government is charged with the fundamental tasks of preserving the community from internal disorder, external conquest, and other forces that threaten to destroy it. Burdened with such awesome responsibilities, it needs powers to match, powers that were limited only by what its subjects would accept as legitimate by their mere acquiescence.

Defenders of the inherent-powers position frequently and significantly direct attention to the necessity or desirability of the ends they seek to achieve: fighting the terrorists or Communists or (in Hamilton’s case) achieving national greatness and economic growth. While such goals were worthy enough on their own, the move of loudly proclaiming their transcendent worthiness is a political tactic rather than a constitutional or substantive argument; its real function is to embarrass and silence critics by calling their patriotism or morals into question. At the same time, the tactic expresses a basic tenet of old-school governance, which is that law, procedure, and constitutionalism are minor matters as long as what Hamilton called “the essential ends of political society”—security and prosperity and whatever other states of being a community wants for itself—are being met. State this as a folksy modern politician might, say as “getting the job done,” and it sounds like practical good sense. State it a bit more clearly, and it makes a mockery of the very idea of limited, transparent, and democratic government by dismissing it as so much “red tape.”

Angels in the Form of George W. Bush?

As Reinhard Bendix points out in Kings or People (1978), one of the very first scholarly books I can remember buying, the old-school view of government as a mandate to rule, constrained only by such compromises as were necessary to allow the mandate’s continued existence, is one that any medieval king, pope, god-emperor, or caliph would have found perfectly familiar. A ruler had to do what a ruler had to do. And you knew his actions were legitimate if he got away with them and succeeded in his goals. 

While it has monarchical origins, the reliance on inherent powers does not by itself render a government monarchical. Early American nationalists like Hamilton, Lincoln, and Daniel Webster had made the modernizing transition that Bendix describes from God to “the people” or “the nation” as the inviolate source of governmental authority. By contrast, Bush and Cheney clearly hearken back to the older monarchical model in which everything rests with the supreme ruler and his supreme duties. The key difference lies in their approach toward law. While different societies tend to have very different legal traditions, in the crudest sense we may say that kings got to be kings by establishing themselves as the sole legitimate source of secular law within their realms. American government has long been celebrated as one of “laws, not men,” where law is created by following certain publicly known and set procedures and, in the process, obtains some form of popular consent.

This is where Bush and Cheney’s views and actions seem quite breathtakingly dangerous. There have likely been absolute monarchs whose lawmaking was more procedurally constrained than that of the present administration. “We have a system of law,” Senator Russ Feingold said of the NSA spying program. “He just can’t make up the law . . . It would turn George Bush not into President George Bush, but King George Bush.” While I hope and believe that George W. Bush has no intention of crowning himself, his mentor Cheney has been seeking “unimpaired” presidential power ever since his days as a junior aide in the Ford White House. Why should his president/boy-prince be forced to endure the insolent effrontery of pesky reporters and congressional investigating committees? For Cheney, the Imperial Presidency is a matter of personal and ideological conviction.

Despite my obvious preferences in present politics, the underlying philosophical question here is still an open one for me. All governments probably do have inherent powers they will have to exercise in times of crisis. Lincoln certainly faced one and probably made the most courageous and far-sighted choice. Yet we should be clear that we are doing just that—making a choice—when we endorse government action based on such thinking. Governing on the basis of inherent powers rather than clear legal-constitutional authority is a distinctly undemocratic, illiberal, and un-American approach to governance. As Lincoln recognized, it should be used sparingly and only when absolutely and indispensably necessary.

The problem comes when leaders manipulate the public sense of crisis to make extraconstitutional powers and presidential monarchy thinkable. The modern American Right has a long record of promoting phony or highly exaggerated crises for political effect, often as a way to attack aspects of democracy, especially the economic, cultural, and intellectual expressions of it that conservatives so dislike. Extensive freedom of expression, strict protections for the rights of the accused, and other civil liberties have never been popular with the dominant elements of the American Right, and strangely enough, the present crisis—whatever it is—always seems to demand that civil liberties be curtailed in some way. The 9/11 terrorist attacks only provided a more easily salable version of the ongoing crisis that the Right has been ringing alarm bells over for the past sixty years or more. The sudden salience of Islamic terrorism as an issue allowed Republicans to revive many of their old cold war themes and policies and provided the opportunity to apply them in Iraq.

There is pretty overwhelming evidence that the intelligence failures regarding al Qaeda and Iraq had more to do with incompetence and ideologically driven inattention and misperception—useful information had been gathered but was not acted on or reported correctly—than a lack of “tools” such as legalized torture and illegal mass eavesdropping. Given that situation, I will let Thomas Jefferson’s first inaugural address give the last word, for now, on my behalf.

I know, indeed, that some honest men fear that a republican government can not be strong, that this Government is not strong enough; but would the honest patriot, in the full tide of successful experiment, abandon a government which has so far kept us free and firm on the theoretic and visionary fear that this Government, the world’s best hope, may by possibility want energy to preserve itself? I trust not. I believe this, on the contrary, the strongest Government on earth. I believe it the only one where every man, at the call of the law, would fly to the standard of the law, and would meet invasions of the public order as his own personal concern. Sometimes it is said that man can not be trusted with the government of himself. Can he, then, be trusted with the government of others? Or have we found angels in the forms of kings to govern him? Let history answer this question.

Further Reading:

The sources for all the quotations above are linked at the point where a particular document or news item is first introduced.

The Bush administration’s working theory of the executive’s nearly absolute powers in matters relating to national security and foreign policy has been given its most developed form by University of California, Berkeley, law professor John Yoo (a former Department of Justice official) in his book The Powers of War and Peace: The Constitution and Foreign Affairs after 9/11 (Chicago, 2005). Simply put, the Constitution does not seem to have much to do with it, except through the most, er, tortured constructions imaginable. Yoo can be heard defending the presidential power to do just about anything here. (Link via Information Clearinghouse.)

For my money, the most incisive recent commentary on the president’s role in our current system is a chapter in Jon Stewart’s America (the Book): “The President: King of Democracy.”

I don’t claim great expertise on the history of kingship or its theoretical basis, but the remarks above are influenced by Martin Van Creveld, The Rise and Decline of the State (Cambridge, 1999); Richard L. Bushman, King and People in Provincial Massachusetts (Chapel Hill, 1992); Robert Filmer, Patriarcha and Other Writings, ed. Johann P. Somerville (Cambridge, 1991); the first part of Gordon S. Wood, The Radicalism of the American Revolution (New York, 1992); and especially Reinhard Bendix, Kings or People: Power and the Mandate to Rule (Berkeley, 1978). History Book Club dealt in some weighty tomes back in those days. Van Creveld, a military historian based in Israel, recently had some choice words on the Bush administration and the Iraq War in the Forward.

In expectation of the hate mail I will soon be receiving from Alexander Hamilton’s many fans, let me urge any present-day liberals tempted to imagine Hamilton and the Federalists as their guys in the 1790s—I know a lot of historians who incline this way—to first read Mike Wallace’s review essay “Business-Class Hero,” about the New-York Historical Society’s Hamilton exhibit. That said, Max Edling’s recent book, A Revolution in Favor of Government: Origins of the U.S. Constitution and the Making of the American State (New York, 2003), convinced me that Hamilton was a more measured statist than I once believed. A somewhat overdrawn reminder that the early presidents were no strangers to the perennial presidential yen for secrecy and covert action is Stephen F. Knott, Secret and Sanctioned: Covert Operations and the American Presidency (New York, 1996).

While Hamilton and the Federalists strike me as far more respectful of the law than the present administration, one thing that Bush and Cheney still seem to have in common with the Federalists is a largely imaginary sense of social superiority to the rabble engaged in democratic politics. This week’s Time magazine contains a remarkable quotation in which the White House uses frank social prejudice as a way of distancing themselves from disgraced House Majority Leader Tom Delay: “Of the former exterminator, a Republican close to the President’s inner circle says, ‘They have always seen him as beneath them, more blue collar. He’s seen as a useful servant, not someone you would want to vacation with.’”

I imagine this piece will have many detractors, and I hope they and any supporters will take advantage of the Common-place Coffeeshop in making their views known. Future plans call for a blog-like discussion space that will be more directly linked to this column.

 

This article originally appeared in issue 6.2 (January, 2006).


Jeffrey L. Pasley, a former journalist and speechwriter, is associate professor of history at the University of Missouri, Columbia. He is the author of “The Tyranny of Printers”: Newspaper Politics in the Early American Republic (Charlottesville, 2001) and the co-editor (with Andrew Robertson and David Waldstreicher) of Beyond the Founders: New Approaches to the Political History of the Early American Republic (Chapel Hill, 2004).




The Clinton Impeachment: Dr. Clio Goes to Washington

On the weekend after the midterm congressional elections of 1998, I flew east to appear with a score of constitutional scholars before the Judiciary Committee of the House of Representatives. The subject of the hearing was the background and history of impeachment, and the occasion was the impending impeachment of President William Jefferson Clinton. Prior to the hearing, Forrest McDonald and I spent a pleasant hour discussing the subject on C-SPAN’s morning Washington Journal program. The highlight of that hour came when Brian Lamb replayed an excerpt from an interview he had conducted with Forrest back in the mid-1980s. When asked what one would see if one ventured out to the McDonald farm outside Tuscaloosa to catch the historian in the act of writing (in flagrante delicto, as it were), Forrest’s eyes dart to the side, a smirk briefly crosses his face, followed by the disarming confession that he writes in the nude–at least in the summer. From the studios opposite Union Station, we grabbed a cab to the House office building on the far side of the Capitol, and checked in with the committee staff; then we went to the main hearing room and took our places.

Thinking of that moment ever since has reminded me of the scene in Larry McMurty’s Lonesome Dove where Gus has to hang Jake Spoon, his Texas Ranger buddy gone bad, for throwing in his lot with the evil Suggs brothers. Gus says something like, “I’m sorry you crossed the line, Jake,” and Jake, distracted by the noose, replies something like, “I never seen no line to cross.” Walking into the committee room, I felt I had crossed a line as well. Testifying as an expert, and effectively taking sides in a highly charged political dispute, is not a role that historians assume readily, nor is it an opportunity that comes our way with any frequency. Forrest professed to be testifying only as an impartial scholar, but I, for one, wasn’t buying his line, nor was I so naive about my own sentiments as to claim to act in the same capacity. I am a native Cook County Democrat, with family ties to the old machine of the elder Richard J. Daley, and proud of it, and I thought then, as I do now, that Hillary Rodham Clinton (coincidentally the mother of one of my better-known students, though we had not yet been introduced) was close to the mark in her famous remark blaming a “vast right-wing conspiracy” for the impeachment.

 

Fig. 1. Alexander Hamilton, The Federalist, 1788, The Gilder Lehrman Collection, courtesy of the Gilder Lehrman Institute of American History, New York.
Fig. 1. Alexander Hamilton, The Federalist, 1788, The Gilder Lehrman Collection, courtesy of the Gilder Lehrman Institute of American History, New York.

For many historians, becoming professionally involved in a partisan conflict (such as these hearings) or in litigation (which I have also done) risks crossing the line between scholar and advocate. Professional historians should have no problem in admitting ambiguity or uncertainty in our findings, but political and legal disputes leave little room for scholarly hemming and hawing. Of the nineteen “experts” testifying on November 9, only one, Michael Gerhardt of William and Mary, appeared in a neutral capacity. The others had been summoned by one party or another. My own invitation came through the assistance of my Stanford colleague, Deborah Rhode, then serving as a staff attorney for the committee’s Democratic minority.

For my part, I have to confess that I was not uncomfortable in this role. For one thing, I had already begun writing op-ed essays about the constitutional issues raised by impeachment, and had formed a position strongly critical of the theory upon which it was proceeding. For another, I felt, with characteristic immodesty, that my work on the origins of the Constitution offered a perspective on the Impeachment Clauses that only a handful of scholars were qualified to present. Legal scholars aplenty would be testifying, but they are used to adversarial argument, and cavalierly happy to deploy whatever materials serve the cause they favor without the historian’s due regard for the limits and ambiguities of the evidence. I had spent more than a decade developing a model or method for conducting inquiries into the original meaning of the Constitution, with due respect for the rules of using historical evidence, and this was too good an opportunity to pass up. Moreover, I felt then, and still believe now, that historians have a civic obligation to bring their knowledge to bear, even if it involves taking sides in a partisan dispute. Obviously it would be better to do so in a more balanced, less partisan forum. But if that is all that is available, why should we forego the opportunity, challenge, and obligation? The test has to be whether what one is prepared to argue in this public role is consistent with what one has written as a scholar. On this count, I had no qualms about my ability to present an originalist argument against the legitimacy of Clinton’s impeachment that would fully comport with the discussion of the presidency in my book, Original Meanings: Politics and Ideas in the Making of the Constitution (New York, 1996).

The hearings got off to a curious start. Although most of the full committee attended most of the day, the hearings were held under the auspices of the subcommittee on the Constitution, chaired by my fellow Haverford College alumnus, Charles Canady. I was naive enough to suppose that Canady might begin by thanking the witnesses for taking time out from their schedules to help enlighten the members on a truly difficult subject. Instead, his opening remarks seemed to amount to saying, we have a rope, yonder is the tree, and all we need is to catch the evildoer and string him up.

An even more curious interlude followed. Television monitors were turned on, and the members of the committee sat raptly watching a ten-minute video consisting primarily of earnest statements about the gravity of impeachment culled from the Watergate proceedings of 1974. This struck me as a strange way to get the members in the mood, but who was the historian to judge? Having been assigned to the afternoon panel, I sat back and prepared to watch the proceedings unfold.

One lesson became evident fairly quickly. The Judiciary Committee’s reputation as the most partisan and ideologically polarized committee on the Hill was well deserved. The tone of the questioning was sometimes amiable, especially when members were questioning friendly witnesses. But from Canady’s opening remarks on, it was difficult to ignore the intensely partisan character of the proceedings, or to resist the conclusion that the hearings were basically a sham because the members already knew (barring some unforeseen political contingency) how they would vote.

As a close student of the political debates of the Revolutionary era, I have always assumed that legislative debate must matter at some level–that there must be a point to deliberations. Yet it was sobering to observe and participate in a discussion where all the positions to be taken are preordained, where the rhetorical moves each side can make are sharply constrained by the structure of the dispute and the limits of the available evidence, where debate as such can therefore have little, if any, impact. Although “pre-commitments” were not possible for many of the issues that the revolutionaries faced in the 1770s and 1780s, the hearings were a useful reminder of the lesson that historians ignore context and circumstance only at their peril.

The second great lesson I took away from the impeachment proceedings came when I had to ask myself what, if anything, I had been able to add to the discussion–as nondeliberative as it turned out to be.

Here I have to begin by describing my substantive position on the merits of impeachment. In Original Meanings, I had argued that the establishment of the presidency proved to be the single most difficult and puzzling problem in institutional design that the Framers of the Constitution had confronted. There were no useful antecedents for the national republican executive the Framers contemplated and there were numerous perplexing uncertainties about the proper mode of election and the political dimensions of executive power. These considerations help to explain why the Framers literally looped around in their attempts to decide such interlocking questions as the mode of election, eligibility for re-election, length of term, and method of removal–including, of course, impeachment. The key decisions on the presidency emerged only during the final fortnight of debate, and even then, the key initiatives came out of the so-called Committee on Postponed Parts.

If any one factor best explained the eventual design of the presidency, I argued, it was the Framers’ desire to make the executive as politically independent of Congress as possible, while allowing Congress (or more specifically the House of Representatives) the residual right to elect the president should the electoral college fail to produce a majority. My testimony to the committee argued that that same principle should be applied to the interpretation of the Impeachment Clause.

In the case of President Clinton, the key problem was to determine whether the phrase “other high crimes and misdemeanors” should be read narrowly or expansively. A narrow reading would limit impeachment to offenses that amounted to a clear abuse of the public trust in the performance of official duties. A broad reading would leave much more to the discretion of Congress, and arguably embrace the kinds of nonofficial failings for which Clinton stood exposed. If one wanted to reason as an originalist, a narrow reading would be consistent with the idea that the Framers worried about leaving the president vulnerable to congressional pressure and manipulation–which is what the records of debate seemed to me to suggest. A broad reading of “high crimes and misdemeanors” carried the opposite implication–that the Framers wanted to make the president politically subservient to Congress–and that seemed incompatible with the evolution of the presidency through the course of the Federal Convention.

By the time my turn came to testify, the atmosphere in the committee room had eased considerably. The mood during the morning session had seemed quite charged, especially when Republican members took issue with Arthur Schlesinger Jr., for asserting that gentlemen always lied about sex. But by late afternoon, some of the committee members had absented themselves, and our circadian rhythms clicked in.

Reading my testimony and trying to gauge what sense the committee members could possibly make of it led to another insight. As much as members of Congress like to praise the Founders and cite useful passages from The Federalist to demonstrate their own learning, their sense of history is both underdeveloped, on the one hand, and distorted by their romantic attachments to the founding era, on the other. All of them, I am convinced, feel a deep kinship and sense of gratitude to the founding generation, for establishing the institutions and offices they now love to inhabit. All of them know how to wallow in the conventional trappings and expressions of American patriotism.

But few of them know much about how the Constitution was drafted, or have a good grasp of the political disputes and conceptual uncertainties of the Revolutionary era. They are much more inclined to think of the Framers imparting their collective wisdom to posterity than to realize that the decisions of 1787 were reached by processes not dissimilar to the ones in which they ordinarily engage. To offer, therefore, an account of the Impeachment Clause which emphasized George Mason’s idiosyncratic role in the debates, or the difficulty of defining “high crimes and misdemeanors,” or the deep uncertainty that clouded the entire discussion of the executive branch at Philadelphia in 1787, was (I sensed) to provide an unsettling lesson that they could neither assimilate nor easily apply. I believed that immersion in the historical evidence was essential to understanding the Impeachment Clause, but as I watched the bemused (or confused) look on Chairman Henry Hyde’s face, it occurred to me that I just as well could have been speaking Greek. My account of Mason’s and Madison’s respective concerns could hardly compete with his appeals to Thomas à Becket or the Normandy war dead.

But why should history matter at all to the members of the Judiciary Committee? With the sole exception of Mary Bono, whose presence on the committee was a bit of a mystery, they were all attorneys, and therein lay a clue to the workings and mentality of the committee. They might not know much about history, but they knew a lot about the workings of the legal system, the importance of witnesses telling the truth and the likelihood that witnesses often try to shade their testimony in just the self-serving and indeed duplicitous way in which Clinton had his. Their own substantial legal experience, in other words, readily shaped the way in which they thought about impeachment; even the most compelling account of the true historical origins and ambiguities of the relevant constitutional language could only be a quaint distraction.

So the unpleasant and messy truth is that the sense of nuance that historians bring to their work cannot be readily translated into the political sphere, especially in a controversy as bitterly partisan as the impeachment imbroglio. Does that mean that historians should refrain from engaging in such controversies, in part because they have little chance to influence them, and in part because they risk compromising their objectivity? I have already been attacked twice by the distinguished jurist and overly prolific legal writer, Richard Posner, for having signed the historians’ “October surprise” advertisement challenging the House impeachment hearings before the 1998 congressional elections and otherwise participating in an avowedly political debate under the bare pretense, as he sees it, of being scholarly. I take comfort, however, from the belief that everything I wrote during the impeachment mess was consistent with my prior scholarly writings. I still believe that historians have an obligation to inform public discussions as best we can, when the opportunity arises. And as citizens, we have the same rights to exercise as anyone else. But as historians we should also understand why our contributions, which rely on the nuanced feel for the past that we have to develop to ply our trade, are likely to have little effect. I accordingly no longer believe, as I did then, that if I could just be given forty-five minutes of prime time to present the equivalent of an undergraduate lecture on the origins of the Impeachment Clause, the country could have been spared the year wasted on the whole sordid affair.

Since September 11, 2001, I have entertained one further reflection about the impeachment imbroglio. At the time, nothing was more common than to hear opinions expressed on either side as to how history would judge either the president’s behavior or the passion with which his detractors hounded him even after they knew that he would remain in office until January 20, 2001. We now know, I think, what the truer judgment of history will really be. While politics dictated that the national government be paralyzed for a year with partisan foolishness, our enemies elsewhere were making other plans for us–plans that we perhaps could have been better prepared to confront. But of course Monica was more important.

 

This article originally appeared in issue 2.4 (July, 2002).


Jack Rakove is the Coe Professor of History and American Studies and professor of political science at Stanford University.




The Clinton Impeachment: Clinton Hating

As the hot glow of 1998-99’s impeachment crisis fades, and the Clinton presidency recedes into the past, we now know far more than we could have wanted to know about the former president’s personal life. We have also learned much that we should have known earlier about the right-wing agitators and propagandists who discovered, publicized, fomented, and sometimes simply manufactured scandalous accusations against him. Yet with all the ink spilt, strikingly little attention has been paid to the nature of the political passions underlying the crisis–the outsized and persistent contempt and resentment that the president himself inspired among a vocal minority of the American electorate.

Why did so many conservatives see the president not simply as a detested opponent but as a cheater, a deceiver, a beguiler, and a rogue? Why did many left-liberals regard him as a self-serving betrayer of their principles? And, perhaps most perplexingly, why did so many members of the cosmopolitan middle, what we might call the supercilious center–people who actually come very close to sharing the former president’s politics–hold him in such disdain? It won’t do simply to say that the accusations are true and thus the opprobrium justified; for one must then contend with the fact that the man was not only twice elected president, but maintained historically high levels of public approval through most of his presidency. Clinton hating was more than ordinary disaffection; it was aggravated and embittered, a phenomenon as much personal as political, and one that simply confounds conventional political analysis.

 

Fig. 1. First printing of the second draft of the Constitution from the Committee of Style. September 12, 1787. The Gilder Lehrman Collection, courtesy of the Gilder Lehrman Institute of American History, New York.
Fig. 1. First printing of the second draft of the Constitution from the Committee of Style. September 12, 1787. The Gilder Lehrman Collection, courtesy of the Gilder Lehrman Institute of American History, New York.

So how is this phenomenon and impeachment, which was its logical culmination, to be understood in the context of the American constitutional order?

While the United States Constitution is a table of rules and procedures for organizing and running the national government, it was also devised–perhaps principally devised–as a structure to channel and break the tides of passion and political enthusiasm that are common to, and recurrently threaten, the existence of popular government. Impeachment had a narrow constitutional focus in the sense that the trial and attempted removal of the president followed the prescribed constitutional procedures. But it is perhaps more fruitfully understood as the culmination of a process that has several times recurred in American political history and is in some sense intrinsic to the American constitutional order: periods of turbulent political transition wherein the Constitution’s separation of powers prevents the resolution of basic political questions for an extended period of time. Parliamentary systems avoid this problem, providing the possibility of unified control of the levers of legislative and executive power even when substantial division in the electorate remain. But the separation of powers at the heart of the American governmental structure–along with the additional divided authorities created by federalism–creates too many redoubts and recesses of authority where committed oppositions can retrench, regroup, and stymie majorities.

The pattern of a two-term president who is widely popular but also deeply reviled in a period of rapid political, economic, and social change is not unprecedented in our history. In their own times, Franklin Roosevelt (1933-45) and Andrew Jackson (1829-37) engendered similar political polarization, with embitterment and contempt on the one hand, and a deep, intuitive identification with a broad mass of the population on the other. (The only other presidential impeachment, that of Andrew Johnson, originated similarly in a disjunction between the forces controlling the executive and legislative branches.) Franklin Roosevelt’s enemies vilified him as “that man”–a demagogue and class traitor who had seduced voters through a kind of illicit, hypnotic mass spell. Jackson was, in his own time, similarly reviled. Part aristocrat and part rough-hewn soldier, Jackson represented a new kind of politics and a new conception of the presidency. He too had a deep, intuitive connection with the American people that terrified his enemies and convinced them that he was a demagogue who threatened the very institutions of American government.

Both men’s presidencies had a transformative character. Each, individually, had a unique ability to connect and communicate with ordinary citizens, an ability that their enemies saw as phony, perverse, opportunistic, and ultimately dangerous. In each case the president’s adversaries’ attacks upon him only deepened and intensified the support of his supporters, in a circular and mutually reinforcing fashion. The antagonism over the man echoed deeper cultural and political rifts that remained inchoate, latent, or simply unspoken. The impeachment crisis of 1998 and ’99 had similar origins in unresolved political stalemate and the unrelieved passions and antagonisms this generated.

Over the years observers have posited a number of possible explanations for the enmity that grew up around the forty-second president. Early in his presidency the disaffection was often chalked up to generational transition: Clinton was the first president since John Kennedy to be well under fifty years of age; he was also the first president to have been fully washed over, and in many ways compromised, by the upheavals and experimentation of the 1960s. His very person, in this reading, became a battleground for a newly intensified version of culture war that had been playing itself out in one form or another since the late 1960s. Yet another theory sees Clinton hating rooted in a sort of baby-boomer self-loathing, a contempt for their inability to reconcile their own youthful indulgence and middle-aged hypocrisy. Each of these explanations is partly true. But neither is quite satisfactory.

To get a better purchase on the questions, let’s first distinguish between at least three distinct kinds of Clinton hating: conservative Clinton hating, left-liberal Clinton hating, and cosmopolitan Clinton hating, each of which shares common roots and predilections but remains nevertheless distinct.

The rhetoric of conservative Clinton hating is immediately familiar. Clinton is a liar, a phony, an immoral man, a deceiver. He can’t be trusted. He had “stolen” their issues. The feelings have become more tortured and embittered because again and again Clinton has won when he shouldn’t have been able to win.

Conservative Clinton hating echoes the McCarthyism of the 1950s, only not necessarily in the sense some of his supporters have argued. The subtlest historical interpretations of McCarthyism describe the movement as a product of two quite distinct forces–one crassly political and opportunistic, another deeply rooted in the insecurities of the early Cold War. In 1946 the Republicans won back the Congress for the first time in fourteen years, only to lose it again two years later, and be defeated in a presidential election they seemed certain to win. From what seemed like an expected restoration after Franklin Roosevelt’s death, the GOP now faced a fifth straight presidential loss and what seemed like it might be a near permanent exclusion from power in the national government.

This reverse made Republicans resentful; it also made them feel cheated. And they retaliated with an attitude that held no tactic or charge as beyond the pale. As Robert Taft, the respected Republican Senate Majority leader, famously told McCarthy early in his crusade, “[K]eep talking, and if one case doesn’t work–proceed with another.” But partisan warfare was only half the story. It was a necessary, but not a sufficient cause for what happened in the early 1950s. Only in a climate of deep-seated political uncertainty and fear could such concerted political attacks have had the truly explosive results they did. The early 1990s were not the early 1950s, of course, but in many respects the times were equally unsettled. The end of the Cold War, though immeasurably more benign than its onset, nevertheless created a similar disequilibrium in the nation’s politics, shaking free a swirling hatred of government and a search for internal enemies that had not been seen in so virulent a form since the McCarthy era. Journalists have described the partisan campaigns–open and covert–against Clinton, but why these efforts struck such a profound chord among a minority of the population still needs to be explained.

One clear reason for the out-sized opposition to Clinton was how much his election–and even more his subsequent success–scotched the paradigm of historical and ideological transformation Republicans had been crafting for themselves during their twelve-year hold over the executive branch from 1980 to 1992. For partisan Republicans these three successive presidential victories were not simply the result of favorable times or quality candidates–for many Republicans, in fact, quite the opposite for the first President Bush. They were the result of an epochal shift in the ideological complexion of the American electorate–a wholesale shift away from liberalism and the New Deal. Clinton’s election in 1992 might have been either an accident or simply a time-out in the Republican hegemony–à la Jimmy Carter. But his eventual success created a dissonance and frustration among partisan Republicans that in its own way was as frustrating as Truman’s unexpected victory in 1948, which seemed to doom them to permanent executive-branch oblivion.

Much less visible to the general public is the equally charged antipathy toward the president among many liberals. The left-liberal Clinton hater found the president phony and inauthentic, willing to sacrifice any principle or precept not simply for expedience but for self-interest. At the same time however (and in a partly contradictory fashion) these Clinton haters see the president as providing Democratic cover for a complete surrender to Reaganism, with balanced budgets, welfare reform, and tax cuts. Like conservative Clinton haters, they despised him because he is something their map of the world doesn’t account for: a Democrat who plays to win, a Democrat who wasn’t afraid to play political hard ball, cut necessary deals, or generally get his hands dirty in the inevitable back and forth of political warfare. Other similarities exist. Part of the depth of disaffection with Clinton among many left-liberals was that he had been successful when he should not have been able to be successful. In many cases he had been able to accomplish goals these critics have long espoused by means that shouldn’t have worked. And perhaps most galling, Clinton had been able to gain the support of constituencies left-liberals have long considered very much their own (women and African Americans particularly), even while eschewing their policies.

The third group, the cosmopolitan Clinton haters, are the most paradoxical because their displeasure is not obviously rooted in specific ideological disagreement. For many, in fact, the level of disgust and disdain for the president appeared to be inversely related to ideological proximity. Political commentators and prominent press figures Howell Raines, Michael Kelly, Maureen Dowd, Joe Klein, Christopher Matthews, and most of the rest of Clinton’s most vituperative elite media critics were centrists of a vaguely liberal hue. This group includes much of establishment Washington, but extends a good deal further, taking in an important slice of society up and down the Northeast corridor. With this group the element of class condescension and resentment runs most deeply, and what seems to cause the greatest irritation is that Clinton is a “bubba” and a mandarin–two qualities that should not be able to coexist in the same person.

As in the cases of Roosevelt and Jackson, a group of journalists and intellectuals slipped into a pit of their own contempt for Clinton and somehow became unhinged by it. They became obsessed and this obsession transformed them, in many cases leaving them different, damaged, certainly not the same. Some of the prime examples of this are Stuart Taylor, Michael Kelly, Maureen Dowd, Christopher Hitchens, Nat Hentoff, and even Kenneth Starr. Every president has critics. And most of these began in a conventional enough way. But Clinton’s unwillingness to be defeated by conventional political means–typified by his refusal to resign after being impeached–undid them. The failure of ordinary means pushed them to extraordinary means. Their failure to bring him down, paradoxically, magnified him in their eyes, leading these critics into an endlessly escalating series of polemics.

Clinton was different, of course, for at least two reasons. Jackson and Roosevelt each in their own way threatened important political and economic constituencies and interests. On the surface at least it is difficult to see how this can be said about Clinton. His policies were centrist and, after 1994 at least, cautious. His cabinets were liberally staffed with men and women who had made their careers on Wall Street. The stock market prospered mightily during his presidency. Many of the social pathologies that conservative politicians and social critics have railed against have undeniably diminished during his tenure in office. Clinton’s policies significantly tacked against the conservatizing course of his predecessors and cut against the pure celebration of the market that so typified the decade. But his policies were still generally friendly toward business and the market and surely not nearly so leftist in complexion as the intensity of the opposition would imply. So–and I hasten to say again, on the surface at least–it is not immediately clear why Clinton’s presidency should be so contentious and polarizing.

Second, as the president’s critics never tire of pointing out, while his public approval numbers were high by historical standards, Clinton has always enjoyed more support than respect. His political strength has been rooted in a politics of empathy, a fact which polling data, if scrutinized closely, bear out. Beside the normal horse-race polls we usually see, pollsters ask a variety of other basic questions, one of which is: Does politician X care about the needs of people like you? On many other questions Clinton’s numbers have fluctuated drastically. Thus, for instance, according to the Gallup poll, from February 1995 to January 1999, the percentage of Americans who believed Clinton could “get things done” rose from 45 to 82 percent. Less favorably, over precisely the same period, the percentage of Americans who believed Clinton was “honest and trustworthy” dropped from 46 to 24 percent. But on the question of whether Clinton “cares about the needs of people like you” his numbers remained virtually unchanged over the entire course of his presidency, averaging just over 60 percent. Without too much facetiousness this might be fairly be called the “feel your pain” index. And, though he was roundly abused for that line, it was also been the core of his political strength and resilience.

Many of those who opposed impeachment saw it at the time as an abuse of constitutional mechanisms provided for most of extreme crisis and executive malfeasance. But the roots of the crisis–particularly the structure of the government the Constitution prescribes, with its pronounced separation of powers, which frequently stalemates resolution of major political divisions and questions–are just as clearly rooted in the Constitution itself. Separation of powers may benignly slow the workings of government and refine them through countless small revisions and seasoning of radical reforms. But in an essentially democratic polity it also contains within itself the seeds of crises–crises for which the extreme solution of impeachment may have been virtually preordained.

 

This article originally appeared in issue 2.4 (July, 2002).


Joshua Micah Marshall, former Washington editor of the American Prospect, is author of Talking Points Memo. His articles have appeared in a wide variety of print and electronic publications including the American Prospect, the New Republic, the New York Times, Salon, Slate, Talk, and the Washington Monthly. He is currently finishing his doctoral dissertation in colonial American history at Brown University.




Turning Sexual Vice into Virtue

Thomas A. Foster, Sex and the Founding Fathers: The American Quest for a Relatable Past. Philadelphia: Temple University Press, 2014. 232 pp., $28.50.
Thomas A. Foster, Sex and the Founding Fathers: The American Quest for a Relatable Past. Philadelphia: Temple University Press, 2014. 232 pp., $28.50.

Biographies are among the most popular forms of history because reading about individual lives humanizes historical moments. Like a new friend, we learn about the secrets, foibles, and successes of individuals and rediscover aspects of ourselves in the process. Historians have long suggested that objectivity and perspective are especially difficult to achieve (if ever possible) in biographical endeavors, because writers become attached to their subjects. In addition, the present has a way of creeping into our reading and writing, as our current code of morals, anxieties, and desires influence the types of questions we ask and the answers we seek. These problems are especially pronounced when dealing with the Founding Fathers, according to Thomas A. Foster, associate professor of history at DePaul University, who investigates the nature of biography and America’s relationship to sex in his book, Sex and the Founding Fathers. Foster’s work does not attempt to reconstruct the sexual lives of the Founding Fathers, but rather analyzes the biographies and historical memories related to six of the founders from the nineteenth through the twenty-first centuries. He finds that Americans’ contemporary sexual and gender mores consistently contribute to our narratives about the sex lives of the founders—often more than the historical record allows—and are important sources of our national and civic identities.

Sex and the Founding Fathers is not Foster’s first work positing the importance of sexuality and masculinity in understanding the United State’s national identity. In tackling the historical remembrance of George Washington, Thomas Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, and Gouverneur Morris, Foster brings his expertise to bear on the way Americans have interpreted the lives of the men whose life and history have greatly affected Americans’ imaginations. Foster is careful not to offer his own analysis of each founder’s sexual past, and gives only short synopses of the historical documentation available to the biographers who crafted their narratives over three centuries. Nevertheless, some moments cry out for his own interpretation: How likely was it that Benjamin Franklin moved beyond flirtation in Paris given the gender and sexual mores of the time? With his expertise in the field, it was a loss not to have Foster’s full perspective and even more context for the sexual histories he provided. But this is not the task Foster set for himself in this book. He sought to reveal our perceptions of the founders, not scour the historical record for evidence of sexual “truths” that does not exist.

As Foster writes, “we are stuck with the Founding Fathers,” and over the twentieth and twenty-first centuries, in particular, their romantic liaisons, adulterous affairs, and marriages have provided a way for new generations to assess the early nation (168). That is not to say that nineteenth-century Americans had no interest in the founders’ sexual lives. Indeed, they wrote biographies of the founders that touched on their sexual and personal lives. However, the general trend was to ignore indelicate information—such as Alexander Hamilton’s affair with Maria Reynolds, or Thomas Jefferson’s sexual relationship with Sally Hemings. Perhaps most compellingly, Foster shows that Gouverneur Morris’ frank and exuberant writings about his own sex life, which offer the clearest picture of any founder’s sexual encounters, were “whitewashed” (144) by his family and historians until the twentieth century. The turning point in discussions of the founders’ sexual lives occurred in the early twentieth century, when many Americans began to more openly embrace sex as a positive force in their lives. At this point, biographers began to consider Benjamin Franklin’s ribald humor and behavior in France, George Washington’s romantic past before his marriage, and Thomas Jefferson’s desirability, among other topics. By the twenty-first century, according to Foster, the founders became emblematic of a masculinity that valued successful marriages, as in the case of John Adams, while also being symbols of virility and a passionate sex life. In this way, the founders’ sexual vices became virtues in the hands of biographers. Understandings of Thomas Jefferson, for example, transformed from a “chaste widower” (46) in nineteenth-century biographies to a modern “multicultural hero” (75) in the twenty-first century as Americans increasingly accepted evidence of his affair with Sally Hemings and mostly ignored any possibility of sexual coercion on his part.

Each generation has contextualized the sexuality of the founders differently but ultimately came to value them as role models for their masculinity and sexuality. The desire to laud the founders has resulted in faulty histories more useful in nation building than innate historical worth. Foster writes that our quest for virile and faultless founders reveals the importance of sexuality and gender in our national narratives and identities. Why, for example, Foster asks, has the marriage of John and Abigail Adams been idealized and covered in a “romantic gauze” (90) given John Adams’ prioritizing of politics and his career goals over his family? The couple’s long-term separations suggest their marriage was not the consistently romantic union it is characterized as today. The way we dismiss, embrace, and analyze information about each man suggests how caught up we are in the mythmaking of the founders—always finding ways of celebrating them to stand as a contrast to present day sexual and moral dilemmas, such as the 1872 Beecher-Tilton affair or the 1998 Clinton-Lewinsky scandal. Even the capitalization of “Founding Fathers” suggests our deification of their historical contributions and personal lives, and our difficulty in treating the founders as actual men. “[A]nd so we rewrite and respin and remember them in various ways to present them in a positive light,” Foster concludes (168).

Foster’s book would have benefited from further in-depth analysis of the gendered constructs existing in the historical refashioning of the founders’ past. Expanding his discussion of the changing ideals of masculinity and male bodies would have added a fuller picture of the motives of writers, particularly during the re-imaginings of the 1950s and twenty-first century, though Foster briefly details these issues with Gouverneur Morris and George Washington. Alexander Hamilton’s physicality and recent makeover is especially ripe for this line of inquiry. Some attention to the gender and sexual privileging of the founders is also necessary. Certainly the founders’ masculinity has been heralded, but what were the gendered implications of this in the writing of history? The discovery and celebration of the founders as ribald and virile depend upon an acceptance of patriarchal sexual ethics, and analysis of this would flesh out the interrelationship between writing, historical memory, and gender systems over time. Time and again, the adjustment in turning a founder’s previously unacknowledged sexual vice into a virtue was the reinterpretation of a woman’s sexual and moral nature. Thus, Maria Reynolds became a conniving lower-class woman who duped Alexander Hamilton; Martha Jefferson a frigid woman who withheld love from Thomas; and several French women lustful partners willing to forgo morality for a momentary sexual tryst. Studying the gender and sexual biases of authors would more fully reveal the way myths of the founders were created and how biases influence our history-making and nation-building.

Foster’s narrative is a thoughtful one that subtly challenges readers and historians to consider their motives in reading and writing history. Readers walk away with an understanding of how contemporary trends influence historical output and perceptions of the founders outside of academia. However, Foster’s real contribution here is his evidence of the ways manliness and sexuality influence our understanding of the founders and the subtle ways that sexuality influences our mythmaking. Reading Foster’s narrative, one concludes that the discipline of history provides a starting point for understanding the human experience, and self-consciously works toward creating histories true to the past while also relevant to our current moment. In this way, we will continue to build a truer past—full of vice and virtue.

 

This article originally appeared in issue 15.1 (Fall, 2014).


Kelly A. Ryan is an associate professor of history at Indiana University Southeast. She is the author of Regulating Passion: Sexuality and Patriarchal Rule in Massachusetts, 1700-1830 (2014), and is currently working on a project examining the implications of violence in the early national Northeast.




The Unbearable Taste: Early African American Foodways

I am the darker brother.
They send me to eat in the kitchen
When company comes,
But I laugh,
And eat well,
And grow strong.

From Langston Hughes, “I, Too, Sing America”

Setting the Table

As an interpreter, researcher and educator in the subject area of enslaved people’s lives and foodways, I am often asked, “where did you go to learn all of this?” The sincerity of the question enhances the awkwardness and anguish of making a humorous retort. My reply generally goes: “Well, I read. I cook on the hearth as much as possible. I listen to the elders of my past and those who are left, and I make mistakes and learn not to repeat them. I lose my eyebrows and arm hair to the fire, and often burn my hands and spill things. Sorry, there are no B.A., M.A. or Ph.D programs in how to cook like an eighteenth- or nineteenth-century slave or how to work in a cotton, tobacco, corn or rice field.” There is nothing sexy about what I do. And what little feeling of “comfort” people get from the smells, stories and samples I have to offer is quickly turned bittersweet by the discomfort of the real meat of the meal—slavery.

 

Fig. 1. Michael W. Twitty cooking a holiday meal at Stratford Hall Plantation, Montross, Virginia, birthplace of Robert E. Lee, 2009. Photo by C. H. Weierke, courtesy of the author.
Fig. 1. Michael W. Twitty cooking a holiday meal at Stratford Hall Plantation, Montross, Virginia, birthplace of Robert E. Lee, 2009. Photo by C. H. Weierke, courtesy of the author.

 

We don’t shy away from slavery as much as we once did. It always seems to raise eyebrows and get the discursive blood flowing. Most of this kinetic response comes from thinking about slavery as a challenge to democracy. The same could be said for the mythos surrounding the Underground Railroad, the Civil War, and the Civil Rights movement, in which what we debate is not what made this debate necessary in the first place—slavery—but rather how the debate was resolved in terms of protests, movements, politics, wars, great men and women. I am an iconoclast. I’m really not turned on by this lofty democracy talk. It doesn’t tell me how I got here, how we got here. In some ways it makes me angrier because slavery rolled into quasi-freedom in slow motion, while wars of independence and the birth of the personal computer seemed to revolutionize American life in minutes. I cannot understand my history looking through that lens. Slavery was not a complication; it was a civilization.

 

Fig. 2. Heirloom garden, Beall-Dawson House/Plantation, Rockville, Maryland, 2009. Photo by C. H. Weierke, courtesy of the author.
Fig. 2. Heirloom garden, Beall-Dawson House/Plantation, Rockville, Maryland, 2009. Photo by C. H. Weierke, courtesy of the author.

 

My hero is not the freedom seeker as much as the one who endured the institution out of sheer survival and the promise of better times and freedom as an inheritance for posterity. My great questions do not revolve around the southern Founding Fathers and their issues with “the Peculiar Institution.” I care about the enslaved world they tried to shield themselves from with fences and hedgerows, columns and dumbwaiters even while they lived in its midst. The smell, sound, feeling, and unbearable taste of being black and enslaved is where I find our common genesis. Slavery was central to the American economy, the birth pangs of black America, and served as the underpinning narrative of all other definitions of ethnicity in America to come; the chains are still on us.

There is no better way to encourage people to understand and empathize with the enslaved than through food. While these “notes from the field,” are intended to create a picture of my process and findings, I should state outright that by recreating and preserving these traditions, my ultimate mission is to give humanity and dignity to my ancestors. Invariably, my audiences will often participate in some part of the food preparation. Children will understand concretely what being enslaved was like. Elders will testify to witnessing the vestiges of the nineteenth century in the dimness of their childhood, and people of various colors and cultures will find themselves in my ingredients, methods, and narratives. Whatever poetry, truth, or wisdom people find in what I do or teach, they themselves bring to an already prepared table. What follows here is a sense of what they find at the feast.

Early African American foodways—toward a definition

Is slave food soul food? Is it born in Africa and transported to America or is it inspired by Africa but essentially an American creation? Does it center itself in key staples or is it the improvised brainchild of iron chefs in iron chains? Alternatively, aren’t early African American foodways really just the black side of a generic lower-class, early American palette?

Definitions are notorious for being the simplest way to be complex. It’s easier to say what the label “early African American food tradition” does not connote than what it does. It was not soul food, even though it’s clear there would have been no soul food without enslaved people’s food. Soul food is the moniker that the great-great grandchildren of enslaved people gave to the accumulated culinary knowledge and flavors that they felt helped to define them as an ethnic group. However, soul food has a spice that enslaved food did not. The average enslaved person would only intermittently enjoyed elements of the classic “Sunday dinner” of soul tradition. Poor whites in early America shared some of the staple foods that enslaved people consumed, but they were not accountable to a “higher authority,” like a master or an overseer, when they sat down for their meals; and poor whites, if they acquired the financial means, could purchase or grow or raise whatever they wanted. Access to ingredients and luxuries and accountability to white figures of authority were navigated by the enslaved through patterns of resistance and both individual and communal empowerment. Not to mention the fact that we have racialized certain foods and cooking methods over the past two and half centuries for definite reasons—they were associated with Africans, African Americans, and enslaved people. To ignore this is to disregard historic perceptions of ethnicity as well as the ways that we simultaneously create parallel cultures and maintain distinctions between them. Neither modern soul food nor the food of lower-class whites contemporary with American slavery serve as useful models for how enslaved people ate.

 

Fig. 3. Mr. Duckett's slave cabin and garden, Prince George's County, Maryland, 2010. Photo by C. H. Weierke, courtesy of the author.
Fig. 3. Mr. Duckett’s slave cabin and garden, Prince George’s County, Maryland, 2010. Photo by C. H. Weierke, courtesy of the author.

 

What we can say about the ultimate origins of early African American foodways is that it would have not taken the same shape had it not been born in a complex nexus known as the “African Atlantic.” The foodways of the West and the Americas were not first introduced to enslaved Africans in America, they were introduced in Africa, sometimes long before contact with Europeans. This pattern of exchange built on connections with the Islamic world and Southeast Asia; it drew African plants and foods north and east even as ingredients and dishes from these regions came into the African world. Commensurate with the creation of new Creole languages, spiritualities, and aesthetics in the Atlantic world was a flowing food tradition that drew on West and Central Africa for its deep structure, values, and “grammar,” as Charles Joyner once put it. This tradition then found in European (and by extension, Native American) traditions, the parallels, luxuries, limitations and possibilities that created an edible jazz. In key regions—Senegambia, the Gold Coast (modern Ghana), southeastern Nigeria and west-central Africa, the culinary negotiations were immediate and palpable. American crops parallel to African cultigens would come to flourish—think maize and cassava, tomatoes, peanuts and curcubits. Where classic African traditions, Creole innovations, and caste control met, there the foodways of early African America were born.

Early African American foodways were also the product of the interplay between ethnic groups from West and Central Africa and new “nations” that emerged out of the catalyst of the Middle Passage. People drawn from a 3,500-mile stretch of coast who had no contact with each other before the slave trade now found themselves having to work out a common Afri-Creole food tradition parallel with a common Afri-Creole tongue and culture. Wolof met Igbo, Mende met Mbundu, Akan met Fang. There was also some level of exchange and movement between enslaved communities in mainland North America and those from the Afro-Caribbean and Latin America. The story is made yet more complex by the fact that whomever enslaved and liberated Africans lived among—be they Dutch, Swedish, Scots Irish, German, Sephardic Jewish, Cherokee, Creek, French or Spanish—would either enrich or limit the culinary realities of their exile in America. No enslaved cultures in the history of the world have had such an enormous influence on the national cuisines of its sojourn than the peoples of the African Atlantic and the African Americas.

Transitions of taste

It has occurred to me that if we were to transport enslaved African Americans through time to the present, they would be amused and amazed with the current trends in American food. Local, seasonal eating? Sure! More than anyone else, enslaved people had little choice but to rely on the immediate environment for their needs since they were literally bound to the land. Eating offal, wild plants and game as something chic? Perhaps not chic, but necessary and favorable? Absolutely. Comfort food prepared and served in unpretentious ways, eaten as a “mess”? Affirmative. Underground eating, food prepared and served guerrilla style apart from the prying eyes of authorities and judgmental eaters? Again, a winner. The conventions of enslaved people’s foodways from the colonial and antebellum periods are remarkably consistent with contemporary passions about the possibilities and power of food as a communal experience.

Enslaved Africans and their descendants clung to hot and spicy foods. They maintained their cravings for salt, grassy leafy greens and herbal tonics. They liked their soups and stews with gummy okra and other mucilaginous textures; they favored one-pot meals where flavors melded and rice cooked so that every grain was single and distinct. They punctuated heavy starchy fillers with the unctuous flavor of heavy oil and crispy outside skins of fried foods. They satisfied their desire for what foodies now call umami (meaning mouthfeel) with parboiled roasted meat, boiled peanuts, and the slightly aged and fermented taste of preserved fish and meat that lent savor to cowpeas.

But there was no single enslaved food culture.

One fundamental shift in what this cuisine looked and tasted like probably occurred about the same time that the majority of enslaved blacks were American born—a shift that occurred between1740 and 1760. By the end of this period, the trade in enslaved peoples began to slow and was ended in the North and Chesapeake by the 1780s. In the Lower Mississippi, Gulf Coast and Lowcountry, successive waves of late importations would continue into the nineteenth century, well past the congressional close of the slave trade in 1808. The result was that the cuisine of the Lower South reflected ongoing influxes from Africa long into the nineteenth century. In the Chesapeake, influential to the Upper South and parts of the cotton belt, the cultural impact of earlier demographic patterns was key to the uniquely African American approach to Southern food. The foods, cooking methods, and manners of eating that appealed to, say, a man arriving from what is now southeastern Nigeria in the 1720s might not be so appealing to his great-granddaughter born in southeastern Virginia in the 1770s. Technology would have accounted for some of the differences in taste between this man and his great-granddaughter. He would have grown up in a household where a kitchen hearth was an outdoor or separate building with a fire pit that held three supporting stones on which a slim variety of clay or iron trade pots could be placed. Wooden spoons, large knives, mortars and pestles, brush whisks, baskets, grinding stones and gourds used as bowls were the primary culinary implements. Contrast this with the brick hearth, metal pots and pans, and multiple utensils that his great-granddaughter might have used as a cook in at the home of a prominent Virginia master. In her world, food was to be served in courses, and made into separate dishes rather than a single pot entree. And the food she prepared would have included baked goods, a novelty to her great grandfather, whose traditions did not include either wheat flour or long baking.

Africans living in the Americas, like their European counterparts, developed a sweet tooth. European and Islamic tastes for honey-flavored desserts lost ground to king sugarcane. The traditional African palette favored bitter, spicy and oily dishes over sweet ones. Raw honey, sugarcane or sorghum cane (native to Africa and brought to the South in the mid-nineteenth century) and fresh fruits were valued but not essential snacks. In the Americas, however, Africans would not only cut cane, but would come to develop new ways of their own to produce confectionary. Sucrose was one of the few comforts known to bonds people. Great-grandfather may have looked askew at anything but fresh fruit and honey; but his great-granddaughter would already be fond of jumbles, biscuits, cakes, sweet bread, jams and other delights including a newly introduced preparation from grain—macaroni.

Both great-grandfather and great-granddaughter would have based their “vegivore” diet on a starchy preparation made from grains and tubers enhanced with leafy greens and bits of meat or fish and nuts or legumes as protein. While great-grandfather’s diet was based on yams, legumes, some millet and sorghum and leafy greens from tropical crops, great-granddaughter’s diet was based on corn first and foremost, supplemented with legumes, sweet potatoes and leafy greens from temperate brassicas. Both diets would have incorporated African ingredients or New World substitutions. Okra, Bambara groundnuts, sesame, cowpeas, sorghum, watermelons and muskmelons, hot peppers, and peanuts all provided a taste of an African home. Sweet potatoes, temperate and tropical pumpkins, Eurasian brassicas (like cabbages, kale, collards) and a variety of indigenous North American fruits—berries, persimmons and the like—were substitutions for West and Central African foods. At the same time, foods consumed by enslaved blacks in the Americas found their way to Africa. The corn that would have been so important to great-granddaughter was consumed on both sides of the Atlantic. Like other grains and tubers, it was mushed, popped, roasted, fried, cooked into flat cakes, and made into loaves. The sole difference between its African and American preparation was that nixtimalization (treating corn with lye or lime to loosen the hull as in hominy) was not utilized in West and Central Africa although the process was endemic to the southeastern coast of North America. The result was that corn remained a poor addition to the diet in Africa, but served as a fairly nutritious food in America.

And meat? Great-grandfather would have enjoyed heads, feet, innards, tongues and eyes of domesticated livestock. Each part would have had its own spiritual and culinary meanings and savor. His great-granddaughter would relish what protein she could get but was tempted by the cuts her owner preferred. Long after slavery, her descendants would look on the cast-off parts with disgust and associate them with powerlessness rather than their equally important roots in West and Central Africa.

While Europeans (and Native Americans, for that matter) influenced the foodways of the enslaved community, enslaved women and men also influenced the food traditions of their neighbors and owners. Native Americans in the southeast began growing watermelons, sweet potatoes (read “yams”) and cowpeas. Europeans could not live without pepper vinegar and other condiments influenced by fiery African tastes. Native persimmons replaced the fruit of the ebony tree and pawpaws were substituted for bananas; cutting grass rat stewed with yam became possum roasted with sweet potatoes; violets became “wild okra.” Early receipt books give a hint of the transformation from European to Southern. We can see the shift with every recipe for fried chicken, turnip tops cooked “Virginia style,” chickens stewed with sweet potatoes, gumbo, hoppin’ John or cowpeas cooked with rice, black eyed pea cakes, barbecue (a term with possible roots in the Hausa word “babbake/babbaku,” “to grill or toast”), sweet potatoes cooked in syrup, okra soup, and treats made from plants with African names like “fevi/ochra/gumbo” (okra) “benne” (sesame), “goober” (originally referring to the Bambara groundnut, later the peanut), “pindar” (peanut), “cala” (a fried rice ball), and fried plantains, the bananas cooked in New Orleans. We should not confuse written “receipts” with real dishes. Not all dishes had names, and the necessity of improvisation meant that a mess of wild greens eaten with a hoecake has only as much meaning as we assign them. But to West and Central Africans and their descendants, dinner was always a starchy or leafy dish eaten with a soup or stew or, much more rarely, a roasted or fried protein.

Research, interpretation, presentation and preservation

There is a lonely sense of merit attempting to embrace and interpret the enslaved cooks of the eighteenth and nineteenth centuries. The audiences at my museum demonstrations and programs admire what I do, but few patrons express interest in joining in the reenactment. There are no legions lining up to participate as there are with Civil and Revolutionary War enthusiasts. Accuracy is hard to assess. And anyway, “cooking as the slaves did” should really be me serving a visitor or student a bowl of mush or rice or a corncake with no frills except perhaps some salted meat or fish, and water, for several meals a day. That might be the most jarring and authentic way to transfer the message: Enslaved foods were supposed to be filling and blandly satisfactory, not tasty and comforting.

 

Fig. 4. Picking cotton, Surry County Virginia, Chippokes Plantation State Park, 2007. Photo by C. H. Weierke, courtesy of the author.
Fig. 4. Picking cotton, Surry County Virginia, Chippokes Plantation State Park, 2007. Photo by C. H. Weierke, courtesy of the author.

 

The prospect I lay out for the visitor is quaint but unbearably tedious. I must rise before sunrise, and using the leftover coals, re-ignite the fire, often building a wooden tower from which large shovelfuls of heat-giving charcoal will rise. It’s important to know what type of wood to use, knowing each has its own burning qualities, taste, and levels of neutrality and savor. The menu is determined by which animal is in its prime and which garden truck is in season. To feed both Big House and slave cabin, I have to answer a flurry of questions:

What species of wild flora and fauna were a part of the diet and what is their present status as food—rare/endangered/threatened/common? (As much as I am responsible to accurately re-create the past, I have a greater responsibility to the future.) What heritage breeds and heirloom vegetables and grains approximate the tastes known to the people of the past? When is poke safe to eat? What are the breeding habits of possums? What is the proper way to eat a biscuit? Which hand do I use? What is the color of a ripe persimmon? The folk knowledge that went into cooking for kitchens high and low is immense. One has to know blanc mange as well as ashcake. Is jambalaya au congri from New Orleans the same thing as hoppin’ john in Charleston?

The receipt books of the past are my tour guides; they get a vote but not a veto. One has to allow for “work presence,” as Karen Hess defined it—the moment a cook deviates from the formula, which in itself is only a distillation on paper. I allow my informants—the formerly enslaved and their observers—a say. I also focus on locally available foodstuffs. I cannot assume that because a food is available in one place it is available in another, or that what is common today was common in the past. I grow the herbs and heirlooms I need for the pot months in advance, source protein from farmers and hunters and fishermen, order for rare and costly grains, and limit myself to the seasons’ produce. It is difficult but necessary work.

 

Fig. 5. Harvesting Charleston Gold, Clemson Agricultural Center, near Charleston, South Carolina, 2010. Photo courtesy of the author.
Fig. 5. Harvesting Charleston Gold, Clemson Agricultural Center, near Charleston, South Carolina, 2010. Photo courtesy of the author.

 

I also rely on written sources ranging from Hannah Glasse and Adam and Eve’s Cookbook through Mary Randolph, Lettice Bryan, Sarah Rutledge, Elizabeth Lea, Mrs. Hill, Lafacdio Hearn, B.C. Howard and the Times-Picayune to Robert Roberts, and Mrs. Abby Fisher. Historical receipts and cookbooks must be mastered even if they don’t have absolute power over me as an interpreter. One has to know and how to subtract and add ingredients according to the needs and restrictions of the enslaved community. In a sense, my working goal is to master two or perhaps three cuisines—the meals of the British-French-American classics of the Big House, soup to nuts, patty shells to snow eggs; the foodways of the enslaved community in times of want and of plenty; and the outskirt, antique parent traditions of West and Central Africa, Brazil, and the Afro-Caribbean and Latin America. These traditions serve to inform and verify the work of re-creating and restoring.

Some might question my expansive/inclusive approach to documentation. The only universally acceptable form of evidence seems to be limited to observations made by white men living 200 years ago. I reject this as not only culturally biased but short-sighted.  Evidence for the foodways of the enslaved can be gleaned from the remarks of slave traders, slave owners, and plantation visitors and some of this evidence is amazingly specific. However, these observations have to be weighed against the cultural backgrounds of an enslaved community, the ethnicities they encountered, the available food in any given region, the accounts of enslaved people themselves, historic receipts, and not least the oral tradition, culinary traditions, and memories that have filtered down into the present. Culture bearers are important counter voices to academic pronouncements. To suggest that my 92-year-old grandfather has nothing valid to add to the discussion, despite his having lived in the rural South in an era not much different from the antebellum period or having been in the presence of his enslaved grandparents; or that the testimony of a West African immigrant or the ethnographic writings of the early colonial period are moot merely because they were not recorded during the eighteenth and nineteenth centuries, is ludicrous and arrogant. Checks and balances on the sources and accuracy of data are fairly intuitive. If nothing else I work with the dictum, “if you can’t prove or prove it reasonable, don’t say it or cook it.”

There is of course a body of knowledge that only experience can help accrue. It is impossible to understand what it meant to be a cook if you have not experienced the fear of moccasins in a Carolina rice paddy or known the blindness that the sun casts on cotton on a hot early autumn day. Tobacco stains the hand and tasseling corn numbs them. The lower back and feet ache from hours on brick kitchen floors. You learn the wisdom of the recipes for salves and cures listed in the same receipt books that provide inspiration for menus. Your arms will teach you about the heft needed to carry 40- and 60-pound pots and the stamina it takes to cook a large meal by the standard dinner hour of two or three p.m.

These experiences are still not as instructive as the ones that time, place and fortune excuse me from. What if I was forced into the field for half a day to get the harvest in and made to leave the relative safety and privacy of the kitchen? Conversely, what if I had to feel the daily loneliness of the kitchen, being separated from much of the rest of the enslaved community? What if I was whipped for tasting too much or not tasting enough and sending out something burned? What if my accidents and tardiness were not forgiven? What if I had to wear the horse’s bit to keep me from eating though I was malnourished?

Free to fail in a way my ancestors never were, my work is spent scouring letters from slave traders, WPA slave narratives, and emancipatory narratives and recording my observations about the smell of game musk, gardening by pine-knot and moonlight, and trials and error with Spanish moss as support for baking dishes. I am now a veteran hog-butcherer, from dividing up the carcass to salting the meat down to each pore. I know the forearms it takes to pound corn and rice and the caution one should exercise when attempting to catch bullheads, crawfish or snappers. I’m working on having the “light touch” in baking, on expertly executing the multi-hour antebellum barbecue, and designing the perfect roux. Frequently I find myself on sourcing safaris where I find Senegalese cowpeas and bitter leaf, dried okra and calabaza and spiny gherkins and hunt down Sieva beans and sesame seeds to plant. Though I am not bound by the colonial and antebellum calendars, I trace time in terms of tobacco and cotton seasons, acknowledging the 101 labors needed to produce food on the plantations of yore—from orchard work to slaughtering to salting fish to shifting the gardens and truck patches from spring to snow, and treating the treats of historic cookery with the respect of a starving man.

The other part of the work is preserving it for the future. Putting measurements to orally transmitted compositions for recipes is not enough. Some foods are dying out not because they are physically endangered but because they are culturally endangered. Reenactment is also reintroduction. You have to eat it to save it. You have to ensure the tastes will endure that make the history relevant and alive for the enthusiastic learner. African Americans are still not well represented in the current drive to save our national and regional heritage foods. Yet these are the same foods that reverberate with deep and complex histories and that connect us to global legacies. My mission is to use this platform to reconnect my people with the culinary heritage many left behind in the transition from agrarian to urban realities. More than that, I believe these stories have the power to connect people across racial and ethnic lines, and to create a table where we are all at once welcome to finally sit and partake of the fruits of our ancestors.

 

This article originally appeared in issue 11.3 (April, 2011).


 

 




Just Add Sparkling Grape Juice: Toasting and the Historical Imagination in the Early Republic Classroom

 

“Everybody get ready—lift your glasses and sing
Everybody get ready to lift your glasses and sing
Well, I’m standin’ on the table, I’m proposing a toast to the King.”
—Bob Dylan, “Summer Days”

Today most people’s experience with toasting is limited to weddings. Toasts have long ceased to be a significant part of our political world, but in early America the practice of politics depended heavily on sociable drinking and toasting. At festive celebrations, men and women lifted their glasses to toast all sorts of subjects, from kings and presidents to military victories, legislation, principles, and historic events. Toasts variously praised, mocked, protested, or condemned their subjects. Their performance helped to create a convivial world of free-flowing conversation and song, food, and drink. Through toasting, individuals broadcast their political affiliations to the public. They also cemented social and political relationships, forging a sense of group belonging. By the end of the eighteenth century, toasts often appeared in newspapers, the expanding social media of the time. In this way, toasts not only reflected but potentially influenced public opinion.

 

“A Glass with the Squire” or “Aqua-Fortis,” etching by James David Smillie (1886) after painting by Eastman Johnson (1880). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.
“A Glass with the Squire” or “Aqua-Fortis,” etching by James David Smillie (1886) after painting by Eastman Johnson (1880). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.

To gain insight into the historical experience of toasting, students in my early republic class write and perform a series of toasts from the vantage point of the Federalists and Republicans of the 1790s. For three years, I have experimented with this activity in the classroom, and the toasting exercise has proven popular with my students. Preparing the toasts pushes them to develop their historical imaginations. Students step back into the past. As they write the toasts, they attempt to think in the language of 1790s politics. When they perform the toasts, they experience toasting as a social performance, an approximation of the historical experience of toasting in the early republic.

Historians trace the origins of toasting to the ancient world. Think of Plato’s Symposium, with participants at that famous banquet performing toasts on the subject of love. However, during the seventeenth and eighteenth centuries, toasting achieved an unprecedented importance in European society. In eighteenth-century Britain, the practice complemented the rise of urban sociability. People ate and drank together in taverns and coffeehouses. They met in various societies with irresistibly fun names like the Kit-Cat Club, the Green Ribbon Club, the Red-Herring Club, the Easy Club of Edinburgh, the Beefsteak Club, and the Society of the Dilettanti. Clubbing was all the rage. In British North America, colonists emulated the sociability of the mother country. They established associations like the Junto in Philadelphia, the Tuesday Club of Annapolis, and the Old Colony Club of Plymouth. At these gatherings, participants shared laughter, made connections, devised self-improvement schemes, and advanced their reputations. These social occasions provided numerous opportunities to feast and toast.

 

Broadside, “The Following Patriotic Toasts were Drank on the 19th Instant, at Hampton-Hall” (New York, 1770). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.
Broadside, “The Following Patriotic Toasts were Drank on the 19th Instant, at Hampton-Hall” (New York, 1770). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.

During the American Revolution, toasting became highly combative. Patriots and Loyalists each tried to convince the public that their toasts represented the popular will. Americans carried forward this contentious style of politics into their new republic. Historians of the so-called “new, new political history” such as Susan Branson, Simon Newman, Jeffrey Pasley, and David Waldstreicher have reconstructed that vibrant political culture, demonstrating how ordinary men and women participated in politics by reading newspapers, magazines, and pamphlets, by organizing meetings, parades, and protests, and by attending festive dinners and other celebrations.

Toasting played a crucial role in shaping that increasingly partisan and democratic culture. Politics, Newman has emphasized, were not confined to voting on election day—and toasting rites belonged to a political world beyond the legislature in which non-elite individuals participated. Such practices enabled those who did not possess the franchise—men without property, women from all ranks, African Americans, and others—to stake their claim to the republic. Branson has demonstrated that during the partisan battles of the 1790s, women offered toasts on political subjects, even in mixed company gatherings, while the toasts offered by men sometimes appealed to women for their approval. Seth Cotlar has illustrated how artisans, mechanics, and other working men used toasting rituals to express their devotion to the French Revolution, particularly to the principles of liberty and egalitarianism associated with transatlantic radicalism. Waldstreicher has stressed that republican celebrations in opposition to the Washington administration, including the printed toasts that derived from those gatherings, contributed to the formation of a national opposition party, while Pasley has suggested that the toasts printed in newspapers actually represented an early version of a party platform.

The writing and performance of toasts was a social experience. Sometimes toasts were impromptu, but more often they were planned for maximum publicity. Before a gathering, a committee of arrangements would be charged with composing a series of toasts. Written individually or collectively, the toasts would all be reviewed and often revised by the committee before their performance. Since celebrants would never toast a sentiment with which they disagreed, such advance planning was necessary to ensure a display of unity. At these celebrations, consensus meant everything, historian Peter Thompson has argued. When performed, toasts would be followed by cheers, applause, sometimes music, and often in the case of militia companies, a ceremonial burst of artillery.

After the celebration, local newspaper editors received copies of the toasts to publish in their papers. As these newspapers circulated beyond their point of original publication, editors in other locations reprinted the toasts. Whereas few toasts from the colonial era circulated beyond the places where they had originated, toasts began being transmitted across the extended republic. When complaining to his wife about the toasts drunk at a Republican “frolic” in Philadelphia, Vice President John Adams asked Abigail: “Have you read them [?]” By the 1790s, Adams naturally anticipated that Philadelphia toasts might be reprinted in Boston. The circulation of printed toasts, along with other materials, pushed many Americans to start thinking nationally, as citizens in one location became increasingly aware of what other people across the nation were doing.

When I began teaching, I conceived of a toasting exercise as a method of conveying to students the participatory and performative nature of early republic politics. This idea was prompted by my own experience of writing and performing a series of toasts on regicide at a Summer Institute hosted by the Jack Miller Center for Teaching America’s Founding Principles and History. After performing these toasts, I gained a new understanding of the political culture that I had been reading about in books for so long. I wondered if a toasting activity could have the same intellectual impact on my students.

In the weeks preceding the assignment, students become familiar with the larger narrative of 1790s politics through lectures and readings. They learn about how Americans over the course of that decade grew divided between the Federalist supporters of the Washington administration and its Republican opponents. In class, we cover specific episodes to illustrate this polarization, from the battle over Alexander Hamilton’s financial policies to the contrasting attitudes toward the French Revolution, the controversy over Jay’s Treaty, the Alien and Sedition Acts, and the election crisis of 1800.

Students read and discuss several journal articles on popular political culture to complement this larger narrative. In past years, my syllabus has included articles by Waldstreicher on the legacy of revolutionary-era popular protests and festivities, Newman on celebrations of Washington’s leadership, Albrecht Koschnick on the Democratic-Republican Societies, and Pasley on the newspaper wars of the 1790s. These articles introduce students to the concepts of the public sphere and print culture, which enhance their understanding of the social and cultural contexts in which toasting rituals took place.

Students also analyze primary sources such as correspondence, addresses, and newspaper editorials from the period. In reading these sources, students learn about the arguments that appeared in print to justify various political positions. They further gain insight into the distinct language in which people discussed politics during the 1790s. On one occasion, a student observed that a newspaper correspondent used the word “jealous” in a way that differed from how people use the term today. While the term is largely used now as a synonym for “envious,” in the eighteenth century people used it to mean vigilant, as in the citizens jealously guarded their liberties. This observation led to a productive conversation in class about political vocabulary and how the meanings of words change over time.

Often when students first approach the partisan conflicts of the 1790s, they assume that those struggles paralleled today’s two-party political system. By reading the sources and then writing toasts, students reach a deeper, more nuanced level of historical understanding.

Finally, students read models of toasts culled from late eighteenth-century American newspapers. Toasts then fell into two categories. First, people drank healths to individuals like George Washington and to collective bodies like the Congress to demonstrate their admiration. Second, citizens drank sentiments to a wide variety of subjects. For example, at a joint celebration of the Democratic Society of Pennsylvania and the German-Republican Society of Philadelphia, participants drank to “the extinction of Monarchy,” adding: “May the next generation know kings only by the page of history, and wonder that such monsters were ever permitted to exist.” These examples give students a sense of the conventions of toasting that must shape their own historical toasts.

This combination of political narrative and readings in the secondary and primary sources prepares the students to work on their own toasts. For the exercise, the class divides into Federalists and Republicans. Working together, each team is charged with writing twenty toasts that reflect its politics. I advise students to begin by brainstorming the key issues of the 1790s and then figuring out their position on those subjects. Each team devises a list of topics such as the Bank of the United States or Jay’s Treaty, making sure that they adequately cover the decade’s political conflicts. The students proceed to experiment with writing their toasts, modeling them on the ones that they have read. They work together as I move around the classroom eavesdropping, advising, and reviewing their draft toasts. While I do not require students to meet outside of class to work on this assignment, the students invariably do this on their own initiative, either in person or virtually.

Typically, individual students take responsibility for writing three or four toasts. Even when they do this on their own, they interact with their peers and with me to revise their toasts. This activity encourages the writing process—brainstorming, drafting, and revising multiple times, which replicates the historical experience of writing toasts by committee. Students engage in this process in an intensive way that does not always happen when they write traditional essays. Maybe this particular assignment appeals to students who are immersed in the world of contemporary social media, like Twitter and Facebook. Perhaps also the knowledge that the students are going to perform these toasts in front of the class pushes them to perfect them. One student, for example, observed that “the pressure of sharing [the toasts] prompted me to do research beyond class notes.”

 

Title page, The Vocal Remembrancer; Being a Choice Selection of the Most Admired Songs, Including the Modern. To which are Added Favourite Toasts and Sentiments (Philadelphia, 1790). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.
Title page, The Vocal Remembrancer; Being a Choice Selection of the Most Admired Songs, Including the Modern. To which are Added Favourite Toasts and Sentiments (Philadelphia, 1790). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.

The major challenge students face in writing the toasts is putting their thoughts in apt historical language. One student quipped that “it was interesting and fun to use the lingo of the time,” but admitted that his group struggled to find the right words. After immersing themselves in the primary sources, students learn how keywords such as aristocrat and mob functioned as political insults during the 1790s. In one mock Republican sentiment, a student toasted: “To Mr. Hamilton’s Financial Plan: May the people come to see the true nature of the national debt as shackling the states to a federal aristocracy.” Another student, representing the Federalists, toasted the Democratic-Republican Societies “For showing us that the mob can be as tyrannical as the king.” Both toasts successfully employed the political vocabulary of the late eighteenth century, demonstrating that these students were thinking and writing in historical rather than contemporary terms. Often when students first approach the partisan conflicts of the 1790s, they assume that those struggles paralleled today’s two-party political system. By reading the sources and then writing toasts, students reach a deeper, more nuanced level of historical understanding. Students realize that in the early republic, Americans’ political vocabulary and assumptions about politics differed dramatically from their own, and that realization opens up a rich field for exploration.

I simulate the experience of historical toasting—and enhance the fun—by bringing plastic cups and a few bottles of sparkling grape juice, red and white, to class. A student who admitted to me that she was nervous before performing her toasts gained confidence when her team members responded to her toasts with several “Huzzas.” Through that collective cheering, students experience first-hand the social energies unleashed through toasting rituals. Ideally, if students connect their own emotions to those of the early republic’s citizens, they can make some valuable inferences about how the latter must have felt when toasting. Some students thrive on this social aspect of the assignment. The toasting exercise “helped me get to know my classmates,” commented one student, and theatre students particularly love the opportunity to connect history with performance.

 

“Toast and Sentiments” from The Vocal Remembrancer (Philadelphia, 1790). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.
“Toast and Sentiments” from The Vocal Remembrancer (Philadelphia, 1790). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.

Ultimately, this toasting exercise allows me and my students to bring the politics of the 1790s to life in our history classroom. Students are typically familiar with the traditional narrative of early American history, particularly the heroic representation of the founding fathers. They are less familiar with the political and cultural environment in which men like Washington, Jefferson, and Adams lived, a world that they shared with other lesser-known Americans who have not made it into the history books. In taking on the political identities of Federalists and Republicans, students imaginatively enter into that historical world. Ideally, in writing toasts, they gain insight into how people in the 1790s thought, and, in performing their toasts, they experience the social power of the early republic’s festive politics.

Further Reading

John Adams’s comments on the toasts drunk at the republican “frolic” in Philadelphia appear in Letter from John Adams to Abigail Adams, February 8, 1794. Adams Family Papers: An Electronic Archive, Massachusetts Historical Society. The “extinction to monarchy” toast derived from The General Advertiser, May 3, 1793.

 

A good narrative of 1790s political developments can be found in James Roger Sharp, American Politics in the Early Republic: The New Nation in Crisis (New Haven, 1995). For an overview of the “new, new political history,” see Jeffrey L. Pasley et al., Beyond the Founders: New Approaches to the Political History of the Early American Republic (Chapel Hill, 2004). On the British context of sociability, see Peter Clark, British Clubs and Societies, 1580-1800: The Origins of an Associational World (Oxford, 2000).

The last study that focused exclusively on toasting rituals appeared over sixty years ago: Richard J. Hooker, “The American Revolution Seen Through a Wine Glass,” The William and Mary Quarterly 11:1 (January 1954): 52-77. For a discussion of toasting within the larger context of alcohol consumption in early America, see Peter Thompson, “‘The Friendly Glass’: Drink and Gentility in Colonial Philadelphia,” Pennsylvania Magazine of History and Biography 113:4 (October 1989): 549-573. Discussions of toasting also appear in the following works: Simon P. Newman, Parades and the Politics of the Street: Festive Culture in the Early American Republic (Philadelphia, 1999); Susan Branson, These Fiery Frenchified Dames: Women and Political Culture in Early National Philadelphia (Philadelphia, 2001); Seth Cotlar, Tom Paine’s America: The Rise and Fall of Transatlantic Radicalism in the Early Republic (Charlottesville, 2011); David Waldstreicher, In the Midst of Perpetual Fetes: The Making of American Nationalism, 1776-1820 (Chapel Hill, 1997); Jeffrey L. Pasley, “The Tyranny of Printers”: Newspaper Politics in the Early American Republic (Charlottesville, 2002).

 

This article originally appeared in issue 16.2 (Winter, 2016).


Michelle Orihel is an assistant professor in the Department of History, Sociology, and Anthropology at Southern Utah University. Her research focuses on politics and the print media in early American and Atlantic history. She is writing a book about the Democratic-Republican Societies and opposition politics in the trans-Appalachian West during the 1790s. She has published articles in The Historian and The New England Quarterly.