What’s “Sacred” about Violence in Early America?

Killing, and dying, in the name of God in the New World

One of the most chilling images in early American history is the deliberate firing of Fort Mystic during the Pequot War of 1637. Five hundred Indian men, women, and children died that day, burned alive along with their homes and possessions by a vengeful Puritan militia intent on doing God’s will. “We must burn them!” the militia captain famously insisted to his troops on the eve of the massacre, in words that echo the classic early modern response to heretics. Just five months before, the Puritan minister at Salem had exhorted his congregation in strikingly similar terms to destroy a more familiar enemy, Satan; “We must burne him,” John Wheelwright told his parishioners. Indians and devils may have been scarcely distinguishable to many a Puritan, but their rhetorical conflation in these two calls to arms raises a question: Was the burning of Fort Mystic a racial or a religious killing?

The simple, and no doubt right, answer is that it was both. In early modern Europe, people were defined as much by what they believed as by how they looked. The line between Christian and non-Christian was the one fundamental divide that separated people, communities, and kingdoms into hostile camps, and it certainly does not surprise us to see seventeenth-century Christians (not to mention latter-day ones) justifying bloodshed in the name of God. In the British North American colonies, where the “sacred” had a more tenuous material and institutional existence and where legitimacy of any kind was harder to come by, it is nearly impossible to disentangle religious violence from other forms of aggression. One could easily say that all actions undertaken by European settlers to defend themselves and their communities in the vast missionary field that was the New World bespoke religious anxieties and aspirations. But having said this, it is still possible to place some phenomenological and interpretive boundaries around the problem of “religious violence.” What forms of violence should we categorize as “sacred” violence, and how do we know when notions of the sacred are at stake in acts of violence, whatever form they might take? (I should note that, though I am using the terms sacred and religious interchangeably, there is a difference: as I’ve come to understand the terms, the sacred refers to all that which exists outside or transcends the human sphere, whereas religion refers to those practices and institutions that defend the sacred within the human sphere. Sacred is the more encompassing term and the less definable.)

A survey of the myriad ways Europe’s Christians found to kill and maim one another in the sixteenth and seventeenth centuries—the age of the wars of religion—would uncover formal and rhetorical similarities in narratives of martyrdom, massacre, iconoclasm, judicial torture, and orchestrated assaults (both popular and state-sponsored) on dissenting communities. The colonial Indian wars of the seventeenth century fit very comfortably in this historical landscape; in numerous ways, these wars may best be understood as continuations and extensions of Europe’s wars of religion. Nearly identical forms of violence marked all these events (burning of innocents, dismembering of combatants and posthumous violations, destruction of sacred objects such as Bibles, cannibalism and other ritualized acts of consumption of one’s enemies, and a central preoccupation with blood and its purgative and purifying properties). And, perhaps more tellingly, these atrocities were often described in identical language from one genre to the next; the tortures endured by martyrs at the stake in the Old World, so meticulously and lovingly described by Reformation and Counter-Reformation propagandists, are exactly replicated in narrative detail in the graphic accounts of the Indian wars published by sympathetic missionaries in the New World. (The parallels are even more pronounced in Spain’s conquest of its Indian population.)

 

"Various Methods of Massacreing the Protestants in the Vallies of Piedmont in Italy." This and subsequent illustrations are taken from The New and Complete Book of Martyrs, by Rev. Mr. John Foxe and revised by Paul Wright, DD (New York, 1794). Courtesy of the American Antiquarian Society.
“Various Methods of Massacreing the Protestants in the Vallies of Piedmont in Italy.” This and subsequent illustrations are taken from The New and Complete Book of Martyrs, by Rev. Mr. John Foxe and revised by Paul Wright, DD (New York, 1794). Courtesy of the American Antiquarian Society.

This is grim reading. And I will spare you the gory details. My point is simply that there are striking similarities between New World violence against indigenous peoples, perhaps the exemplary form of colonial violence, and the European wars of religion, which many historians consider the apex of human savagery in the early modern era. The formal and rhetorical similarities only take us so far, however, in understanding the role of the sacred in colonial violence. Ultimately, what seems to distinguish sacred violence from other acts of aggression is not its form but its intensity. Certain thresholds (emotional, ideological, and perceptual) had to be crossed in order for violence to be interpreted and sanctioned as serving religious ends rather than secular ones. Religious wars, by definition, seem to be more brutal, more zealous, and less tempered by regret or remorse than other forms of warfare. This is not a new or very interesting insight; we’re reminded daily of the ferocity and single-mindedness by which people defend their faith in the face of perceived threats. But it may be useful to explore just how, and where, thresholds of legitimacy come to be established in different cultures and why and how the sacred is invoked.

Many gifted thinkers have devoted considerable energy and creativity to decoding the relationship of violence to the sacred. Some have suggested that all violence is in some sense sacred, that it is rooted in the deepest human desire to defend what is most precious and transcendent. Others have reminded us just how central acts of violence (the crucifixion, for example) are to the core principles and symbols of the world’s great religions. From René Girard’s classic anthropological theory of sacrificial rites to Elaine Scarry’s imaginative reading of the imagery of wounds and wounding in the Old and New Testaments to Ariel Glucklich’s much more recent theory of the psychological roots of “sacred pain,” scholars from across the disciplines have tackled the ubiquitous association of religion with violence.

For the historian of colonial America, the question is not the ubiquity of religious violence but the apparent scarcity of it. The starkest and most brutal forms of persecution—the burning of heretics, wholesale destruction of sacred places and objects, the forced expulsion and enslavement of outsiders such as Jews and Huguenots—were noticeably absent from the British colonies. But the European periphery produced new and sometimes bizarre forms of sacred violence: the ritualized assaults by Puritans on witches and Indians, which some scholars consider a peculiar form of iconoclasm; the proliferation of martyr tales within the context of slavery and Indian captivity; and the emergence of a hyperbolic rhetoric of suffering and redemption that traveled easily from religious to secular genres. Colonial Americans seemed (in good Protestant fashion) particularly adept at vicarious forms of violence. Words and objects, not people, were their main targets.

 

Top, "St. Lawrence Burnt on a Gridiron by order of the Emperor Valerianus in the 8th Roman Persecution of the Christian Church"; Bottom, "Two Primitive Martyrs put into Copper of Boiling Oil by order of the Proconsul of Ephesus during the Reign of Nero. AD 69." Courtesy of the American Antiquarian Society.
Top, “St. Lawrence Burnt on a Gridiron by order of the Emperor Valerianus in the 8th Roman Persecution of the Christian Church”; Bottom, “Two Primitive Martyrs put into Copper of Boiling Oil by order of the Proconsul of Ephesus during the Reign of Nero. AD 69.” Courtesy of the American Antiquarian Society.

There is one obvious answer to this puzzling lack of traditional religious violence: the absence of the traditional targets of such violence—cathedrals, religious art, shrines, and, above all, significant urban concentrations of “outsiders” like Catholics and Jews. Furthermore, the structural weaknesses of the colonial governments meant that the religious disaffected had neither the resources nor the institutional backing to mount a serious assault on the objects of their rage. It is no coincidence that the only instance of official martyrdom (the execution of four Quaker missionaries in Massachusetts in the 1650s) occurred in a colony that boasted both a strong established church and a powerful magistracy. When we consider that the era of mass execution of heretics had largely ended by the time the British colonies were settled (the last Englishman burned at the stake for heresy was Edward Wightman in 1611), it is perhaps not surprising that so few colonists died for their beliefs. Moreover, the spectrum of popular sacred violence in Europe—which included such acts as cursing, blasphemous songs and jokes, and carnivalesque gestures aimed at reinforcing popular morality—also narrowed noticeably in the American colonies. The ecclesiastical landscape in North America simply didn’t offer the same opportunities for expressing sacred rage as the Old World, even if that rage continued to simmer in the hearts of European settlers who had so recently endured so much pain and suffering on account of their own religious beliefs.

A significant proportion if not an outright majority of immigrants throughout the seventeenth and eighteenth centuries were driven to the colonies by religious persecution at home—the Puritans, of course, but also the Quakers, Huguenots, Scots-Irish Presbyterians, German Pietists, Moravians, Shakers, and a smattering of minor sectaries such as the Dunkards and Muggletonians. For some of these migrants, the experience of persecution was a fresh wound, still bloody and raw, not a theological abstraction. (Some, most famously the Moravians, managed to ground both their collective historical memory and their theology in the image of blood and wounds.) And yet, the colonial settlements that mushroomed along the Atlantic coast in the late seventeenth and eighteenth centuries were, by and large, safe (or at least, non-lethal) environments for new generations of religious dissenters.

So much inflamed passion, so little actual persecution: such is the conventional story of colonial America’s religious history. Take martyrdom, for example. Martyrdom was the paradigmatic experience of sacred violence in the Reformation and Counter-Reformation eras—the ultimate expression of a community’s willingness to kill, and die, in the name of God. What immediately strikes the colonial historian is how few people were martyred for their faith in British North America. To provide some background: the most recent survey of early modern European martyrdom concludes that, as a conservative estimate, some 5000 people were judicially executed for religion in the sixteenth and seventeenth centuries, customarily by burning at the stake. If we expand this figure to include those whose deaths were the indirect result of legal prosecution or popular protest (those who died while imprisoned, for example, or who perished in religious riots), the number quickly climbs into the tens of thousands. Not every death produced a martyr, of course—martyrdom is an interpretive as well as a judicial category, the product of a particular believer’s conviction and a particular community’s need for heroes—and Europe’s Christians were assisted in the task of identifying and promoting martyrs by a well-oiled literary machine that poured thousands of martyr pamphlets from its presses. Post-reformation Europe was, in historian Brad Gregory’s words, “awash in martyrological literature.”

 

"The Burning of Mr. Julius Palmer (Fellow of the Magdalen College Oxford), Mr. John Givin, and Mr. Tho(ma)s Askiw at the Sandpits near Newbery Berks." Courtesy of the American Antiquarian Society.
“The Burning of Mr. Julius Palmer (Fellow of the Magdalen College Oxford), Mr. John Givin, and Mr. Tho(ma)s Askiw at the Sandpits near Newbery Berks.” Courtesy of the American Antiquarian Society.

Protestants proved, somewhat surprisingly, to be as adept at promoting the cult of martyrdom as the late medieval church had been—this despite their opposition to cults of any kind. The Protestant martyr was, in many respects, the successor to the Catholic saint: an exemplary figure whose spiritual heroics helped close the immense gulf separating God from man. Unlike saints, martyrs had no supernatural or intercessory powers, but, at the moment of a martyr’s death, believing witnesses could see the face of God shining through the flames. Martyrs’ lives and, especially, their deaths were told and retold, compiled into massive compendiums that served as important textual supplements to the Bible for a new generation of reformers. The most influential of these martyrologies was John Foxe’s eight-volume Acts and Monuments (known colloquially as the Book of Martyrs), first published in English in 1563 and widely available throughout the Anglophone world, including the American colonies. Colonial children were introduced to Foxe’s martyrs from an early age in the pages of the New England Primer, and the town of Concord helpfully purchased its own copy to be made available to all residents.

From the very beginning, then, narratives of martyrdom were as important as the experience of the stake itself in shaping the devotional style and religious subjectivity of English Protestants. Those who did not have to face the prospect of actual death could read about the deaths of others just like them. Martyrologies like Foxe’s were thus instructional manuals as well as commemorative histories: they showed faith to be an act of constant sacrifice, and they told ordinary believers how to defend their faith in the face of terrible pressures. The Protestant martyr was, first and foremost, a humble lay man or woman, not a religious superhero. The Quakers, formally the Society of Friends, best exemplify this ethos of universal martyrdom: a self-designated “suffering people,” Quakers understood suffering to be a collective not an individual experience and so made no distinction in their official records between the heroics of a select few and the more mundane deprivations endured by the many. All Friends were, in some sense, martyrs. “Narratives of suffering” were collected and published by the Society in the thousands to celebrate the whippings, imprisonments, financial hardships, and innumerable indignities inflicted on Friends by a tyrannical and unmerciful government. One such pamphlet “promiscuously” recorded the sufferings “as they were promiscuously inflicted,” and proceeded in an incantatory tone to enumerate “all the sufferings therein: the blood of those whom you had put to death, and the Earsthat you had cut, and the Backs that youhad torn, and the Limbs that you had endeavoured to starve, and the Bellies that you had kept empty, and the Houses and Estates that you have laid waste and devoured, and the necessities and straitsyou had put and exposed the People of the Lord unto . . . for their Conscience to God.”

 

The title page of volume I of The New and Complete Book of Martyrs. Courtesy of the American Antiquarian Society.
The title page of volume I of The New and Complete Book of Martyrs. Courtesy of the American Antiquarian Society.

In the Quaker literature, martyrdom came perilously close to meaning persecution of anykind, no matter how mild or unspectacular. Other Protestants, especially their sectarian opponents, did not go this far, but all Reformers participated in this project to conflate suffering and death and to make narratives of suffering central to the community’s understanding of itself. There is a basic paradox here. Protestants came to experience martyrdom as both a normative category and an exemplary one: something that everyone might—even should—aspire to but that most would experience only vicariously as readers. As the historian of print David Hall points out, “to read about the martyrs was not the same thing as becoming one.” Protestants, it seems, wanted to have it both ways.

This was especially true in the colonies. If European settlers were relatively unfamiliar with the spectacle of executions for heresy, they proved avid consumers and producers of martyr tales—both imported and homegrown. Turning from legal records to pamphlet literature, we find martyrs everywhere in colonial America: Indian captives such as Mary Rowlandson who saw their ordeal as a testing of faith; Puritan and Anglican missionaries who loved to recount the hardships and deprivations they endured in their pursuit of souls in America’s wild backcountry; colonial soldiers in the many wars of empire fought on American soil in the late seventeenth and eighteenth centuries who believed they were redeeming the continent for God and King; long-suffering colonial magistrates and governors who felt persecuted by the ungrateful and importunate masses they were supposed to be governing.

Such martyr tales did not cease with the eighteenth-century dissemination of Enlightenment ideas about a rational and self-regulating universe that presumably demanded little heroic sacrifice and self-denial from its human subjects. As long as new sectarians continued to arrive from bloody Europe, as long as martyr tales continued to sell briskly, as long as the Protestant impulse to glorify suffering remained intact, the image of the martyr continued to exert a powerful pull on the colonial mind.

The British colonies were thus “awash” in martyr literature, a fact that returns us once again to the central paradox of colonial martyrdom: colonial martyrs were everywhere, religious violence (of the kind recognizable to early modern Christians or to historians of the Reformation era), in short supply. What, then, are we to make of this oversized colonial martyr complex? Was this just a rhetorical hangover from a European past, or was it a symptom of deep-seated anxieties about the precarious nature of the “sacred” in a dangerous and alien land? And does it matter, or rather, howdoes it matter that martyrdom was primarily a textual experience in colonial America?

 

"A Bookseller Burnt at Avignon in France for selling Bibles in the French Tongue, with some of them tied around his Neck." Courtesy of the American Antiquarian Society.
“A Bookseller Burnt at Avignon in France for selling Bibles in the French Tongue, with some of them tied around his Neck.” Courtesy of the American Antiquarian Society.

This question, among others, has been at the heart of recent efforts to apply the insights of Michel Foucault and Norbert Elias to the historical experiences of Anglo-Americans in the seventeenth and eighteenth centuries, a period which saw enlightenment but also savagery: corporal punishment began to disappear from the judicial landscape, but the disciplining of slaves reached new heights of terror; blood feuds and shaming rituals were discredited by new generations of civil and political leaders, but successful new evangelical and pietistic movements restored the language of blood and sacrifice to religious practice; the sentimental novel captured the hearts of Anglo-America’s middle classes, but quasi-pornographic tales of sadistic cruelty and sexual violation moved the souls of abolitionists and other middle-class reformers. At the heart of this conundrum is the printed word’s capacity to (re)-create bodily experience—to capture for readers the physical sensations associated with a given activity (torture, enslavement, death, sex, birth).

The boldest interpretations of the early modern print revolution argue that texts and their readers became the predominant model for theories of human society: actors became spectators as the chaos of lived experience was sorted into clean and coherent linear narratives in which the particular was supplanted by the universal. In this interpretation, the stories people tell themselves about their lives in a modern world are truer, more consequential, than life itself. More cautious interpretations stress the circular, limiting quality of print, in which stories, however visceral and compelling, remain just that—stories. One conclusion is clear from all this: bodies and texts bear an imperfect and unstable semiotic relationship to one another. Neither can be a substitute for the other, yet, in a certain sense, neither exists without the other.

This is a pretty long-winded way of saying what David Hall said much more succinctly—reading about martyrs is not the same thing as becoming one. Still, I think we can make a case that something of the dread, terror, and ecstasy of martyrdom was available to readers. An interesting comparison is the literature of Indian-hating, which colonists produced in large quantities during the eighteenth century, especially during the French and Indian War. Though horrific in tone and graphic in content, much of this literature was written by men and women who had never encountered a real Indian or faced a real tomahawk. What the historian Peter Silver has called “the agreeable horror of Indian-hating,” nicely captures the potent mixture of pain and pleasure such narratives produced in their readers. Reading about horror, after all, is never an entirely passive experience. It can have real emotional and ideological consequences: in the case of Silver’s war stories, hardened racial animosities and ethnic paranoia. In similar fashion, reading martyr tales may have been such a pervasive practice in the colonies precisely because of the absence of Old World sectarian persecution. Colonists had to find other, less immediate ways to fuel their faith. When we consider how quickly and vehemently anti-Catholic prejudices surfaced in the wake of the French and Indian War—at a time when the number of Catholics in the colonies was vanishingly small, and their institutional presence, entirely benign—we can see the political utility of keeping the memory of religious persecution alive in the form of martyr tales.

Of course, the ultimate irony is that, while the colonists were busy envisioning their own sufferings as a form of martyrdom, safe from the flames of religious hatred that still engulfed much of Europe, people were dying for their faith (or the faith of others) in the Americas. These people were not Europeans; many were not Christians or members of any faith recognized by Europeans. But they were clearly the victims of violence perpetrated in the name of God.

Should we extend the concept of martyrdom to include those who did not use or recognize the term? Should the five hundred Pequots who perished in the Fort Mystic massacre be considered “martyrs”? What about the Praying Indians who were herded onto a pestilential island in Boston Harbor during King Philip’s War and left to die while the Puritan militias burned Indian villages from Maine to Massachusetts? Or the peaceful Indians of the Moravian mission town of Gnadenhutten who were slaughtered by vengeful Scots-Irish farmers a century later in the Pennsylvania backcountry? We know that the German Moravians considered their Indian brethren at Gnadenhutten to be martyrs to the cause, and I suspect that New England’s Christian Indians had their own martyr tales to tell of King Philip’s War, even if they left almost no written accounts of their ordeal. To move further into the dark borderlands of the colonial “violence frontier,” how about the thousands of Africans who suffered (in Jon Butler’s provocative phrase) a “spiritual holocaust” when they were torn from their native villages and cosmologies and forced into slavery in the American South? Should the violence of renaming, the loss of African genealogical and spiritual roots, be compared to the violence of burning at the stake? And what of those slaves who were burned at the stake—the unfortunate men and women who fell victim to the southern slaveholders’ paranoia about fire and poison throughout the eighteenth century or, on a larger scale, the “conspirators” in the 1741 New York arson scare who formed a human bonfire at the hands of the city’s terrified citizens? How much of the ideological complex of European heresy hunting was recreated in the spectacle of slave malefactors or Indian villages being put to the flames?

These are questions not easily answered. And the answers depend in part on which perspective we wish to adopt—that of the victims or that of the aggressors. As Europe’s Protestants knew all too well, one person’s martyr was another’s heretic. From the perspective of the historian of the European colonial experience, it seems reasonable to suggest that the act of burning alive was an expression of religious anathema, whether reserved for heretics or racial others, and that those who suffered (and perpetrated) this horror were understood to be fulfilling religious roles. Whether construed as heathens, infidels, apostates, or devil worshippers, Indians and Africans occupied a position of spiritual significance for their European neighbors, and acts of violence directed against these religious and racial outsiders were, I would argue, always acts of sacred violence. In this sense, the terrible wars of religion that destroyed so much of Europe did not end in 1648 with the Treaty of Westphalia, or in 1689 with the Act of Toleration in Great Britain, or in 1710 with the final defeat of the Camisard Revolt in France but continued to gather victims well into the Age of Reason.

 

This article originally appeared in issue 6.1 (October, 2005).


Susan Juster is professor of history at the University of Michigan. Her recent publications include Disorderly Women: Sexual Politics and Evangelicalism in Revolutionary New England (1994) and Doomsayers: Anglo-American Prophecy in the Age of Revolution (2003). She is currently working on a cultural history of religious violence in the American colonies.




“Feed on Humane Flesh and Blood? Strang mess!”: A Puritan Communion Cup

Silver beaker by John Dixwell, 4 5/8 in. x 2 3/8 in. (c. 1715), Henry Needham Flynt Silver and Metalware Collection. Courtesy of Historic Deerfield, Massachusetts.
Silver beaker by John Dixwell, 4 5/8 in. x 2 3/8 in. (c. 1715), Henry Needham Flynt Silver and Metalware Collection. Courtesy of Historic Deerfield, Massachusetts.

This silver communion cup, made around 1715 by English silversmith John Dixwell for the First Church of Deerfield, Massachusetts, was likely used in a Puritan communion service.

Communion in Deerfield would have looked something like this: the congregation gathered in the meetinghouse, sitting in pews facing the minister at the center of the room. Stepping down from the pulpit to the communion table, the minister blessed a flagon of wine, symbolizing the blood of Christ, and poured its contents into an array of silver vessels, including this two-handled cup. Deacons carried the cups of wine to the ends of the pews. One by one, those members of the congregation who had been declared fit to take communion passed the cup along the pews. They drank and meditated on the body of Jesus Christ and on the body of the congregation.

Behind this ritual lurked the shadow of another, the Catholic mass, in which a priest consecrated wafers and wine, transforming them into the body and blood of Christ. In the Puritan communion service, the substance that communicants ingested was merely a metaphor; in the Catholic Mass, communicants consumed the product of a miracle called transubstantiation. Although Puritans designed their service in direct opposition to the mass (and similar Anglican practices back in England), its practitioners’ version of the Lord’s Supper had more in common with the mass than Puritans wanted to admit. The silver communion cup embodied the tensions that Puritans faced in their religious experiment in New England. The cup fostered community as it enabled bodily contact between communicants and Christ. At the same time, however, communion ritualistically excluded outsiders. These tensions took on a fierce urgency as the Puritans warred with their French Catholic neighbors to the north in Canada.

Taking Communion

The Dixwell communion cup forged ties between the bodies of communicants and the body of Christ. It fit easily in the hand, with two handles that allowed communicants to pass it along the pew. Other Puritan vessels of the time would have been more difficult to hand down the pew, as they had only one handle or none at all. The Dixwell cup was designed to be handled by large groups of people, without any spilling of wine; however, the sheer delicacy of the handles (likely replaced at a later date) still encouraged users to grasp it carefully. In the Catholic mass, the priest placed consecrated wafers on the tongues of believers. By contrast, the Puritan communion service, in a manifestation of the priesthood of all believers, granted the communicant direct contact with the blood of Christ: communicants raised the lip of the cup to their own lips and drank.

Made to be touched, the communion cup both facilitated contact between bodies and formed a body of its own. Bodies are vessels, containers of viscera, and the communion cup enclosed a substance that purported to be the essence of life itself. As a metal, silver warms in contact with heat. Puritan communicants would have warmed the communion cup, and maybe the wine within, with their hands as they passed the cup down the pew. The cup and the wine sloshing within might have felt like a pulsing, living thing. Even so, the handles of the Dixwell cup prevented users from touching the body of the cup, much less the precious wine within; the contact between the body of the consumed and the body of the consumer was well-regulated.

Defining Communities

Just as the cup regulated contact between the body of the communicant and the body of the consumed, Puritans and Catholics drew different boundaries around their communities of communicants. In order to take communion at mass, Catholic communicants needed to have made confession of sins, fasted since midnight the night before, and expressed faith in the miracle of transubstantiation. By contrast, many Puritan congregations “fenced” the communion table, allowing only certain laypeople to take part. Potential members had to complete an extensive devotional regimen before the congregation deemed them ready to take part in the Lord’s Supper. Early in the seventeenth century, only those who had publicly declared their conversion experiences were allowed be baptized and take communion. By the late seventeenth century, church authorities began to allow people who had not announced their conversion but who lived godly lives to participate in the sacraments. A sometimes uneasy compromise, this Halfway Covenant lasted into the eighteenth century. Puritans and Catholics defined their bodies of communicants differently, with Puritans taking pride in their more restrictive Lord’s Supper.

Debating Communion and Cannibalism

Like the cup itself, the wine within held complex meanings for Puritans, who rejected the Catholic belief in transubstantiation. The most important division between the two religions was the question of what communicants ingested during the communion service—blood or wine, flesh or wafer. The resulting arguments—miracle versus metaphor—played out over and over again in Catholic and Protestant writing and practice.

The Catholic doctrine of the miracle of transubstantiation relied upon a literal interpretation of Jesus’s words in 1 Corinthians 24-25: “This is my body … This cup is the new testament in my blood.” A priest’s blessing transformed sacramental wafers and wine into Jesus’s actual flesh and blood. According to the Douay catechism, the Eucharist was “the Body and Blood of Jesus Christ … under the forms or appearances of Bread and Wine.” John Gother, an English convert to Catholicism, placed belief in transubstantiation at the forefront of Catholic faith. “My Saviour Jesus Christ,” he wrote, “I firmly believe Thou art really present in the Blessed Sacrament; I believe that it contains thy Body and Blood, accompanied with thy Soul and Divinity.”

Protestants disagreed, insisting that Christ’s words should be interpreted only as a metaphor. The Westminster catechism specified that communicants partook of Christ’s flesh and blood “not after a corporal and carnal manner, but by faith” alone. Nevertheless, Puritan devotional writings contained a hunger for communion that seemed to transcend figures of speech. Cotton Mather expounded upon the life-sustaining qualities of bread and wine, declaring that Christ’s love similarly fed the soul: “If Bread nourish & strengthen the Body, much more will the Lord Jesus do so, to the Souls of them, who draw near unto Him,” he wrote. Though Protestants were not consuming actual flesh and blood, Mather argued that a true believer would nevertheless be able to “Discern the Lords Body in the Lords Supper.” But the importance of the Lord’s Supper went beyond discerning the holy in the seemingly mundane. Communion satisfied a particular kind of spiritual appetite. Another Puritan minister, Thomas Doolittle, asked of communicants, “Do you love him, would you not desire to eat and drink at his Table, yea, to feast upon him? … Did you hunger after him, and thirst for him …?”

Edward Taylor certainly did. The Westfield, Massachusetts, minister wrote reams of devotional poetry, including several “Meditations” on John 6:53, “Except you eat the flesh of the Son of Man, and drink his blood, ye have no Life in you.” Unlike Mather, Taylor did not belabor the distinction between wine and blood. He described the Lord’s Supper in literal, visceral terms: “Thou, Lord, Envit’st me thus to eat thy Flesh / And drinke thy blood more Spiritfull than wine.” Communion provided a special kind of nourishment that secular food could not, nourishment without which believers would starve: “I must eate or be a witherd stem,” Taylor declared.

In spite of hungers like Taylor’s, most Puritans interpreted transubstantiation as no less than lust for human flesh. The idea that Catholics consumed Christ’s real body and blood made them “so much worse than Canabals,” Mather declared. Taylor therefore recognized the tricky balance he had to strike, between venerating the body of Christ, and being a metaphorically minded Puritan. One of his meditations posed these very questions about communion: “What feed on Humane Flesh and Blood? Strang mess! / Nature exclaims. What Barbarousness is here?” Like a good Protestant, Taylor answered himself by arguing that Christ’s words were symbolic: “This Sense of this blesst Phrase is nonsense thus. / Some other Sense makes this a metaphor.”

Spilling Blood

The consequences of the debates between miracle and metaphor would be literally bloody, leading to centuries of religious warfare after the Reformation, in the Old World and the New. As New England’s Puritans defined their own beliefs and communities, they did so with an anxious eye toward the French Catholics in Canada just to their north. Between 1690 and 1763, New France and New England were at war more often than at peace.

The intimate act of communion incorporated the body of Christ into one’s own. Those Puritans who drank from the silver cup hungered after communion, a hunger that in many ways resembled the Catholic hunger for the host in the mass, though Puritans were loath to admit it. The body of the communion cup helped to bridge the space between the believer and God, but it also divided believers from one another. Puritans and Catholics drew the boundaries of their communities in blood, then went out to draw the blood of their enemies.

Further Reading

For more on New England communion silver, see New England Silver and Silversmithing 1620-1815, edited by Jeannine Falino and Gerald W.R. Ward (2001); and Barbara McLean Ward, “‘In a Feasting Posture’: Communion Vessels and Community Values in Seventeenth- and Eighteenth-Century New England,” Winterthur Portfolio 23:1 (Spring, 1988): 1-24. Observers rarely described Puritan worship services in detail, but much of what historians know about them is summarized in Philip D. Zimmerman, “The Lord’s Supper in Early New England: The Setting and the Service,” New England Meeting House and Church: 1630-1850, Dublin Seminar for New England Folklife Annual Proceedings 1979, edited by Peter Benes (1979): 124-134. On Puritan beliefs and practice, important works include David D. Hall, Worlds of Wonder, Days of Judgment: Popular Religious Belief in Early New England (1989); Charles E. Hambrick-Stowe, The Practice of Piety: Puritan Devotional Disciplines in Seventeenth-Century New England (1982); and Sally Promey, “Seeing the Self ‘In Frame’: Early New England Material Practice and Puritan Piety,” Material Religion 1:1 (2005): 10-46.

On Protestant-Catholic conflict and accusations of cannibalism, see Catalin Avramescu, An Intellectual History of Cannibalism, translated by Alistair Ian Blyth (2009); Karen Gordon-Grube, “Evidence of Medicinal Cannibalism in Puritan New England: ‘Mummy’ and Related Remedies in Edward Taylor’s ‘Dispensatory’,” Early American Literature 28:3 (1993): 185-221; Maggie Kilgour, From Communion to Cannibalism: An Anatomy of Metaphors of Incorporation (1990); and Richard Sugg, Mummies, Cannibals, and Vampires: The History of Corpse Medicine from the Renaissance to the Victorians (2011). On religious conflict and convergence in early New England and New France, see Laura M. Chmielewski, The Spice of Popery: Converging Christianities on an Early American Frontier (2011); and Linford Fisher, The Indian Great Awakening: Religion and the Shaping of Native Cultures in Early America (2012).

 

This article originally appeared in issue 16.3 (Summer, 2016).


Carla Cevasco is a doctoral candidate in the Program in American Studies at Harvard University. She is writing a history of hunger in colonial New England and New France. 

 

 

 




Holy Man, Holy Head: John Wesley’s Busts in the Atlantic World

Sponsored by the Chipstone Foundation

Busts are odd things. Heads free of their bodies, severed at the shoulders, often at the neck, and plopped on architectural bases. They emerge from the sides of buildings, arise from monuments, line libraries, occupy museums, attend grave markers, and greet us from dreary governmental buildings. Busts are, in essence, decapitated heads. And perhaps, in that sense, it isn’t a surprise that the Romans were the first to pioneer the form, as they slashed their way across the Mediterranean world. Most cultures have fixated on the head in their efforts to represent people. After all, the head is where we do our thinking, speaking, listening, and where our emotions reveal themselves—the whole self in a small compass, as it were. But there is always a whiff of destruction in the desire to capture the body in stony substance. The body is not above decay, but in order to set it in stone, humans must cut, chisel, smother, cast, even fire—enact a series of small deaths to represent the body in perpetuity.

At the center of Protestant devotion to eminent divines is mimesis—an urge to be cast in the mold of their preeminent saint.

Despite the ambivalence, sometimes verging on antagonism, among Protestants towards the sacred image, this essay tracks a fixation on the severed head of a prominent Protestant divine, even saint: John Wesley, the preeminent founder and leader of the evangelical movement known as Methodism (fig. 1). Wesley’s head was not chiseled out of fine marble, but almost exclusively cast in pottery. Efforts to capture Wesley’s body and possess a part of his “true nature” participated in a mid- to late-eighteenth-century obsession with highly realistic portraiture, in sculpture as much as painting. But instead of a block of stone from whence a head emerged, a modeled clay depiction of the holy man from “life” was made into a mold and mass-produced in a dizzying array of pottery forms across the Atlantic world. These countless molds were filled and fired with the clay of the small landlocked county of Staffordshire, England. In so doing, Wesley’s presence was transferred in ways that transcended his printed words and transferred not only his bodily image, but the actual land and labor of an area of England that held him in special regard.

At the center of Protestant devotion to eminent divines is mimesis—an urge to be cast in the mold of their preeminent saint. But it would be eighteenth-century Staffordshire potters who would do this work first. Through their massive production they were able to assert their “great man” as a means of connecting a rapidly expanding but scattered Methodist tradition, and as a means of applying the firm pressure of the past upon their tumultuous present.

 

1. A selection of "Wesleyana." Courtesy Wesleyan University Library, Special Collections & Archives. Photo by the author.
1. A selection of “Wesleyana.” Courtesy Wesleyan University Library, Special Collections & Archives. Photo by the author.
2. John Wesley figurine, mid-19th century. Painted lead glazed pearlware. Possibly Minton (Minton design books contain similar designs). Molds were often copied and shared among Staffordshire potteries. Courtesy Wesleyan University Library, Special Collections & Archives. Photo by the author.
2. John Wesley figurine, mid-19th century. Painted lead glazed pearlware. Possibly Minton (Minton design books contain similar designs). Molds were often copied and shared among Staffordshire potteries. Courtesy Wesleyan University Library, Special Collections & Archives. Photo by the author.

 

Objects are prone to legend. They have a tendency to accrue more myth than history, and more hearsay than labels. The material culture of this Protestant saint is no exception. In an unknown location in England, at an unknown time, an artist, Mr. Culy, invited a friend over for evening tea. After initial pleasantries, the two were soon circumnavigating the artists’ home gallery of portrait busts. One bust stood out to his visitor. Who is this, the friend asked? Why, it is a bust of the “Rev. John Wesley,” the artist replied. His visitor was enamored, and he was not alone. Culy told his visitor that it “struck Lord Shelburne in the same manner as it does you.” But when Lord Shelburne learned that it was John Wesley, he was aghast: “He—that race of fanatics!” Culy had assured Shelburne that Wesley was a very humble man, and had always refused to have his likeness taken, thinking it “nothing but vanity.” But after offering Wesley “10 Guineas” for every ten minutes he would sit for him—”knowing you value money for the means of doing good”—Wesley amazingly assented. So Wesley removed his coat, lay on the couch, and the artist prepared the plaster, and laid the wet, cold substance on Wesley’s bare face. After eight minutes, Culy “had the most perfect bust” he had “ever taken.” Wesley washed his face, “counted the ten guineas in his hand,” and said, “I never till now earned money so speedily—but what shall we do with it?” For Wesley, it turns out, this wasn’t hard. On the way home he encountered the suffering dregs of British society: “a poor woman crying bitterly,” “a poor wretch who was greedily eating some potato skins,” a “man, or rather a skeleton,” a “young woman in the last stage of consumption,” an “infant, quite dead,” and a heavily indebted lawyer. Soon Wesley’s 10 guineas were consumed in his efforts to alleviate this deluge of suffering. After hearing this story, Lord Shelburne’s heart was softened against the leader of the “race of fanatics,” and in response, he “immediately ordered a dozen of the busts to embellish the grounds of his beautiful residence.” Shelburne’s transformation, from indignation to devotion, was a powerful story of how the bodily representation of John Wesley could overcome anti-evangelical prejudice and transform it to devotion, all with the help of a story attached to a thing.

This legend about the origin story of John Wesley’s bust and its power was reprinted widely in America in nineteenth-century religious and secular periodicals. One reason for the dissemination of the story was the growing presence of John Wesley, not only in prints that could be tacked to walls, or on copper engravings printed in the frontispiece of books, or on the name of institutions, but in the growing ubiquity of his representation in pottery form. “Wesleyana” flooded middling and lower domestic spaces across the British Isles and North America from the late eighteenth century on. This has led some experts to claim, perhaps in a moment of exaggeration, that Wesley was the most represented British person in ceramics in the eighteenth and nineteenth centuries, probably only surpassed by Queen Victoria. Another reason for the popularity of the story was a desire that the prized busts of Wesley were directly related to his actual face, an indexical portrait that circumvented the mediation of an artist. It wasn’t simply an image of Wesley, open to exaggeration or enhancement. Through the direct contact of plaster on his living face, it wasWesley, more than any other image of the man. This alleviated a lingering Protestant unease with the sacred image, by promising a representation “from nature,” of which God was the undisputed artist.

 

3. “The Founders and Pioneers of Methodism,” collotype print, original by Charles C. Goss (and Theodosia C. Goss, attributed), engraved (albertype process) by Edward Bierstadt (New York, 1873). Courtesy of the American Antiquarian Society, Worcester, Massachusetts. Click for larger image and more information.

 

Wesley appeared not only in busts, but also on plates, teapots, clocks, medallions, intaglio presses, door knockers, wax profiles, walking sticks and figurines (fig. 2). Nearly all of this pottery came from Staffordshire. These objects were like action figures in a period when Methodism was exhibiting its otherworldly strength. In America, this strength was on full display. On the eve of the Declaration of Independence, there were only 69 Methodist congregations in the American Colonies. By 1850, there were almost 200,000. Whereas in the eighteenth century Methodists trailed nearly every other sect, by the Civil War they claimed a third of American church members (church attendance was even higher). This led President Ulysses S. Grant to quip in 1868 that “there were three great parties in the United States: The Republican, the Democratic, and the Methodist Church.” This expansive growth, forged on reaches of the Atlantic world, has led one prominent historian to call Methodism an “empire of the Spirit.” In Methodism’s rapidly expanding solar system, John Wesley was its burning sun (fig. 3). Wesley chose the songs that Methodists sang; he selected (and mercilessly edited) the books Methodists read; he dictated Methodist religious practices; he expressed their first ethical positions; he set their theological doctrines; he established their organizational structure—he was the movement’s most powerful religious force.

Staffordshire potteries were clustered in the small towns of Burslem, Tunstall, Stoke, Hanley, Longton, and Fenton (fig. 4). These pottery factories maintained a stranglehold on the middling Atlantic pottery market from the late eighteenth century on. Most of the firms persist to this day—familiar names such as Wedgwood, Minton, and Spode—and enjoy avid collecting communities. The five major pottery towns were landlocked, but they laid beside a turnpike that led to the ports in Liverpool. The completion of the Trent and Mersey Canal in 1771 gave the potteries port access to the world, and this accelerated north Staffordshire’s already well-established pottery industry. They had a near-perfect situation for Atlantic dominance. The towns’ historic reputation for pottery drew skilled artisans from across England. Coal and clay were abundant in the rolling hills, often at the very same sites. Transportation became easier and cheaper by the year. Many shrewd businessmen led the pottery firms, combining a razor-sharp sensitivity to shifts in “taste,” along with a tendency to embrace mechanical and scientific innovation. Living in the intimidating shadow of Chinese porcelain, Staffordshire potters discovered new methods of expression—from glossy glazes to colored enamels, and from elegant white creamwares to dark, durable redwares. The resulting products varied widely as well, from whimsical “toby jugs” to classical sculptures, from jasperware (a colored, mock porcelain) totransfer prints (a method of transferring inky text and design onto fired clay). By the nineteenth century, most middling Americans enjoyed their dinners and took their tea from ceramics made from the clay, thrown by the potters, and fired from the coal of Staffordshire.

John Wesley first came to the North Staffordshire town of Burslem in March of 1760, and he described it as a “scattered town on top of a hill, inhabited almost entirely by potters.” Methodism thrived in Staffordshire. Wesley stopped there annually, in his wide circuit of Methodist societies and chapels. Looking backward in 1784, he called Burslem “the first [Methodist] society in the country, and it is still the largest and the most in earnest.” In the years after Wesley began visiting Staffordshire regularly, the potters had begun to make portrait medallions of the great leader, a representational tradition that would be greatly expanded and perfected by Josiah Wedgwood (fig. 5). These small likenesses of the preacher were often given as gifts, from one Methodist to another. This practice inaugurated a long tradition of giving little Wesleys to establish bonds between Methodists near and far. This appears to be the way the first potted Wesleys made their way to American shores—gifts from traveling Staffordshire potters on business trips to their American sisters and brothers. Over this particular body, bonds could be made. The other means of American collecting was pilgrimage. As many prominent American Methodists traveled to their English holy land, they often brought Wesley relics back with them—pressed flowers from his childhood home, personal relics, like his glasses, or the velvet from his chair, and most commonly, pottery. They were souvenirs, for sure, but it was often more—a desire to possess the material of a very particular, venerated life. The busts thus often earned the title of “relic,” not only for being old, but also for their purported “exact representation” of the elderly religious leader. This was not simply a “likeness” of Wesley, but somehow carried a part of him. The legend of “Mr. Culy” making a bust from a life-mask became too enticing to deny. Despite the legendary tale with which this essay opened, it was not “Mr. Culy,” but rather Enoch Wood, who would sculpt with his fingers—rather than a life mask—the standard Wesley bust in 1781.

 

4. The pottery towns of Burslem, Tunstall, Stoke, Hanley, Longton, and Fenton were clustered in the north of Staffordshire. A New Physical, Historical & Political Map of England and Wales, by John Andrews. (London, 1786). Courtesy of the American Antiquarian Society, Worcester, Massachusetts. Click image for enlargement (with cities marked) in new window.
4. The pottery towns of Burslem, Tunstall, Stoke, Hanley, Longton, and Fenton were clustered in the north of Staffordshire. A New Physical, Historical & Political Map of England and Wales, by John Andrews. (London, 1786). Courtesy of the American Antiquarian Society, Worcester, Massachusetts. Click image for enlargement (with cities marked) in new window.
5. Jasperware cameo portrait medallion of John Wesley in gilt frame. Wedgwood Manufactory (late 18th century). Courtesy Wesleyan University Library, Special Collections & Archives. Photo by the author.
5. Jasperware cameo portrait medallion of John Wesley in gilt frame. Wedgwood Manufactory (late 18th century). Courtesy Wesleyan University Library, Special Collections & Archives. Photo by the author.

 

Enoch Wood grew up among the distinctive bottle kilns of Burslem, in a potter’s family that specialized in modeled figurines. Young Enoch demonstrated considerable talent, using spare pieces of abandoned clay to try his hand at sculpting. When Enoch was fourteen, some traveling artists came to town, with a box in tow. Beneath the mahogany encasing and a velvet lining was a wax crucifix. Wood watched his fellow townspeoples’ astonishment in seeing the crucifix, “how it seemed to soften their hearts and open their purses.” He knew he could do better. So he endeavored to make a bigger, better crucifix, hoping to take it on the road, and with his earnings, see the world. Four years later, he amazed his fellow potters with an even more impressive production of a three dimensional, basso relief of John David Rubens’ Descent from the Cross, made in the famous blue and white of jasperware. It is unclear if these early artistic efforts helped Enoch see the world (he at least made it to Liverpool to study art and anatomy), but his pottery would travel far. And he never forgot the lesson of marshalling his talent for economic gain. He became a master potter in his early twenties and led a long career of pottery production that specialized in ceramic products for the American market. If Josiah Wedgwood was the recognized “prince” of the Staffordshire potteries, Enoch Wood, by the end of his life, would be remembered as their “father.”

In 1781, at the age of twenty-two, Enoch Wood had the sculpting opportunity of his lifetime. Through a fellow potter, he arranged to “take” Wesley’s head. In five separate sittings, Wood used modeling clay to sculpt the holy man as he was hunched over, catching up on correspondence. Enoch’s wife, Ann, tried to help, attempting to divert Mr. Wesley with some polite conversation. But she couldn’t get him to look up from his letters for long. As a result, Enoch’s sculpture “from life” was a bit too accurate. Wesley was impressed, saying it was the best likeness that had ever been taken of him, but Wood had copied the concentration on his brow with a bit too much fidelity. He looked stern and harsh. And Wood had embarrassed Wesley’s manservant by copying his rumpled clerical gowns, which had been crushed in his travel bags. So Wesley sat back down for a few minutes, and Wood gave him a lift (though his clothes would remain crushed in the early editions of the bust) (fig. 6). Wesley was pleased and told Wood not to touch it, lest he “mar” it. Wood had a tradition of placing small medallions on the rear of his busts that gave the name, age of the sitter, and an additional note about the person’s life—”any remarkable occurrence.” Without the “smallest hesitation,” Wesley told Wood his remarkable moment. For Wesley it was the story of being saved from the flames of his family home in Epworth as a child, and that Wood should write on the medallion, “Is not this a brand plucked from the fire.”

 

6. John Wesley by Enoch Wood (ca. 1781), painted terracotta, with an open back. Tipple Collection, Object 203, from the Methodist Collection at Drew University, Madison, New Jersey. Photo by the author.
6. John Wesley by Enoch Wood (ca. 1781), painted terracotta, with an open back. Tipple Collection, Object 203, from the Methodist Collection at Drew University, Madison, New Jersey. Photo by the author.

 

Wood went home and made a cast of his sculpture, copying the realistic loose jowls, wrinkles, raised veins, scars, and dimpled chin. When he decided where to crop the body in order to place the copies on bases, he cut the arms and left the head and the heart. This was an appropriate choice for the man, given that scholars have understood Wesley as what one of his biographers calls a “reasonable enthusiast,” a combination of enlightenment intellect and experiential feeling. Soon Wood was selling copies of the work, painted on terra-cotta, with an open back, to some of Wesley’s closest admirers. He exhibited the bust to a gathering of Methodist ministers later that year, all of whom marveled at the stunning accuracy of “so exact a resemblance of that great man.” A Christian lesson was never far behind these exhibitions. As Wood was leaving the meeting, one of the ministers, John Fletcher, chased the sculptor down in the churchyard. Fletcher grilled Wood on his technique, and then stood on one of the nearby graves, and preached about God as the potter of human souls, using Wood’s process as a metaphor for the Holy Spirit’s work in reproducing God’s image on soft, pliable human hearts.

After Wesley’s death in 1791, Wood began mass-producing Wesley busts in a wide variety of media. Just as hagiographers after Wesley’s death smoothed over Wesley’s rough personal imperfections—such as his abysmal home life—so a gradual loss of fidelity was introduced in the Staffordshire production of Wesley’s body. The molds began to wear out and break; new, derivative molds were made. On the subsequent busts Wesley’s gaze moved gradually heavenward. His wrinkles were smoothed. His jowls became taut on his face, and the decoration more fanciful, and amateur—in part because children took over the work of decoration in the factories (fig. 7). Around the 1830s, some factories even took custom decoration orders from consumers and made diverse castings to expand merchants’ offerings. Would you like your Wesley with red or black hair? Young or old? A marbled base or a colorful swirl? Would you like him to look like Lord Byron, or would you prefer him perched upon a fake clock? And so Wesley’s admirers gained the power to shape his presence in their homes.

But even as Wesleyana moved away from the realm of respectable sculpture toward bric-à-brac, Wood’s original composition remained the standard representation of the man, upon which all prior and subsequent images were measured. Sculptors looked to Enoch Wood’s uncanny likeness for the public monuments to the man that appeared in Britain and America in the nineteenth century. The bust was so central to Wood’s career that he asked to be buried with an early copy, installed in the wall of his family vault in Burslem. And when a print was made to honor Wood’s career as a potter after his death, his friends didn’t depict Wood, but rather a two-dimensional copper engraving of John Wesley with the head copied from the bust. The bust even came to represent its region of origin. In the early twentieth century, when the Chairman Cigarettes Company ran a promotional cigarette card series highlighting English pottery, the image they chose to represent Staffordshire was Enoch Wood’s bust of Wesley—an ironic choice given that one of Wesley’s rules for his class meetings was, “To use no needless self-indulgence, such as taking snuff or tobacco, unless prescribed by a Physician” (fig. 8).

 

8. Silk cigarette card with paper back, Chairman Cigarettes Company (ca. 1910). Private collection. Photo by the author.
8. Silk cigarette card with paper back, Chairman Cigarettes Company (ca. 1910). Private collection. Photo by the author.
9. Essays on Physiognomy: For the Promotion of the Knowledge and the Love of Mankind; Written in the German Language by J. C. Lavatar, Abridged from Mr. Holcrofts Translation, by Johann Caspar Lavater. First American edition, printed for William Spotswood, & David West (Boston, 1794). Private collection. Photo by the author.
9. Essays on Physiognomy: For the Promotion of the Knowledge and the Love of Mankind; Written in the German Language by J. C. Lavatar, Abridged from Mr. Holcrofts Translation, by Johann Caspar Lavater. First American edition, printed for William Spotswood, & David West (Boston, 1794). Private collection. Photo by the author.

 

What did Wesley’s head mean for those who invited him into their domestic spaces? Enoch Wood, like his contemporary and once-employer, Josiah Wedgwood, was enamored of the classical past. In ceramics they determinedly spread the revival of antique designs upon English pottery, and, in turn, across the Atlantic world. And with these new arrangements came a certain view of history. Here Staffordshire potters like Enoch Wood, Josiah Wedgwood, and the countless artisans who contracted with them, put modern people—in modern dress—on antique forms, fromGeorge Washington to General Wolfe. And in so doing they elevated the modern era to the import of the classical, much like the history painting of eighteenth-century masters such asJohn Singleton Copley, Benjamin West, and the wax sculpture of Patience Wright. By so doing, they asserted that modern events and people could match sacred and classical history. For those who looked up to Wesley, Staffordshire potters sought to insert the man among the pantheon of great men who were understood to have formed the age (and capitalize on Methodism’s rapid growth in the process). Wesley’s great historical consequence, his followers believed, was to save Protestantism from the brink of decay and “infidelity”—what they increasingly came to call the “Second Reformation.”

The profile and the bust also carried other ideas with them from antiquity, namely physiognomy—the reading of physical features for insight into personal character. Eighteenth-century people were increasingly attracted to the idea that the mind, even heart, could be read upon the face and along the curvature of the skull. Here admirers of Wesley could actually learn things about the man from the close examination of his bust. The science of reading heads gained immense popularity on the European continent, the British Isles, and, later, in America through the works of Johann Caspar Lavater, best expressed in his richly illustrated Essays on Physiognomy. Deeply religious, Lavater argued that close attention to physiognomic features could open up divine truth about the self and others. Profiles and busts (and if possible, skulls) were the ideal objects to conduct this kind of study, for they resisted the potential for deception in a live face (fig. 9). They froze features that were otherwise in flux, and if “true to nature,” were able exhibit “God’s line.”

It should be noted that Methodists were a humble people; their homes were not materially ostentatious. Simplicity and discipline were hallmarks of early Methodist life as much as deep feeling and ecstatic singing. Even though the visual field within their homes was limited, it awarded Wesley disproportionate space. Above a cabinet, desk, door, or on a thinly populated mantle, Wesley oversaw the domestic life of his followers. This visual concentration summons a deeper Christian tradition, that of the departed saint and the living devotee: Wesley as icon, Wesley as body, Wesley as saint, here erected in a kind of devotional statuary. Protestants had no illusions that Wesley would weep tears, become semi-animated through his representation, or begin to speak to lonely seekers in dark rooms—nor would he respond to prayer or intercede on their behalf (though he did visit some followers in dreams). They pursued their devotion to Wesley in distinctively Protestant ways that reflected Protestant beliefs in the relationship of materiality and spirit. They sought a vision, an embodiment of the holy man, to connect to his words. And the association of Wesley with holiness was natural, given that he argued forcibly for the possibility of Christian perfection, the human heart cleansed of the inclination toward sin through total devotion to God. In bust form, his upturned eyes, not quite surveillance, were a reminder of where to fix the gaze—upward. His aquiline nose summoned a classical import to his evangelical innovations. His serene countenance evinced a calm, pleasant affirmation—a welcome vision for a group that eagerly sought assurance of their salvation in intimate class meetings, prayers, journals and hymns. His bust was a holy life perched on a pedestal (fig. 10).

 

11. "Follow after … ," unknown (ca. 1840). John Wesley transfer print earthenware plate. Lake Collection, from the Methodist Collection at Drew University, Madison, New Jersey. Photo by the author. Click for enlargement in new window.
11. “Follow after … ,” unknown (ca. 1840). John Wesley transfer print earthenware plate. Lake Collection, from the Methodist Collection at Drew University, Madison, New Jersey. Photo by the author. Click for enlargement in new window.
10. John Wesley by Enoch Wood (after 1791). Solid back with medallion, painted lead glazed creamware. Tipple Collection, Object 204, from the Methodist Collection at Drew University, Madison, New Jersey. Photo by the author.
10. John Wesley by Enoch Wood (after 1791). Solid back with medallion, painted lead glazed creamware. Tipple Collection, Object 204, from the Methodist Collection at Drew University, Madison, New Jersey. Photo by the author.

 

 

In the century after Wesley’s death in 1791, Methodists across the Atlantic world eagerly sought to possess Wesley’s mantle, to own the title of being “the true followers of Wesley,” especially in the face of the social and political challenges they encountered. They asserted in proceedings, periodicals, pamphlets, and on the title pages of books that “He, being dead, still speaketh.” Their saint could indeed speak, but not without print. And true to form, Wesley’s words would transfer their way onto pottery, on teapots and plates, reminding respectable men and women to keep their conversation chaste, and to direct their lives toward his triumphant death (fig. 11). At the same time that Methodists began acquiring potted Wesleyana to fill their homes, they simultaneously sorted through his papery remains to see what he might say to the challenges of their present. Should women preach? Should communion be given to slave holders? Should slavery be tolerated at all? Should bishops have authority over ministers and congregations in a democratic age? Should alcohol be prohibited? Should Methodists support radical politics (especially in light of so many decapitated heads in France)? His presence within homes, in prints, medallions, plates, teapots, books, and busts reasserted individuals’ choice of leader—their emperor of the spirit. But it also placed the firm pressure of his example upon everyday life. What would Wesley do? What would they do? Could their lives fill Wesley’s mold? Or would the times require Methodists to break it?

The most seismic debate in nineteenth-century American Methodism was over slavery. Wesley did not mince words on the issue, and neither did his early American followers—it was evil, through and through. Wesley wrote a radical and influential tract on the topic in 1774 in Thoughts Upon Slavery. But a parallel memory was introduced that Wesley did not bar early slaveholding Methodists in the West Indies from communion. American Methodists backed away from their founder during the American Revolution due to Wesley’s support for the Crown and outspoken opposition to the Continental Congress. But Southern Methodists knew that they would forfeit their success in winning souls if slavery was made antithetical to Methodism, and so an opening was made for slavery in southern Methodism that turned to outright acceptance after the American Methodist schism of 1844. Yet Northern Methodists and abolitionists would not let their southern counterparts forget Wesley’s antislavery, even in the realm of Wesleyana. In the abolitionist newspaper the National Era, led by the Quaker abolitionist poet John Greenleaf Whittier, it was reported that a “small telescope and electrical machine” owned by Wesley were procured by a wealthy Methodist in North Carolina. The editor was “surprised” that anyone from the North Carolina Conference, which had recently “formally, repudiated Mr. Wesley’s Anti-Slavery Creed…as heretical, fanatical, and for aught we know, diabolical,” should ever “venture to cherish anything as a memento of the great abolitionist father of the church.” The author averred that Wesley had used the devices “in his researches into physical nature.” But the North Carolinian would be unable to rely on the devices, as Wesley had, “to be the moral instrument through which he read the heavenly laws, and proclaimed to the world that ‘Slavery is the sum of all villanies.'”

Modern American Methodism has been recently divided over another institution, and the affections that prompt it. This has occurred in waves over marriage for same-sex couples in particular, and sexuality in general. It is currently the most explosive issue in the church and an issue that still evinces a regional divide. In this new debate, Wesley is marshalled, most often in quotations, for his characteristic ethical force. The bust isn’t far behind, of course, along with an open Bible, a Communion Chalice, and the Book of Discipline, the ominously titled book of guidelines and rules for the denomination (fig. 12). Methodists are still struggling to unite these objects, maintain the ethical tradition they inherited from Wesley, keep the communion (connection) together, revere the Bible, and uphold their rules for social and personal “holiness.” Knowing what Wesley would do in this situation is a concern for many, one of those perennial quandaries of history and memory. But it is a very real question for those who seek a usable past. Wesley will not solve the debate from his grave two centuries later. But the bust still appears, like in this image, a visual statement of efforts to resurrect his presence in a new, ongoing American confrontation.

 

12. Courtesy of Andy Oliver, Reconciling Ministries Network, and Jeremy Smith of hackingchristianity.net.
12. Courtesy of Andy Oliver, Reconciling Ministries Network, and Jeremy Smith of hackingchristianity.net.

Further Reading

The best modern biography of John Wesley is still Henry Rack, Reasonable Enthusiast (1989), and the best work on transatlantic Methodism is David Hempton’s Methodism: Empire of the Spirit (2005). Studies of English Methodism abound, in part stimulated by E.P. Thompson’s provocative chapter in The Making of the English Working Class (1966). A particular standout in terms of the body, and how it intersects with gender and emotion, is Phyllis Mack, Heart Religion in the British Enlightenment (2008). The list shrinks considerably in America. Important work has been done by Dee Andrews, Richard Carwardine, Christine Leigh Heyrman, and Lynn Lyerly, the latter two with a focus on the American South.

Staffordshire pottery and potters in America are covered well in the beautiful volumes of the annual American Ceramics, sponsored by the Chipstone Foundation; Patricia Halfpenny has done important work in tracing the work and influence of British potters in America. Much of Enoch Wood’s archive has thankfully been preserved and transcribed by collectors from private collections; much of it can be read in Arthur Cummings’ Portrait in Pottery (1963). On highly accurate representation of the body and its meanings, see Wendy Bellion’s Citizen Spectator (2011), and Marcia Pointon’s recent “Casts, Imprints, and the Deathliness of Things,” in Art Bulletin (2014). On the intense, intimate practice of viewing busts, and their use as indexes of the mind and soul of the depicted, see the important Malcolm Baker, The Marble Index (2014). The relationship between materiality and American Protestantism has been understudied, but important work has been done by Sally Promey on Puritan gravestones, David Morgan in Protestants and Pictures (1999), Colleen McDannell in Material Christianity (1999), and several essays in the recent compilation edited by Promey, Sensational Religion (2014).

 

This article originally appeared in issue 15.3 (Spring, 2015).


Christopher M.B. Allison is a PhD candidate in the program in American Studies at Harvard University. He is writing a dissertation on early American Protestant material culture.




Before 1822: Anti-Black Attacks on Charleston Methodist Churches from 1786 to Denmark Vesey’s Execution

Following the June 17, 2015, murders of nine worshipers in the Emanuel African Methodist Episcopal Church in Charleston, South Carolina, many commentators pinpointed 1822 as the most significant year in the history of violence directed against Charleston black churchgoers and their institutions. In 1822, Denmark Vesey was hanged after a guilty verdict in court and an independent black meeting house was razed by a mob of Charleston whites. Less understood, however, is a generation of vicious attacks before 1822 on Charleston black Methodists and the churches they attended. The destruction of the African Church in 1822 culminated a series of events that was in full force in Charleston no later than 1786. Almost from the end of the War of Independence, black Methodists in Charleston were subject to harassment. Instead of describing the destruction of the black meeting house as part of anger and paranoia revolving around Vesey and his accused co-conspirators, we should see a collision between two strands of violence. One strand was the decades of abuse directed at Charleston black Methodists who worshiped in mixed-race congregations in which they were the majority; the other strand was a precipitate response to rumors of an imminent slave insurrection.

In this essay, I analyze the decades of abuse and violence prior to 1822. Nineteenth-century commentators disturbed by the mistreatment understood it as a response to the large number of Charleston black Methodists as well as to the enthusiastic worship practices of those churches. This understanding was correct but inadequate on two counts. First, it never took a true measure of the significance of the number of Charleston black Methodists. They constituted the greatest number and the densest concentration of black Protestants in the world around 1820. No other center of black Protestantism—for example, Boston, New York, Philadelphia, St. John’s (Antigua), Kingston (Jamaica)—came even close to Charleston for numbers of black worshipers. Second, it gave black worshipers little credit for enthusiasm, conversions, and revivals among whites as well as among blacks. Yet the commentary of white contemporaries suggested that African Americans were at the forefront of enthusiasm. We gain a better comprehension not only of this history of violence but also of Charleston black Methodists’ self-understanding if we gauge the significance of their numbers and if we understand them as principal actors in the Methodist Episcopal Church and, later, in an independent congregation.

 

1. Cumberland Methodist Episcopal Church, Charleston, South Carolina. Also known as the Blue Meeting House, this was the first church built by Charleston Methodists, almost certainly with some black laborers. It stood from 1787 to 1839, with an initial cost of 1,300 pounds. Page 36, F. A. Mood, Methodism in Charleston: A Narrative  (Nashville, 1856).
1. Cumberland Methodist Episcopal Church, Charleston, South Carolina. Also known as the Blue Meeting House, this was the first church built by Charleston Methodists, almost certainly with some black laborers. It stood from 1787 to 1839, with an initial cost of 1,300 pounds. Page 36, F. A. Mood, Methodism in Charleston: A Narrative (Nashville, 1856).

From the moment of the first Methodist presence in Charleston, black worshipers were present. In 1736, John and Charles Wesley visited the town, where they held services in the Protestant Episcopal Church. The minister at this church was Alexander Garden, namesake of the gardenia and, later, scourge of Methodist revivalist George Whitefield. The black worshipers at the service made an impression on John Wesley with their numbers and their piety. Curious about the black members, he tried to explore their faith. He recorded a telling conversation with a black woman in which he informed her that the body and the soul were discontinuous. She disagreed. In Wesley’s words, “I asked her if she knew what a soul was. She answered, ‘No.’ I said, ‘Do you not know there is something in you different from your body?—something you cannot see or feel?’ She replied, ‘I never heard so much before.’” Wesley reported on the conversation but seems not to have comprehended it. He perceived only a need for religious tutelage of black congregants, but in retrospect we see this exchange as a prophecy of the enthusiastic worship that would characterize Charleston Methodist congregations. The black woman (whom Wesley never named) had brought body and soul together in piety in a way that the Methodist theologian seems not to have been able to countenance. Worshipers, black and white, came to express the integration of body and soul, not their distinction, as black worshipers became, within two years of the founding of the first church, a majority among Charleston Methodists. In these congregations, black worshipers came to be, in effect, evangelists and revivalists, no matter who occupied the pulpit.

Religious enthusiasm was, of course, a feature of revivalistic religion, black or white. Yet white Methodists rarely credited blacks with fostering spirited worship and religious conversions.

The Wesleys’ 1736 visit also established two continuing themes in Charleston Methodism as, first, black Christians responded to white preachers who took risks in their efforts to spread the Word, and, second, itinerants ignited fires of religious zeal that spread through England, mainland North America, and the West Indies, encompassing blacks and whites equally. For example, the Wesleys were caught in a dangerous storm in St. Helena’s Sound on the voyage to spread the word into the South Carolina Lowcountry. Such risks would be undertaken later by George Whitefield and other Methodist itinerants, and the memories of the dangers inherent to ministry would become part of the collective memory of Charleston Methodists. Such willingness to face danger in the Lord’s service struck the hearts of black men and black women. And although white Methodists rarely recognized all the egalitarian and antislavery implications inherent in their church practices, black Methodists did not miss the message. Charleston Methodist churches of the late eighteenth and early nineteenth centuries provided one of the clearest examples of equality dawning in congregations in which both blacks and whites contributed to revivals.

The Wesleys were soon followed, in 1739, by the greatest Methodist evangelist of the eighteenth century, George Whitefield. Like other Methodist itinerants, Whitefield preached in the open air or in dissenting churches when he was unwelcome in the pulpits of the established church. In 1740, during an evangelical tour in Charleston, Whitefield was challenged by Alexander Garden for “field-preaching” and extemporaneous prayer. Although Whitefield was suspended by a colonial ecclesiastical court from preaching in any Protestant Episcopal Church, such spiritistic practices were embraced among black worshipers, drawing the racist ire of some of white Charleston who were suspicious of such enthusiasm. After his suspension, Whitefield simply sought alternative venues in the Charleston Independent (i.e., Congregational) Church and among the Huguenot congregation. Both these were Calvinist in theology (Whitefield and the Wesleys had not yet separated because of theological differences).

Braving storms, Whitefield returned to Charleston several times. During one of Whitefield’s stays, in the 1760s, John Marrant, one of the most famous early African American Methodists, was converted during a revival meeting in which Whitefield was exhorting. Like the appreciation of the daring of itinerants and the egalitarian implications of mixed-race congregations, revival meetings would later figure in a vast increase in the number of Charleston black Methodists. In 1775, even before a Methodist church existed in Charleston, white Methodists were, as Philip D. Morgan notes, “reprimanded for sponsoring black preachers who proclaimed radical messages.”

The success of the Methodists in gaining black as well as white adherents in places like Charleston is well explained by John Walsh. “It was the crucial determination of Wesley and Whitefield to launch into itinerancy, making the world their parish and not the parish their world, that turned Methodism from a small awakening to a full-scale revival,” Walsh writes. “The spread of early Methodism depended on its ability to integrate diverse constituent groups into the associational network of its societies. . . . Methodism was as much a missionary movement as a revival.” One of the greatest American missionaries was Francis Asbury, who was born in England in 1745 and migrated to the North American colonies in 1771. Beginning in 1780, Asbury took a special interest in black Christians and traveled with “Black Harry” Hosier, who drove their carriage, preached with Asbury, and, perhaps, shared interpretations of Scripture with the white itinerant. Soon Asbury turned his attention to Charleston.

 

2. Old Bethel Methodist Episcopal Church, Charleston, South Carolina. A cutaway engraving shows the galleries, where black worshipers sat. This church was preferred by black Methodists because its less central location in Charleston made it less prone to attack by white mobs. Page  82, F. A. Mood, Methodism in Charleston: A Narrative (Nashville, 1856).
2. Old Bethel Methodist Episcopal Church, Charleston, South Carolina. A cutaway engraving shows the galleries, where black worshipers sat. This church was preferred by black Methodists because its less central location in Charleston made it less prone to attack by white mobs. Page 82, F. A. Mood, Methodism in Charleston: A Narrative (Nashville, 1856).

In 1785, Asbury arrived in the city, procuring, with the help of a local white sympathizer, an abandoned Baptist meeting house between Water and Tradd Streets. The Methodist Episcopal Church had just separated from the Episcopal Church in December 1784, and Asbury had just been ordained and named superintendent (later, over the objections of English Methodists, bishop). The Baptist congregation scattered by the War of Independence, the building became Charleston’s first Methodist meeting house. In Charleston, the Arminian Asbury felt beset by Calvinist Independents, Huguenots, and Scots Presbyterians on the one side and the wicked population of the major slave-trading port of North America on the other. In a series of visits to Charleston, Asbury realized that black churchgoers were a mainstay of his support. In December 1785, the Charleston Methodist Church reported thirty-five white and twenty-three black members—the only year in its first half century of existence in which white members were a majority.

The year 1786, in which blacks became a majority, saw the first attack on the church. The “success” of the Methodists—their increase in numbers, which was almost all due to new black members—led to vandalism. In the words of their nineteenth-century historian, Francis Asbury Mood, “When the congregation assembled one Sabbath morning, they found the benches helter-skelter in the street, and the doors and windows barred against them. This was taken as a hint that they were desired to change their quarters.” They found a new place of worship in a private home.

Nineteenth-century white Methodists like Mood understood that those who attacked their churches were angered by both spiritistic worship and the presence of black worshipers. Religious enthusiasm was, of course, a feature of revivalistic religion, black or white. Yet white Methodists rarely credited blacks with fostering spirited worship and religious conversions. Today we need to emphasize that the spiritism of black worshipers—including the experience of the black woman who refused to disjoin body and soul—helped form the very environment in which conversions occurred. Our challenge as scholars is that white authors almost never named black converts or assumed that their experiences should be inscribed in print. Today scholars recognize that black worshipers were a crucial part of the context in which white conversions took place. Conversions during church services or revival meetings were communal rites. An enthusiastic preacher, an emotional crowd with its own cries, songs, and movements, and some members gripped by their own mix of Christian beliefs and deeply felt regrets over past sins added up to emotional conversions. Records of Methodist Charleston include a terrible paradox: black members created the matrix in which conversions occurred and of course they converted themselves, yet when the chroniclers of the churches named converts, they chose white ones, not black.

White Charlestonian George Airs was, for example, identified when not one black member of his black-majority church was named. Yet his conversion occurred in the context of a congregation in which white men were a minority. “We may mention the conversion of George Airs,” Mood wrote:

He was a man of impulsive, ardent temperament, and had been long confirmed in sinful habits. He was seeking religion for some days under poignant grief for his sins. Light at length broke in upon his darkness, his captive soul was freed, and, as we might expect, the demonstration he made was not a little boisterous. After strongly assuring all present of the wondrous change which had passed upon him, he rushed from the building, anxious to tell the world what a merciful Saviour he had found. He ran towards East Bay, ‘Hallelujah!’ bursting from his strong lungs at every step. This produced a great sensation in the neighborhood, and quite a crowd took after the supposed maniac, who had been rendered so at the Methodist meeting. And ranging around several squares, much to the horror of the people living thereabout, what was their surprise to see him quietly return to the house, the big tears streaming down his face! Instead of finding a maniac, they had in truth fallen upon one who had been just clothed and put in his right mind, as his subsequent life of piety abundantly proved.

It is likely—though never stated—that one trigger of the attacks on the churches was black members prompting such dramatic conversions of white members. Sadly, the rich detail concerning a white convert contrasted with parsimony of words when a black Methodist became the subject. Mood everywhere recorded the presence and the spirit of black worshipers, but nowhere gave them credit for conversions. Yet it seems unlikely that the attackers of the church missed the role black worshipers played in conversions among the church membership at large.

 

3. A poppy bud, apparently an emblem of the church, engraved to emphasize black and white. The poppy is a traditional symbol of Christ. In 1856, Mood placed this illustration after a passage describing the mixed-race membership of Charleston Methodist congregations, projecting the colors of two races onto Christ. Page 124, F. A. Mood, Methodism in Charleston: A Narrative (Nashville, 1856).
3. A poppy bud, apparently an emblem of the church, engraved to emphasize black and white. The poppy is a traditional symbol of Christ’s passion. In 1856, Mood placed this illustration after a passage describing the mixed-race membership of Charleston Methodist congregations, projecting the colors of two races onto the Passion. Page 124, F. A. Mood, Methodism in Charleston: A Narrative (Nashville, 1856).

There were other slights of blacks in print—an absence in presence based on an assumption that white worshipers would be dignified with names but black ones would not. When the Charleston Methodists finally built their own meeting house, which came to be known as the Cumberland Methodist Episcopal Church, it required not only “galleries for the accommodation of the colored people” but also, of course, labor and materials for construction. The white men who provided materials were named, along with the prices paid them, in the 1856 history of the church. The lowest price was 1 pound, once for nails and once for stones, the highest 10 pounds, for labor, possibly that of slaves owned by the member. In any case, there were clearly unnamed laborers. For instance, some men were paid 5 shillings for carrying boards, while 10 shillings 6 pence was expended for “corn for workmen” (in a time when corn mush was a common food for Southern slaves). Emma Hart has noted the presence of enslaved workers in the building trades: “Up to 1800, Charleston building artisans owned at least 250 slave carpenters, bricklayers, painters, and plasterers.” Similarly, once the building was open for services, its “floor was always covered with a layer of clean white sand.” Who hauled the boards? Who spread the sand? No word was recorded about the race of the lowest-paid laborers, yet the pattern of the attestations suggests that white men would be named while black men would not be. It seems likely that heavy or relatively poorly compensated labor was performed by black men.

By 1787, the Charleston Methodist Church had thirty-five white and fifty-three black members. February 1788 saw an attack on worshipers themselves (as opposed to their place of worship). A crowded church was “greeted with the first open demonstration of hostility from the inhabitants. There was a riot raised at the door. A general panic seized the audience. . . . At night, while the Bishop [Asbury] was preaching, the house again crowded to overflowing, it was assailed on all sides with stones and brickbats.” Asbury stood his ground as the mixed-race congregation was besieged. He recalled his sermon on Isaiah 52:7 as one of the most inspired of his long career. “I have had,” he reported, “more liberty to speak in Charleston than ever before, and I am of the opinion that God will work here.”

In 1789, the presence of Thomas Coke stirred another riot. A great apostle of Wesleyanism, Coke had evangelized in the West Indies, criticized enslavement, and helped establish in St. John’s, Antigua, one of the world’s great centers of black Protestantism. It was under his guidance that two memoirists of black or colored Antigua—Anne Hart and Elizabeth Hart—were converted. Although hardly antislavery, Mood’s 1856 history, in touching upon Coke’s visit, condemned “the illegal and cowardly assaults made by the ‘young chivalry’ of Charleston upon unoffending women and children while worshipping their God.”

Again in 1790, during a conference in which Methodist ministers and leaders proposed “Sunday-schools for poor children, black and white,” there were disruptions and insults directed at Charleston Methodists. The schools never opened. By 1792, the Charleston Methodist Church enrolled sixty-six white and 119 black members. By this time, slurs directed against the Charleston Methodists included that they fostered “negro-churches” and listened to “negro preachers.” Yet its members were grateful for their “season of revival” without acknowledging that the emotional and theological life of the congregation was conditioned by the mix of races within it. At the end of 1793, after a year of revivals, the congregation comprised sixty-five white and 280 black members.

In 1795, again the meeting house was attacked as “a crowd assailed the church, beating open the doors, and breaking down the windows.” Returning to Charleston, Asbury held prayer meetings for black worshipers. In 1797, the congregation “was still called to suffer much annoyance from rioters and mobs.” A grand jury of the Charleston district declined to recommend charges against the malefactors. Thus, “every night the services were interrupted by riotous proceedings outside; and the congregation, while in-doors, and especially when dispersing, were grossly insulted, because their cowardly assailants felt they could do it with impunity.”

Also in 1797, the congregation began construction of a second building, Bethel Methodist Episcopal Church, possibly to counter schismatic members who had seceded to form a Primitive Methodist Church. With a less central location, the Bethel meeting house proved attractive to black members seeking to avoid white mobs. This was a time of a “large increase of colored members.” Mob violence occurred again in 1800, worsening when it became known that a Methodist minister assigned to Charleston for the year had received a packet of seemingly antislavery writings from the North. A crowd of “patriotic bullies” mobbed the minister as he left church and sought to “dunk” him under a nearby water pump. After he escaped, the mob grabbed another Methodist and dunked him to near-drowning.

Asbury preached in the Bethel Church in 1798, reporting that in the parsonage he received “visitors, ministers, and people, white, and black, and yellow. It was a paradise to me and some others.” Again the nineteenth-century historian omitted the name of a significant black man. Asbury had arrived to find the parsonage building completed but unfurnished. He “gravely sat down on the door-step, no one knowing of his arrival. A negro man passing observed him sitting there, and recognizing him to be the Bishop, stopped, and told him no one lived there. ‘I know that,’ said the Bishop. ‘Where do you want to go, sir? I will show you the way.’ ‘I want to go nowhere,’ said the Bishop: ‘I will spend the night here.’” After this exchange, the black man informed some other church members of Asbury’s arrival and intentions. They pressed upon him to go with them instead of staying on the street. When he refused, they carried furnishings to the parsonage to make him comfortable. Once again, the language could suggest black church members. It is certain, in any event, that black believers populated Asbury’s “paradise.” “He was able there, untrammelled by forms or customs, to manage things his own way, and, as far as possible, make a paradise below, by constant communion with his God.” He convened “family worship,” attended “by a number of colored persons . . . so that often at family prayer at the parsonage, there would be an assembly of forty or fifty persons.”

Another glimpse of the type of white minister appeared in commentary on one of Asbury’s itinerant colleagues who died in 1803, possibly of yellow fever. Bennet Kendrick “was a close student, and a skilful eloquent preacher; and, with it all, perhaps his highest eulogy is, ‘The poor Africans repeated his name and death with tears. He was a willing servant to slaves for the sake of Christ.’”

No one could better evoke the link between violence and church demographics than Mood. From 1794 to 1804, “there was a decrease of three white members; and, as it includes the period of the most violent open hostility to the church, this should go far toward convincing those who think that persecution is the time most favorable for the growth of the Church, that they may be mistaken. The colored membership, however, continued to increase with a steady growth. They averaged, during this decade, a yearly increase of sixty-two; so that at the close of the year 1804, they numbered nine hundred and three.”

 

4. This illustration of a class paper, kept by a Methodist class leader, appeared in Jonathan Crowther,  A true and complete portraiture of Methodism: or, The history of the Wesleyan Methodists: including their rise, progress, and present state: the lives and characters of divers of their ministers: the doctrines the Methodists believe and teach, fully and explicitly stated: with the whole plan of their discipline. The different collections made among them, and the application of the monies raised thereby; and a description of class-meetings, bands, love-feasts, &c. Also, a defence of Methodism, &c.  Published by Daniel Hitt and Thomas Ware, for the Methodist connexion in the United States, J. C. Totten, printer (New York, 1813).
4. This illustration of a class paper, kept by a Methodist class leader, appeared on page 225 in Jonathan Crowther, A true and complete portraiture of Methodism: or, The history of the Wesleyan Methodists: including their rise, progress, and present state: the lives and characters of divers of their ministers: the doctrines the Methodists believe and teach, fully and explicitly stated: with the whole plan of their discipline. The different collections made among them, and the application of the monies raised thereby; and a description of class-meetings, bands, love-feasts, &c. Also, a defence of Methodism, &c. Published by Daniel Hitt and Thomas Ware, for the Methodist connexion in the United States, J. C. Totten, printer (New York, 1813).

Disruptions of religious services occurred again in 1804 and 1807. Black worshipers gravitated toward the Bethel Church since its location seemed to attract less attention from the mobs. “The blacks had become so subject to annoyance at Cumberland, that they preferred to attend Bethel, which thus so far had not seemed to attract much attention from the rioters. The church, as was always the case on Sabbath afternoon, was crowded with blacks.” That day the captain of the city guard marched up the aisle and ordered the mixed-race congregation to disperse. Black churchgoers “emerged into the street and graveyard only to find themselves captured. Then, in a hollow square, as felons or incendiaries, they were deposited en masse in what was then known popularly as the ‘Sugar House.’ Singular to state, no reason was ever assigned for this outrage, nor any explanation given for this extraordinary procedure.” An 1838 narrative by a runaway South Carolina slave described this house of correction: “I have heard a great deal said about hell, and wicked places, but I don’t think there is any worse hell than that sugar house. It’s as bad a place as can be.”

Revivals surged through the congregations in 1807 and 1808. In 1811, a “powerful religious influence rested upon the congregations during the year, and at its close an increase was reported of eighty-one whites and four hundred and fifteen colored members.” Church membership swelled, with black membership increasing at a rate five times that of the white membership. By this time, there were established classes led by black laypersons for prayer and religious conversation among black believers. The classes were an important institution insofar as they kept believers close to Methodist beliefs and they allowed some black church members leadership roles as class leaders. At “the Conference of 1815, a membership was reported of two hundred and eight-two whites, and three thousand seven hundred and ninety-three colored.”

However, the stage was being set for an exodus of black members. First, in 1815, the white minister then in charge, Anthony Senter, initiated an investigation into the use of funds collected from black Methodists, which had remained with black class leaders. The inquiry revealed what appeared to at least some white Methodists to be “much corruption.” A modern account mentions that some donations were used to purchase the freedom of slaves who were to be sold away, but there was no indication in the primary documentation that Methodist preachers objected to that as corrupt. Donations from black churchgoers were ordered to be transmitted to the stewards of the Methodist Episcopal Church, no longer retained by black class leaders. Second, white members announced that part of the church’s black burial ground was to hold a storage building for hearses. Third, a rapid increase in black members led to a total of 5,690 by 1818. The black worshipers almost certainly felt that they had enough strength in numbers to form their own congregation. The question inherited by early twenty-first-century scholars of American religion is what the worshipers thought their congregation might be. Most scholars have assumed that the goal was an African Methodist Episcopal church, but the evidence from Charleston suggests that it was more likely to have been an independent Methodist church.

Two Charleston black Methodists, Morris Brown and Henry Drayton, traveled to Philadelphia in 1816 and then again in 1818. Brown had been attending the Charleston Methodist Bethel meeting house. In Philadelphia, in May 1818, Brown was ordained an elder and Drayton was elected a deacon. In 1817, a Methodist minister, Solomon Bryan, wrote to the Charleston city council with his concerns about black worshipers forming their own congregation. Then, in Charleston, in 1818, “at one fell swoop nearly every leader delivered up his class-papers, and four thousand three hundred and sixty-seven members withdrew.” Class papers were documents maintained by class leaders for recording the activities of church members as well as keeping track of those who were trial members of the congregation. Black class leaders in other locations, such as Antigua, similarly maintained their own class papers. Since the class leader not only kept the documents but also inquired into the religious experience of his class members, possession of the papers indicated a leadership role. Yet class papers also served as a record of the attendance of members at prayer meetings. For instance, class leaders marked each person as “p” (present), “d” (distant), “s” (sick), “n” (neglectful), “b” (away on legitimate business), or “a” (absent), records that might well have been used against black people in Charleston’s racist, slave-holding environment. In surrendering their papers, the class leaders of black Charleston were removing themselves from white authority.

Although 1,323 black churchgoers remained within their traditional church home, the white Methodists were stunned by the loss of almost eighty percent of their black brothers and black sisters. Mood confirmed that piety had been corporeally felt during Charleston Methodist Episcopal services. “None but those who are accustomed to attend the churches in Charleston, with their crowded galleries, can well appreciate the effect of such an immense withdrawal. The galleries, hitherto crowded, were almost completely deserted, and it was a vacancy that could be felt. The absence of their responses and hearty songs was really felt to be a loss to those so long accustomed to hear them.” These lines were the closest Mood ever came to acknowledging the value of Charleston black Methodists in any way other than their numbers or their assistance to Asbury.

The 4,367 who left united into an African Church, according to Mood. Modern scholars have often called this an “African Methodist Episcopal Church,” but it was not clearly described that way at its inception, notwithstanding Brown’s and Drayton’s attendance at A.M.E. conferences. The evidence from around 1820 more likely suggested that its members viewed it as an independent black Methodist church. The African Church possibly envisioned itself as an independent Methodist congregation unaffiliated with a regional conference. Indeed a petition to the Charleston city council of January 27, 1817, for permission to purchase a plot for a cemetery was submitted by a black “Independent Religious Congregation,” while an 1818 petition to the House of Representatives described the petitioners simply as “Methodists, in the City of Charleston.” An 1830 letter by a white observer, James Osgood Andrew, insisted that the majority of the breakaway black believers desired to remain Methodists albeit in a church with both black congregants and a black minister. In Andrew’s estimation, Brown initially hid his plan for an A.M.E. church from his followers.

There had been previous breakaway independent Methodist congregations in Charleston, such as the Primitive Methodists, so there were local models in place for the black congregants. In effect these were models of congregations with revivalistic devotion (possibly planned to be missionary devotion at some future date), theology culled from Methodists with whom Americans were familiar (Whitefield, the Wesleys, Coke, Asbury), and freedom from a superordinating conference that would have enforced discipline, provided itinerants or missionaries, collected its share of church donations, and possibly owned church buildings or grounds. It is at least as likely that the breakaway black Methodists followed local models as that they affiliated themselves to the A.M.E. Church. Indeed, another schism, involving breakaway white Methodists, occurred in 1834, when a Methodist Protestant Church was formed in Charleston by whites who objected to blacks sitting outside the galleries in the Bethel Church. In short, there was a history of independent Methodist congregations in Charleston before and after 1822.

It is also possible that between 1815 and 1822, Charleston black Methodists envisioned becoming the apex of black Protestantism. That would have implied a congregation independent of outside entities other than God. An 1820 petition to the South Carolina legislature, subscribed by twenty-six black Methodists (including Brown and Drayton) and by thirty-four whites, described the black church as “the African Episcopal Church, in Charleston, called Zion.” Only a later insertion in the petition—the only revision in the document—added a superscript word, “Methodist.” The clerk who received the petition and initiated an account of its procedural history never described the black subscribers as African Methodist Episcopal. His terminology was “certain free persons of color . . . praying to be permitted to worship in a building created by them in the suburbs of Charleston.” The petition was denied.

 

5. A reenactment of an emblem of the Charleston Methodist Episcopal Church, with a poppy pod, made by Joanne Pope Melish and Clémence, Théophile, and John Saillant, July 2015, Wakefield, Rhode Island. Courtesy of John Saillant.
5. A reenactment of an emblem of the Charleston Methodist Episcopal Church, with a poppy pod, made by Joanne Pope Melish and Clémence, Théophile, and John Saillant, July 2015, Wakefield, Rhode Island. Courtesy of John Saillant.

Had the breakaway black Methodists united into an A.M.E. congregation in Charleston around 1820, they would have instantly established the demographic center of the A.M.E. Church as Charleston, not Philadelphia. Based on membership figures given at the 1822 A.M.E. conference, if those who had hived off from the Charleston Methodist Episcopal Church had formed an A.M.E. congregation, they would have outnumbered Philadelphia A.M.E. congregants by thirty percent and they would have comprised forty-five percent of all A.M.E. members in North America. Even at the time of Richard Allen’s death in 1831, the reported membership of the A.M.E. denomination was about 10,000. It is impossible to know what would have happened had Charleston’s breakaway black Methodists joined the A.M.E. Church, but the figures suggest that they would have been between thirty and forty-five percent of the entire denomination. Furthermore, the word “Episcopal” in the name of the church could have implied a black bishop in Charleston—highly unlikely in the structure of the A.M.E. Church around 1820. Today the church historian might wonder how a black bishopric would sit in an ecclesiastical hierarchy, but it is possible that a black bishop implied independence and leadership in Charleston around 1820.

Connections between the Charleston African Church and the Philadelphia A.M.E. Conference were not strong around 1820. The only use of the phrase “African Methodist Episcopal” to describe the congregation derived from the emendation in the group’s petition. The reputable white ministers who signed the ex parte petition were local Congregationalists or Presbyterians, who were closer to the Calvinism of George Whitefield than to the Arminianism of the A.M.E. Church. Indeed, the presence of those white subscribers suggests that Charleston black Methodists were returning to their roots in Whitefield. Once again, Mood perceived an essential truth about Charleston Methodism. Despite the conflict with Wesley over free grace and predestination, Whitefield earned Mood’s praise because “it was, no doubt, of advantage to the future establishment of Methodism, that ‘justification by faith’ was fearlessly and powerfully proclaimed in Charleston.” Charleston black Methodists’ theological self-understanding around 1820 could have been Calvinist, not Arminian.

One tie between the Charleston black Methodists and the A.M.E. Church was the ordination of the two Charleston black men in Philadelphia. Moreover, Brown traveled to the 1819 and 1822 annual conferences. Richard Allen was qualified as a bishop to ordain other men—as he had been ordained by Asbury—but their ordination neither made them members of the A.M.E. Church nor created a congregation of the A.M.E. Church. For example, Asbury also ordained Allen’s friend Absalom Jones, who joined the Episcopal Church and then led the formation of the African Episcopal Church of St. Thomas. When Brown reported Charleston black membership at the 1819 conference, he gave a figure of only 1,848—about forty percent of the black Methodists who had left the Methodist Episcopal Church and about thirty percent of Charleston blacks who identified as Methodists. The number Brown reported for Charleston in May 1822, a month before Denmark Vesey was arrested, was 1,400, seventy-five percent of what he had reported in 1819. Even these relatively low figures—only about thirty percent of the breakaway black Methodists—were never corroborated for a Charleston A.M.E. congregation after Brown’s attestation. Yet if we credit Brown’s figures, they mean that at most thirty percent of the breakaway black Methodists identified enough with the A.M.E. Church for him to count them.

The nineteenth-century source that was probably most likely to have mentioned an A.M.E. church in Charleston—had there been one—never did so. The leading historian of the A.M.E. Church, Bishop Daniel Alexander Payne, who was himself born in Charleston in 1811, published an autobiography in 1888. One chapter treated his childhood and his experiences in the 1820s in the Charleston Cumberland Street Church. Another treated the formation of the A.M.E. Church. Neither mentioned a Charleston A.M.E. congregation. It is difficult to imagine that Payne’s autobiography would not have mentioned an A.M.E. congregation in Charleston around 1820 had there been one. However, another Charlestonian, Charles Cotesworth Pinckney, scion of a local planter family, did believe by 1829 that some black members had “seceded from the regular Methodist Church in 1817, and formed a separate establishment, in connexion with the African Methodist Society, in Philadelphia: whose Bishop, a coloured man, named Allen, had assumed that Office, being himself a seceder from the Methodist Church of Pennsylvania.”

The available evidence does not lead to an indisputable conclusion, but a possible scenario of events between 1815 and 1822 can be constructed. Most Charleston black Methodists espoused Whitefieldian enthusiastic religion and kept it alive in their services. The A.M.E. Church was at the edge of their horizon. They yearned for independence, not subordination to other Charleston churches or to any larger denomination. Insofar as they seemed still to consider themselves Episcopal, they hankered for a local black bishop. Brown may have added “Methodist” after all subscribers had signed: this is unflattering in that Brown may have duped the subscribers. Unfortunately, Richard Allen himself was by 1820 notorious for high-handed tactics, which in the eyes of some of his congregation included deception and self-aggrandizement. Some, perhaps including Brown, might have had an inkling that Charleston could become the sun of black Protestantism, whether A.M.E. or another form—a central role that would have made Philadelphia a satellite. Ambition or a sense of mission or both seem to have been at high tide in Charleston around 1820. Until fissures broke open around 1820, the largest number and the densest concentration of black Protestants in the world were in Charleston Methodist churches. Indeed, almost from the end of the War of Independence, it was these black Protestants, not only the congregants of the African Church, that were magnets for mob violence. The African Church was harassed by city officials, beginning in June 1818, when 143 worshipers were arrested at Sunday services. That was the next step in the evolution of the mob violence that began in 1786.

Charleston black Methodists were crucially important apart from any connection (whether strong or weak) to the A.M.E. Church. It seems unlikely that believers this numerous were willing either to continue in a subservient role in the Methodist Episcopal Church or to affiliate themselves to a new black church in Philadelphia, some 700 miles away. In fact, in these very years, Allen’s Philadelphia A.M.E. Bethel congregation underwent a schism, and Allen himself was publicly reviled, even spat upon, by black worshipers who had left his church when he tried to force his way into their pulpit. Brown’s later ascent in the A.M.E. Church—he indeed was consecrated a bishop in 1828—should not be read backwards and should not be read as an endorsement by Charleston black Methodists of the A.M.E. Church. If Charleston’s black worshipers were envisioning that their church might become the sun in the galaxy of black Protestantism, that vision was based on growth in numbers and success in revivals that had run unabated since the end of the War of Independence. White Methodists may have taken all the credit for the post-Revolutionary victories, but it seems unlikely that Charleston black Methodists were fooled. Ironically, it seems unlikely that the part of white Charleston that harassed the Methodists was fooled either.

In 1822, however, a collision occurred between the continuing violence directed at Charleston black Methodists and the precipitate response to fears of a slave insurrection. A white mob destroyed the meeting house soon after Denmark Vesey was hanged. Their church destroyed, some members of the African Church returned to the Charleston Methodist Episcopal Church and others joined the Calvinist Scots Presbyterian Church. Unfortunately, no numbers are available, but Mood suggested the appeal of Calvinism among these believers by writing, “Large numbers connected themselves with the Scotch Presbyterian Church.” And “the rest were peeled and scattered.” Brown fled to Philadelphia with a number of followers before the year closed and he began his ascent in the A.M.E. Church. The “peeled and scattered” black believers could have created a small nucleus—probably with a fluid membership—that kept independent Charleston African Methodism spiritually alive in the coming decades. The first true A.M.E congregation in Charleston seems to have coalesced in 1865 and dedicated its first meeting house in 1872. The history of the Emanuel African Methodist Episcopal Church—as well as the history of violence against Charleston black Methodists—is more complicated than most modern commentators have imagined.

Our understanding of 1822 in Charleston as the culmination of a generation of violence directed against black Methodists, whether in the Methodist Episcopal Church or in an independent evangelical congregation, is vastly different from an understanding of an outburst of terror aimed at a new A.M.E. church and Vesey and his supposed co-conspirators. After 1822, attacks continued to plague the lives of black worshipers in Charleston Methodist Episcopal churches, but by the 1830s, violence was within congregations. Once again, young white men were the vanguard, forcing black churchgoers out of seats they had traditionally occupied. Thus 1822 was not an originating year for violence against Charleston Methodists—the origin was much earlier—but it was perhaps a pivotal year in which cruelty and abuse migrated from outside the meeting house door to the benches and aisles inside.

Acknowledgments

The author benefitted from the comments of several excellent colleagues as he wrote this essay in the summer and fall of 2015: Doug Egerton, Peter Hinks, Joanne Pope Melish, and Rich Newman. It was at the joint conference of the Omohundro Institute of Early American History and Culture and the Society of Early Americanists, June 18-21, 2015, that Nathan Jérémie-Brink, Eric Slauter, and David Waldstreicher urged him to put his ideas on Charleston down on paper. Many thanks.

Further Reading

The handwritten petition registered as “Ex parte certain free persons of color praying to be permitted to worship in a building created by them in the suburbs of Charleston” is held at the South Carolina Department of Archives and History. It shows that only an emendation in the document identifies the black church as African Methodist Episcopal. The 1817 and 1818 petitions are reprinted in, respectively, Designs against Charleston: The Trial Record of the Denmark Vesey Slave Conspiracy of 1822, edited by Edward A. Pearson (Chapel Hill, N.C., 1999) and Court of Death: A Documentary History of the Denmark Vesey Affair, edited by Douglas R. Egerton and Robert L. Paquette (Gainesville, Fla., 2016). The latter reprints much crucial source material. A discussion of Methodist class papers (with a reproduction of one) appears in Jonathan Crowther, A True and Complete Portraiture of Methodism (New York, 1813). A succinct overview of early Antiguan Methodism, along with a comment that black or colored class leaders kept their own class papers, appears in Robert Glen, “The History of Early Methodism in Antigua: A Critique of Sylvia R. Frey and Betty Wood’s Come Shouting to Zion: African American Protestantism in the American South and British Caribbean to 1830,” The Journal of Caribbean History 35:2 (2001). Early histories of the African American Episcopal Church by its own adherents never described the African Church of Charleston as part of the A.M.E connection. An example is Christopher Rush, A Short Account of the Rise and Progress of the African Methodist Episcopal Church in America (New York, 1843). Another example is Daniel A. Payne, History of the African Methodist Episcopal Church (Nashville, 1892), which assumed that any black Methodist in Charleston who wanted to join the A.M.E. Church after about 1815 had to migrate to Philadelphia. Payne also listed the numbers of the African Church that Morris Brown provided to the A.M.E. conference. Payne’s autobiography is Recollections of Seventy Years (Nashville, 1888). Wesley J. Gaines, in African Methodism in the South; Or Twenty-Five Years of Freedom (Atlanta, 1890) described the organization of the first A.M.E. church in Charleston in 1865. The same date for the first Charleston A.M.E. church was given in Richard R. Wright Jr., Centennial Encyclopaedia of the African Methodist Episcopal Church Containing Principally the Biographies of the Men and Women, Both Ministers and Laymen, Whose Labors during a Hundred Years, Helped Make the A.M.E. Church What It Is (Philadelphia, 1916). Richard C. Wade, “The Vesey Plot: A Reconsideration,” The Journal of Southern History 30:2 (May 1964), follows the nineteenth-century sources in describing the Charleston African Church as an independent Methodist church. The most detailed account of Charleston Methodists, black and white, before 1822 remains Francis Asbury Mood, Methodism in Charleston: A Narrative of the Chief Events Relating to the Rise and Progress of the Methodist Episcopal Church in Charleston, S.C. (Nashville, 1956). Albert Deems Betts, History of South Carolina Methodism (Columbia, S.C., 1952), is apparently the earliest to claim, without citing sources, that around 1815 the use of funds collected from black members to purchase slaves was objectionable to the standing white minister. Betts never mentions the existence of an A.M.E. church in Charleston, although he does mention Richard Allen and Morris Brown as A.M.E. bishops as well as schisms that occurred among Charleston white Methodists. James Osgood Andrew’s letter appeared in Methodist Magazine and Quarterly Review 12 (1830). John Marrant’s 1785 Narrative, which recorded his encounter with George Whitefield in Charleston, is available in “Face Zion Forward”: First Writers of the Black Atlantic, edited by Joanna Brooks and John Saillant (Boston, 2002). Francis Asbury’s accounts of his visits to South Carolina are available in The Journal and Letters of Francis Asbury, edited by Elmer T. Clark, et al (Nashville, 1958; three volumes). Emma Hart, Building Charleston: Town and Society in the Eighteenth-Century British Atlantic World (Charlottesville, 2009), describes black laborers and tradesmen in Charleston. Philip D. Morgan treats South Carolina Afro-Christianity in Slave Counterpoint: Black Culture in the Eighteenth-Century Chesapeake and Lowcountry (Chapel Hill, N.C., 1998). John Walsh, “‘Methodism’ and the Origins of English-Speaking Evangelicalism,” which emphasizes the missionary dimension of early Methodism, appears in Evangelicalism: Comparative Studies of Popular Protestantism in North America, the British Isles, and Beyond, 1700-1990, editors, Mark A. Noll, David W. Bebbington, and George A. Rawlyk (New York, 1994). Originally serialized in The Emancipator in 1838, the recollections by a runaway slave of the Charleston “Sugar House” are available in I Belong to South Carolina: South Carolina Slave Narratives, edited by Susanna Aston (Columbia, S.C., 2010). The most distinguished commentary on slavery in Charleston is Lacy K. Ford, Deliver Us from Evil: The Slavery Question in the Old South (New York, 1999). Morris Brown’s and Richard Allen’s relationship is well described in Richard S. Newman, Freedom’s Prophet: Bishop Richard Allen, the AME Church, and the Black Founding Fathers (New York, 2008). Both Newman, Freedom’s Prophet, and Carol V. R. George, Segregated Sabbaths: Richard Allen and the Emergence of Independent Black Churches, 1760-1840 (New York, 1973), describe discord within the Philadelphia Bethel A.M.E. church.

This essay has argued, in part, for the importance of Charleston black Methodists apart from any connection (whether strong or weak) to the A.M.E. Church. Some leading works have assumed a strong connection. The earliest to assume such a connection was apparently Marina Wikramanayake, A World in Shadow: The Free Black in Antebellum South Carolina (Columbia, S.C., 1973). Although none of the primary sources cited in her book mention a Charleston A.M.E. church, she was followed by Kenneth K. Bailey, “Protestantism and Afro-Americans in the Old South: Another Look,” The Journal of Southern History 41:4 (November 1975), Robert L. Harris Jr., “Charleston’s Free Afro-American Elite: The Brown Fellowship Society and the Humane Brotherhood,” The South Carolina Historical Magazine 82:4 (October 1981), Bernard E. Powers Jr., Black Charlestonians: A Social History, 1822-1865 (Fayetteville, Ark., 1994), and Douglas R. Egerton, He Shall Go Out Free: The Lives of Denmark Vesey (Madison, Wis., 1999). Most scholarship since the mid-1990s repeats the claim that the Charleston African Church was A.M.E. A recent example, of many, is James O’Neil Spady, “Power and Confession: On the Credibility of the Earliest Reports of the Denmark Vesey Slave Conspiracy,” The William and Mary Quarterly, third series, 68:2 (April 2011). An exception, which uses Methodist sources but avoids asserting that there was a Charleston A.M.E. church, is Robert L. Paquette, “Jacobins of the Low Country: The Vesey Plot on Trial,” The William and Mary Quarterly, third series, 59:1 (January 2002).

 

This article originally appeared in issue 16.2 (Winter, 2016).


John Saillant is a professor of English and history at Western Michigan University. He was awarded degrees from Brown University in American Civilization. His recent and in-progress works concern the earliest black Baptists in North America and Jamaica and ways that early black-authored documents were revised by white handlers.




Bibles, American Style

John Fea, The Bible Cause: A History of the American Bible Society. New York: Oxford University Press, 2016. 384 pp., $29.95.

John Fea’s The Bible Cause is an ambitious bicentennial history of the American Bible Society, the evangelical benevolent organization founded in New York in 1816. The American Bible Society was massive. Fueled by a can-do millennial spirit, its leaders sought to place a Bible with every family in the country, and eventually, the world. New technologies allowed the society to produce Bibles at a scale never before seen: after installing steam-powered presses in the late 1820s, the publisher produced 600,000 Bibles a year (31). A vast network of auxiliaries worked to circulate the sacred texts across the expanding geography of the United States. The society continued to produce and distribute Bibles in such astronomical numbers that by the late twentieth century, leaders resorted to measuring their impact in “tonnage” (301).

The argument of The Bible Cause is that the American Bible Society was an eminently American institution that sought to build a Christian nation. Throughout its history the society worked to forge a country unified “around Protestantism and the social virtues that logically flowed from its teachings” (23). Like other voluntary associations of the antebellum era, the American Bible Society aspired to nonsectarianism and to universal appeal. The society’s leaders, though mostly belonging to Calvinist denominations, regarded themselves as doing the work of reinforcing a “Protestant consensus” that served both God and country (28).

Fea moves steadily through two centuries of the society’s religious activities and their relation to political events and cultural movements in American history. These include the society’s devotion to federalism at its founding; the spreading of Bibles abroad on the coattails of imperialism in the western Mediterranean and China; Reconstruction-era domestic missions to African Americans; patriotism and troop outreach during the World Wars; and the society’s alliance with the Protestant mainline and later embrace of the Christian right. Fea ends with an image of the society’s delivery of scripture booklets to Ground Zero in the aftermath of the Sept. 11 attacks. He vividly draws these and other sections by focusing on the individuals involved. For instance, he introduces us to S. Brooks McLane, affectionately known as “the Bible guy” to troops in Texas during the First World War, his Ford Roadster packed with Bibles. We learn of John Percy Wragg, the African American Methodist pastor who ran the society’s “Colored Agency” from 1901 and hired the society’s first black colporteurs.

The extent of the American Bible Society’s claims on national identity may be seen in one episode in 1818. That year the society administrator Samuel Bayard petitioned Congress for an exemption from domestic postage and import taxes on paper, on the basis that the society’s work was salutary for “national character” (24). If vaccinations were tax-exempt, then surely Bibles ought to be as well. Such an exemption would not violate the First Amendment, Bayard noted, which forbade a specific religious establishment—it was never meant to apply to the universal, nonsectarian doctrines of the Bible. After all, biblical principles were American principles. A greater availability of Bibles in fact promoted the exercise of religious liberty. Though Bayard’s petition did not succeed, the episode speaks to the society’s confidence in speaking as a religious, cultural, and political authority for the American people.

One minor quibble is that there are few physical bibles in The Bible Cause. Fea describes how Bibles were supposed to be simple and represent the pure word of God (who spoke most purely in the King James Version). We are told frequently that the society’s Bibles were published “without note or comment” (13). By design the society did not include “images, illustrations, or other curiosities” (32) and ignored the luxury market. But what did these millions of Bibles—whose production and distribution, Fea writes, was the “sole object” of the American Bible Society (22)—look and feel like? How were Bibles made and used? Fea is not a book historian or bibliographer, and certainly the methods of those fields are not the only gainful ways to examine the publisher’s archives. But where Fea does consider publishing activities and the Bibles themselves in greater detail, questions remain. In a discussion of its early production, Fea notes that the society sold three kinds of Bibles, which varied in price from 60 cents to $3, depending on the quality of paper (31). However, a $3 book was not cheap circa 1820, likely equivalent to about a week’s wages of one of the society’s female folders and stitchers who worked long hours in the bindery. What were the differences between these editions, and for whom were they designed?

This expansion of Bible formats continued throughout the antebellum period. Flipping open the society’s Thirty-Sixth Annual Report (1852) to its publication list, one finds that fifty-two editions of the Bible were available in the English language alone, ranging from 25 cents to $10. The most expensive of these was a large quarto, bound in gilded morocco. Even the policy of “without note or comment” seems to have been fluid when it came to these more elegant Bibles; it is noted that many Bibles contain “references.” The American Antiquarian Society preserves many of these finer sorts of American Bible Society Bibles in their collections. One example boasts silk headbands and registers, side references, and an embossed cover signed by Alexander C. Morin, the binder specializing in elegant gift books and albums. The absence of pictorial engravings in these Bibles does not indicate that the society was unconcerned with aesthetics, middle-class tastes, or the luxury market. Even though in its rhetoric the society professed to produce plain versions of the sacred word, in practice, by midcentury it was producing over fifty different media formats of the same text. Those formats were deliberately designed for different classes of consumers. With some humble editions, the society undersold the market. With its fine editions, the society entered the market boldly.

More Bibles appear in the later sections of the book devoted to the twentieth century, and many of these examples are fascinating. The society supplied khaki-covered Bibles picturing an American flag to troops in World War I. A 1960s-era new Bible translation called Good News for Modern Man featured line drawings and a cover designed to evoke newsprint. According to Fea, this cheap paperback was designed to be “used, marked, and scuffed up.” We hear the most from actual readers in this section: as Rick, a member of a “Jesus folk-rock” group in the late sixties put it, “Good News for Modern Man was never meant to look good on a bookshelf. It was at its best in disorderly stacks with its paperback cover bent and wrinkled” (259). The American Bible Society was attuned to the importance of the materiality of their Bibles in the twenty-first century as much as in the nineteenth. Leaders are now in the midst of developing a “wear-able Bible” for a watchband (314). The device will not only provide the Bible digitally on a small screen, but also will detect biometrics such as anxiety levels and offer the appropriate scriptural remedy. The Bible Cause is at its best in its telling of captivating granular details such as these.

Fea wrote The Bible Cause at the invitation of the American Bible Society. Other reviewers have suspected that their warm relationship explains the adulatory tone that crops up in the book. I wonder if an examination of the material record might have forced a more critical posture toward some of the textual sources as well. A material examination of the American Bible Society’s nineteenth-century Bibles would have shown that for all its professed emphasis on Bibles “without note or comment” and its promotional self-image as a Bible depository for the common people, the society was also intimately concerned with appealing to the tastes of the wealthy and the emerging middle class. Consumerism and class are part of the American story as well.

In the opening pages of The Bible Cause, Fea announces that he takes to heart the words of the retired general secretary of the American Bible Society: “The Bible Cause is about people” (1). Fea beautifully brings to life the people of this organization. It may be selfish to ask for more from a book replete with richly drawn portraits of individuals and careful accounting of their efforts. But surely for one of the country’s leading publishers—whose goal was to produce and distribute printed objects—the American Bible Society was about things as well as people.

 

This article originally appeared in issue 17.2.5 (Winter, 2017).


Sonia Hazard received her PhD from Duke University in Religion.




Kidnapped!: Tracking down a ripping good Irish-American tale

In London on St. Valentine’s Day in 1945, the aspiring young novelist Patrick O’Brian, today regarded as one of the great twentieth-century writers of historical fiction, received as a gift an early volume of the Gentleman’s Magazine. Published in 1744, it briefly recounted the sensational ordeal of an orphaned child named James Annesley. The presumptive heir to five aristocratic titles and sprawling estates in Ireland, England, and Wales, Annesley was kidnapped from Dublin in 1728 at the age of twelve and shipped by his Uncle Richard to America. Only after twelve more years, as an indentured servant in the backwoods of northern Delaware, did he successfully escape, ultimately returning to Dublin to bring his blood rival, now the earl of Anglesea, to justice in one of the epic legal struggles of the eighteenth century. “Wicked uncle, kidnapped heir, bastards, sudden death. Very gratifying,” O’Brian later recorded in his diary.

No saga of personal hardship and aristocratic skullduggery so captivated the British public in the eighteenth century as Annesley’s turbulent life. Following his return from America, it quickly became the “common conversation” of coffeehouses and sitting-rooms on both sides of the Irish Sea. With slight exaggeration, a London writer later reflected, “Starting from the low and ignominious state of a slave, he…at once engrossed the attention of the three kingdoms, more, I believe, than any private man ever did.”

This extraordinary tale inspired as many as five nineteenth-century novels. Set either in Ireland or Scotland, each revolved around the dramatic kidnapping of a young heir for the purpose not of extorting ransom but of usurping the lad’s patrimony. Sir Walter Scott’s Guy Mannering (1815) was the first to adopt this formula, but far and away the most famous novel to draw from Annesley’s life was the classic boy’s adventure, Kidnapped (1886), by Robert Louis Stevenson, which recounted the abduction of young David Balfour by his greedy Uncle Ebenezer. Like Annesley, Balfour is consigned to servitude in the American colonies, though he manages to escape after his ship wrecks off the coast of western Scotland. In the end, after a series of adventures in the Highlands, he at last succeeds in obtaining his inheritance.

Although Stevenson never evidently acknowledged his indebtedness, he was a voracious reader, especially of history, literature, and the law, and, in fact, was intimately familiar with Thomas Bayly Howell’s Complete Collection of State Trials…(London 1809-28) that included the Dublin trial of 1743 in which Annesley attempted to reclaim his birthright. Stevenson was also well-acquainted with Scott’s Guy Mannering, of which he owned a personal copy; and he also would have been familiar with Charles Reade’s popular novel, The Wandering Heir (1873), based largely on Annesley’s life.

At the time of Kidnapped’s publication, a critic wrote in the Athenaeum, London’s prominent literary magazine, “Of both ‘Guy Mannering’ and ‘Kidnapped’ the main action was suggested by the Annesley case, that marvelous romance of real life which, in ‘The Wandering Heir’, not even Charles Reade could effectually vulgarize and spoil for future use. And no doubt it may be said that in Balfour’s struggle with old Ebenezer there is nothing so improbable as the real struggle of Annesley with his wicked uncle, and that Annesley’s adventures in the plantations…surpass in wonderfulness any of the chances, escapes, and disasters that befell Balfour. More recently, the legal scholar David Luban has written of Annesley’s life, “Surely this is melodrama and not history.” The truth is that it is both, as I discovered in the course of writing a book about James Annesley with the aid of trial transcripts, newspaper accounts, and nearly 400 legal depositions located in the National Library of Ireland in Dublin and the National Archives outside London.

 

Upon his arrival with other indentured servants in the port town of Newcastle in 1728, Annesley was sold to Duncan Drummond, a small-time merchant-farmer who resided in northwestern Delaware. Early Map of the Province of Pennsylvania, David Humphreys, 1730, page 92 of An Illustrated History of the Commonwealth of Pennsylvania, William H. Egle (Harrisburg, Pennsylvania, 1872). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.
Upon his arrival with other indentured servants in the port town of Newcastle in 1728, Annesley was sold to Duncan Drummond, a small-time merchant-farmer who resided in northwestern Delaware. Early Map of the Province of Pennsylvania, David Humphreys, 1730, page 92 of An Illustrated History of the Commonwealth of Pennsylvania, William H. Egle (Harrisburg, Pennsylvania, 1872). Courtesy of the American Antiquarian Society, Worcester, Massachusetts.

Certainly no historian could wish for a more dramatic tale or a more remarkable set of characters, from the king of England, George II, to a brave Dublin butcher and a wily Scottish merchant, both of whom befriended James. If his supporting cast is more credible than the dramatis personae of a novel by Defoe or Fielding, the members are every bit as colorful. A personal favorite is an elderly widow named Anstace O’Connor, who in 1742 confronted the earl of Anglesea for having kidnapped his nephew. Years earlier she had peddled oysters at Dunmain, the countryseat in County Wexford, seventy miles southwest of Dublin, where James was born in 1715. “The people are moaning because a great many here say you transported him [overseas],” protested O’Connor. Startled, Anglesea snapped, “No, you fool…hold your tongue!” After regaining his composure, the earl offered £40 if she’d swear that Annesley was the illegitimate son of a servant maid, which O’Connor adamantly refused, adding that she would not “take such an oath for all the world.”

But, of course, center stage is occupied by members of the house of Annesley, an English family who during the course of the1600s achieved wealth and fame in Ireland on a grand scale. Apart from personal tenacity and ambition, their rise owed much to Ireland’s dramatically transformed social order. Following a new wave of Elizabethan conquest in the late sixteenth century, Protestant adventurers like Capt. Robert Annesley, the younger son of a gentry family in Buckinghamshire, laid claim to confiscated estates, displacing the native Irish with English and Scottish tenants in the years preceding England’s colonization of North America. Annesley’s grant in 1589 of 2,600 acres in County Limerick was considerably smaller than Sir Walter Raleigh’s, but for an ambitious squire, it was a promising start.

The Annesleys were charter members of an Anglo-Irish upper class that historians have labeled the Protestant Ascendancy. Subsequent members of the family included Robert’s son Francis, a favorite of James I, who became vice treasurer of Ireland in 1625, and a grandson, Arthur Annesley, whom Charles II in 1682 made the earl of Anglesea, one of two English peerages that he received to accompany a pair of Irish titles. Appointed lord privy seal in 1673, the earl was a figure of considerable learning and culture, adopting the motto, from Horace, of Virtutis Amore (With Love of Virtue) for the family’s coat of arms. An acquaintance of John Locke and patron of the poet Andrew Marvell, Anglesea boasted the largest private library in Britain, not to mention vast estates in Ireland and England by the time of his death in 1686.

By any measure—honor, wealth, power—the first generations of Annesleys laid a formidable foundation for the family’s future success. No aristocratic house at the end of the seventeenth century could have expected more from its forebears. Equally important, there was no apparent shortage of male heirs, at least, that is, until the fifth generation. High rates of infant mortality were common during the late seventeenth and early eighteenth centuries, and noble houses like the Annesleys experienced their share of mourning and loss. This, in itself, did not threaten to disrupt the male line, which ordinarily would have descended to the eldest son of the fifth earl of Anglesea, a leading Tory politician who twice served as privy councilor, first to William and Mary and later to George I. But the inability of the earl and his wife to conceive children meant that the family’s honors and estates, rather than descending directly on the main stem, would instead fork sideways, upon his death, to a remote bough of the genealogical tree inhabited by a younger cousin, Arthur Lord Altham, an impoverished baron and the future father of James Annesley.

In 1708, at nineteen years of age, Altham had abandoned his wife, Mary, in London and absconded to Ireland, where he owned land in Wexford, including much of the town of New Ross. Politics was not his passion as it had been for earlier generations of the family. A short, homely man of slight build with gray eyes and black eyebrows, he delighted in low pleasures, best expressed perhaps upon offering a servant employment. “If you come to live with me,” promised the baron, “you shall never want a shilling in your pocket, a gun to fowl, a horse to ride, or a whore” (the offer was accepted).

In time, Altham reconciled with his wife, long enough, at least, to guarantee the birth of a male heir, James, in 1715. Less than three years later, the baron falsely charged Mary with consorting in their bedchamber at Dunmain with a young country squire. “It’s true she bore him [James], and that’s all the pleasure she shall have of him,” he later stormed, having sent her off in a coach, never again to return. After which, on taking a commoner named Sally Gregory for a mistress, the baron became deeply indebted to her among numerous other creditors. Not only was Altham shortly charged with corruption by the Irish House of Lords, but he turned James out at eight years of age to fend for himself in the streets of Dublin, which the youth successfully did for three years as a shoeblack and errand boy. When taken to task for abandoning his son, the baron blamed Miss Gregory: “That bitch,” he protested, “will not suffer me to do any thing for him.” Despite Altham’s sudden death in 1727 at age thirty-eight, no one felt sufficiently concerned to investigate the circumstances. No coroner’s jury convened, and no autopsy was performed on the corpse before it was interred on the night of his passing in Dublin’s Christ Church Cathedral.

 

It is not known whether Annesley posed for this portrait, which originally appeared in one of the published Court of Exchequer trial transcripts. Note the ship to the left, with an infant [Annesley was actually twelve at the time] held aloft in a stern window, and to the right an American scene replete with a palm tree, a half-naked youth with a horn, a pack of hounds, and a pair of beavers. James Annesley by George Bickham the Younger, after Kings line engraving (1744). Courtesy of the National Portrait Gallery, London, England.
It is not known whether Annesley posed for this portrait, which originally appeared in one of the published Court of Exchequer trial transcripts. Note the ship to the left, with an infant [Annesley was actually twelve at the time] held aloft in a stern window, and to the right an American scene replete with a palm tree, a half-naked youth with a horn, a pack of hounds, and a pair of beavers. James Annesley by George Bickham the Younger, after Kings line engraving (1744). Courtesy of the National Portrait Gallery, London, England.

The  protagonists of the saga, of course, are James, the orphaned son, and his Uncle Richard, Lord Altham’s estranged brother, who in addition to being a serial bigamist was if anything more rapacious, the consequence perhaps of being a younger son with neither rank nor financial security. As the brother of a lowly baron, he was not even permitted to use the coveted title of “Lord.” By one account, Richard was of middling height and had the “clumsy manner of a country farmer.” Unfortunately, no known portrait has survived, nor was he the sort to sit for one. Owing to both his appearance and character, a female acquaintance later volunteered, “I would not have had him if I was young, no, not [even] to be a countess.”

Just three weeks before Baron Altham’s death, Richard had paid James, or Jemmy as he was still called, a visit at the home of John Purcell, a warm-hearted butcher who, together with his wife, had taken the boy into their home just north of the River Liffey that bisected Dublin. Do you, Richard asked the butcher, have a boy in the house named James Annesley? Yes, replied Purcell, calling the lad, hesitantly, from the fireside, once and then a second time. Stricken with fright, his eyes beginning to moisten, Jemmy whispered to “Mammy,” Purcell’s wife, “That is my Uncle Dick.”

The boy’s fear was well founded. Following his mother’s banishment ten years earlier from Dunmain, Richard had cursed Altham for not turning out the boy as well. “Damn my blood,” he’d railed, “I would have let her have him, and she might carry him to the devil, for I would keep none of the breed of her.” Circumstantial evidence, in fact, strongly suggests that Richard had his brother poisoned, thereby leaving James as the final obstacle to Altham’s barony along with four more peerages that his nephew stood to inherit. On April 30, 1728, James was grabbed in Dublin’s Ormond Market, bundled into a coach, transported to George’s Quay alongside the Liffey, and carried aboard a vessel, brimming with servants, that was bound for Newcastle, Delaware. Unlike Balfour’s escape in Kidnapped, Annesley was less fortunate, ultimately being forced to toil for twelve years after being sold to a merchant-farmer in Newcastle County named Duncan Drummond.

That Annesley survived for such a length of time as an indentured servant in northern Delaware was a testament to his resilience. In contrast to immigrants who sought to make new lives in the colonies, he held fast to his identity. Not only did he proclaim his rightful origins in hopes of finding a sympathetic ear, but he was said to shun the company of other servants. During the boy’s early years, there had been flashes of pride in his lineage, even as a street waif in Dublin. At the same time, he had an outward ability to adapt to different settings and new hardships. Instability had long been a way of life for him; if anything, it may have eased his adjustment to the life of a servant, however punishing the ordeal or wrenching his separation from Ireland.

Upon James’s flight in 1740—successfully absconding on his third attempt to run away—first to Philadelphia, then to Jamaica, to London, and finally to Ireland, his Uncle Richard, now the sixth Earl of Anglesea, repeatedly tried to have him killed. In November 1743, the two commenced a titanic struggle at the Court of Exchequer in Dublin in what newspapers later declared the longest trial (nearly two weeks) ever heard by a British jury. A total of twenty-eight barristers participated. Never before had five peerages and such an immense estate been at the disposal of the legal system. By one account, the property was worth at least £50,000 a year.

No less momentous in the public mind at the time of the trial were allegations of aristocratic wrongdoing, which, for James’s following, had robbed a peer of the realm of his birthright and broken the chain of succession within the house of Annesley. Richard’s circle, by contrast, bewailed the prospect of elevating a false claimant to the family’s hereditary honors, not unlike Jacobite efforts to place a Stuart heir on the throne. Opponents commonly derided James as a “pretender,” the slur attached to successive generations of Jacobite claimants. Annesley’s contest, claimed Viscount Perceval in a letter to his father, was “perhaps of greater importance than any tryall ever known in this or any other kingdom.”

The jury found in Annesley’s favor, but his uncle’s appeal set the verdict aside for eight years, as yet new fronts opened in and out of court on both sides of the Irish Sea. As to whether the young claimant finally achieved justice, it is a tale full of unforeseen twists and turns that persisted well after the deaths of both James and his uncle in the early 1760s. “Revenge,” as the saying goes, “is a dish that is best served cold”—even, as events turned out, from the grave.

Although it might seem odd that historians have largely ignored the tribulations of James Annesley, reasons for scholarly indifference are not difficult to find. To begin with, the best-known source associated with Annesley’s abduction is a volume, first published in 1743, titled Memoirs of an Unfortunate Young Nobleman, Return’d from a Thirteen Years Slavery in America, ‘Where he had been sent by the Wicked Contrivances of his Cruel Uncle. Reprinted innumerable times, it is easily dismissed as sentimental fiction, written at a time when overblown stories of high adventure were a popular literary genre. Were the protagonist, modeled loosely after Annesley, of humble origins and a roguish bent, the style might be labeled picaresque, similar to that of Moll Flanders (1722) and Tom Jones (1749).

I first encountered Annesley’s purported memoirs nearly thirty years ago when researching a book, Bound for America (1987), about the banishment of British convicts to North America during the 1700s. My instinctive reaction was much the same, I suspect, as that of most others: to dismiss the volume out of hand as fiction, and bad fiction at that. In 1975, in fact, a reprint had been put out as part of Garland Publishing’s “Flowering of the Novel” series.

In retrospect, like other eighteenth-century narratives set in British North America such as The Infortunate, written by the former servant William Moraley (1743), it is now clear that the Memoirs blended fact with fiction. Although Annesley himself was not the writer (the book’s florid style alone discounts that possibility), only he could have been the author’s principal source of information. Once stripped of their romantic hyperbole, events in the Memoirs for the most part ring true, and references to names and places usually prove accurate.

That James Annesley’s story has gone untold cannot, however, be attributed solely to the false scent left by a Georgian narrative—especially if we consider, beginning in the 1960s, the publication in academic journals of several short articles devoted to different aspects of Annesley’s life by the late Lillian de la Torre, a prolific mystery writer. Those articles notwithstanding, the larger story of James Annesley has invariably failed to draw scholarly attention.

A second explanation lies, perhaps, in the problematic nature of the legal evidence, particularly a transcript of the climactic trial in Dublin between James and his uncle that was published with a long introduction in 1912. The principal dilemma in using this document and, for that matter, other legal sources, including depositions, was the contradictory nature of much of the testimony. Perjury, I quickly realized, whether in court or in a sworn affidavit, was a persistent problem in eighteenth-century Ireland. In the historical novel Castle Rackrent (1800), Maria Edgeworth contrasts the “Englishman who expects justice” at court with the “Irishman who hopes for partiality.” The extent of perjury during the trial in 1743 was unusually flagrant. Although disagreements between witnesses arose over dates, places, and personalities, invariably the underlying point of contention was the same: whether or not James Annesley was the legitimate son of Lord and Lady Altham and, in turn, the rightful heir to the honors and property of the house of Annesley.

How, then, best to determine the facts of the controversy? As much as possible, I tried to shed any presuppositions, despite a natural inclination to sympathize with Annesley’s plight (bastard or not, he was abducted by his uncle). Grasping heirs and bogus claimants are common enough in British history, if rarely on the scale alleged by Annesley’s enemies. And there was always the chance, I recognized, that James himself, however sincere his protestations, might not have known the true details of his birth.

In the end, my task turned out to be less tortuous than I first imagined. Despite its inconsistencies, the totality of the evidence veered strongly in Annesley’s favor. Independent sources, when available, almost always corroborated the claims of James and his followers. With few exceptions, their statements were more apt to be accurate. Testimony favoring Annesley during the course of his lawsuits typically seemed more credible than that of his opponents. If his witnesses were less artful, their responses were less contrived.

The earl, of course, alone commanded the resources and authority to suborn legal testimony, particularly in County Wexford where he resided. Not only was he the most prominent member of the county’s landed class, but his assets dwarfed those of his nephew. In stark contrast, when James returned to Ireland in 1742, he enjoyed neither personal connections nor political influence, much less the wherewithal to sway witnesses. It was, I concluded, the merits of his cause that drew such large numbers of supporters, many of whom still recalled the young lord, Jemmy Annesley.

Finally, I decided that no explanation other than a desire to deny James’s birthright could account for Richard’s relentless persecution, which his legal counsel never successfully disputed. Neither Anglesea nor his lawyers ever offered a consistent explanation for Jemmy’s disappearance in 1728. Initially, Richard reported his nephew’s death from smallpox. Afterward, to a circle of intimates, he related “in an easy manner” that the boy “was gone.” Still later, he stated to a friend that Jemmy had died in the West Indies.

These very real methodological problems help to explain scholarly inattention to the Annelsey story. But the full answer, I suspect, also lies in the predilections of academic historians. For social historians, the principal appeal of Annesley’s saga would seem restricted to the backdrop of eighteenth-century Ireland. What might this tale have to say about Irish society, especially the roles played by class, gender, religion, and ethnicity? Meanwhile, cultural historians might be drawn, if at all, to Annesley’s representation in literature and the press. The sensational events of his life would have less appeal in their own right.

More surprising perhaps is that popular historians have not been attracted to a story peopled by larger-than-life figures, complete with a plot overflowing with venality and violence. After all, the tale was sufficiently dramatic to capture the interests of Scott and Stevenson, among others. How could five novelists be wrong, not to mention the studios of Walt Disney, which employed the same formula by pitting an orphaned lion cub, Simba, against his wicked uncle, Scar, in the enormously successful movie, “The Lion King,” followed soon by an even more successful Broadway production?

Alas, for most popular historians, Annesley’s ordeal possesses two fatal flaws. Not only are the antagonists lacking in intrinsic importance, but James’s life, however extraordinary, left little lasting imprint on the historical landscape. Although his abduction by Uncle Richard represented a breathtaking act of treachery in a family drama that unfolded on two continents over seven decades—to the rapt fascination of the British public—the fate of the house of Annesley had no enduring impact upon the past, unlike the aftershocks of revolutions, battlefield heroics, and presidential elections. Nor does this story throw new light on the momentous events of the era, be they wars for empire, scientific discoveries, or the origins of the Industrial Revolution.

Following the return of young Annesley to Dublin in 1743, the Irish Parliament did pass legislation designed to impede the kidnapping of paupers, but the bill was overruled at the behest of officials in London. Kidnapping was destined to remain for many years, under the common law, a misdemeanor, whereas horse theft, a felony, was punishable by death.

Other than providing fodder for novelists, the only other tangible legacy lies in Annesley’s courtroom contests—two trials in particular established legal landmarks, one in the pioneering use of forensic evidence, whereas the other all but gutted the principle of attorney client privilege, a ruling that would dominate British courts for another half century.

Such, then, is the thin gruel that these events offer two different sets of historians—those, on the one hand, who fill the shelves of Barnes and Noble, and others of a more academic bent, prone periodically to deriding popular historians for chronicling the deeds of “dead white men on horseback,” including presumably the shenanigans of British aristocrats.

Fourteen years ago, the novelist Margaret Atwood described her goals in writing historical fiction: “Such stories are not about this or that slice of the past, or this or that political or social event, or this or that city or country or nationality, although of course all of these may enter into the picture, and often do. They are about human nature, which usually means, they are about pride, envy, avarice, lust, sloth, gluttony, and anger”—but also such qualities as love, forgiveness, and charity. Not long afterward, John Demos urged fellow scholars to set a similar task in their own studies, albeit by drawing directly upon the hard evidence of the past—in short, to compose compelling narratives, enriched by a wealth of detail, that reveal human nature in all of its complexity. The critical difference between this and fiction is that for a majority of historians, the past still enjoys the virtue of being more authentic.

In truth, the strategy favored by Atwood and Demos incorporates the strengths of both popular and academic historians. To be sure, many of today’s scholars, in their efforts to analyze patterns of human behavior, more often look at bodies of people, be they ethnic groups, voting blocs, consumers, or social classes. By contrast, Demos and Atwood, in their desire to fathom “the foundations” of “the human condition,” are more concerned with individuals, a tact that may or may not involve the writing of a biography. But this difference in approach, while important, verges on being a subtle, dare one say academic, distinction. The broader point is that Atwood and Demos share with most scholarly historians a desire to plumb the depths of humanity in whatever guise or form. If their methodology adopts, in part, the apparatus of popular authors by favoring well-told stories, replete with memorable events and individuals, their objective, still and all, is to foster a deeper level of understanding.

One of my aims in resurrecting the saga of James Annesley was to illuminate eighteenth-century Irish society, thanks largely to the vast trove of legal documents that members of the Annesley family left in their wake. The sheer density of the depositions, many containing richly detailed recollections of rural Ireland, is stunning. They speak not only of the minutiae of everyday existence—the clothing, furnishings, and customs of lords and peasants—but also of the cadences of Irish life.

But I set out primarily to write a narrative, not unlike, in purpose at least, Atwood’s novels—a story that would hopefully capture the imagination of readers, much as James Annesley’s ordeal had drawn the attention of a young Patrick O’Brian and, sixty years earlier, had inspired Robert Louis Stevenson. Although the Annesleys were scarcely the only noble house in the British Isles racked by internal discord in the eighteenth century, they fought the bulk of their battles at uncommonly close quarters. The ferocity of the family’s clashes over rank and wealth cut to its very core, pitting husband against wife, father against son, and brother against brother. James’s abduction by his Uncle Richard was only the most brazen act of treachery, neither the first nor, surely, the final breach of trust in this violent family drama. Ultimately, then, besides whatever light it sheds on eighteenth-century Ireland, the ordeal of James Anneslsey is a tale of betrayal and loss—but also, I like to think, of endurance, survival, and redemption.

Further reading

For more on the Annesley family saga, see A. Roger Ekirch’s Birthright: The True Story that Inspired “Kidnapped” (New York, 2010) and “Robert Louis Stevenson’s ‘Kidnapped’. . . The True Story,” The Scotsman (Edinburgh), (March 6, 2010). Originally delivered as the Bronfman Lecture in November 1996 in Ottawa, Margaret Atwood’s remarks on historical fiction (“In Search of Alias Grace: On Writing Canadian Historical Fiction”) were printed in the American Historical Review 103:5 (December 1998): 1503-1516 as was, in the same issue, John Demos’s essay, “In Search of Reasons for Historians to Read Novels” (1526-1529).

 

This article originally appeared in issue 11.1 (October, 2010).


A. Roger Ekirch is a professor of history at Virginia Tech.

 



Our Mayflower Bible

The Harry Ransom Center at the University of Texas at Austin is a large rare book, manuscript, film, and photography collection. It is now best known as a superb repository of twentieth-century literary books and archives, although it contains many noteworthy collections of earlier books and manuscripts. It evolved from a rather modest rare book collection, and still retains some of the remnants of its humble origins. This is an account of one of them.

For many years we kept, near the reference desk, a small book truck full of treasures. These were to show to the casual visitor, who, it was hoped, would be intrigued, astounded, and delighted. There were locks of hair, a silver binding, first editions and manuscripts by authors known even to the imperfectly schooled, odd plates by Blake, and other bibliographic flotsam.

About ten years ago we decided to retire this truck and return its cargo to the stacks. Perhaps we had become too sophisticated; perhaps we no longer had very many casual visitors; or perhaps we thought that our vigorous exhibition program provided enough items to see–I doubt if much thought went into the retirement: it just happened. As the items were being put away, one caught the eye of a reference librarian who decided to investigate its claim to treasuredom: our Mayflower Bible.

Sez who? he harrumphed, and I’m sure that his suspicions arose from his perception that the book was too good to be true. Here was a Bible that had not only come over on the Mayflower in 1620, but had belonged to some of the most prominent early settlers, according to annotations on its pages. And there was nothing obscure about its claims: arrivals, marriages, births, and deaths were not only recorded, but were illustrated. There was a small full-length portrait of Peregrine White, the first white child born in New England; Indians with bows and arrows; the first houses; and even the Mayflower itself–all in pen-and-ink drawings in margins and other blank spaces. Could it be true?

Our skeptical librarian began an investigation, and I was soon enlisted to help. Where to begin? I decided to look at how we got it, its history as a Mayflower Bible, and its provenance in general. This information was obtained easily by looking at annotations in the book, and by reading newspaper clippings and exhibit labels found at the back of the book. The Bible had been unearthed in 1892 by Charles M. Taintor, a bookseller from Manchester, Connecticut, and was purchased the same year by S. W. Cowles of Hartford. The price was twelve dollars. Mr. Cowles allowed it to be written up and encouraged the curious to see it in his home. The Bible was inherited by a Mrs. S. W. Cowles, presumably his widow or daughter, who took it with her when she migrated to Los Angeles. There it was shown and concurrently ballyhooed, with reproductions of some of the drawings, in a 1912 newspaper article. Sometime between then and the mid-1920s Miriam Lutcher Stark, of Orange, Texas, bought it, and in the late 1920s it came to us, with the rest of Mrs. Stark’s library, as a gift.

Besides the early Mayflower-related inscriptions, and the later paraphernalia related to its New England and West Coast surfacings, there were a few annotations in the margins of the text that showed that the Bible had been in England during a portion of the eighteenth century. This was all of its history that the book itself was going to surrender. The gaps in its ownership and its three Atlantic crossings were troubling but did not disallow its authenticity.

I next looked for articles about any Mayflower Bible. I found a single citation–to a 1959 article in the Texas Quarterly–which turned out to be about our own copy, complete with the over-familiar illustrations.

I grew subtle. Maybe current scholarship demonstrated conclusively that some widely accepted fact about the Pilgrims was untrue–perhaps someone died a year earlier or a year later or a marriage may have never taken place. If our Bible had been tarted up in the nineteenth century there might be an irrefutable historical error, then believed to be true, incorporated into the process. Unfortunately, after carefully checking the events, dates, and historical figures found in the annotations against a great deal of current writing on the Pilgrims and the Mayflower, I was unable to find such an error.

My final sally captured the prize–certainty as to whether or not this Bible had actually come over on the Mayflower–although not at all in the way that I thought it would. I proceeded again to examine the book itself. What I was trying to see was whether or not this edition of Scripture was a likely one for a Pilgrim to take on this voyage, even though I knew that the answer would not be conclusive in determining whether or not we had a manufactured rarity among our treasures.

 

 

Marginal drawing of the Mayflower
Marginal drawing of the Mayflower

The Bible was the Geneva version of 1588, a satisfactory edition. Before it was bound a substantial fragment of a Book of Common Prayer. This was somewhat incongruent: why flee religious persecution with a Bible which was accompanied by a prayer book sanctioned by the official church of the country you were fleeing? But some Pilgrims may have considered themselves adherents of England’s established church even as they sought to reform it, and there was always the possibility that this Bible had been purchased without much thought given to the fragmentary material bound up with it. But I needed to know more to complete this final aspect of my investigation: which edition of the Book of Common Prayer was bound in?

As I said, the prayer book was a fragment, and was missing its title page. To even begin to establish the edition I would need to do a collation, so that I could identify it, based on its format, pagination, typography, and the like, by comparing it with Online and printed bibliographic sources. Actually, it had no pagination, and I could see right away that it was in black letter, two columns to a page. A brief inspection revealed that it was in a quarto format, gathered in eights. All signatures present were complete in eight leaves, except signature B, which had, no matter how many times I recounted, eleven leaves. Of course, it is not possible to have eleven leaves in one gathering of a quarto book. A very careful examination revealed how this came to be: the first eight leaves of gathering B were from one edition of the prayer book, and continued the text from the preceding A gathering; leaves nine through eleven of gathering B were actually the last three leaves of signature B of another quarto edition of the prayer book. That is, in what had been a seriously damaged book, two fragmentary texts of the Book of Common Prayer had been joined together to make the best possible whole. I knew that the only hope of identifying these fragments was to look at the state prayers they contained: the prayers for the head of the Church of England, the reigning king or queen at the time the fragments were printed. These prayers in the second fragment were for Queen Elizabeth. I was unable to date this edition with certainty, but it is one of five or six printed between 1587 and 1601, which fits in nicely with the 1588 date of the Bible.

The first fragment also had a section of state prayers. To my great surprise, these were for King Charles, who ascended the throne in 1625; Queen Mary; and Prince Charles, who was born in 1630. Since significant early inscriptions–notably, “William White his booke 1619” and “This booke to Mr. William Brewster his booke from Susana White 1623”–appear in this first fragment which I later established as a 1634 edition, our Mayflower Bible is a fake.

I did not have mixed feelings when I made this discovery. I was elated. It is very seldom that you can prove that something is a fake–usually even strong suspicions remain in mental coffins without the final nail. But I was sorry that we had shown off this clever pastiche, as the real thing, to generations of school children and other awed visitors, and sorry too that we had lost one of the most intriguing out-of-scope artifacts in our collections.

In wondering how so many could have been fooled for so long, I came up with the moral for my tale:

LOOK AT WHAT’S IN FRONT OF YOU.

 

This article originally appeared in issue 1.3 (March, 2001).


John B. Thomas III is the curator of the Pforzheimer Library of Early English Literature, and Chief, Rare Book Cataloguing, at the Harry Ransom Humanities Research Center, University of Texas at Austin. In private life he is an avid gardener and collector of books on atomic energy.




Accept No Imitations: The campaign against counterfeits, past and present

Don’t look now, but the country’s money is changing. Really changing. After decades of consistency, the greenback has begun a startling metamorphosis in its appearance. It all began in 1996. Out went the modest busts of the dead, replaced by enormous heads with impossibly high foreheads and hair straight out of an advertisement for Rogaine. The new notes made a fetish of asymmetry. The presidential portraits sidled leftward, and strange and shiny numbers made their appearance on the lower right-hand side of several of the high-denomination bills, printed in a green—or is it black?—ink. And no sooner had we come to terms with the fresh look than the Bureau of Engraving and Printing let loose a new twenty-dollar bill that featured, of all things, a light-blue eagle floating to the left of Andrew Jackson’s head. A pale peach stripe now runs through the center of the bill, and the little iridescent “20” has changed from green to gold. Even the back of the note, which had up until then been left unchanged, was given a sprinkle of tiny yellow “20s,” making the White House look as though it has been encircled by a swarm of angry bees. Any user of the once-staid United States currency would be entitled to ask: What’s going on here?

While it is tempting to ascribe our money’s makeover to American envy about the new Euro notes, the threat of counterfeiting was the real impetus for the change. After some seventy years in which the look of the greenback changed very little if at all, the country is adopting a novel look for its currency in the hopes it will deter a new and technologically savvy generation of criminals. But if history is any guide, the Treasury Department has an uphill fight ahead of it; counterfeiters have a knack for circumventing almost any obstacle put in their way. That said, the challenges the government now faces pale in comparison to the monetary misery of an earlier epoch, when counterfeiting assumed epidemic proportions, eventually becoming symbolic of a crisis of confidence in the nation’s currency, and perhaps in its emerging economic culture as well.

It has been called the “golden age of counterfeiting” by one historian. Between the Revolution and the Civil War, counterfeiters operated with impunity throughout the United States. Many became folk heroes for their exploits, and more than a few observers in the fledging republic feared that the economy would drown in a flood of bogus currency. Newspapers of the day published breathless warnings of counterfeits circulating throughout the country. An issue of Niles’ Weekly Register from 1818 warned of a single fraudulent emission of notes, telling its readers that “more, much more, perhaps, than a million of dollars in counterfeit and altered notes, have very recently been manufactured.” The warnings only intensified as the decades passed, and by the early 1860s, the New York Times concluded that “there are very few persons, if any, in the United States, who can truthfully declare their ability to detect at a glance any fraudulent paper money . . . In spite of all precautions,” the paper observed, “every merchant has his pile of counterfeit money, and his hourly fear of having it increased.”

The antebellum era’s counterfeiting problem was a consequence of the nature of the money supply at this time. There is a tendency to assume that the greenback is a timeless creation, that the nation-state has always taken the lead in issuing and safeguarding the currency. Nothing could be further from the truth. Prior to the Civil War, the United States exercised little control over the money that circulated within its borders, having abdicated that responsibility decades earlier. 

In fact, the roots of the problem date back at least to the previous century, when the colonists began issuing paper money contrary to the wishes of the imperial authorities. They had good reasons: in a specie-poor economy, it was absolutely necessary to have some circulating medium with which one could transact business. The British passed laws banning the practice, but to no avail. And a curious North American tradition of monetary democracy—the right to “make money,” literally—was born, one that reached its apotheosis during the American Revolution, when the fledgling nation financed its independence with a flood of paper money.

 

Fig. 1. Front of the three-dollar bill, printed in Philadelphia, May 10, 1775. Courtesy of the American Antiquarian Society.
Fig. 1. Front of the three-dollar bill, printed in Philadelphia, May 10, 1775. Courtesy of the American Antiquarian Society.

Constitution marked an attempt to reverse this trend in that it forbade individual states from issuing “bills of credit.” Yet at the time, those three words had a very specific meaning: the paper debt of governments (and occasionally individuals) issued as legal tender. Paper money issued by state-chartered banks, or “bank notes,” did not have the same pretensions, being nothing more than surrogates of money, slips of paper that could, in theory, be converted to real money (specie) when presented at the counter of the issuing bank. Within a few years of the ratification of the Constitution, a growing number of states had chartered banks and other corporations that could issue their own money. The upshot was not, perhaps, what the framers of that document had in mind when they attempted to “shut and bar the door against paper money,” in the words of one delegate. While only a handful of corporations issued their own notes in the 1790s, approximately 250 did by 1815, and by 1830, the number climbed to 330. Ten years later that number jumped again to 901, dipped in the early 1840s, and then skyrocketed again in the 1850s. By 1860, some 1,562 banks, or “rag manufactories,” as one critic called them, churned out a dizzying stream of colorful bits of paper.

Banks, left to their own devices, did not issue their notes in concert, nor did they subscribe to a uniform design. As a consequence, the look of an individual bank’s notes depended on criteria as disparate as the personal preferences of a corporation’s board of directors, the regional or commercial allegiances of the institution, and the relative cost of engraving the pictures, or vignettes, on the bills. Antebellum bank notes thus portrayed a bewildering array of individuals and events drawn from history, mythology, and fiction: Lafayette, Martha Washington, Saint George and the Dragon, Poseidon, Penn’s Treaty with the Indians, Archimedes, Santa Claus—even contemporary figures like P. T. Barnum, Lord Byron, Jenny Lind, Daniel Webster, and yes, Andrew Jackson. Other notes depicted allegorical figures representing commerce and industry, or stock figures such as slaves, farmers, tradesmen, and sailors. Still others showed ships, railroads, canals, wharves, shops, and other symbols of commerce. With every bank commissioning money of its own design (and in denominations, sizes, and colors of its choosing) more than ten thousand different kinds of notes bobbed up and down in the streams of commerce by the late 1850s, continually changing hands and baffling the uninitiated. Even the phrenologist George Coombe, no stranger to reading appearances, marveled in 1841 that “it has become a science nearly as extensive and difficult as Entomology or Conchology, to know the value of the currency.”

 

Fig. 2. Back of the three-dollar bill, printed in Philadelphia, May 10, 1775. Courtesy of the American Antiquarian Society.
Fig. 2. Back of the three-dollar bill, printed in Philadelphia, May 10, 1775. Courtesy of the American Antiquarian Society.

As Coombe recognized, the simple act of reading these notes posed a substantial challenge, one that grew more acute with every passing year. Early on, when only a few banks issued notes, it was relatively easy to remember the different designs, which made detecting counterfeits—or at least poorly rendered counterfeits —a relatively simple task. But as the decades passed, the market economy took root in the most remote corners of the new nation. Where the market went, banks and bank notes followed. And within the widening compass of capitalist relations, these monetary hieroglyphs drifted ever further from the institutions that issued them, making it increasingly difficult to keep track of the currencies in circulation, much less spot a fake.

It staggers the imagination to comprehend the extent and ubiquity of counterfeiting during the antebellum years. One estimate in 1862 observed that “out of 1,300 bank note issues, but 100 are not counterfeited,” and counted some 5,902 different kinds of bogus bills. Others claimed that fraudulent bills accounted for upward of a tenth, a quarter, or even a half of the paper money in circulation. Many of these fakes went beyond simple imitations. Instead, counterfeiters exploited people’s unfamiliarity with the currency by issuing notes that bore no resemblance whatsoever to the genuine article (spurious notes). Others produced notes with their title, locality, or denomination extracted and a new one put in its place (altered or raised notes). Still others dropped all pretense of authenticity, and arrogated the banking function, producing notes that sounded plausible (from the Merchants’ Bank of Utica, for instance), but which had no parallel outside the counterfeit economy. Such notes, while deemed counterfeit, blurred imperceptibly into yet another category of fraud, the notes of so-called “wild-cat” banks—institutions founded by unscrupulous financiers in remote areas for the express purpose of making it difficult, if not impossible, for the notes to be exchanged for gold and silver. Counterfeiting thus existed on a continuum of fraud where the dividing line between the solid and the sham vanished upon close examination.

While the problem of counterfeiting at this time grew out of the diversity of the money supply (something that is no longer an issue), there are more than a few echoes of the past in the present battle against counterfeiting. Take, for example, the growing availability of technologies that can be turned to the counterfeiters’ ends. In the early nineteenth century, new engraving and printing techniques enabled bank-note engravers to produce infinite copies of the plates and dies used in the manufacture of notes, a process known in the bank-note engraving trade as siderography. All the elements of a bank note—the border, the pictures, or vignettes, and the denominations of the bills or names of the banks—could be copied endlessly with perfect fidelity. More than a few counterfeiters never went to the bother of engraving imitations—they could often get copies of the real thing. Add to that the discovery of chemicals and compounds capable of erasing and altering notes, and perhaps most important of all, the invention of photography in 1847, and the ease with which notes could be copied, altered, and otherwise forged grew exponentially between 1800 and 1850.

 

Fig. 3. Front of a three-dollar bill privately issued by the City Bank of Worcester, Massachusetts, circa late 1850s. Courtesy of the American Antiquarian Society.
Fig. 3. Front of a three-dollar bill privately issued by the City Bank of Worcester, Massachusetts, circa late 1850s. Courtesy of the American Antiquarian Society.

And now? A similar wave of cheap and easy-to-use computer hardware and software—color photocopiers and printers, digital scanners, and image manipulation software such as Photoshop—has flooded the market, enabling anyone with a bit of computer expertise and a criminal mindset to make their own money. Like the technological innovations of the past century, this equipment does not require extensive training to use, and is cheap and widely available for use at home, schools, printers’ shops, and a host of other venues.

The government’s response has been swift, if predictable. Seeing the writing on the wall—and fearing the printing in the wallet—the Treasury Department funded a National Research Council study in 1993 to investigate possible counterfeiting deterrents. The research team put safety features through countless tests, probing for weaknesses, trying to find ways to outwit the latest technology. The ongoing makeover of our money is a product of that first study, and will add about two cents to the cost of producing each note, a cost, the Bureau of Engraving and Printing assures us, that is to be defrayed by interest on government bonds held by the Federal Reserve. Fear not, taxpayers!

 

Fig. 4. Front of a three-dollar bill privately issued by the Asiatic Bank of Salem, Massachusetts, November 1, 1864. Courtesy of the American Antiquarian Society.
Fig. 4. Front of a three-dollar bill privately issued by the Asiatic Bank of Salem, Massachusetts, November 1, 1864. Courtesy of the American Antiquarian Society.

While the new designs may seem exotic and strange, there is nothing particularly new about any of them. The use of special inks, complicated watermarks, complex designs, and denomination-specific safeguards (such as printing “20” dozens of times on a bill) has a long and illustrious history. In the early republic, bank-note engravers and mechanics filed scores of patents designed to frustrate counterfeiters using precisely these devices. Want some anticounterfeiting paper? A proposal from 1822 that calls for the use of paper dyed with blue indigo might be of help. Or would special inks be of interest? Any of the different two-toned black and green inks developed in the 1850s would be of use. Watermarks? They went into widespread use in the early nineteenth century, with the manufacturers of bank note paper taking the lead. All of these anticounterfeiting measures have a history, and a rather long one at that. And in the past, counterfeiters have always managed to circumvent these obstacles. Indeed, they have an incentive to do so. The most dangerous counterfeit—and the one that is most likely to pass without much trouble—is one that perfectly imitates some safeguard the public believes to be inimitable.

It does not bode well for the Treasury Department. Yet one thing our government has going for it is that it need only protect a limited number of designs. Indeed, the strength of today’s money supply lies not with its diversity, but with its simplicity. With only six different types of bills in circulation, it is relatively easy to remember what face goes with what denomination, though more than a few people will struggle to remember when posed that question. Their amnesia is less a function a cultural illiteracy (or poverty) than a testament to just how secure they feel about the currency and how little they need to question the underlying value of these scraps of paper. We do not much read money any more. The bill is in our hands, it is green, and it has a number on it: that is all we need to know. Its virtue is its familiarity. Which is why the government has introduced the new anticounterfeiting measures over the space of close to decade, and in a series of very slow, staged steps. “The currency still has a familiar American look,” states the Bureau of Engraving and Printing on its Website. “The size of the notes, basic colors, historical figures and national symbols are not changing,” the Bureau notes reassuringly. “New features were evaluated for their compatibility with the traditional design of U.S. currency.”

 

Fig. 5. Front of the one silver-dollar bill, series of 1896. Courtesy of the American Antiquarian Society.
Fig. 5. Front of the one silver-dollar bill, series of 1896. Courtesy of the American Antiquarian Society.

That is debatable, given how they have tarted up Andrew Jackson. But the intent is clear: do not make radical changes or you risk shaking people’s faith in the paper in their wallets. The greenback has become so synonymous with the financial strength of the United States both at home and abroad that radically altering the design is dangerous. Such changes are only welcome when a nation wants a new start (Iraq, for example), or in the case of the European Union, when an entire region wants to carve out a new identity. But in general, preventing a few counterfeits is not worth the erosion of confidence that accompanies the wholesale revision of the symbols and signs that give our money its meaning. After all, in the absence of a gold standard, it is all based on confidence.

By contrast, the banks that issued notes prior to the Civil War worried little about what a change in the design of their notes would mean. If anything, a more expensive and artfully engraved note was taken to be symptomatic of the bank’s financial well being, while a poorly engraved or simple note could indicate a lack of resources and commitment. Individual banks and other note-issuing corporations did not have to shoulder the burden of national sovereignty; they had only to worry about their own interests and their own profit.

Despite all the counterfeiting, that system worked relatively well: the nation had a sufficient circulating medium to meets its insatiable need for credit. And while the system collapsed with some regularity—in 1818 and 1837 most dramatically—resulting in the suspension of specie payments if not national bankruptcy, it does not appear to have slowed down the pace of growth. Indeed, if anything, the sprawling system of state-chartered banks and the money they issued contributed to the nation’s economic ascent. The federal government played little role in any of this in the early nineteenth century, minting some coins and chartering the Bank of the United States, but otherwise steering clear of direct involvement in the monetary system. That process of disengagement only intensified after Jackson’s “Bank War” in 1832-33, which effectively transferred control of the money supply from the Bank of the United States to corporations chartered by the individual states. What prevailed in the early United States was not the most dignified monetary system, but it did work, in part because so many people were willing to suspend disbelief and accept otherwise worthless pieces of paper in the course of business. It was an era in which the distinctions between the real and the counterfeit had yet to coalesce.

 

Fig. 6. Back of the one silver-dollar bill, series of 1896. Courtesy of the American Antiquarian Society.
Fig. 6. Back of the one silver-dollar bill, series of 1896. Courtesy of the American Antiquarian Society.

But eventually they did coalesce, which brings us full circle back to the efforts of the federal government to protect the currency from counterfeiters. Short of funds during the Civil War, the North turned to the printing presses to finance the war, issuing the money that quickly became known as the greenbacks. Within a few short years, the convergence of the country and the currency was complete, and the older system of the state-chartered banks and their notes was swept away in a flurry of nationalist legislation, replaced by a uniform currency issued by the federal government and a select number of so-called “national banks.” The nation-state now had a vested interest in protecting the currency, and a new national policing agency was established to prosecute counterfeiters to the fullest extent of the law. Indeed, before the Secret Service began protecting the president, its members spent most of their time protecting the money supply from fraud, imposture, and insult. There is something telling about the fact that the greenback was considered a national symbol more deserving of protection than the head of state through much of the Gilded Age. That eventually changed, but even today, the job of protecting the money supply is central to the mission of the Secret Service.

By the early twentieth century, the Secret Service had largely succeeded in eradicating counterfeiting, and an era of almost unquestioned confidence in the greenback began. Little could those officers have imagined the present crisis of authenticity triggered by the proliferation of digital imaging. And so now the government has apparently come to the conclusion that a police force alone cannot protect the currency. It must harness technology as well. Today, as in the early republic, there is the hope that a splash of color, some watermarks, and a new look will frustrate the counterfeiting community. Perhaps, but as the bankers of the early republic could attest, it is one thing to make money more difficult to copy; it is altogether another matter to make it impossible to imitate.

Further Reading:

There is no serious history of counterfeiting in the United States, though Lynn Glaser, Counterfeiting in America: The History of an American Way to Wealth (New York, 1968) is not without its merits. Also helpful, if a bit earlier in focus, is Kenneth Scott, Counterfeiting in Colonial America (New York, 1957). On counterfeiting and the rise of the Secret Service, see David R. Johnson, Illegal Tender: Counterfeiting and the Secret Service in Nineteenth-Century America (Washington, D.C., 1995). For more on the rise of national monetary systems, consult Eric Helleiner, The Making of National Money: Territorial Currencies in Historical Perspective (Ithaca, 2003).

 

This article originally appeared in issue 4.4 (July, 2004).


Stephen Mihm is an assistant professor of history at the University of Georgia. He is presently writing a book about counterfeiting and capitalism in the nineteenth-century United States.




Brother, Can You Buy a Salem Witch Death Warrant?: A story of forgery in the Great Depression

A. B. Macdonald, who should have known better, fell for the homesick veteran’s tale of woe. A self-described collector of Americana, Macdonald was a feature writer for the Kansas City Star, where Ernest Hemingway had learned to hard-boil his prose. On a stormy Sunday morning in October 1932, a man calling himself Captain E. Newman Bradley knocked on his door with an offer to sell a rare document at a bargain price. Bradley, Macdonald recalled, was “a tall, slim man, with sunken cheeks and an expression of sorrow and weariness in his face,” wearing a thin brown overcoat and holding a well-worn cap in his hand. “I have not washed, nor combed my hair, nor eaten for fifteen hours,” he said. “I have been too much troubled to do so. I have come to tell you the hardest hard-luck story you ever heard.” He told Macdonald that he was from Houston, a veteran of the World War, “formerly of the air service, U.S.A.” (where, he claimed, he had taught Lindbergh to fly), showed him his discharge papers (specially signed by “General Pershing” in recognition of his bravery as “a flying ace”), and opened his coat to give Macdonald “a glimpse of a shiny metal shield pinned to the breast of his flannel shirt.” He had been an antiques dealer, “ruined” by the Depression. The promise of a job in Kansas City was “a life-saver, for my wife was sick.” He loaded some rare books and documents into his jalopy, “to sell in case of emergency,” and drove north, but he arrived to find that the job had been given to someone else.  “[D]ropping his voice to a monotone, filled with sadness,” he read Macdonald a telegram he had just received “that has broken my heart.” The telegram informed him that his wife was dead.

It would be an act of charity, as well as a shrewd investment, if Macdonald could find it in his heart to buy the death warrant of Elizabeth How, hanged as a witch at Salem in 1692, that Bradley had in his car. “Let me tell you how this excessively rare document came into my possession,” he explained. “Five years ago, in my prosperous days, I bought the furniture, books, pictures and other things in a very ancient mansion on a large plantation in Mississippi and shipped them to my store in Houston. Among the things was a very old Bible. One day, as I turned its leaves, I found this old document, folded between its pages where it had probably lain for a century or two.” The “yellowed old document, the ink faded, its edges gnawed by the tooth of time,” was accompanied by an official certificate of authenticity from the South Carolina Historical Society in Columbia, bearing a file number and the signature of curator J. A. Skaggs, who attested that the Massachusetts Historical Society and an expert firm in New York joined him in pronouncing it genuine. And Bradley wanted only twenty dollars for it. “That will take my car to Houston,” he said. “Do me this favor, please, and you may have whatever profit is in it for you.”

Macdonald immediately tried to parlay his good deed into a small fortune. He offered to sell the warrant to Charles Tuttle, a Rutland, Vermont, collector of “Americana,” “First Editions,” and “Vermontiana,” who enthusiastically responded that before the crash, such a rare document would have fetched $10,000, and even in these hard times ought to go for $5,000 ($65,000 today). Tuttle was not willing to buy it himself but tried to sell it, “at a small commission,” to Howard Corning, secretary of the Essex Institute in Salem. “The Document is faded and foxed a little,” Tuttle told Corning, and the “bottom edge . . . is worn off so that a name signed there is illegible,” but “[t]he balance is as plain as the day it was written.” He sent Corning a typewritten copy of the warrant and followed up ten days later with the “real” thing.

 

A forged death warrant for the accused witch Martha Corey, sold by "Captain Bradley" to a resident of Sandwich, Illinois. Used by permission of the Phillips Library at the Peabody Essex Museum, Salem, Mass.
A forged death warrant for the accused witch Martha Corey, sold by “Captain Bradley” to a resident of Sandwich, Illinois. Used by permission of the Phillips Library at the Peabody Essex Museum, Salem, Mass.

“The thing is a fake from beginning to end,” Corning replied. The modern ink alone was “enough to discredit it,” and the stains on the paper obviously came from a forger rather than age. He advised Tuttle to take a look at the facsimile of the only known Salem death warrant—for Bridget Bishop—in Charles Upham’s 1867 history of the witchcraft affair; “[Y]ou will see that [the] wording and the writing are entirely different.” Tuttle was embarrassed. “I confess at first glance it fooled me. Not so much from the looks of it, but from the fact I did not see how the guaranty got on there.” Of course the ink and the stains were suspect, but the South Carolina Historical Society’s authentication seemed authentic. They ought to look into this Macdonald character, though Tuttle’s hunch was that he was “an innocent party”; it seemed unlikely that a forger would be on Tuttle’s mailing list as a collector.

In their embarrassment, Tuttle and Macdonald reassured each other that the forgery had been “cleverly executed.” Corning tactfully agreed, but after Macdonald had confirmed his innocence and decided to write an article about the scam, Corning suggested that the warrant really was not very clever at all. Besides the “suspiciously modern” ink, paper, and handwriting, and the “clumsy” staining, he explained, “the internal evidence was also damning.” The forgery was “signed” by a panoply of Puritan celebrities, including Increase and Cotton Mather, John Winthrop, and Samuel Sewall. The Bishop death warrant was signed only by William Stoughton, the chief justice of the court of oyer and terminer at Salem. Governor William Phips never spelled his name with two p’s; he was “William Phipps” on the forgery. Corning neglected to point out to Macdonald the document’s most obviously faked signature: the “mark” of the Wampanoag leader “King Philip.” J. A. Skaggs noted the “famous mark” on his certification of authenticity. But Philip had been killed in 1676. Skaggs, it turned out, did not exist, and there was a state historical commission in Columbia, but the South Carolina Historical Society was in Charleston.

Macdonald was not alone in getting played for a sap by “Captain Bradley.” Bradley had taken several other Kansas City collectors for as much as $100 ($1300 today) before he “nicked” Macdonald, and he got the city’s Grolier Society for $50 just after. Corning first learned of the forgeries in August, when the editor of the local newspaper in Whitewater, Wisconsin, wrote the Essex Institute on behalf of his fellow villager S. Hoyum, a schoolteacher who had quite remarkably come into possession of a death warrant for the accused witch Sarah Good. As the Whitewater Register breathlessly reported it, “A few days ago a man from Tulsa, Okla., drove into Whitewater. He was a combination interior decorator and collector of antiques. More than that, he was broke. Hunting up a local resident who he figured might be interested he offered to pawn a valuable relic he carried for a loan sufficient to get him home. He said he would repay the loan because he wanted the relic and insisted on an agreement to that effect in writing.” The editor requested a letter from the Essex Institute certifying the document’s authenticity. Since Hoyum was holding the document as security for a loan, he was not yet prepared to offer it for sale—”although the period of the loan has expired and the manuscript is obviously his.” The poor traveler must have been so badly off that he could not come up with the cash to reclaim his beloved relic. He had told Hoyum that he needed the loan to get home to his sick wife. Maybe she had taken a turn for the worse, and the money for repayment had gone instead to her doctors in Tulsa. Meanwhile, the Register boasted, “Whitewater houses an antique which historical museums throughout the country would covet.”

Quincy, Illinois, was equally proud of its new antique: the death warrant of Rebekah Nurse, who, according to the Quincy Herald-Whig, “was sentenced to hang by Phillip, king of the Indians,” on June 10, 1692. (A curious elision of “villains” here, as if Philip, even if he had still been alive, could have shared legal authority with Puritan judges.) The “rare” document, “valuable, both from a monetary and historic standpoint,” was now in the proud possession of Mrs. Don Hoover. Mrs. Hoover had generously allowed it to be put on display in the window of Odell’s department store.

The “original” Nurse death warrant was simultaneously in the possession of W. A. Thompson, an antiques collector in Greensboro, North Carolina, who offered to sell it to the Smithsonian in September, and of Mrs. J. E. Eckdall of Emporia, Kansas. Mrs. Eckdall, reported her nephew Joseph Kellogg, “is a member of patriotic societies and is interested in such things.” Kellogg, a professor of architecture at the University of Kansas, was highly dubious about his aunt’s document. “What on earth could King Philip have to do with Salem witchcraft matters, twenty years or so after his death!” he asked Corning in February 1933. Corning still took no notice of the strange Philip business when he replied to Kellogg to confirm that Bradley had sold his patriotic aunt a “rank forgery.” Corning also fielded inquiries from Sandwich, Illinois, (about a Martha Corey death warrant), and Marshall, Texas.

By November 1932, Corning had seen enough of the forgeries to launch a public effort to warn collectors. “Beware of a faker who is traversing the Southern and Western states endeavoring to sell some skillfully forged death warrants for female witches of Salem, Mass., 1692,” advised the Boston Evening Transcript’s antiques column. “The warrants are cleverly and convincingly authenticated.” Publisher’s Weekly’s warnings were not so delicate in sparing the feelings of those who were swindled; the forgeries were “obvious.” In January 1933, PW carried an item reporting that a “person calling himself ‘Bradley’” had been “canvassing the middle western libraries” with his fakes: “This person usually represents himself as ill and stranded and attempting to get home to Texas.” The South Carolina Historical Society issued its own warning through the Charleston News and Courier.

Why did Bradley choose to forge Salem death warrants? Why not, say, letters from obscure signers of the Declaration of Independence? (A November 1932 article in the American Book Collector on “fakes, forgeries, and frauds” mentioned fishy Button Gwinnett documents as the kind of thing collectors might encounter.) Corning speculated that it had to do with geography. “Apparently the man has been working almost entirely in the Middle West and South,” he reasoned to Macdonald. “I suppose on the principle that the people out there are not likely to be quite so familiar with this type of document as we would be here in the East.” But if midwesterners and southerners were unfamiliar with the documents of Puritan New England, they were all too familiar with the contemporary discourse of Puritanism. Eastern critics and intellectuals—H. L. Mencken most conspicuously and nastily—had lambasted the Midwest and South throughout the 1920s for their superstitiousness, repressiveness, narrow-mindedness, and general benightedness. Mencken railed against “the Puritan’s utter lack of aesthetic sense, his distrust of all romantic emotion, his unmatchable intolerance of opposition, his unbreakable belief in his own bleak and narrow views, his savage cruelty of attack, his lust for relentless and barbarous persecution”—all of which were revealed in the Scopes Trial in Dayton, Tennessee, in 1925. With beautifully controlled irony, Esther Forbes’s 1928 historical novel A Mirror for Witches located Salem’s “lust” for persecuting the bewitching heroine Doll Bilby in sexual repression.

When Bradley, in the guise of the South Carolina curator J. A. Skaggs, “authenticated” his forged death warrants by observing, “This is the only shameful delusion in early Americana [sic] history,” was he suggesting to his midwestern and southern collectors that Puritanism was not that bad after all? (“The only shameful delusion,” “only 22 executions in Mass.,” he wrote.) Reminding them that the site of a truly relentless and barbarous persecution was Massachusetts? Exploiting their desire to deflect the charge of benightedness back in time and toward the East and North?

Or maybe Bradley had in mind a much more recent instance of persecution and scapegoating by rigid authorities. He gave his victims a false name, but his age and appearance made it likely that he really was a veteran of the World War. Whether or not he had a sick wife or wanted to get home to Oklahoma or Texas, his hard luck was probably genuine. He certainly would have heard and read about the World War vets who hitched, rode the rails, or walked to Washington in the late spring and summer of 1932 to demand early payment of their bonuses to relieve some of the burdens of unemployment. The Bonus Expeditionary Force (BEF)—Bonus Army, as it came to be known—started out as about 250 men from Portland, Oregon, in May, and swelled to 25,000 from across the country in June and July. Camped for a time in twenty-three spots in Washington, and later gathered in one main camp on Anacostia Flats, the BEF included a Texas unit of about thirteen hundred, “headed by a woman—‘a Joan of Arc in overalls,’ the press called her, and with picturesque livestock—a burro named Patman (after the Texas congressman who sponsored early payment) and a goat named Hoover.” On July 28, recalled the BEF’s leader, “a military column moved down Pennsylvania Avenue,” under the command of General Douglas MacArthur. “Here were tanks and cavalry, the men wearing gas masks and carrying drawn sabers. In open trucks machine guns were stripped for action.” After clearing the remaining BEF billets in the city proper, they crossed the bridge to Anacostia, set fire to the tents and shacks in the encampment there, and drove the veterans and their families out of the capital. The routed Bonus marchers, vilified now by the Hoover administration as “communists and persons with criminal records,” straggled through Maryland, reassembled briefly in Johnstown, Pennsylvania, and then dispersed back to the roads and box cars. Maybe Bradley was even among them. Maybe he had passed the long periods of boredom on Anacostia Flats, the chances of getting his bonus increasingly remote, scrawling and staining Salem death warrants. From the torched bivouac at Anacostia, he could have made his way to the upper Midwest in time to start peddling his wares by the middle of August.

Or maybe Bradley was a con man, pure and simple, who shrewdly exploited a burgeoning desire for authentic Americana. Surveying popular attitudes toward the nation’s past, the historian Michael Kammen lists several impulses behind Americana collecting in this period: patriotism; antimodernism; the excitements of bargain hunting; the collegiality of a shared hobby; the internalization of a craft-based “American aesthetic.” It is hard to imagine death warrants tapping directly into love of country or nostalgia, and Bradley’s work did not share much with “the solid, simple, dignified and lovingly wrought craftsmanship” of early American tools, textiles, pottery, or furniture. But for a moment in the worst year of the Depression, A. B. Macdonald, S. Hoyum, Mrs. Don Hoover, W. A. Thompson, Mrs. J. E. Eckdall, and Bradley’s other marks experienced the thrill of possessing something unique (“very rare,” said Skaggs’s certificate), “real” (“pronounced genuine by Mass. Historical Society”), and “valuable.” Bradley’s cleverness was to recognize that the more “honestly” he performed his desperation, the more certain his marks would be that his bargain price signified real worth.

Further Reading:

A. B. Macdonald told the story of how he was bilked by Bradley in “Texas Collector is Forced to Part with a Rare Document,” Kansas City Star, February 19, 1933, 3C. Correspondence and clippings related to the forgeries can be found in Witchcraft Documents, Family Mss., Box 2, Folder 1, Phillips Library, Peabody Essex Museum, Salem, Massachusetts. I am indebted to Bryan F. LeBeau’s article on the Carey document, a forged Salem death warrant from Nebraska that its owners tried to sell in the 1980s, for pointing me to many of my sources. See Bryan F. LeBeau, “The Carey Document: On the Trail of a Salem Death Warrant,” Early America Review (Summer 1997). The Mencken quotation, which originally appeared in “Puritanism as a Literary Force” (1917), is from Fred Hobson, Mencken: A Life (New York, 1994), 192. For the Bonus March discussion, I drew on Roger Daniels, The Bonus March: An Episode of the Great Depression (Westport, Conn., 1971), 82, 84; and W. W. Waters as told to William C. White, B.E.F.: The Complete Story of the Bonus Army (New York, 1933), 222-23, 257. Michael Kammen’s discussion of Americana collecting is in Mystic Chords of Memory: The Transformation of Tradition in American Culture (New York, 1991), 322-25.

 

This article originally appeared in issue 5.3 (April, 2005).


Steven Biel is director of the history and literature program at Harvard University and the author of American Gothic: A Life of America’s Most Famous Painting, which will be published by W.W. Norton in June.




Finding Barnum on the Internet

An antebellum museum in cyberspace

On July 13, 1865, one of the most celebrated institutions in the United States, the American Museum, burned to the ground. But thanks to the wonders of technology, it has been rebuilt—sort of—on a Website called The Lost Museum, sponsored by the American Social History Project of the CUNY Graduate Center, in collaboration with the Center for History and New Media at George Mason University. As it was managed by Phineas T. Barnum, the original American Museum was located in lower Manhattan and presented an ever-growing collection of wonders across five floors, ranging from “cosmoramas” and wax figures, to aquariums and live-animal specimens, to “moral representations” in the Lecture Room.

This new virtual version uses the resources of the Web to present artifacts relating to Barnum’s establishment and to recreate historically particular spaces of an antebellum dime museum. Here, we have a central room linked to three “floors,” containing “Barnum’s Office,” which houses, among other items, a desk adorned with a photograph of Barnum with Tom Thumb, an 1850 Illustrated Guide Book to the Museum, an 1844 image of Bowery B’hoys, and a handwritten letter expressing concern over cruelty to animals. The second floor contains a portrait gallery, while “The Lecture Room” on the third floor contains a screen, framed by a theatrical proscenium, on which the viewer can see magic-lantern slide shows about great fires or nineteenth-century etiquette. Perhaps in deference to the shorter attention span of today’s technology-saturated museum goers, the designers have made a selective version of Barnum’s museum. If antebellum visitors made an entire day of their trips to the real thing, you can be in and out of “The Lost Museum” in about half an hour.

Despite the verisimilitude with which it is portrayed, “The Lost Museum” does not exist in the physical world; it is, rather, an ingenious visual hoax. A virtual museum of oddities and artifacts, it offers a wide-ranging, interactive encounter with the cultural history of nineteenth-century America. In addition to electronic reproductions of historical objects and sources—daguerreotypes, illustrations, broadsides, printed books, manuscripts, and the other detritus that would have filled Barnum’s collections—the site includes a series of illuminating short essays on the museum’s social and cultural contexts. Unlike other pioneering virtual archives, such as the ground-breaking American Memory project of the Library of Congress, “The Lost Museum” recreates the experience of a museum from another era, encouraging us to get lost in the artifice of history. For here we are not just viewing a series of documents. We are exploring the particular, sometimes arbitrary ways institutions and scholars link artifacts to larger stories. More than a historical resource, “The Lost Museum” invites us, then, to ponder the narratives with which we stage authenticity, the material objects and practices with which every generation reimagines the kinship of truth and fiction.

 

The virtual "central room" of "The Lost Museum." Courtesy American Social History Productions, Inc.
The virtual “central room” of “The Lost Museum.” Courtesy American Social History Productions, Inc.

Public Leisure and the Print Medium

 

Barnum remains our most reliable reference point for understanding the struggle between commerce and philanthropy in nineteenth-century America. In her 1997 study of dime museums, for example, the scholar Andrea Dennett baldly asserts, “The answer is clear: Barnum conceived his extraordinary museum for the purpose of entertainment—not education—and with profit as his central concern.” Dennett asserts that “education” and “entertainment” are timeless, discrete categories of experience, rather than concepts developed in the later nineteenth century to justify the mission of civic philanthropy and the modern university. She likewise holds pursuit of profit to be antithetical to a disinterested acquisition of knowledge. Like so many scholars of the American Museum, Dennett makes the outsized personality of Barnum himself—the very image that he so carefully cultivated in print media, in order to market and brand his exhibitions—exhibit A for achieving clarity about the past. In the nineteenth century, however, both producers and consumers of popular knowledge spoke about forms of “rational amusement” in ways that defy simple distinctions between education and entertainment. If Barnum’s name is now synonymous with the tricks of modern advertising, his entertainment succeeded by combining commerce and culture, challenging the simplistic meanings (passive vs. active, imagination vs. utility) that moralists—conservative politicians and evangelical ministers of the early nineteenth century, no less than cultural critics of the twentieth century—would attribute to these concepts.

So how did Barnum fashion American taste? As Neil Harris famously argued, Barnum developed an “operational aesthetic” that made learning about how he pulled off his stunts, hoaxes, and humbugs a major part of their appeal. Amidst rising scientific literacy and populist skepticism of expert learning, Barnum “trained Americans to absorb knowledge” by marketing his museum as an experience of “process,” of problem solving, and of information gathering. Or as another historian, Miles Orvell, puts it, “Learning to tell the true from the false, the lie from the truth, learning trust and mistrust, was part of a process of an acculturation” in a rapidly commercializing America. Another way to think about the value of Barnum’s museum, however, is as an introduction to mass communication, which engaged patrons in a distinctive kind of collective imagination. The latter was being made possible by the confluence of cheap print technology and novel spaces of urban leisure.

To further understand Barnum’s innovations, it is worth attending to his own aesthetic and moral design. To begin with, as he explained in his second autobiography, Struggles and Triumphs, Barnum wanted to attract the largest crowds possible by giving “them abundant and wholesome attractions for a small sum of money.” Like the huge theaters, concert halls, and hotels that competed for antebellum pleasure seekers and tourists, the American Museum sold recreation to “truly mass audiences” that numbered in the thousands. These were, as Harris notes, “heterogeneous groups of men willing to take off their hats and open their pocketbooks for art, but demanding entertainment at the same time.” Although scholars have tended to romanticize these audiences for their raucous, lowbrow, and socially varied character, Barnum suggested that there was no more “truly popular place of amusement” because it was “wholesome” and “attractive” to diverse customers. He “abolished all vulgarity and profanity from the stage,” for example, so that “parents and children could attend the dramatic performances in the so-called Lecture Room, and not be shocked or offended by anything they might see or hear.” And unlike most museums and concert halls, he banned the sale and consumption of alcohol from his galleries. He even hired private detectives to eject men who were drunk or women suspected of prostitution. Barnum made urban leisure safe for everyone.

Using print in innovative ways, Barnum’s promotions turned a visit to his museum into a new kind of imaginative experience. Barnum’s own repeated avowal of a profit motive—so often used to impugn techniques of modern advertising as amoral—underscores the fact that the ends of publicity were more instrumental than symbolic: to mobilize the actions of large groups of people. With commercial signage, broadsides, sandwich boards, and printed money, “a vast spectacle of writing and print [became] part of everyday life in the city by the 1840s and 50s,” the historian David Henkin writes, and Barnum’s “bold lettering and eye-catching graphics” made him, “quite literally, an urban man of letters.” Barnum understood that he owed his success to the “means of printer’s ink, which I have always used freely,” and that advertising was integral to the museum spectacle. Like modern marketing gurus, Barnum understood that he was not just selling particular goods or services. He was selling stories that heightened expectations and speculations about those goods and services. In short, Barnum beat his competitors by telling much better stories about his attractions. “My ‘puffing’ was more persistent, my advertising more audacious, my posters more glaring, my pictures more exaggerated, my flags more patriotic and my transparencies [or illuminated banners] more brilliant.”

Barnum repeatedly justified any particular fabrication that he advertised by pointing to the cumulative experience afforded by the museum. While extending the museum walls outward into the heterogeneous social and physical urban environment, the images and words of his printed advertising and planted “news” were always linked to a collection of bona fide objects and attractions, “a wilderness of wonderful, instructive and amusing realities.” Customers always got their money’s worth because they were only paying twenty-five cents, and there was always more to see. In his first autobiography, published in 1855, Barnum declared, “If a sight of my ‘Niagara Falls’ was not worth twenty-five cents, the privilege of seeing the most extensive and valuable museum on this continent was worth double that sum to any one who was enticed into seeing it by the advertisements.” If the exhibits fell short of what was promised, how much would you pay for another chance to trade in incredulity and wonder? Barnum shrewdly perceived the value of irrationality in a country that, as he put it, was overwhelmed by “a severe and drudging practicalness . . . [that] concentrates itself on dry and technical ideas of duty, and upon a sordid love of acquisition—leaving entirely out of view all those needful and proper relaxations and enjoyments which are interwoven through even the most humble conditions in other countries.” Barnum made sure his customers saw past the objects themselves to become participants in his public theater of the imagination.

Imagination and Virtual Experience

 

No less than Barnum’s museum, contemporary Web design entails the management of crowds. Because it is interactive and open ended, the user-friendly desktop computer has made our encounter with virtual space a matter of the hand-eye coordination we bring to moving cursors and clicking mice. For the rising generation of digerati, weaned on the ever-more seamless simulacrum of live-action and video animation, the intensive manipulation of on-screen images can itself seem awkward, an oddly old-fashioned way of encountering the past through artifacts that challenge the sophistication and flexibility one complacently assumes in the seemingly personal control of technology. For those unused to the rapid-response demands of Gameboys or the hair-trigger sensitivity of Xbox toggle sticks (or who can barely handle the remote control for home video and audio systems), what one finds in “The Lost Museum” often entails getting lost on the Internet. Accidentally pushing the wrong button or scrolling too haphazardly will—like following the strategic (mis)direction of Barnum’s famous “To the Egress” sign, which led some of his less literate (often Irish) visitors out a back door in search of a potential curiosity—land you on the sidewalk.

 

Courtesy of the Brown University Library Special Collections. Click on image to see entire broadside in new window.
Courtesy of the Brown University Library Special Collections. Click on image to see entire broadside in new window.

As with “The Lost Museum,” the navigation of Barnum’s collection in the nineteenth century assumed literacy in modern rituals and modern communication forms. Consider, for example, a broadside reprinted in the 1850s from a newspaper called the Eastonian, taken from the work of the American folk humorist Benjamin Penhallow Shillaber. The broadside narrates a visit to Barnum’s traveling museum in the voice of a censorious older woman, Mrs. Partington, who in a series of books had become a popular comic figure known for malapropisms and homespun wisdom. Throughout the first half of the text, this woman’s approach to the museum is clothed in elaborate, if not fanciful, words and phrases—”retrocession into the place,” “reproaching the tent,” “money I’d given him to pay his dismission”—that strain the particularly spoken coherence of common sense and the simple poetry of marketplace doggerel. But once inside the museum, Mrs. Partington is seduced by the visual wonder of the collection: “I truly declare, I was all struck in a heap with what I saw there.” These awkward verbal formulations give way to what Miles Orvell calls the “omnibus form” of descriptive detail, in which the arbitrary juxtaposition of objects is made natural by the merely rhythmic cadence of nouns and names:

Suits of armor, coats of mail
Together with the end of a comet’s tail,—
An Indian canoe, with a curious paddle,
A Mexican uniform, bridle and saddle,
The Point of an argument, wonderful shells,
And a Chinese pagoder all covered with bells.

If Partington’s entrance in the broadside, like her approach to the collection, is skeptical—”‘Whoever hear the like of that!’/Said Mrs. P . . .”—she leaves it in silent reverie:

The old lady, lost in admiration
Here cut the thread of her narration,
And spreading her handkerchief over her face,
And replacing her needles, in her tin netting case,
Settled to sleep, and to dream of Tom Thumb
And the wonders of Barnum’s great museum.

As a byproduct of the notoriety Barnum’s showmanship had achieved, the broadside suggests how fully Barnum’s exhibitions were already touring an information highway. Partington’s visit begins with her “gazing at a bill, which showed in letters bold the wonders Barnum would unfold . . .” Here, as in so much of the publicity that surrounded the American Museum, Barnum’s own reputation for promotion—and especially his facility with the “bold letters” of printer’s ink—is literally built into one’s encounter with the museum, providing the printed frame within which the retailing of goods and leisure “unfold” as “wonders.” But by animating its depiction of Barnum’s museum with extended quotation of Mrs. Partington, the broadside situates our understanding within the local networks of vernacular speech. What one knows in such a world is always filtered through the homely, colloquial social matrices of gossip, conversation, controversy, and argument that spread the popular taste for antebellum entertainment (no less than politics and literary sentiment) by word of mouth. As Barnum himself so often put it, the key to success in the business of culture was “to make people talk and wonder.”

What separates Mrs. Partington’s virtual experience of imagination from our own are the material frameworks through which audiences see and interpret objects. As recent scholarship by Benjamin Reiss, James Cook, and Bluford Adams has demonstrated, nineteenth-century audiences made sense of exhibitions such as the elderly former slave Joice Heth (billed as “Washington’s Nurse”) and other curiosities both living and dead by drawing on contemporary narratives about race, politics, and the exotic. But situated within the motley collection of the American Museum, objects continued to appeal to the traditional cult of the marvelous. As the literary critic Stephen Greenblatt has characterized that pervasive phenomenon, it is the “power of the displayed object to stop the viewer in his or her tracks, to convey an arresting sense of uniqueness, to evoke an exalted attention.” Much like its ancestor, the Renaissance cabinet of curiosity, the American Museum made the beautiful and the strange kindred forms. In doing so, it affirmed the power of man, nature, or the divine to defy the routine patterns and familiar assumptions of reality.

Objects displayed in museums today, by contrast, evoke the original contexts from which they were taken: the church wall, the royal burial chamber, the archeological dig, the vanished habitat, the ceremonial rituals of pre-modern tribes. They carry a resonance of otherness, differences of context and use that can only be partially recovered by catalogue descriptions and exhibition labels, whose very muteness endow objects with transcendent value as historical and aesthetic artifacts. We are expected to know their meaning mainly by silencing the informal narratives within which untrained viewers such as Mrs. Partington unfolded and circulated the social sense of Barnum’s wonders. Narrated within galleries organized by historical period, national context, artistic provenance, or stylistic tradition, objects became artifacts, whose authenticity and value are framed by omniscient scholarly narratives. As they came to be lit by boutique lighting pioneered in shop windows and department stores, these objects demanded the viewer’s deference to symbolic, immanent values, which—whether animated by the spirit of civilization, history, or an emerging anthropological sense of “culture”—divorced an artifact’s meaning from the vernacular networks through which Mrs. Partington would have grasped Barnum’s wonders.

The modern infrastructure of public culture thus gives us wonder without talk, specialized kinds of historical and aesthetic education mediated less by the operations of vernacular speech and popular taste than the methods and institutions of professional expertise. Museums of natural history, for example, introduced formal taxonomies into the preservation and display of collections. “It is not the objects placed in a museum” that constitute its value, as William Henry Flowers, a director of the British Museum argued. It is “the method in which they are displayed and the use made of them for the purpose of instruction.” Similarly, as historians and librarians, such as Harvard’s Justin Winsor, imported German methods to the American academy, a new “scientific history” displaced romantic narratives with documentary evidence that, as historian Steven Cohn notes in his intellectual history of late nineteenth-century American museums, “emphasized detail over values, facts over ultimate meanings.”

Converted into such documentary evidence, objects not only present arguments about the past, but become totems of professional authority, offerings at the alter of positivist objectivity, licensed and guarded by professionals and institutions of higher education. Throughout much of the twentieth century, ordinary people thus learned to see objects not through the screaming letters of broadsides but through the silent reading of books, exhibit labels, collection catalogues, and the like. The apparatus of professional expertise promoted textual truth over the merely antiquarian, the spiritual, and the just plain popular.

Digital Humbug

 

Seeking “to startle, to make people talk and wonder,” Barnum transformed the marketplace for leisure into a stage for “wonderful, instructive, and amusing realities.” The door of the museum, no less than the two dimensional broadsides that kept it before a mass reading public, was a portal to “realities” that engaged viewers in multiple processes of representation—visual, verbal, architectural, and performative. The speculations that Barnum puffed in print were only consummated when people were motivated to pay the price of admission—the visit by which otherwise standard offerings of mass leisure were made into personal experience. Countless people would leave the museum to write about what they saw in their diaries or letters or simply to talk about their visit with friends and strangers. No matter what they saw, patrons took self-conscious pleasure in the entrepreneurial making and unmaking of meaning.

The digital age has allowed countless museums and libraries to mount exhibitions that travel the information highway but in ways that make the event of a “visit” increasingly passive. Where “The Lost Museum” offers Internet access to a virtual environment, Mrs. Partington traveled to a place:

To see it, and wondered, and marveled
From the first to the last—from morning to night,
And vowed for no money would have missed the great sight.

Today, we can have experiences without actually going anywhere, through increasingly inert and imaginary forms of mobility pioneered in Barnum’s broadsides. Indeed, a New York Timesarticle from July 1, 2000, described “The Lost Museum” as “A Museum to Visit from an Arm Chair.”

 

The "notepad" at "The Lost Museum." Courtesy American Social History Productions, Inc.
The “notepad” at “The Lost Museum.” Courtesy American Social History Productions, Inc.

Much like the cacophonous nineteenth-century city, every Website assumes the viewer’s ability to navigate particular symbolic codes. In place of the typographic carnival of nineteenth-century broadsides, they trade in the contemporary conventions of the home page, arrayed with icons and lists. They also assume—as does “The Lost Museum”—our facility with the visual and verbal clichés of movies and broadcasting; the faint sound of horses and carriage wheels on cobblestones tells us that we are deep into “history” (any time and place before the automobile), while black darkness signals its vaguely sinister elusiveness (can we ever fully see the past with clarity and objectivity?). Like Caleb Carr’s historical fiction The Alienist (New York, 1994) or Patricia Cline Cohen’s non-fictional history The Murder of Helen Jewett (New York, 1998), the site seeks to popularize the taste for nineteenth-century America by framing its cultural insights and artifacts in the narrative devices of an unsolved mystery: who might have burned down Barnum’s museum? The viewer is encouraged to navigate the documents and artifacts “housed” on the site by gathering clues, which suitably enough are indicated when the screen cursor becomes a question mark. In a clever touch, a notepad, done-up in the antiquated guise of nineteenth-century stationary, is furnished for viewers to record and store their clues, creating a personal narrative about what might have happened and why. With these tools in hand, the viewer becomes his or her own historian. For students, the device offers an elegant education in scholarly method. If the virtual realities of mass communication have removed the social dimension from leisure, Internet technology has, in such inspired sites as “The Lost Museum,” allowed us to regain the sense of wonder that once accompanied a trip to the museum. The site insists that we confront artifacts not as disembodied texts and images—the transcription of handwritten documents or photographic reproductions—but as three-dimensional objects. By moving their cursors across the screen, viewers can reproduce for themselves the increasingly old-fashioned optical effects of walking through a gallery, scanning a wall or room crowded with stuff, zeroing in on a particular object or image, focusing one’s attention on a detail. So too, even in its radically reduced number of artifacts, it recreates the eclectic visual experience of collection, the wonder that Partington attaches to the omnibus form of Barnum’s museum—the sheer quantity of objects linked to one another by the seemingly arbitrary logic of physical proximity, commercial expedience, and connoisseurship.

 

"Who Burned Down the Museum?" Courtesy American Social History Productions, Inc.
“Who Burned Down the Museum?” Courtesy American Social History Productions, Inc.

By blurring the line between entertainment and education, “The Lost Museum” allows us to recover an innovative space in the history of public leisure and popular taste. Ironically, the interactive breakthroughs developed for the Internet have given visitors new tools that succeed as education precisely because they trade in tricks of “rational amusement,” adapting the narrative frameworks and media literacy of modern commercial entertainment. If Barnum’s museum was lost to history, its spirit moves again in the humbugs of digital technology.

Further Reading:

 

Barnum’s comments can be found in his two autobiographies: P. T. Barnum, Struggles and Triumphs; or, Forty Years’ Recollections of P. T. Barnum, written by Himself (Buffalo, [1876] 1883) and P.T. Barnum, The Life of P.T. BarnumWritten by Himself (New York, 1855). For scholarly assessments of Barnum in American culture, see Andrea Stulman Dennett, Weird and Wonderful: The Dime Museum in America (New York, 1997); Neil Harris, Humbug: The Art of P.T. Barnum (Boston, 1973); Miles Orvell, The Real Thing: Imitation and Authenticity in American Culture, 1880-1940 (Chapel Hill, 1989); David Henkin, City Reading: Written Words and Public Spaces in Antebellum New York (New York, 1998); Benjamin Reiss, The Showman and the Slave: Race, Death and Memory in Barnum’s America (Cambridge, Mass., 2001); James W. Cook, The Arts of Deception: Playing with Fraud in the Age of Barnum (Cambridge, Mass., 2001); Bluford Adams, E Pluribus Barnum: The Great Showman and the Making of U.S. Popular Culture (Minneapolis, 1997). On museum history, see Ivan Karp and Steven Levine, eds., Exhibiting Cultures: The Poetics and Politics of Museum Display (Washington, 1991) and Steven Cohn, Museums and American Intellectual Life, 1876-1926 (Chicago, 1998).

 

This article originally appeared in issue 6.1 (October, 2001).


Thomas Augst is associate professor of English at the University of Minnesota. He is the author of The Clerk’s Tale: Young Men and Moral Life in Nineteenth-Century America (Chicago, 2003) and is currently writing a book about temperance reform and mass culture