Mary Kelley, Learning to Stand and Speak: Women, Education, and Public Life in America’s Republic. Chapel Hill: The University of North Carolina Press for the Omohundro Institute of Early American History and Culture, 2006. 294 pp., cloth, $39.95.
In the early 1790s, Annis Boudinot Stockton wrote to her daughter that “the Empire of reason is not monopolized by man” (22). Julia Stockton Rush was in a position to understand; her husband, Benjamin Rush, was a founder of the Young Ladies’ Academy of Philadelphia. A century later, women’s rights activist Lucy Stone recalled in a letter to fellow reformer Antoinette Brown Blackwell that she had “learned to stand and speak” in literary societies at Mount Holyoke Seminary and Oberlin College (275). These two stories bookend Mary Kelley’s Learning to Stand and Speak: Women, Education, and Public Life in America’s Republic, which explains the entry of privileged white women into American civil society by examining their education in all-female schools, academies, and reading circles in the seven decades before the Civil War. Kelley argues that “although women also worked with men in parallel literary societies and reform associations, these exclusively female institutions were at least equally if not more important sites at which women gathered locally, regionally, and nationally to chart the nation’s course” (54).
Kelley begins with schools, noting that “approximately the same number of women and men were enrolled in institutions of higher learning” in early nineteenth-century America (41), a remarkable finding in itself. And these women were not studying merely ornamental subjects like embroidery and dancing; instead, Kelley’s research shows, women pursued a curriculum very like the one offered at male colleges, especially after 1820. Although middling as well as elite women attended seminaries and academies, those at single-sex schools were more likely to come from “families with significantly more access to economic, social, and cultural capital” (32). An education in the “values and vocabularies of civil society” prepared such women for both leadership in benevolence and social-reform organizations and “entry into the market economy as educators, writers, and editors” (32). And enter they did. However, although they positioned themselves “at the center, and crucial to the success, of the republican experiment” (27), they continued to couch their influence in the traditional language of family and home. Thus, these women struck a compromise: educational equality in Enlightenment terms but a still-circumscribed—though widened—sphere of influence.
Kelley’s extensive research makes clear that the educated women of the early nineteenth century reveled in their learning; the book is packed with quotations from letters and diaries that celebrate the joys of knowledge. Their dedication is also evident from their participation in literary societies and in the reading and writing they did as adult women, the subjects of the book’s second half. Literary societies at female academies, where members circulated their own writings and where they read and discussed books, “acted as schools within schools,” teaching members “to read critically, to write lucidly, and to speak persuasively” (117, 118). After graduation, many of these women replicated such circles among adult women. Engaging with a wide range of books in conversation and in writing, these women “sanctioned and supported intellectual productivity” (117). Crucially, Kelley argues, these mutual endeavors also gave women the confidence to step into civil society and work actively to shape public opinion.
Although Kelley’s interest is in the roots of women’s activism rather than in its fulfillment, her discussion of reading circles includes an extensive treatment of the connections between literary education and civic action. She shows that debates about the content and form of women’s education stood in for and reflected larger debates about women’s role in society. She traces the extensive networks that women built through literary societies and the ways that such networks built women’s influence within and beyond their communities. In particular, such groups focused on creating broader roles for women in civil society; Kelley locates the political development of such suffragists as Blackwell and Stone in precisely these circles.
Although the book focuses on white women either already members or aspiring to be members of the elite, Kelley is also attentive to the experiences of African American women. Her discussion of African American literary societies, drawing mostly on scarce clues in antebellum newspapers, is particularly thorough. As slavery was gradually eradicated in the North in the first three decades of the nineteenth century, African Americans found themselves subjected to increased racial hostility. For them, literary societies “were ideal vehicles for developing the arguments for, and the strategies of, resistance” (141). In Boston, Philadelphia, New York, and Rochester, African American women gathered, like their white counterparts, to read and write together and to develop what Kelley calls “avowedly political subjectivities,” using meetings to sharpen their arguments against racism and slavery and as a prelude to publishing those arguments in the antislavery press (142).
A subgroup of literary women chose the writing of women’s history as their contribution to civil society. Kelley examines Judith Sargent Murray, Lydia Maria Child, Sarah Josepha Hale, Margaret Fuller, and a handful of other women who published works of history focused on women. Kelley’s careful analysis of these writers’ choices shows that their narratives were designed to claim for women a gender-inflected moral authority that justified their participation with men in creating the nation as a beacon to the world. Establishing the public voices of their female subjects, these writers simultaneously made themselves exemplars of civic actors for their readers.
Because Kelley proceeds thematically rather than chronologically, it is sometimes difficult for a reader to pick out the threads of change. With evidence from the 1790s joined in a single paragraph with stories from five decades later, shifts over time are sometimes buried under the wealth of detail. Yet, the thematic focus is also a strength, allowing Kelley to explore in great depth each of her topics and to make connections across region and time.
The book’s greatest strength is its archival depth and breadth. Drawing on dozens of archives in many states, Kelley recovers the experiences of well-known women like Blackwell and Stone but also of their little-known schoolmates. Learning to Stand and Speak presents an impressive number of examples drawn from the experiences of women across seven decades and at least a dozen states. These anecdotes and stories give Kelley’s analysis weight as well as color, peopling her schools and reading circles, private homes and social libraries with vividly present women and girls. Kelley’s painstaking research convincingly places education, especially reading in the context of mutual improvement, at the center of elite women’s antebellum experience. Two dozen illustrations are an outstanding addition to the text. Learning to Stand and Speak will be an important resource for all historians of gender, education, or print culture in early republic and antebellum America.
This article originally appeared in issue 8.1 (October, 2007).
Lynda Yankaskas is a Ph.D. candidate in American history at Brandeis University, where she is completing a dissertation entitled “Borrowing Culture: Social Libraries and the Shaping of American Civic Life, 1731-1851.”
Gangs, the Five Points, and the American Public
Tyler Anbinder, Five Points. New York: Free Press, 2001. 544 pp., cloth $30.00; $16.00 paper.
Herbert Asbury, The Gangs of New York: An Informal History of the Underworld. New York: Thunder’s Mouth Press, 2001. 420 pp., paper, $14.95.
How can we teach Americans about our violent and sordid urban past? Certainly the movie Gangs of New York introduced countless Americans to the idea that the nineteenth-century city was an inhospitable place with crime, slums, and a core of violence that boggles the imagination. Gangs of New York was visually enthralling with a complicated love triangle and an overly simplistic plot that was more a mixture of The Godfather, Braveheart, and Gunfight at the OK Corral, than a reflection of the historical reality. The problem with the movie was that while it was evocative–with at times carefully detailed language and costume–it was wrongheaded in its historical narrative. The American movie public got no real sense of what it meant to live in the Five Points or what life was really like for any immigrant in the first half of the nineteenth century.
Martin Scorsese based his movie in part on Herbert Asbury’s The Gangs of New York, a sensationalist book that was originally published in 1927 and rushed into a new printing to capitalize upon the release of the movie. In fact, Thunder’s Mouth Press also published Asbury’s The Gangs of Chicago and The Barbary Coast hoping to build on the interest generated by the movie. These books are much the same: they are a compendium of horror stories from the American past highlighting the urban underworld. Asbury was a popular journalist who did a great deal of research for these books, but who did not always carefully separate fact from fiction. In one incredible section of Gangs of New York, Asbury recites the “history” of the mythical Mose, the Bowery B’hoy who existed on the stage and popular imagination, but not in reality. Facts, however, do not get in the way of Asbury’s prose. He declared that Mose’s favorite weapon was the butcher’s cleaver–reminiscent of Bill the Butcher of the movie–and that he was eight feet tall and that “his hands were as large as the hams of a Virginia hog.” Mose once pulled an oak tree from the earth and flailed out at the Dead Rabbit Gang of the Five Points “even as Samson smote the Philistines.” He could lift a horse car over his shoulders, swim the Hudson in two strokes and around Manhattan Island in six (32-33). Perhaps the reader is supposed to know that all of this description is fiction. But Asbury does not distinguish it as such and he so frequently gets information slightly awry or exaggerates events that his books are virtually useless as a guide to the past. All three books are packed with gory details of a poverty that drove men and women into crime. He relished telling us about this gang or that gang, describing great fights, chronicling the activities of prostitutes and the many dives that existed in New York, Chicago, and San Francisco. But we are left incredulous and wondering where reality based in sources ended and fiction began.
Neither the movie nor Asbury’s books help historians in getting Americans to understand the true nature of nineteenth-century urban America. Tyler Anbinder’s Five Points, however, offers us a better opportunity to introduce the general public to how historians understand the past. Based on prodigious research, Anbinder’s book is written for the all-too-elusive “educated reader.” Published by a commercial press, Anbinder’s book opens each chapter with a prologue that focuses on some story, usually about an individual, that highlights the thematic nature of the chapter that follows. The idea is to suck the reader in and get the reader thinking about thematic issues centered around America’s first great slum–the Five Points. The book begins with violence, always a nice hook for a popular audience, with a prologue on the 1834 race riot. Anbinder goes on to describe how Five Points became associated with crime and disorder. Anbinder builds a detailed socio-economic portrait of the Five Points as it developed before the Civil War. He describes the nature of Irish immigration, housing, working, politics, leisure, crime, and vice, and then religion and reform. He culminates this section of the book, which comprises the meat of his research and three quarters of the pages, with a return to violence in the 1850s and during the Civil War. The remaining one hundred pages describe the postbellum transformation of the Five Points with the influx of new immigrant groups–Italians and Chinese–and the initiation of urban renewal at the instigation of Jacob Riis.
As an academic I admire Anbinder’s achievement. Having used some of the same sources, I know how hard Anbinder has had to work to patch together a vivid picture of life in the Five Points. I wonder, however, if the effort has been worth it. Anbinder has filled his book with great stories of bare-knuckle fighting, barrooms, corrupt politics, and mayhem. There are also a host of great illustrations, including a color cover of an original painting of the Five Points that served as the model of the more frequently republished print that appeared on the cover of my first book. The detail and attractiveness of Anbinder’s book may or may not carry the general reader through more than 441 pages of text. But as an academic I am interested in what I learn that is new from a book and I want to see an author’s thesis that makes me think, “Gee, I wish I thought of that.” Anbinder states in his preface that he hopes “to set the record straight” about the Five Points. In this goal he succeeds both in the small and the large picture. For example, Anbinder convincingly argues, to the contrary of the movie Gangs of New York and Herbert Asbury, that there never was a Dead Rabbit’s Gang. More importantly, he demonstrates that while poverty and crime permeated much of the life of the Five Points, the immigrant groups who lived there strove for their piece of the American dream and ultimately succeeded in leaving the slum behind them. While recognizing that this thesis offers a corrective to the movie and Asbury, it is not a new idea for anyone familiar with the history of nineteenth-century urban America. Immigrant groups repeatedly crammed into city slums only to have the second generation succeed economically and move to better neighborhoods. In other words, Anbinder has only provided a case study for something we–the scholarly community–already knew.
Has Anbinder been successful in his other goal of reaching a popular audience? On April 15, 2003, I visited Amazon.com to get some means of answering this question. Anbinder’s hardcover book ranked 17,555, but his paperback was at 7,822. This ranking represents success by academic standards (and much better than the six-figure rankings of my own books), while Asbury’s Gangs of New York stood at 9,048. Unfortunately, millions have had their image of the Five Points warped by the cinematography of Martin Scorsese.
This article originally appeared in issue 3.4 (July, 2003).
Paul A. Gilje teaches at the University of Oklahoma. He is the author of The Road to Mobocracy: Popular Disorder in New York City, 1763-1834 (Chapel Hill, 1987), Rioting in America (Bloomington, 1996), and Liberty on the Waterfront: American Maritime Culture in the Age of Revolution which will appear in fall 2003.
Lincoln/Net: Abraham Lincoln Historical Digitization Project
The Common-place Web Library reviews and lists online resources and Websites likely to be of interest to our viewers. Each quarterly issue will feature one or more brief site reviews. The library itself will be an ongoing enterprise with regular new additions and amendments. So we encourage you to check it frequently. At the moment, the library is small, but with your help we expect it to grow rapidly. If you have suggestions for the Web Library, or for site reviews, please forward them to the Administrative Editor.
With an emphasis on Abraham Lincoln’s years in Illinois, 1830-1861, and a wealth of material outside that period, “Lincoln/Net” is a multi-media, multi-purpose Website. Created and maintained at Northern Illinois University, “Lincoln/Net” represents a collaborative effort by educational institutions, archives, and museums, including the Newberry Library, the Chicago Historical Society, the Illinois State Archives, and the University of Chicago. Designed to reach a general audience, which the site’s creators believe “historians have largely abandoned in recent decades,” this online archive guides visitors through the sources in a focused way by bringing the “findings and debates of American historians” to the Internet. The goal is to help users think historically and to ask questions that make full use of the site’s databases. Accordingly, the materials have been grouped into eight thematic sections: frontier settlement; Native American relations; economic development; women’s experience and gender roles; African-Americans’ experience and American racial attitudes; law and society; religion and culture; and political development. There is, of course, significant attention devoted to Lincoln’s biography.
For each of the major themes, there is a guide to accompanying primary sources, with image galleries, maps, and audio and video clips. Slide shows offer good visuals and helpful narration, and they succeed in providing overall background context for major historical developments of the period. In the “Lincoln’s Biography” section, there is a similar organization in place, with a summary, primary sources, and visual images. To help viewers through this material, the site provides video clips of historical commentary by prominent historians, including Eric Foner.
Educators especially may appreciate the contents and organization of the site. The “Teacher’s Parlor” link takes visitors to well designed and engaging lesson plans. For example, in a lesson on alcohol and temperance in the nineteenth century, the lyrics and downloadable rendition of “King Alcohol” lend immediacy to the subject. The “lesson plans” section is divided into materials for teaching the antebellum era, the Civil War era, and the Gilded Age. With detailed objectives, clear instructions for background reading, links to primary sources, and exercises based on the site, the lessons are ready to use and linked to state standards.
Much of the site is easily adaptable to classroom purposes. Indeed, a laudable feature of “Lincoln/Net” is how user-friendly it is. Easily downloadable audio and video links can be readily integrated into the lessons on the site or used for creating new ones. Thirteen video clips comprise the “women and gender” video section. For the history of the Black Hawk Indian War of 1832, in which Lincoln served as a member of the Illinois militia, the site documents his activities but more importantly provides the larger historical context, with maps, first-person battlefield accounts, and government records. In the section on “Native American Relations,” the selections break down a full video, “Lincoln and Black Hawk,” into many discreet and briefer subsections. Other video clips include eminent scholars such as John Mack Faragher and Kathryn Kish Sklar presenting brief overviews of topics such as “Singing on the Illinois Frontier” and “Education, Culture, and the Patterns of Frontier Settlement.”
In the interactive resources section, visitors will find informative maps that can be manipulated to highlight topics of interest, such as the distribution of churches of different sects in 1850 and 1860; this distribution can then be explored to break the numbers down according to sect, for example. Other maps detail voting patterns in presidential elections from 1840 through 1864. Clicking on two interesting maps showing U.S. population distribution in 1850 and 1860 allows users to view the figures by state and by numbers of free blacks, slaves, and whites.
“Lincoln/Net” seeks to foster deep historical understanding while reaching multiple audiences, including history buffs and educators. With a site that locates its central figure, Abraham Lincoln, within the context of numerous complex historical developments, “Lincoln/Net” promises to go a long way toward succeeding in its goals.
This article originally appeared in issue 8.2 (January, 2008).
American Shores: Maps of the Middle Atlantic Region to 1850
The Common-place Web Library reviews and lists online resources and Websites likely to be of interest to our viewers. Each quarterly issue will feature one or more brief site reviews. The library itself will be an ongoing enterprise with regular new additions and amendments. So we encourage you to check it frequently. At the moment, the library is small, but with your help we expect it to grow rapidly. If you have suggestions for the Web Library, or for site reviews, please forward them to the Administrative Editor.
Wayne Bodle, author of The Valley Forge Winter: Civilians and Soldiers in War (2002), teaches history at Indiana University of Pennsylvania. He is finishing a book on the Middle Colonies from first settlement until shortly after the American Revolution.
The New York Public Library is one of the most distinguished research institutions in the world, and its renowned Map Room contributes to that reputation. American Shores is a searchable Website created by the library with a grant from the National Endowment for the Humanities to bring a portion of its cartographic resources to your own reading room, no matter how far you are from the library’s famed lions on Fifth Avenue. The site’s focus, the Middle Atlantic, is a less-celebrated descriptor for a piece of the geographical pie that was first the Dutch commercial outpost of New Netherland, then colonial British North America, and finally the early national United States.
Definitions of that region are legion. The project’s choice—”from New York south to Virginia”—more closely follows the administrative usage of modern federal agencies than recent scholarly ones, which generally exclude Maryland and Virginia. Reflecting its title, the site’s definition more problematically places the region “east of the Appalachian Mountains,” without saying whether that includes their easternmost ridges, which parallel the Hudson, Delaware, and Susquehanna rivers and bisect the states in question. The project is not about definitions—its scope may reflect the contents of the maps in its constituent collections more than a definition of the region—but academic users should be aware of the implications of these organizational protocols.
The project itself seems designed to accomplish two objectives: to instruct visitors on the fundamentals and applications of cartographic scholarship and to let oriented users roam at will in search of early American spatial knowledge. This review takes the first objective on its own terms and the second one more literally, by testing it on an embryonic actual research project.
The iconic image of “American Shores,” a colorful strip of Atlantic littoral running from south to north (and left to right) from the central Carolina coast to near Riverhead on Long Island, suggests this double objective. Four clickable headers across the top of the site—”Overview,” “Basics of Maps,” “Maps Through History,” and “Geographical Areas”—summon a series of subheads running down the left side that preview the “Overview” section’s stated design to “highlight and explain a few of the hundreds of maps that have been digitized, and offer suggestions for using them in the study of historical topics or geographic areas.” With that opportunity either availed or declined, a large button—”Search Map Collection”—turns you loose into the database.
The clarity and utility of these preliminary elements varies considerably. The marginal subbars on the left border of the “Overview” section include the “Mid-Atlantic Region and [its] Wider World” and a group of narrower spatial frames from the “World” to the “Atlantic Ocean” to the “United States.” Each of the latter brings up a map or two, with commentary and an interactive “catalogue record.” Use of these maps provides a kind of tutorial on cartographic practice. The header “Basics of Maps” offers definitions used by specialists in the field and a “Note on the History of Cartography.” “Maps Through History” is both intriguing and strange. When clicked, it lays out an asymmetrical and somewhat fragmentary categorical catalogue of the Website as a whole (one presumes), under the subheads “Nautical New York,” “Coastal and Oceanic Nautical Charts,” “Early Transportation,” and, most oddly, “American Revolution Battle Sites.” The specific rationale for the selection of these topics—or why other and potentially equally useful subjects, such as “Religious Denominations” or “Newspaper Circulation Zones”—are not included alongside these idiosyncratic headings is never articulated. “Geographical Areas” leads to a more predictable, conventional survey of largely political jurisdictions into which the larger region has been divided, each with its selection of illustrated starter maps.
This arrangement serves as a noncoercive introduction to the map collection itself, with no requirement to linger, and an early opportunity to actually interact with the maps. Clicking on thumbnail sample maps enlarges them but also discloses that running the pan and zoom features “require[s] MrSid plug-in.” A link takes you to a proprietary Website with free software downloads, but nothing clarifies what you are to do next. Once inside the larger map database, the efficacy of the interactive features, including pan and zoom, seems to vary considerably.
Rather than just browse randomly, I wanted to go into “Search Map Collection” with an actual research project. By chance, the invitation to do this review coincided with an embryonic project fit to the region: an inquiry into a migration and settlement salient in the 1720s and 1730s southwest from Esopus (Kingston), New York, on the Hudson River, to the Delaware Valley at Mahackamack (Port Jervis), and then down both sides of the Delaware through the “Minisink Country” to the famous Water Gap. The settlement process involved multiple jurisdictions, ordinary farmers, mysterious folkloric ancient miners, and alarmed royal and proprietary officials in Philadelphia and elsewhere wondering why it was happening. With mouse at hand, the possibility of discovery beckoned.
The resulting trip was both bumpy and intriguing. Under “Search Map Collection,” the subhead “search maps” offered a clickable “New Search Page (Digital Gallery),” which led into an ambiguous new space. Titled the “NYPL Digital Gallery,” it was not clear whether this was a part of “American Shores” itself or an auxiliary New York Public Library (NYPL) domain. The new page offered two search boxes, one of which seemed connected to the “Lawrence H. Slaughter Collection and Others” and the second of which linked to the “Digital Gallery.” It presented some background information on the NYPL, the history of the Slaughter Collection, and a bibliography on cartographic history.
Wanting to get on the ground in the Minisink as quickly as possible, I tried keyword terms in both boxes. It was unclear if either was connected to a database defined by information in the maps in question or just to their titles, but trial and error suggested the latter. Queries directed to names on the ground in adjacent parts of New York, New Jersey, and Pennsylvania mainly went nowhere. Esopus produced no entries and Kingston only a few. Minisink, Wallkill, Shawangunk, and Mahackamack all failed. Queries aimed at counties in the area did better. Ulster brought up some maps, but figuring out how to manage and control the pan and zoom features was difficult. I wasn’t sure whether I was even in the “Digital Gallery” or was instead in the “Slaughter Collection,” reminding me of the fact that early American regions were themselves layered and contested, with royal and proprietary holdings, overlays from New England’s “sea-to-sea” charters, and other murky phenomena. Several queries into the “Digital Gallery” search box returned large numbers of photographs, which suggested that I had wandered out of the Map Room. My sense of challenge intertwined with frustration.
Flailing about digitally for orientation did not help. A click on “Browse: Names A-Z” disclosed an “alphabetical list of 13,636 names [of] artists, authors, publishers, collectors, and others responsible for the creation of items found on the site.” “Browse: Subjects A-Z” brought an even more unwieldy list of “over 58,000 subject terms [which] includes people, places and topics . . . ,” along with the unsettling news that “many gallery records do not have subject descriptions.” It was starting to get dark, and I wanted to get back to “American Shores,” or to any shores, for that matter. This page at least had a tool, “Explore Subjects,” with an “Exact Match” and an “Expanded Search” option, which turned up intriguing items. It was at least the third search box I had encountered since leaving digital Manhattan. It bore the disclaiming title “BETA,” but it gave some sense of at least partial closure, albeit hardly one of real completion.
“BETA” sounds like an appropriate grade for the project as a whole at this early point. Any inquirer using this site will by definition learn some things about cartography, cartographic history, and cartographic approaches to historical interpretation. They may very well come away with useful insights into their own specific project or object of research. Whether the site, in its current configuration, is recommendable over a straight Google search—under “Images” and with the keyword “maps” and their topic—may be a matter of some debate. But, recalling my own first research expedition between the actual stone lions on Fifth Avenue, as a high schooler—when a foray for a paper on Mao Tse-Tung revealed thousands of interesting sources but most of them in Chinese!—I would suggest patience, resourcefulness, and regular return trips. The site will doubtless be fine-tuned, and your own instincts for virtual navigation will mature.
This article originally appeared in issue 8.3 (April, 2008).
Touchstone
The Sesquicentennial, the National Park Service, and a Changing Nation
It’s a great irony of our history: the places of such fierce combat during the U.S. Civil War became, in the decades that followed, quiet places of reflection and reconciliation, where veterans gathered to heal rather than cause wounds, where the nation looked for regeneration. For most of its history, the National Park Service facilitated this healing process, encouraging Americans to derive from these places of conflict common values and virtues that would bind rather than divide. But more recently, the role of the National Park Service and the Civil War sites it manages has become more complicated, reflecting evolving scholarship and the varied demands of a public that does not see the Civil War in monolithic terms. For the National Park Service and the nation at large, the sesquicentennial of the Civil War is an important touchstone on an evolutionary journey that has provoked both praise and censure in a nation still struggling to reckon with its most tumultuous, destructive, and transformative epoch.
No historic event has a more complicated place in American culture than the Civil War. We can’t even agree on its name, re-phrased variously depending on one’s perspective: the War Between the States, the War of Northern Aggression, the War of the Rebellion (once the official U.S. government name), the War for the Union, the War for Southern Independence, the Second American Revolution. Born of conflict, the memory of the war has a conflicted history of its own. In the immediate post-war years, an abiding sectional hostility simmered—personal and deep, it was rooted in the immense personal loss suffered by American families and communities. Later, as the quest for reconciliation reigned, a narrative of mutual virtue evolved, statues of Confederate heroes went up in the U.S. Capitol, and federal tax dollars funded the memorialization of Confederate graves. Some protested angrily at theabsence of sectional hostility, at the seemingly easy acceptance back into the cultural fold of a people and section that had been bent on the Union’s destruction.
Many white southerners rallied around the memory of the Confederacy as they constructed a post-war society akin to apartheid. To many Americans—especially African Americans—the Confederate battle flag (indeed, the Confederacy itself) became not a symbol of courage and sacrifice, but an emblem of oppression.
It is almost impossible to overstate the importance of the Civil War’s battlefields in the process of national reckoning with our most deadly national epoch.
A people’s view of their own history always reflects the views of those who have power. In the decades following the Civil War, Southerners quickly regained political voice, and thus our American narrative has somewhat happily and uncommonly incorporated into its collective story the view of the ostensible losers in a national rebellion. Since the centennial of the war in 1961, the women’s rights movement and the subsequent upsurge in American women’s history has produced new work on the role of civilians in the war. Likewise, the Civil Rights movement gave voice to African Americans and other minorities, who have in turn sought (rightly so) to tell stories that reflect the immense and complicated role played by slaves and slavery in the evolution of the nation. Political power has expanded among all classes of Americans—rich and poor—and so history has delved beyond the Great Men of the past to reflect the experiences of everyday people. We are in a constant process of taking second looks at our past.
This process of re-examination has threatened the cherished view held by some Americans that our nation should have, as battlefield preservationist Jerry Russell has written, a singular, “shared understanding of American history,” a “culture that unites us, not one which divides us.” The Civil War would seem to be an obvious point of friction in that quest, but early in the twentieth century the war’s battlefields were places where sectional animosities and lingering resentments could be laid aside. They became (and remain) places where the common virtues of Americans North and South were celebrated, where by focusing on American “good,” the ugly blemishes of history could be painted over in the name of national unity. For more than a century, the war’s battlefields became something of a refuge for a nation still wounded by war.
The modern 69th New York Infantry leads a procession of more than 2,000 through the streets of Fredericksburg toward the Sunken Road, accompanied by church bells tolling. Photograph courtesy of the National Military Parks.
It is almost impossible to overstate the importance of the Civil War’s battlefields in the process of national reckoning with our most deadly national epoch. It is likely that the United States preserves more acres of battlefield land in its borders than the rest of the world combined. For nearly a century—from the creation of the first federally owned battlefield site at Antietam in 1890—Americans demanded of their battle sites a congenial neutrality. At Manassas, the deed conveying Henry Hill from the Sons of Confederate Veterans to the National Park Service (NPS) stipulated that the government would “care for and preserve this battlefield without prejudice to either the North or the South” and not detract from “the glory due Confederate heroes.” At Fredericksburg in the 1930s, when someone objected to the NPS exhibits in the new visitor center on the Sunken Road, arguing that they ought to provide greater context for the battles, he received a rather terse reply from the NPS. “To what end?” the exhibit planner wrote. “The consequences of a major war are infinite … and these things shift with the bias of every writer.” Yet, he declared, “one result is simple, striking and indisputable … Death admits of no argument.”
In the long history of our battle sites, there have been few hard questions and little discussion of the larger issues that either gave rise to or were resolved by the war. Instead, the most intense debates raged about remarkably specific questions: was Sickles (not Lee!) wrong at Gettysburg? What if McClellan had committed all his men at once at Antietam? The battlefields became places of reverence, engines of empathy, platforms for national reconciliation (none of which are bad things). Visitors and NPS historians alike engaged in a rhetoric of affection and nostalgia that still persists. At the dedication of Fredericksburg and Spotsylvania National Military Park in 1928, the keynote speaker from Massachusetts declared, “We do more than to dedicate these fields in memory of things which have passed. We consecrate them, in the spirit of Robert E. Lee and of Abraham Lincoln, to a more perfect understanding between the South and the North, and to an abundant increase in brotherly love.” A slim 1930s volume of regulations that governed the work of rangers at the nation’s various military parks admonished, “The story of the guides shall be limited to the historical outlines approved by the superintendent and shall be free from praise or censure.” This language still exists in the federal regulation governing licensed guides at NPS sites.
To be sure, interpretation at NPS sites has evolved over the decades, mirroring academia’s progression from a focus on great leaders to increased attention to the experiences of the common soldiers and civilians amidst war. This trend found its greatest expression in the proliferation of living history programs at NPS sites in the 1960s and 1970s. To the details of battle, the NPS added the details of the human experience. A 1978 publicity photo for a living history camp at Chancellorsville proclaimed, “Here at Chancellorsville the National Park Service has attempted to recreate in every possible detail the camp of a Confederate ordnance detachment.” But this trend only perpetuated the intense interest in the details of war—rations, equipment, uniforms, the fabric and rhythms of camp and battle—without reference to the war’s larger issues.
Interpretation of the battles themselves reflected change, too. The National Park Service sought to understand its battles and landscapes better, and so in the 1960s commenced an intensive effort to document battles through minutely detailed battle maps. Later, on-site historians gave increased, often singular attention to the experience of men in combat. The use of quotes from soldiers’ letters and diaries, carefully related to the specific site of a certain event, made for a powerful combination. While these efforts surely told us important things about the war on the ground, they did not challenge the concept of battlefields as a place of national refuge. The focus on shared experiences, shared sacrifice—the commonalities rather than the differences between soldiers—reinforced the traditional (and rather ironic) role of battlefields as places of congenial neutrality, healing empathy, and patriotic expression.
Of course, historians and the more learned fringes of the American public continued to explore the war’s many complexities in academic journals and thick books. New scholarship exploded myths, corrected long-cherished historical misperceptions, and provoked public discourse about the cause, purpose, nature, and significance of the war itself. But into the 1980s, the traditional role of Civil War battlefields as sanctuaries within our society remained largely unchanged: they were places of commemoration, places of reflection, sites whose stories reflect larger American virtues and honor most participants. Indeed, the dream of a singular, uncontested memory of the Civil War was a reality for a century on America’s battlefields. Historians working at these sites continued to focus on narrow themes of history and commemoration, largely avoiding controversy—and largely ignoring the swirl of new thought that engulfed Civil War historiography in the 1960s, 1970s, and early 1980s. Their cautious, non-controversial practice of history sustained the public perception of NPS historians as memorialists.
Re-enactment of Civil War scene at Gettysburg in July 2013. Photograph courtesy of the National Military Parks.
Many Americans have found comfort in this image of NPS sites and staff. But not all. In February 2011, my colleague Steward Henderson and I gave a tour, “Forgotten: Slavery and Slave Places in Fredericksburg.” We had given the tour before, but that day’s audience consisted of about 70 members from three historically black churches in Fredericksburg. The tour went well, with a high energy level all around. In the midst of it, an older gentleman pulled me aside and said, “Are you going to get in trouble for doing this?”
I said, “I’m sorry. I don’t know what you mean.”
“You know,” he said. “Your bosses. I didn’t think you guys were allowed to do things like this.”
During the day, I received a number of comments along the same lines, expressing surprise that we, NPS staff at a battlefield site, would create a tour dealing with slavery. Clearly, this group of people perceived me and my colleague as part of an organization bound by rigid (if unspoken) limits of inquiry and interpretation, an organization at best ambivalent and at worst hostile to an interpretation of the war that strayed beyond traditional topics or sites.
But more importantly, the question highlighted a great irony: while the traditional role of Civil War battle sites as sanctuaries offers comfort to some, for others it is a barrier to their engagement with both the history of the war and the National Park Service. As one man explained at a community forum just months after the tour, sustaining a positive image of the war meant sustaining a positive image of the “white-supremacist Confederacy.” The American tradition of “celebrating” the war through its battlefields—re-enactments, pageants, concerts, idolatry, and even commemorative ceremonies—has become, to some, offensive.
Something else renders the National Park Service’s relationship with the Civil War and its battlefields more complicated than most. Tens of millions of Americans have a blood relationship with a Civil War soldier, the men whose deeds the battlefields were set aside to remember. These Americans often see the war not with the dispassion of a historian (even an amateur historian), but through the intensified lens of a family connection. Many visitors to NPS sites often understand the war in a way that reflects generations of conventional wisdom rather than historical knowledge acquired through formal study. Unlike any other event beyond our direct memory, the Civil War has constituent groups that patrol the intellectual universe, intent on protecting and advocating a specific memory of the war—usually one that reflects positively on their ancestors, communities, or regions. Historians have demonstrated that many aspects of this “true history” (as it is often called by heritage groups) are at best incomplete and at worst not true at all. Still, the beliefs endure in parts of the general public—and most commonly in those members of the public who visit National Park Service battlefield sites.
This personal connection to the past has helped shape our nation’s relationship with and understanding of the war. At least as it relates to the Civil War, we as a nation have permitted the personal motivations of soldiers (often imperfectly remembered or revised over time) to define the cause and purpose of war for the public. If you work at a Civil War site any amount of time—say, more than a week—you will hear something like this from a visitor: “My great-great-grandfather didn’t own slaves. He sure as hell didn’t fight to preserve slavery. He fought to defend his home, the way of life of his community and state. The Civil War wasn’t about slavery, and you are wrong to tell people it was.”
Visitors place flowers atop the famous stone wall at Fredericksburg, in tribute. Photograph courtesy of the National Military Parks.
We have heard such assertions so often they qualify as a mantra. Of course, virtually every credentialed historian in America accepts a connection between slavery and the Civil War, and most of them see the connection as central to its cause, its progress, and its outcome. But to acknowledge, for example, that the South formed the Confederacy largely to protect the institution of slavery is to suggest to the millions of Confederate descendants that their ancestors fought to sustain what by any measure was a vile institution—perhaps the darkest stain on America’s national fabric. Many remain vehemently opposed to scholarly arguments about the war and slavery, and don’t hesitate to tell you. It was this vehemence—first articulated by the founders of the United Confederate Veterans and United Daughters of the Confederacy more than a century ago—that inspired the nation to simply avoid the topic and focus on the shared virtues of men fighting for life and principle (whatever they might have been) on our nation’s battlefields.
Since the 1980s—as scholarship from earlier decades started to take root in the American mind, and as scholars started exploring the role of historical memory in American culture—Americans have increasingly seen the Civil War not through the lens of personal connection, but through the prism of national purpose. This is by far the most important change in the cultural landscape of Civil War history in the last three decades, and it is one that portends dramatic change to come. Among those changes will be that America’s battlefields will likely no longer provide the quiet and happy historical refuge where history is neatly compartmentalized to provide comfort for Americans struggling to understand and reckon with their past.
The sesquicentennial is a touchstone in a process of change that began in the 1980s and will continue for years to come. The scholarship of the 1970s and 1980s demonstrated clearly that the Civil War constituted far more than just a confrontation between men in uniform on battlefields, and the turn to studies of historical memory and historic places helped launch the National Park Service into a new era. Edward Linenthal’s Sacred Ground: Americans and Their Battlefields (1991) gave NPS public history professionals important context on the evolution of the industry in which they worked. Later studies by David Blight and many others illuminated the conscious manipulation of memory in the name of national reconciliation, and its consequences—including, notably, the alienation of the African American community from the history of the Civil War.
The first recognizable hints of change came in 1991, when Congressional staffer Heather Huyck (who holds a PhD in history and was formerly an NPS employee) inserted language into new boundary legislation for Fredericksburg and Spotsylvania National Military Park that directed the park to interpret not just military events, but the impact of the war on civilians. Similar language followed in other bills related to Civil War sites. Throughout the 1990s, NPS battle sites responded in various ways to the emerging scholarship and greater understanding of the foibles and virtues of seventy years of practicing public history on Civil War battlefields. In a new General Management Plan, Antietam National Battlefield placed increased emphasis on the relationship between the battle and the Emancipation Proclamation. Monocacy National Battlefield embraced themes that viewed that site through the lens of the civilians who worked and shaped the land. At Manassas, archeological investigations illuminated not just the battle, but also the lives of slaves and free blacks who lived in the area. By mid-decade, close observers could see change happening at many NPS battle sites.
In 1998, superintendents of Civil War sites across the country met in Nashville with an eye toward formalizing the changes already appearing at battle sites across the land. While the conference generated agreement for collective action on issues like recreational use, managing layers of historic resources, and road expansion in parks, the issue of interpretation clearly emerged as the headline. The Nashville conference commenced a process that would result in a service-wide interpretive plan, called Holding the High Ground. It was this plan that, a decade later, would become the basis for NPS involvement in the sesquicentennial of the Civil War. In their introduction, the authors of Holding the High Ground stated:
The proximity of time and place matter. Hundreds, sometimes thousands, have attended real-time programs on the original site at Fredericksburg, 150 years removed. Photograph courtesy of the National Military Parks.
The challenge that faces the National Park Service today is a huge one: to convey the significance and relevance of the Civil War in all its aspects while at the same time sustaining the Service’s invaluable tradition of resource-based interpretation (a concept that is at the very foundation of the National Park Service’s mission). … This plan urges a broader approach to interpreting the Civil War—it seeks to have parks challenge people with ideas, challenge them to not just understand the nature and horrid expanse of the bloodshed, but the reasons for it, and the consequences of its aftermath.
The plan acknowledged the inherent limits of battlefields as venues for interpreting the Civil War and urged an expanded definition of “Civil War sites” to include those that can vividly address “causes, politics, social change, the military experience, civilian experience, and the legacy.” Holding the High Ground also urged managers of Civil War sites to re-examine and expand how they interpret events and sites by giving voice to observers with perspectives beyond the military: civilians, slaves, and observers on the homefront. And finally, the superintendents embraced a broader set of themes that addressed everything from causes to the war’s evolution to emancipation to industrialization and the civilian experience to consequences and legacy. These themes constituted not a mandate, but an option, allowing each site to embrace those that most closely fit its story and resources. The superintendents realized that not every site can effectively interpret every theme, but collectively NPS Civil War sites can convey the immensity, complexity, and enduring relevance of the Civil War.
Holding the High Ground was a working document rather than a public proclamation. Though it received little notice outside the NPS, its vision for interpreting Civil War sites—as evidenced in new exhibits and interpretive programs—provoked an intense public debate that especially riled traditionalists. NPS Chief Historian Dwight Pitcaithley took to the road to argue in favor of a new vision for Civil War sites. Congressman Jesse Jackson weighed in, inserting language in a bill that directed NPS sites to interpret “the unique role that the institution of slavery played in causing the Civil War.” Traditionalists took to their computers and microphones in response. Given the historically gentle relationship between most white Americans and Confederate history, it is not difficult to understand why.
For decades, NPS battlefield sites had been placidly neutral places, where forgetting and remembering sometimes competed for ascendance. Staff at NPS sites had practiced history diligently and well, but usually played the role of memorialists. In the decade before the sesquicentennial, some feared that the NPS was abandoning its traditional role of honoring the men of both sides—often to the detriment of the Confederacy. A Confederate heritage advocate saw the NPS in harsh terms: “Not every ranger or guide exhibits hostility to all things Confederate,” he wrote, “but, the National Park Service, as a governmental agency, is avowedly hostile, and plans to present the story of the War Between the States as a simple conflict between good and evil.”
When in the early 2000s the NPS placed an interpretive panel in the museum at Manassas that discussed the nexus of slavery and the war, some members of the Sons of Confederate Veterans pondered a legal challenge. The SCV had once owned the heart of the battlefield at Manassas, and in conveying the land to the National Park Service in the 1930s had included a condition in the deed that required the federal government to manage and interpret the site in a way that would not detract from “the glory due Confederate heroes.” To some in the SCV, the new panel on slavery in the museum did exactly that; the Park Service had “become defamatory to the memory of our ancestors,” and in so doing had violated the 1936 deed restriction. Ultimately, talk of a lawsuit faded, but the episode highlighted the sensitivities of some organizations to a more scholarly interpretation of the war. The offending panel (by most measures mild in its interpretation) still stands in the museum.
The protests of heritage groups and a few individuals matter not because they threaten to derail efforts to broaden NPS interpretation at Civil War sites, but because they signal just how important the changes have been. Which brings us to the sesquicentennial itself. For much of the public, the 150th anniversary of the war has been the first time they have encountered this broader approach to interpretation at NPS sites. Harpers Ferry commenced the sesquicentennial in October 2009 with thoughtful, popular programs related to John Brown’s raid. Later events focused on Lincoln’s 1861 journey to Washington, fugitive slavery in Fredericksburg, emancipation at Antietam, secession at Fort Sumter, slaves at Lee’s Arlington House, mobilization at the Boston Harbor Islands, and civilians at Richmond. The NPS has published new booklets on slavery as a cause of the war and explored the role of Native Americans and other groups commonly ignored in traditional narratives.
Events during the sesquicentennial have demonstrated that the evolution of interpretation at NPS sites has largely been a process of addition, not subtraction. Events at battle sites continue to focus on the military conflict and to offer traditional interpretive and commemorative moments. At the heart of these events are the “real-time” programs, conducted on the precise ground where the battle took place precisely 150 years after the event. The proximity of time and place remains a powerful attraction to visitors to NPS sites, who have attended these programs by the thousands. But they also offer more. At Fredericksburg, “Ten Thousand Lights to Freedom” remembered the more than 10,000 slaves who crossed the Rappahannock River to freedom behind Union lines during the spring and summer of 1862. On the battle’s anniversary in December, more than 2,000 visitors, surrounded by tolling bells, joined a slow procession through the streets of Fredericksburg—a program intended to connect the story of the town to the story of the battle. In 2014, the culminating commemorative event at Spotsylvania Court House will include a procession that reflects on the experience of slaves and civilians before concluding with a remembrance of the fighting men and the immense cost of war at the Bloody Angle.
At “Ten Thousand Lights to Freedom” in Fredericksburg, visitors carried, then shed, stones, symbolic of slavery. Photograph courtesy of the National Military Parks.
What has been the public response to these activities? There has been hardly a complaint, and most often the programs have been met with overwhelming praise. With few exceptions, programs have been at or near their capacity. More than 200,000 visitors attended 150th anniversary events at Gettysburg, and tens of thousands more flocked to Manassas, Richmond, Antietam, Fredericksburg, and Chancellorsville. Abetted by the incredible reach of social media, millions of people around the world have engaged with the National Park Service during the sesquicentennial of the Civil War.
Some in the field of public history have seen the sesquicentennial as an intellectual destination for the National Park Service. Once we are done with the 150th, they say, it is time to declare victory and move to the next big thing (notably the centennial of the NPS itself in 2016). But for those working at sites related to the Civil War, the 150th is a chance to gauge where the nation and the National Park Service stand in an interpretive process that will continue beyond our lifetimes. It seems clear that the vast majority of the interested public has embraced the more comprehensive and just approach to Civil War history reflected in NPS programming and media over the last twenty years. Louder than the complaints from traditionalists that the NPS has done too much are complaints from some circles—notably academics—that the NPS has not done enough. In public history, the intellectual winds rarely wane, even if they do change direction.
The changing place of the Civil War in American culture presents the National Park Service with some profound and fascinating challenges. At their root are two competing phenomena: the Park Service’s traditional role as memorialists, and the increasing inclination of Americans to view the Civil War through the lens of national purpose, to lay claim to a national, not merely personal, narrative of the Civil War. As advocates for Confederate heritage clearly understand, seeing the Confederacy in terms of its purpose as a nation makes embracing the Confederacy—a nation founded in a quest to perpetuate slavery—a difficult proposition. Not long ago, the “chief of heritage defense” for the Sons of Confederate Veterans argued, “We don’t need to give visitors an entire history of the antebellum South so they come away with the idea that one side was the villain.”
Still, millions of Americans are descended from Confederate soldiers. Can the nation and the NPS continue to ignore or downplay the national purpose for which Confederates fought? Or should we simply help visitors distinguish between the stated purpose of the Confederacy and the myriad personal motivations that compel men to wage war for a nation? Is the Park Service’s traditional role as the nation’s non-partisan, bi-sectional facilitator of honor and reflection incompatible with its charge to practice robust, just history, which is often rejected as “politically correct” or “revisionist” by traditionalists? In thirty years, will the nation permit the National Park Service to manage a place called the “Stonewall Jackson Shrine?”
Here is another perspective on the same questions: can the National Park Service honor and memorialize Confederate soldiers (and by implication the Confederacy) and still hope to engage the nation’s African American community in the history of the Civil War and its legacy of freedom? For all its expanded programming, the sesquicentennial has failed to alter the basic reality that African Americans largely continue to avoid events or sites associated directly with the military experience of the war. Given the recent past, it’s not difficult to understand why. Clearly this is an issue that goes beyond simple programming; it might take a generation for the vast chasm between the African-American community and the legacy of the Civil War to be bridged. But, a start surely has been made during the sesquicentennial.
The National Park Service serves all Americans, with the charge to preserve places central to the nation’s identity and experience. The organization, however, invariably reflects rather than leads society in its exploration of our past. When the nation demanded it in the last century, the NPS emphasized themes of shared sacrifice, courage, and reconciliation. Until the 1980s the organization gave little thought to its narrow interpretation of the war. In response to the women’s rights and Civil Rights movements, the NPS has incorporated new themes in its interpretation and has expanded the number of sites deemed worthy of National Park status. Today, the National Park Service engages in a more diverse history than it did fifty years ago because our society is more diverse and demands a telling of history that reflects the experiences of its own communities and ancestors. The programming of the National Park Service will continue to evolve over time, pulled along by the demands of the society it serves.
We are, without question, in a period of historic change as it relates to America’s understanding of the Civil War. It is a messy and often painful process, especially in a nation with an aversion to cultural controversy and a preference for constancy. The sesquicentennial is not a turning point in that process, but a touchstone—a time to step back, to see and understand the progress made, and to ponder the profound challenges that lie just ahead.
Further reading
David Blight, Race and Reunion: The Civil War in American Memory (Cambridge, Mass., 2001); Edward T. Linenthal, Sacred Ground: Americans and Their Battlefields (Champaign, Ill., 1991); J. Christian Spielvogel, Interpreting Sacred Ground: The Rhetoric of National Civil War Parks and Battlefields (Tuscaloosa, Ala., 2013); Kevin M. Levin, Remembering the Battle of the Crater: War as Murder (Lexington, Ky., 2012).
This article originally appeared in issue 14.2 (Winter, 2014).
John Hennessy is the author of three books and dozens of articles on the Civil War and preservation. He presently serves as the Chief Historian/Chief of Interpretation at Fredericksburg and Spotsylvania National Military Park in Virginia.
The Second Amendment: Infringement
A well-regulated Militia being necessary to the security of a free State, the right of the people to keep and bear Arms shall not be infringed.
–Amendment II, United States Constitution
In April 1995, I joined three other scholars testifying before the U.S. House Judiciary Committee’s Subcommittee on Crime about our research into the meaning of the Second Amendment. As we gave evidence that the Second Amendment guaranteed an individual right to be armed and why the Founders believed it essential, the Republican members of the committee listened politely and with interest. Every Democrat on the committee, however, turned upon us with outrage and disdain. I felt startled and dismayed. The meaning of the amendment, at least for these representatives, seemed less a matter of evidence than of party politics. Sitting opposite us, arguing against an individual right, Dennis Henigan, general counsel for Handgun Control, Inc., presented the committee with a full-page advertisement from the New York Times signed by scores of scholars denying that a right to be armed existed. At this juncture one of my copanelists, Daniel Polsby, then a professor at Northwestern School of Law, pointed out that one signer, a colleague of his, was no expert on constitutional law, let alone the Second Amendment, and that to his knowledge none of the other signers had ever conducted research into the issue. For the scholars who put their names to that testimonial, the conviction that there was no individual right to be armed was an article of faith. The attitudes of both the politicians and the scholars are regrettable. We are all the losers when constitutional interpretation becomes so politicized that otherwise reasonable people are neither willing to accept, nor interested in, historical truth.
Fig. 1. Edmund Randolph, “Objections to the Constitution as far as it has advanced . . . ” August 30, 1787. The Gilder Lehrman Collection, courtesy of the Gilder Lehrman Institute of American History, New York.
Political wrangles over the limits of constitutional guarantees are common, proper, and even necessary. The battle over the Second Amendment, however, is being waged at a more basic level, the very meaning of the amendment. This too is understandable where there is doubt about the Framers’ intent. But once evidence of that intent is clear, as it now is, further argument, even in the service of a worthy political agenda, is reprehensible. It becomes an attempt to revise the Constitution by misreading, rather than amending it, a precedent that puts all our rights at risk. The argument over the Second Amendment has now reached that stage. But first, some background.
Two important points should be kept in mind as we briefly review this history. First, the debate over the meaning of the Second Amendment is surprisingly recent. Second, many of those who question or disparage the right do so because they believe that guns, in and of themselves, cause crime. Until the end of the nineteenth century, few Americans doubted their right to be armed. The Founders believed privately owned weapons were necessary to protect the three great and primary rights, “personal security, personal liberty, and private property.” An armed people could protect themselves and their neighbors against crime and their liberties against tyranny. Madison and his colleagues converted their English right to “have Armes for their defence Suitable to their Condition, and as allowed by Law,” into a broader protection that took no account of status and forbade “infringement.” “As civil rulers, not having their duty to the people duly before them, may attempt to tyrannize,” the Philadelphia Federal Gazette explained when the proposed amendment was first publicized, “and as the military forces which must be occasionally raised to defend our country, might pervert their power to the injury of their fellow-citizens, the people are confirmed . . . in their right to keep and bear their private arms.” In the 1820s William Rawle, who had been offered the post of attorney general by George Washington, found, “No clause in the constitution could by any rule of construction be conceived to give Congress a power to disarm the people. Such a flagitious attempt could only be made under some general pretence by a state legislature. But if in any blind pursuit of inordinate power, either should attempt it, this amendment may be appealed to as a restraint on both.” Supreme Court Justice Joseph Story, writing in 1840, agreed that the right of the people to keep and bear arms had “justly been considered, as the palladium of the liberties of a republic.” And after the Civil War, the charge Southern whites were depriving blacks of their right to be armed was instrumental in convincing Congress to pass the Fourteenth Amendment.
Fig. 2. The Concord Minuteman by Daniel Chester French. This sculpture can be found today at the Old North Bridge in Concord, Massachusetts. Photo by Douglas Yeo, www.yeodoug.com.
Then politics intervened. Early in the twentieth century when American whites, fearful of blacks in the South and the millions of foreign immigrants in the North, wanted to restrict access to firearms, alternative readings of the amendment gained credence. In the absence of serious scholarship, constructions that reduced or eliminated the individual right to be armed seemed plausible, especially in light of the awkward construction of the Second Amendment and the sparse congressional debates during its drafting, both of which relied upon common understandings of the value of a society of armed individuals that had faded over time. These new interpretations emphasized the dependent clause referring to the militia, to the neglect of the main clause’s guarantee to the people. The theory developed that the Second Amendment was merely intended to enhance state control over state militia; that it embodied a “collective right” for members of a “well-regulated” militia–today’s National Guard–to be armed, not a personal right for members of a militia of the whole people, let alone for any individual. Even when an individual right was conceded, the amendment was proclaimed a useless anachronism. After all, twentieth-century Americans had the police to protect them while armed individuals would be helpless against a government bent on oppression.
Beset by fears and armed with alternative readings of the Second Amendment, restrictive gun legislation followed. In 1911 New York State passed the Sullivan Law that made it a felony to carry a concealed weapon without a license, or to own or purchase a handgun without obtaining a certificate. Discriminatory laws in the South kept blacks disarmed. The first federal gun legislation, the National Firearms Act of 1934, introduced controls on automatic weapons, sawed-off rifles and shotguns, and silencers, weapons popular with gangsters. It was more than thirty years before rising crime-rates, urban riots, and three political assassinations again led to demands for stricter federal firearms legislation. The resulting Gun Control Act of 1968 limited mail-order sales, the purchase of firearms by felons, and the importation of military weapons. Professor Robert Cottrol finds this statute “something of a watershed” for, since its passage, the debate over gun control and the right to be armed have become “semi-permanent features” of late twentieth century American life. And “semi-permanent” the debate remains as we enter the twenty-first century.
The argument over the Second Amendment became and remains intense and highly political because the stakes are so great. Americans suffer from a high rate of armed crime that many insist is caused, or made worse, by easy access to firearms. Eliminate these, the thinking goes, and streets will be safer. Thousands of federal, state, and local firearms regulations adorn statute books, but a Second Amendment guarantee of the right to be armed blocks the dramatic reduction or banning of firearms that gun-control groups seek. There is a deep desire on their part to believe no individual right exists. On the other side, the traditional belief that guns protect the innocent and deter offenders is even more widely accepted. Studies show the majority of Americans have always believed the Constitution guarantees them a right to be armed. Approximately half of America’s households have at least one gun, an estimated arsenal of some 200-240 million weapons, kept for sport and, more crucially, for personal protection. Every new threat to regulate weapons provokes thousands of additional purchases.
Both sides seek a safer nation. But whether one believes guns cause crime or prevent it, the Second Amendment figures in the political solution at every level. National elections turn on a candidate’s position on the right to be armed. A small Illinois town bans handguns completely; a small Georgia town requires a gun in every home. The state of Vermont, with no gun restrictions at all, boasts the lowest crime rate in the nation. In the name of public safety, the cities of New York, Chicago, Boston, and Washington, D.C., impose ever tighter gun restrictions. In the name of public safety, thirty-three states–some two-thirds–now allow every law-abiding citizen to carry a concealed weapon. Is an individual right to be armed an anachronism? Not in their opinion. Other states are considering this option.
In this clash of strategies, political gestures and competing claims abound. The Clinton administration allocated millions of dollars for gun buy-back programs, knowing a Justice Department study found this approach ineffective. Flushed with the success of lawsuits against tobacco companies, public officials in thirty-one municipalities sued gun manufacturers claiming millions in damages for gun crimes. In response, twenty-six states passed legislation forbidding such suits. Philanthropic foundations finance research that favors gun control, some even establishing whole institutes for “the prevention of violence.” Notwithstanding plummeting rates of gun homicides, leading medical journals publish articles that proclaim guns a health emergency. They print seriously flawed research that purports to demonstrate that the presence of a firearm transforms peaceful citizens into killers, although studies of police records show the great majority of murderers are individuals with a long history of violence.
Nor has the popular press been shy in broadcasting its preferences. For seventy-seven consecutive days in the fall of 1989, the Washington Post published editorials calling for stricter gun controls. This was something of a record, but it is indicative of a national media in which three-quarters of the newspapers and most of the periodical press have advocated severe curbs on gun ownership and have denied a right to be armed exists. The press is entitled to its opinions, but unfortunately this bias has often affected and distorted news coverage. Every gun accident or shooting, every study that supports gun restrictions, is intensively reported, while defensive uses of firearms are downplayed along with scholarly investigations that tabulate these or that call into question the notion that legally owned firearms increase violent crime.
As a result, much conventional wisdom about the use and abuse of guns is simply wrong. Such reporting, for example, gives the impression that gun accidents involving young children are common and increasing when, in fact, they are happily rare and declining. The same is true of gun violence in schools. Do guns cause violence? In the thirty-year period from 1968 through 1997 as the stock of civilian firearms rose by 262%, fatal gun accidents dropped by 68.9%. Numerous surveys have shown that far more lives are saved than lost by privately owned guns. And John Lott’s meticulous study of the impact of statutes permitting citizens to carry concealed weapons found them of value in reducing armed crime. Yet, convinced advocates are unwilling to examine the evidence of the constitutional protection or studies that contradict their view of the danger of private gun use.
All this has taken its toll. Alone among the articles comprising the Bill of Rights, the Second Amendment has, in recent years, come very near to being eliminated from the Constitution, not through the prescribed process of amendment, but through interpretations that reduced it to a meaningless anachronism. The low point came in 1975 when a committee of the American Bar Association was so befuddled by competing interpretations that members concluded, “It is doubtful that the Foundings Fathers had any intent in mind with regard to the meaning of this Amendment.” Leading textbooks on constitutional law, such as that by Lawrence Tribe, had literally relegated the Second Amendment to a footnote. Yet the American people remained convinced of their right to be armed despite textbooks and newspaper advertisements to the contrary.
Now scholarship has come to the rescue. The past twenty-five years have witnessed a growing number of studies of the Second Amendment, and these have found overwhelming evidence that it was meant to guarantee an individual right to be armed. In 1997, Supreme Court Justice Clarence Thomas in Printz v. United States, noting that the Court “has not had recent occasion to consider the nature of the substantive right safeguarded by the Second Amendment,” hoped, “Perhaps, at some future date, this Court will have the opportunity to determine whether Justice Story was correct when he wrote that the right to bear arms ‘has justly been considered, as the palladium of the liberties of a republic.’” Thomas added, “[A]n impressive array of historical evidence, a growing body of scholarly commentary indicates that the ‘right to keep and bear arms’ is, as the Amendment’s text suggests, a personal right.” Such evidence includes the individual right to be armed inherited from England; Madison’s intent to list the right to be armed with other individual rights, rather than in the article dealing with the militia; his reference to his proposed rights as “guards for private rights”; the Senate’s rejection of an amendment to tack the phrase “for the common defense” to the “right of the people to keep and bear arms”; and numerous contemporary comments. By contrast, nocontemporary evidence has been found that only a collective right for members of a militia was intended. The evidence has convinced our leading constitutional scholars, among them Lawrence Tribe, Akhil Amar, and Leonard Levy, that the Second Amendment protects an individual right. In March 1999, Judge Sam Cummings of the Fifth Circuit, in the case of United States v. Timothy Joe Emerson, found a federal statute violated an individual’s Second Amendment rights. The Fifth Circuit Court of Appeals, in a meticulously researched opinion, agreed that the Second Amendment protected an individual right to keep arms. As the Court of Appeals in Ohio pointed out when, in April 2002, it found Ohio’s prohibition against carrying a concealed weapon unconstitutional, “We are not a country where power is maintained by people with guns over people without guns.”
Since the evidence clearly shows an individual right was intended, we should now move on to discuss the prudent limits of that right. Yet that discussion can’t take place because denials of that right continue along with ever more tenuous theories to refute it, claims that the phrase “bear arms” was used exclusively in a military context; that the amendment resulted from a conspiracy between Northern and Southern states to control slaves; and that since the phrase “the right of the people to keep and bear arms” is set off by a comma it can be eliminated. But in early American discourse, as today, “bear arms” often meant simply carrying a weapon; there is no direct evidence of any conspiracy; and the elimination of every phrase set off by commas would play havoc with constitutional interpretation. Michael Bellesiles claimed to have evidence there were few guns in early America, Americans were uninterested in owning them, and therefore no individual right to be armed could have been intended. However, his results seriously underestimate numbers of weapons and distort the attitudes toward them. Other scholars looking through some of the same evidence have found widespread ownership of guns.
Why does the debate over original intent continue? Lawrence Tribe, who concluded there is an individual right after considering the new evidence, points to the “true poignancy,” “the inescapable tension, for many people on both sides of this policy divide, between the reading of the Second Amendment that would advance the policies they favor and the reading of the Second Amendment to which intellectual honesty, and their own theories of constitutional interpretation, would drive them if they could bring themselves to set their policy convictions aside.” The time has come for those who deny an individual right exists to set policy convictions aside in favor of intellectual honesty–and a more productive discussion.
Further Reading: On the origins and constitutional interpretation of the Second Amendment see Joyce Lee Malcolm, To Keep and Bear Arms: The Origins of an Anglo-American Right (Cambridge, Mass., 1994); Leonard W. Levy, Origins of the Bill of Rights (New Haven, 1999), chap. 6; Lawrence Tribe, American Constitutional Law, 3d ed. (New York, 2000), 894-903; Sanford Levinson, “The Embarrassing Second Amendment,” The Yale Law Journal 99 (1989); Don B. Kates Jr., “Handgun Prohibition and the Original Meaning of the Second Amendment,” Michigan Law Review 82 (December 1982); Robert Shalhope, “The Ideological Origins of the Second Amendment,” The Journal of American History 69 (December 1982); and Stephen P. Halbrook, That Every Man Be Armed: The Evolution of a Constitutional Right (Albuquerque, N.M., 1984). The recent, landmark opinions in the Emerson Case by Federal Judge Sam Cummings and the Fifth Circuit Court of Appeals provide excellent and readable historical treatment and analysis of the intent of the Second Amendment and its legal history in the courts. See United Statesv. Timothy Joe Emerson, 46 F.Supp.2d 598 (1999) and United States v. Emerson, 270 F.3d 203(5th Cir. 2001). For up-to-date scholarship on the efficacy of gun control legislation and a review of the scholarly studies on the subject see John Lott Jr., More Guns, Less Crime: Understanding Crime and Gun Control Laws, 2d ed. (Chicago, 2000). Gary Kleck & Don B. Kates, Armed: New Perspectives on Gun Control (Amherst, N.Y., 2001) is an excellent source of information on numerous aspects of the issue. Also see an older but important study by James D. Wright and Peter H. Rossi, Armed and Considered Dangerous: A Survey of Felons and Their Firearms (New York, 1986).
This article originally appeared in issue 2.4 (July, 2002).
Joyce Malcolm is a professor of history at Bentley College and senior fellow, MIT Security Studies Program. Her sixth book, Guns and Violence: The English Experience (Cambridge, Mass., 2002) was published in May.
The Awful Truth
Has a historian solved the mystery of Charles Brockden Brown?
The terrorists not only murdered but also mutilated their victims. They brained infants and burned men alive in front of their wives. In response, some on our side also began to kill and soon convinced the government to declare war. Anyone killing a terrorist would receive a bounty from the state. Only a small subset of our community remained calm enough to wonder why the terrorists were angry. To the consternation of their compatriots, this subset arranged a conference, where the terrorists explained why they had turned to violence: twenty years earlier, at the behest of a family whose role in our government was all but dynastic, we had taken their land by fraud.
In 1737, an agent of the Penn family, proprietors of the Pennsylvania colony, took advantage of a contract’s ambiguous wording to seize twelve hundred square miles of land from the Delaware Indians. Alan Taylor’s recent history of colonial America calls the act “perhaps the most notorious land swindle in colonial history.” The Walking Purchase, as the swindle has come to be known, got its name from the way it was perpetrated. Pennsylvania officials convinced Delaware leaders to sign away a tract of land along the Delaware River that could be walked in thirty-six hours. The Indians seem to have expected the walk to be roughly twenty miles, but one of the colony’s walkers managed to go roughly three times that far, because officials had arranged to clear the path ahead of time. Two decades later, when the once friendly Delawares attacked white colonists during the French and Indian War, only the most pious Quakers in the colony had the presence of mind to ask why.
The independent scholar Peter Kafer believes that the Walking Purchase is the dark but true history behind one of the first Gothic novels ever written in America, Edgar Huntly, by Charles Brockden Brown. He sees references to the fraud in the 1799 novel’s geographic details and in its plot, which involves Delawares who have turned inexplicably violent. Once Kafer makes its case, it is hard to disagree. The incredible thing is that it took two centuries for Brown’s fiction to yield its secret.
Fig. 1. Charles Brockden Brown, engraved by L. B. Forrest from a miniature by William Dunlap in 1806. Taken from the frontispiece, Wieland or The Transformation, vol. 1 of a six-volume set of Brown’s novels published by David McKay, 1887. Courtesy of the American Antiquarian Society.
I.
Charles Brockden Brown is the undisputed father of American horror. He imported Gothic novels into our literary tradition by writing and publishing four of them in 1798 and 1799. I remember when I first heard of him, as distinctly as I remember my first cigarette. A friend of mine called to tell me that he had just read a novel where one character spontaneously combusts, and another receives a vision from God telling him to kill his family. “You’d like it,” my friend said. “It’s crazy.” This was a compliment in our vocabulary. It described an artist who knew how to go to pieces, a feat we admired more or less the way that New Critics had once admired poets who knew how to put things artfully together. I took this friend’s advice about everything. He had told me to listen to Big Star and Yo La Tengo, and he had been right about them.
His suggestion proved fateful. I quit smoking years ago, but I became a Charles Brockden Brown geek irrevocably. I always feel as though I have to apologize for this. Not one but two chapters of my book concern him, and when nonscholar friends volunteer that they have started to read it, those are the chapters I warn them about. It may be more than you want to know, I say.
But it is not more than I want to know. Brown’s prose style is clumsy, and his plots are ill formed, but my curiosity about him has seemed at times to be insatiable. He provokes it much the way Melville does: his novels are thick with allusions to the events of his day–glancing allusions that make you wish you were as familiar with them as he is. Like Melville, he seems to have lived out the tragic myth of the American writer, the bold experimenter who flies too high and is punished by a breakdown of some kind–possibly psychological, almost certainly financial–which leads, in turn, to a dramatic shift in style. There is the lure of esoteric meaning. Is Melville suggesting that Ahab was a Gnostic? Is Brown suggesting that Carwin belonged to the Illuminati? Maybe if you read the novels one more time you will figure it out. And most enticingly, Brown, like Melville, gives the reader the impression that he is playing with masks. You are not sure which mask is hiding his true intention, or whether he has any true intentions at all. There must be a reason the tree is an elm. There must be a reason the summerhouse has twelve columns. If only you knew a little more . . .
In 1799, it was already conventional for a Gothic novel to have lurid and bloody episodes, a plot full of abrupt dislocations, and characters mysterious even to themselves. As Brown himself admitted, he merely updated the genre for the New World by replacing such European devices as “castles and chimeras” with American ones like “incidents of Indian hostility, and the perils of the western wilderness.” And in 1799, it was already understood that Gothic novels were written with mysteries because puzzling over them was one of the pleasures they offered. Whether by design or shoddy workmanship, Brown’s Gothic novels have a generous number of loose ends and ragged seams, and ever since Brown was added to the canon, sometime in the last couple of decades, scholars have taken up his puzzles with a vengeance. Recently he has been examined in light of Federalist-era aesthetic theory, eighteenth-century nosology and etiology, the significance of land in America’s national myth, Philadelphia’s German community, and a well-documented 1782 murder. But although the circles of interpretation around Brown have been widening steadily, he himself has been strangely neglected. Surprisingly few researchers have looked in archives for documentary evidence about his life. There is a Library of America edition of his novels, but there has never been an adequate biography.
There still is not one, exactly. Peter Kafer has chosen to write something less conventional, though just as exhaustively researched. Beneath each of Brown’s fictions, Kafer believes, lies an episode of history at odds with national myth. “Brown embellished, exaggerated, transposed, fantasized,” he writes. “But what he wrote was grounded in verifiable experience. And in memory.” Kafer aims to unearth these memories, and so he has written not a chronological narrative of Brown’s life but, fittingly, a series of detective stories.
II.
For his alternative American history, Brown drew on what Kafer calls an “underworld of tribal knowledge.” The tribe in question was the Pennsylvania Quakers.
According to a family history, the first Brown to convert to the Society of Friends was a farmer in Northamptonshire, England, in the middle of the seventeenth century. He was convinced by a traveling Quaker who began his testimony with the words, “O Earth! Earth! hear the word of the Lord.” Later the same man prophesied prosperity in America, and between 1677 and 1684, two of the farmer’s sons emigrated.
The Society of Friends was not the only new and unorthodox religion in late-seventeenth-century Pennsylvania. As Kafer explains, the colony was a hotbed of mystics and seekers. While living in Chichester in the 1690s, the two Brown brothers nearly fell out when one was tempted to join a new sect led by George Keith. Keith had been converted to the Quaker faith by the same man as their father but had then moved on to Kabbalism and a belief in reincarnation. Keith’s group was allied with the Brethren in America, which also contained ex-Quakers and was led by Henry Koster, another student of the Kabbalah. Koster, in turn, had previously belonged to the Hermits of the Wissahickon, a group of mostly German pietists led by a mystic named Johannes Kelpius, who had come to Pennsylvania by way of London. If the Brown brothers attended the 1696 yearly meeting of the Society of Friends in Burlington–and Kafer suspects they did–then they heard Henry Koster defiantly assert that his sect was true to the teachings of Jesus Christ and the Society of Friends was not. Most in the audience were annoyed rather than persuaded, and in the end the Brown brothers remained Quakers.
To any reader of Brown’s first novel, Wieland, these stories of eccentric schisms and self-taught seekers will sound familiar. The novel’s narrator and heroine, Clara Wieland, explains that her family was German in origin. Like Johannes Kelpius, her father lived in London before coming to America. His more or less accidental reading of the words “Seek and ye shall find” in a history of the Camisards, a French Protestant sect, led to his religious awakening, which took the form of sedulous, idiosyncratic study of this Camisard history and the Bible. He emigrated to Pennsylvania because he came to believe that he had a religious duty to convert the American Indians, but he did not succeed as a proselyte. In his retirement, he built a temple where he worshiped alone, guided by his strict construction of the text of Matthew 6:6 (“But when you pray, go into your room and shut the door”). As Clara explains, “He rigidly interpreted that precept which enjoins us, when we worship, to retire into solitude, and shut out every species of society.” His retreat thus paralleled that of the Hermits of the Wissahickon, who recorded that they felt called “to live apart from the vices and temptations of the world, and to be prepared for some immediate and strange revelations which could not be communicated amid scenes of worldly life, strife and dissipation, but would be imparted in the silence and solitude of the wilderness.”
Here Kafer does some sleuthing that will take your breath away if you are a Charles Brockden Brown geek. In Brown’s novel, Father Wieland’s private temple is said to resemble a summerhouse. It is twelve feet in diameter, “edged by twelve Tuscan columns, and covered by an undulating dome,” and it sits on a rock beside a sixty-foot cliff overlooking the Schuylkill River, five miles outside Philadelphia. Kafer has discovered that in the nonfiction world, this location “turns out to be where the Schuykill meets the Wissahickon, and is where the Kelpius community lived.” Kelpius’s Hermits of the Wissahickon built a tabernacle on the site, with an architectural plan determined by Rosicrucian numerology. In the late eighteenth century, Brown could have toured the ruins. The portion still standing then was circular. Kafer prints a photograph of the riverside landscape as it appeared in 1999, eerily similar to Brown’s description of Father Wieland’s temple grounds. In a footnote, Kafer observes that Brown could have walked to the spot from the country home of a lawyer he studied with and that the adjacent property belonged to the family of one of Brown’s closest friends.
To a Brown fan, this is heady. It is equivalent to telling a Melville fan that the Pequod has been located and raised from the deep. Wieland’s temple is crucial to Brown’s first novel. It is where “a cloud impregnated with light” seized on Clara’s father and scorched him, in an episode of either spontaneous combustion or divine punishment. And it is where, years later, Clara’s brother bantered with his friends about Cicero, until he too saw a mysterious light, on the night he began to hear the voices that would drive him mad.
III.
In 1702 the two sons of the first Quaker Brown moved to Nottingham, Pennsylvania, near the Maryland border, where the land was fertile and the corruption of the city safely distant. In Nottingham, many of the Browns would become ministers, which was not a formal office among Quakers, but indicated a person who heard with some regularity the inner voice of truth and felt called to share it with others.
The following generation of Browns married into another devout family, the Churchmans, and by the middle of the eighteenth century, they were highly respected in Nottingham. Into this pious, rural community Elijah Brown, the novelist’s father, was born in 1740. At the time, the most celebrated person in the family was Elijah’s uncle, the minister John Churchman. So great was Churchman’s moral prestige that in 1748, when the Pennsylvania Assembly was debating whether to pay for a warship, they listened at length to his opinion even though he held no governmental office and spoke merely as “a country man . . . having something to communicate.” He told them, of course, that a colony founded on Quaker principles should not arm itself.
During the French and Indian War, Churchman was one of the Quakers who tried to restore peace with the Delawares, whom the Walking Purchase had embittered. The crisis so engaged him that it brought on a vision. Riding to a treaty conference in 1756, he saw a light as bright as a rainbow, in “a human form about seven feet high,” which he took to be an angel. (Shades of Wieland.) At a November 1756 conference, Churchman heard the leader of the Delawares declare that “this very Ground that is under me . . . was my Land and Inheritance, and is taken from me by Fraud.” According to Kafer, the money for the conference came in part from another of Charles Brockden Brown’s great-uncles and from the father of one of the novelist’s closest friends, both of whom probably also attended. In other words, to Charles Brockden Brown the Walking Purchase was not just an important piece of recent history but one that had touched his family particularly.
Fig. 2. A map of the province of Pennsylvania, taken from Charles Thomson, ed., An Enquiry into the Causes of the Alienation of the Delaware and Shawanese Indians (London, 1759). Courtesy of the American Antiquarian Society. (Click to enlarge in a new window.)
The fraud had originally come about this way. In 1737, to secure the rights to the southeast portion of Pennsylvania and to raise money for one of William Penn’s sons, Chief Justice James Logan pressured the Delawares to confirm what he said was a 1686 deed of land already sold to William Penn. In fact it was either the rough draft of a deed never executed or an outright forgery. The Delawares signed it nonetheless, perhaps because it seemed to involve land that had already been given to the Penn family by other treaties. It specified a tract extending north by northwest from present-day Wrightstown as far as could be walked in “a day and a half.” The Delawares were led to believe that this ambiguously worded distance was about twenty miles. That would have brought the whites not much further north than Tohickon Creek, which runs through present-day Point Pleasant and which approximated the Delawares’ sense of their southern boundary anyway.
James Logan intended to stretch the tract much farther, however. After the treaty was signed and before the land was surveyed, Logan had a path through it cleared. During the official “day and a half” that the land was measured out, two of the three men hired by Logan as “walkers” quit from exhaustion, but a third managed to go sixty-four miles before collapsing–some forty-seven miles beyond Tohickon Creek. When the Delawares protested that they had been cheated, Logan made a military alliance with the Iroquois, who drove the Delawares out.
IV.
Brown’s last novel, Edgar Huntly, is not, at first glance, about land fraud. Instead it seems to concern a young man in mourning who lives in the town of Solebury, Pennsylvania.
Edgar Huntly is distraught by the recent murder of a virtuous and wealthy friend. One night he visits the elm where his dying friend was discovered, and by the light of the moon he sees a naked sleepwalker digging and weeping. Naturally he suspects the digger of murder and decides to investigate. After watching a second night, he follows the sleepwalker into a wilderness area called Norwalk, described as “rugged, picturesque and wild,” where he loses him. Confronted by day, the sleepwalker confesses, but not to the crime of which Huntly suspects him. Then the man vanishes. In pursuit, Huntly returns to Norwalk, where he again sights his suspect and again loses him. Frustrated and agitated, he dreams of the “inquietude and anger” of his murdered friend. One night soon after, he goes to bed in Solebury and wakes up in a cave in Norwalk. He too has become a sleepwalker, and he has woken up in the middle of a bitter, violent war with the Indians. He picks up a tomahawk and begins to kill.
In suggesting a link between the Walking Purchase and Edgar Huntly, Kafer is following up a hint dropped by the amateur Brown scholar Daniel Edwards Kennedy, which was first mentioned in print by Sydney J. Krause in his historical note to the Kent State edition of the novel published in 1984. Kafer makes a compelling case. Some of the clues in the novel are explicit. When Huntly is surrounded by hostile Indians, he admits to the reader that “a long course of injuries and encroachments had lately exasperated the Indian tribes.” The bloodiest scenes in the book occur near the hut of Old Deb, an Indian who is said to have remained in the area after the rest of her tribe had departed and to have “originally belonged to the tribe of Delawares or Lennilennapee.”
But it is geography that clinches it. Solebury, the hometown of Edgar Huntly, lies about five miles southeast of Tohickon Creek, well within the region that William Penn had legally purchased long before the 1737 fraud. (When I consulted a modern-day map, DeLorme’s Pennsylvania Atlas & Gazetteer, I came up with mileage slightly different from Kafer’s. Tohickon Creek bends and twists dramatically, which may account for the discrepancy.) Although Brown never lived in Solebury himself, he may have felt some kinship to it; Kafer reports that among the earliest landowners in Solebury were the family of Charles Brockden Brown’s mother and the father of the Charles Brockden for whom he was named. Huntly leaves Solebury, however, to pursue his murder suspect into Norwalk, where most of the action takes place. Brown consistently locates Norwalk to the north of Solebury, but he gives conflicting descriptions of its extent. He first describes it as “a space, somewhat circular, about six miles in diameter.” But later he writes, “Norwalk is the termination of a sterile and narrow tract, which begins in the Indian country. It forms a sort of rugged and rocky vein, and continues upwards of fifty miles.”
Norwalk is, therefore, a wilderness that either reaches six miles north of Solebury, which would put its northern edge roughly at Tohickon Creek, or reaches fifty miles north of Solebury, which would put it some forty-five miles beyond the creek. Sound familiar? As Kafer writes, succinctly, “‘Norwalk.’ North Walk.” On their way north in 1737, James Logan’s walkers would have passed by not only Solebury but every setting in Edgar Huntly.
V.
The Quakers’ exposure of the Walking Purchase fraud altered nothing for the Delawares. The treaty remained in force. Back in Nottingham in December 1756, a dismayed John Churchman told his fellow Quakers that he heard a voice saying, “I will bow the inhabitants of the earth, and particularly of this land, and I will make them fear and reverence me, either in mercy or in judgment.” The gloom of Churchman’s ministry must have told on his nephew, because two months later, Elijah Brown left small-town Nottingham for Philadelphia, where he hoped to become a merchant. Exactly nine months after that, Kafer reports, “his father and stepmother had another son; and at this point they did a most peculiar thing. They named their new son Elijah–as if the other Elijah, a healthy seventeen-year-old . . . were dead.”
It was not easy to move from pious Nottingham to worldly Philadelphia, and Elijah Brown never really succeeded at the transition. The one asset Nottingham might have been expected to provide, a sturdy moral compass, seems to have gone missing in his case. Kafer thinks the novelist may have been painting a portrait of his father in his third novel, Arthur Mervyn.The hero claims to be a nice boy from the country who has fallen in with bad company through no fault of his own, but he may, in fact, be as unscrupulous as Welbeck, the evil mastermind who is supposed to have misled him.
Some such ambiguity beclouded Elijah. He copied out extracts from the works of the atheist anarchist William Godwin and the deist feminist Mary Wollstonecraft in his commonplace books, which survive in the Historical Society of Pennsylvania, and so scholars have long known that his intellectual interests were not those of an altogether orthodox Quaker. Kafer has discovered that his career was not quite orthodox, either. His 1761 marriage to Mary Armitt brought him access to the good credit of her brother-in-law, Richard Waln, and he soon set himself up as a merchant. But in 1768 his fellow Quakers disowned him for failing to pay his debts and for misleading creditors about the extent of them, and he was never restored to their good graces. In the spring of 1770, in a desperate attempt to repay his brother-in-law, he went to the West Indies and smuggled tea. “Devious deliveries to one ‘Richd Somers’ at an out-of-the-way harbor in New Jersey would seem a long, long way from the Quaker principles of John Churchman and Nottingham,” Kafer comments. Elijah gave up on his ambition to be a merchant. Waln’s subsequent loans to his brother-in-law were acts of charity rather than business, and Elijah never repaid them. He continued to sell as a retailer until July 1784, when he was jailed for debt. The debt was of a “conciderable amount,” according to Waln, and it ended Elijah’s shopkeeping. After 1784 he worked as a conveyancer, copying legal documents and writing out the paperwork for real-estate transactions.
Charles Brockden Brown was born into this struggling family in January 1771. In Wieland, when Clara recalls the mysterious light and explosion that burned her father in his temple, she notes,”I was at this time a child of six years of age. The impressions that were then made upon me, can never be effaced.” Kafer notes that the novelist was six years old during the most frightening of his father’s travails, which came during the Revolutionary War. To a child, its sights and sounds may have been just as inexplicable and traumatic.
Quakers were sidelined from the Revolution by their pacifism and their refusal to swear oaths. It was not a sudden displacement, however, but the culmination of a decades-long process. The Quakers’ peaceable attitude toward the Indians had long ago wrong-footed them not only with the lordly Penn family and its allies but also with many of the common people who settled on the Pennsylvania frontier. In 1763 and 1764 a group of frontiersmen known as the Paxton Boys rioted. They murdered a score of Indians–in his Cultural History of the American Revolution (New York, 1976), the scholar Kenneth Silverman likened the massacre’s impact to My Lai’s–and then marched to Philadelphia, where they threatened to hang the colony’s most prominent Quaker politician.
The Paxton Boys, as Quakers noted at the time, were “principally of Irish extraction.” That is, they were Scots-Irish Presbyterians, an ethnic group that was immigrating to Pennsylvania in droves. Quaker politicians retaliated with what historian Patricia U. Bonomi has called “a virulent anti-Presbyterian campaign.” In the long term, the campaign did not serve Quakers well. The Paxton Boys were the sort of people to whom power flowed, in the form of committees and militias, when the Revolution came. When John Adams compiled a list of potential traitors in Philadelphia in 1777, he probably got his information from people like them; everyone on his list was a Quaker. Many of those he named had helped to fund the 1756 conference with the Delawares where the Walking Purchase fraud was brought out–an overlap that, in Kafer’s opinion, “is not a historical coincidence.”
Fig. 3. The Hermits of the Wissahickon tabernacle, which stood on this site, may have been Charles Brockden Brown’s inspiration for Father Wieland’s private temple. From Charles Brockden Brown’s Revolution and the Birth of American Gothic. Photograph by Peter Kafer, courtesy the University of Pennsylvania Press.
Kafer believes that by the time of the Revolution, the Quakers, although usually too couth to spell it out, understood Scots-Irish Presbyterians as their tribal enemies, more or less. In fact, when the British army occupied Philadelphia from September 26, 1777, to June 18, 1778, some Quakers appreciated their presence. “We may expect some great suffering when the Englis Americans again get possession,” one Quaker confided to her diary. (Not all Quakers dreaded American troops, however. While waiting out the occupation in rural Gwynedd, Sarah Wister flirted with American soldiers gaily; when she reported in her diary that the British had abandoned the capital, she brought out her best faux French and called the news “charmonte.”)
After Adams’s 1777 list of traitors was expanded and refined, the people on it were rounded up. Quakers privately recorded the names of the men who made the arrests. Almost all were Scots-Irish. For example, at noon on September 5, 1777, Elijah Brown was arrested by James Loughead and James Kerr. (I can vouch for Kerr’s Scots-Irish genealogy, because I happen to be descended from him.) On September 11, 1777, Elijah Brown, sixteen other Quakers, and three outright Tories were driven out of Philadelphia in wagons. There had been no trials or hearings. When the prisoners applied for writs of habeas corpus, the Pennsylvania Assembly passed an act retroactively suspending the right to habeas corpus in their cases only. They were taken to Winchester, Virginia, where they were overcharged for their room and board. Over the winter, two Tories ran away and two Quakers died. Four of the exiles’ wives appealed personally to George Washington at Valley Forge, and at the end of April 1778, the exiles were finally allowed to go home.
The Revolution did not, therefore, mean to Charles Brockden Brown what it may have meant to other boys who were then six years old. It took away his father. In Edgar Huntly, the narrator, just before he begins to kill Indians, explains, “Most men are haunted by some species of terror or antipathy, which they are, for the most part, able to trace to some incident which befell them in their early years.” Huntly claims that he is haunted by the slaughter of his parents by Indians. But as Kafer points out, from the perspective of the Brown family, “the native Indians weren’t the ‘savages.’ The Revolutionaries were.” The Indians in the novel seem to be standing in for the Scots-Irish Presbyterians who were, in real life, the deadly enemies of both Indians and Quakers. Kafer finds a further clue in the high number of Brown’s villains who are “Irish,” from Jackey Cooke, the wife-beater in Brown’s earliest attempt at fiction, to the sleepdigger of Edgar Huntly’s opening scenes, who was born to peasants living “in the county of Armagh.”
In his novels, the son remembered the violence that had harmed his family when he was a child–violence that the rest of America celebrated every July 4 with cannons and fireworks. He did not take up his father’s quarrels, however, or his religion’s. If Arthur Mervyn, for example, is a portrait of Elijah, it expresses a deep ambivalence. It is a serious betrayal of Quaker ethics for Edgar Huntly to kill Indians. And it is perverse of Brown to have signaled the onset of madness in Clara Wieland’s brother with a vision of light like that seen by John Churchman in 1757.
Kafer may have solved these mysteries, too. Charles may not have been able to represent his father as a hero because it was not the case. It turns out that Elijah was not exiled on account of his Quaker principles. He was arrested because even though the Revolutionary authorities had fixed the price of flour and were regulating its sale, he sold it, illegally, on the open market. And after they warned him not to, he continued selling it. He was, in other words, a war profiteer–though a very unskilled one, unable to survive his exile without borrowing even more money from the brother-in-law who had once sent him smuggling.
Standing up for Quaker ideals would not have avenged Charles Brockden Brown’s father, and standing up for Revolutionary ideals would have betrayed him. Such was the genealogy of a new kind of American writing, which gave voice to confusion and terror.
Further Reading:
No Brown aficionado can afford to be without Peter Kafer’s Charles Brockden Brown’s Revolution and the Birth of American Gothic, published in 2004 by the University of Pennsylvania Press. Some of Kafer’s discoveries were revealed in two earlier articles: “Charles Brockden Brown and Revolutionary Philadelphia: An Imagination in Context,” Pennsylvania Magazine of History & Biography 116 (October 1972): 467-98, and “Charles Brockden Brown and the Pleasures of ‘Unsanctified Imagination,’ 1787-1793,” William & Mary Quarterly, 3rd ser., 57 (July 2000): 543-68.
Online, the Pennsylvania State Archives offers a transcription and a digital photograph of the Walking Purchase Treaty, and the Pennsylvania Historical and Museum Commission reprints a 1972 essay on the fraud by William A. Hunter. The fraud is also discussed in Anthony F. C. Wallace’s King of the Delawares: Teedyuscung, 1700-1763 (Philadelphia 1949; rpt. Syracuse, 1990) and in Appendix B of Francis Jennings’s The Ambiguous Iroquois Empire: The Covenant Chain Confederation of Indian tribes with English Colonies from its Beginnings to the Lancaster Treaty of 1744 (New York, 1984).
The authoritative text of Brown’s novels is the Bicentennial Edition, published by Kent State University between 1977 and 1987 and overseen by Sydney J. Krause and S. W. Reid. The historical essays in each volume are in many ways unsurpassed, and Krause’s annotations for Ormond offer an excellent opportunity to those who wish to dive down the rabbit hole and into Brown’s idiosyncratic intellectual world. Penguin, the Modern Library, and the Broadview Press publish paperbacks of several Brown novels, and the Library of America sells a volume containing three of them.
Brown’s nonfiction is harder to find, but Mark L. Kamrath, Fritz Fleischmann, and Wil Verhoeven are directing a supplementary edition of Brown’s reviews, essays, short fiction, letters, and poetry, to be published electronically and in six bound volumes. Scholarship on Brown has exploded in the last few years, and Bryan Waterman, a leading practitioner, offers an insightful overview of the field in “Charles Brockden Brown, Revised and Expanded,” Early American Literature 40 (2005): 173-91.
This article originally appeared in issue 5.4 (July, 2005).
Caleb Crain is the author of American Sympathy: Men, Friendship, and Literature in the New Nation (New Haven, 2001). He is at work on a history of the 1851 divorce of the theatrical couple Edwin and Catharine Forrest.
The Architect of Colonial Desires
Peter C. Mancall
Richard Hakluyt and the English in America
Common-place asks Peter Mancall, whose Hakluyt’s Promise: An Elizabethan’s Obsession for an English America was recently published by Yale University Press, “What led you to turn from social history to biography in your study of English encounters in North America?”
At the start of my career I was concerned with the ways Englishmen and other Europeans established control over eastern North America and its indigenous peoples. Few commodities had a more lasting influence on this process than alcohol, and yet no historian had ever studied the interplay of alcohol and empire in any depth. I never could have imagined that I would follow my own attempt to understand that subject with the biography of a man who helped launch the English colonial enterprise but who never stepped foot in America.
Richard Hakluyt the younger, as he is usually known, became the most important promoter of the English colonization of North America in the late sixteenth century. Often confused with his older cousin of the same name, the younger Hakluyt rose to prominence not through any great act of heroism or by braving an Atlantic crossing. His life was marked instead by more prosaic achievements. He was a scholarship student at Westminster School, near the edge of the walled city of London, who then attended college at Christ Church Oxford where he received his B.A. in 1574 and an M.A. in 1577. He spent most of his adult life in London, Oxford, and Wetheringsett, a small village in rural Suffolk. He also lived in Paris during the 1580s, as a chaplain assigned to the English ambassador. Hakluyt had two opportunities to go to the Western Hemisphere, once to Newfoundland in 1583 and again as one of the first colonists bound for Jamestown. He declined both.
I found myself oddly drawn to this land-lubbing Englishman. Part of what attracted me was the possibility that biography—even biography of an English man who had never been to America—would help me better understand the initial encounter between the English and Native Americans. I thought that a focus on one person’s history might shed new light on the emergence of what was, as historians James Merrell and Colin Calloway have noted, a new world for both Indians and English.
So I turned to Hakluyt, one of Shakespeare’s contemporaries. He is best known now as the person who so inspired a group of Victorian imperial enthusiasts that they named a learned society after him. The Hakluyt Society, which remains a vigorous organization, seeks to emulate what Hakluyt himself did: to publish travel accounts for an English-reading audience. Most of the narratives printed by the society recount journeys that occurred after Hakluyt’s death in 1616, but all the texts are in the spirit of his original enterprise. He sought to inform an English-speaking audience about what existed far from Britain’s shores, in part because he hoped to convince his countrymen to begin building faraway colonies.
I went into this project not to explain why the English colonized North America—a subject I had treated in a brief edition of documents published in 1995 entitled Envisioning America—but instead hoping to show how the idea of colonization took hold in European minds during the late sixteenth century. I wanted to continue exploring the origins of one of the great tragedies in modern history—the demographic, economic, and social catastrophes that rocked Indian Country—from a new perspective. Some historians, such as the late Francis Jennings and David Stannard have argued that Europeans always intended to dominate the Western Hemisphere. Many of my students tend to agree with such a position: the disasters that befell indigenous nations would not have taken place had not unwelcome newcomers intended to dispossess them; any benevolent claims by Europeans about their intentions with respect to Native peoples were either cant, as Jennings had powerfully put it, or a tool to further the conquest of the Western Hemisphere.
It is unarguable that Europeans’ actions caused unspeakable trauma for Americans, as Europeans often called the indigenous peoples of what they labeled a “new world.” When Europeans arrived in the Americas, the immediate and long-term effects were horrendous. Many Americans succumbed to Old World infectious diseases; Europeans enslaved others and forced many to pay tribute; still others witnessed the expansion of European colonists across lands earlier dominated by indigenous peoples. The threats in Indian Country came in different forms: a smallpox microbe, a herd of unrestrained ungulates (hoofed mammals), a missionary bent on eradicating traditional beliefs, a group of soldiers willing to use violence to impose the terms of a treaty, a surveyor marking trees in advance of colonial settlers. Few indigenous communities escaped these threats.
But did the tragic consequences of colonialism spring from the original intent of Europeans or were they largely unexpected? Were they at the heart of colonizers’ plans or an unfortunate by-product? In seeking an answer to that question I began to pursue the life of Hakluyt, a man—as scholars have long argued—who played a central role in convincing the English to begin colonizing eastern North America.
In able hands, a biography can reveal much about a period that remains invisible in other kinds of historical works. Many of the greatest practitioners of our craft have used the lives of the famous, the infamous, or the obscure to reveal central tensions in specific eras. Set into their larger context, these individuals’ experiences show us how larger events unfolded and what they meant for participants. Historians of the Native American experience such as Merrell and Richard White have demonstrated that one promising way to understand the encounter between Natives and newcomers in eastern North America is to focus on questions of agency and contingency. Some imaginative scholars who reached the same conclusion have turned their attention to recreating the life stories of Natives; Camilla Townsend’s recent Pocahontas and the Powhatan Dilemma (2004), for example, offers a thoughtful study of perhaps the most famous North American of the seventeenth century. (As one measure of Pocahontas’s lasting celebrity try to imagine a Disney musical based on the life of Cotton Mather.)
I hoped that Hakluyt’s experience would enable me to understand and explain how the English came to embrace colonization. I was not the first person to recognize his importance. Though no one had produced a full-scale biography of Hakluyt—nor did I write one—others had investigated his writings with extraordinary care. Among them was the literary geographer George Bruner Parks, who in 1928 produced Richard Hakluyt and the English Voyages. Parks combed through Hakluyt’s major works, recognizing the deep intellectual connections between this Elizabethan and the Venetian geographer Giambattista Ramusio, and teased out of Hakluyt’s books a narrative about English overseas explorations. Even more important were the efforts of the historian David Beers Quinn. All early Americanists know of Quinn’s astonishing output; no other historian has written so widely on the English colonial experience in the sixteenth century. Working in an age before computers had revolutionized historians’ access to documentary evidence, Quinn apparently read everything related to the origins of English colonialism. Among his books is the two-volume The Hakluyt Handbook. Much of that work, published by the Hakluyt Society in 1974, summarized existing knowledge about Hakluyt and his intellectual community. At its heart was a year-by-year chronology of Hakluyt’s known activities. It is, in essence, the blueprint for a biography that Quinn never wrote.
This scholarship led me deep into Hakluyt’s own writings, which convinced me that the English never originally intended to eliminate Native Americans. To be sure, Elizabeth’s subjects, like other Europeans, wanted to reshape the landscape of North America. They planned to ship home abundant furs and minerals extracted from the American interior. They heard reports of Americans’ religious ideas and presumed that these would quickly give way to superior Christian beliefs. They argued that Americans would become eager consumers of English manufactured goods. They looked at maps and believed that laying claim to North America would bring glory to Queen Elizabeth and monarchs who succeeded her. English explorers assumed they would find the Northwest Passage and a quick route to the commercial opportunities of East Asia and the Spice Islands. Arrogance drove this vision. Hakluyt did not believe that colonists should emulate Indians nor did he entertain the idea that Europeans would find Americans’ ideas or beliefs superior to those of the Old World.
But Hakluyt never advocated the destruction of Native Americans. When he received reports of the epidemics that swept through Native societies, he believed what most Englishmen believed: God had afflicted Americans with diseases, and nothing could prevent divine will. This fact was something to be noted, not celebrated. Hakluyt did presume that the English would use force if Americans resisted the newcomers, but he also believed such resistance was unlikely. In contrast to the Spanish, whose alleged atrocities were widely documented in England, the English—so Hakluyt thought—would be civil and agreeable neighbors. Hakluyt did not live long enough to hear about the violence that eventually poisoned relations between Americans and colonists in eastern North America. It is impossible to know what he would have thought about the conflicts that erupted there during the seventeenth century.
Like the vast majority of indigenous peoples in the Americas, Hakluyt left only imperfect traces in the historical record. We know more about him than about many people of his age, of course. But basic details of his life—when and where he was born; what he looked like; what he thought about the loss of his first wife and his marriage to a second; why (as I suggest in the book) he shifted his attention late in life away from America and toward the Spice Islands; why he never went to North America when he had the chance—remain beyond our grasp, at least until someone finds a trove of new materials that has so far eluded detection.
Still, the traces Hakluyt left are compelling. The title of my book derives from one of Hakluyt’s most famous stories: his recreation in 1589 of the day he gained the inspiration for his life’s work. As he told the story, his older cousin waved a wand over a map and pointed out the parts of the world and then sent Hakluyt to Psalm 107 where he learned about the benefits of exploring the seas. At that moment Hakluyt resolved to study geography and to use that information for the benefit of the realm. I took Hakluyt’s resolution as a promise, which he made to himself and to the English-reading world. It was a promise about knowledge and a promise about the benefits of expansion. And it was a promise that Hakluyt largely fulfilled.
Hakluyt is not someone we usually think of as being part of the encounter between the peoples of the Atlantic basin. He may never have met a Native American face-to-face. But he was the architect of English colonial desires. Understanding his world enables us to know what colonizers had in mind when they first crossed the Atlantic and helps to explain the complicated origins of English America.
This article originally appeared in issue 7.4 (July, 2007).
Peter C. Mancall, professor of history and anthropology at the University of Southern California, is the director of the USC-Huntington Early Modern Studies Institute. In addition to Hakluyt’s Promise, his recent books include the edited volumes Travel Narratives from the Age of Discovery (2006) and The Atlantic World and Virginia, 1550-1624 (forthcoming in September 2007). He is also guest curator for the Huntington Library exhibit, “Jamestown at 400,” which is on view from July to December 2007.
National Character
Daniel Day-Lewis, American historian
Anyone trying to construct a U.S. history syllabus—or, for that matter, anyone trying to follow a prescribed U.S. history curriculum—must contend with a number of pedagogical questions: What primary sources are available? What is a reasonable workload and pace for both students and teachers? But there is another question, perhaps the most important for any teacher of history: how does one confront the challenge of boredom, the default setting for most students? Those of us in the profession may savor shared enterprises like producing scholarship, attending conferences, or reading publications like this one. But as far as many of our employers are concerned, the principal justification for our livelihoods lies in making sense of the past for young people who do not necessarily know—or care—about the things we cherish.
So while it’s all fine and good to try to encourage students to think like historians, a successful teacher is going to have to be a historian who thinks like an adolescent. And in my experience, an adolescent would much rather watch a movie than read a book. That’s one reason why films have become a staple of my survey course ever since I left academe to become a high school teacher a half dozen years ago. (Other reasons include the simple fact that I meet with my students four times a week for seventeen weeks a semester; gone are the days when I could assign a book in a seminar and convene a week later to talk about it.) At its best, such an approach allows me to achieve a number of objectives at the same time. For example, I can demonstrate what the early film industry was like while illustrating the problems of immigration broached in Charlie Chaplin’s perennially entertaining The Immigrant (1917).
Daniel Day-Lewis
This kind of film-centered pedagogy is all fine and good for the twentieth century, when films are bona fide primary sources. But what about early American history? Here there are more difficulties, ranging from a dearth of truly good films to a responsibility to point out the seemingly inevitable distortions that accompany even the richest historical recreations. Still, over the course of the past few years, I’ve settled on a core battery of relatively recent movies that effectively fill that void: The Crucible (1996), Last of the Mohicans (1992), Gangs of New York (2002), and The Age of Innocence (1993). If indeed a picture is worth a thousand words, and a few thousand words is more than my lexically challenged students can comfortably handle each night for homework, this strikes me as a relatively efficient, even if imperfect, method for cultivating an informed imagination.
It was only after I had settled on my first-semester slate of films that I realized they all had something in common: Daniel Day-Lewis. (I’ve now taken to telling my students that I run an annual Daniel Day-Lewis film festival.) The British-born, adoptively Irish actor has been justly celebrated as the finest performer of his generation, in no small measure for the sheer variety of performances he has given over the course of his distinguished career. He first burst into international prominence in 1985 by playing two dazzlingly diverse characters: Johnny, the punk East London homosexual of My Beautiful Launderette, and Cecil Vyse, the priggish aristocrat of E. M. Forster’s Room with a View. He won an Academy Award for his 1989 portrayal of the Irish poet Christy Brown in My Left Foot and also played Irish characters in In the Name of the Father (1993) and The Boxer (1997). Yet over the course of the last two decades the highly selective Day-Lewis has portrayed a gallery of American characters, beginning with a now obscure role as a contemporary art collector in Stars and Bars (1988) and ending most recently with the wildcat oil prospector in There Will Be Blood (slated for release later this year). To some extent, this American accent surely reflects the historical realities of the film industry: the U.S. market still dominates to a now-rare degree, and Americans are notorious for their cultural provincialism. So international actors tend to go where the action is.
Taken as a whole, however, Day-Lewis’s choices seem to reflect more than just industry realities or access to the juicy roles someone of his stature commands. Actually, his recent body of work shows a remarkably textured, yet consistent, vision of American history. If, as even the most die-hard academics would now concede, history is too important to be left to the professors, Day-Lewis would have to be regarded as one of its most prominent, and even influential, practitioners. That influence may be all the more striking for the way it effectively sneaks under our intellectual radar, viscerally shaping emotions that more often than not are the source of the most powerful, and durable, ideas. But what is he showing us?
We begin our survey with Lewis’s portrayal of the doomed John Proctor in Nicholas Hytner’s 1996 film version of Arthur Miller’s 1953 play The Crucible (for which the late Miller, who was Day-Lewis’s father-in-law, wrote the screenplay). The Crucible, of course, is one of the true canonical texts of American literary education, right up there with Uncle Tom’s Cabin and The Adventures of Huckleberry Finn. It has become exactly what Miller intended: a cautionary tale in which the literal witch trials of the Puritan era foretell the figurative witch trials of the McCarthy era. I must confess, however, that I’m not particularly interested in this angle, not only because I use the film to illustrate life in the colonial period, but also because I regard the deeply ingrained perception of the Puritans as hypocritical prudes—even people who have no idea who H. L. Mencken was have thoroughly imbibed his version of them—as too easy a cheap shot in the age of Paris Hilton and Cam’ron. Instead (and here I know I’m swimming against the tide, which is fine because I don’t insist on a particular reading of the story), I hope to convey the richness of the Puritan life: the emotional intensity of a world in which living spirits are taken for granted; in which class and racial conflict jostle with religious obligation; in which a harshly beautiful landscape reflects the jagged longings of the characters.
Those characters are the key. There are any number of objections one can make about The Crucible, ranging from the way it creates composites out the original players in the Salem drama to anachronistic interior sets that are far more grand than any interiors the Puritans could have or would have made. But looking beyond these failings, the film brings to its characters a depth of feeling rarely acknowledged among the Puritans. Even more than in the play, for example, the film forces us to see the sexual intensity and jealousy of which the Puritans were so capable. These feelings are particularly evident in Winona Ryder’s portrayal of Abigail Williams. On a more historically specific note, Joan Allen’s Elizabeth Proctor dramatizes the characteristically Puritan struggle with the sin of pride. She says she forgives her husband’s transgressions—she may even want to forgive her husband’s transgressions—but for most of the story her persistent self-righteousness blocks the better angels of her nature. Allen’s performance illustrates the human cost for those who struggled to uphold impossibly high standards of piety.
Ultimately, though, this is John Proctor’s story and Daniel Day-Lewis’s movie. Literally and figuratively, his Proctor is a man on the edge, living on the outskirts of town, maintaining an initial stance of detachment about the accusations of witchcraft, and then refusing to compromise his good name and personal integrity by confessing his alleged “guilt.” Yet—and to me this is the key to the movie, the reason why I like to use it—Proctor’s fierce moral energy and his ultimate sacrifice of his own life is at once deeply personal and deeply communitarian. His insistent individualism (and that of the motley compatriots who also refuse to confess) is finally what saves Salem. As such, he is as much a reflection of Puritan life as the hysteria that surrounds him. One thing I try to mention in this context is the role of Samuel Sewall, who has a bit part in the movie and whose later apology for his role in the Salem affair is a useful reminder that many of the reasons the Puritans are condemned (such as their naïve moral intransigence) simply do not fit the facts.
This notion of the marginal but righteous communitarian is also important in understanding Last of the Mohicans. Much more than The Crucible, this is a movie that adapts and updates its source material. Part of a five-part series featuring a variously named character generally known as Natty Bumppo, Last of the Mohicans, set during the French and Indian War, was a wildly popular novel at the time of its publication in 1826, but it has been considered almost unreadable since Mark Twain’s now-legendary 1895 swipe at “James Fenimore Cooper’s Literary Offenses.” The novel’s protagonist in particular comes off as a ridiculous country bumpkin, a portrayal that carries over into a number of film adaptations of the book. Director Michael Mann, who also co-revised the screenplay with Christopher Crowe, overhauls the character as Nathaniel Poe, a much more formidable figure. Yet it’s Day-Lewis—whose notorious obsession with developing and inhabiting characters even when offscreen has impressed and irritated crews of many productions—who transforms the character into a riveting embodiment of understated competence (a transformation all the more impressive given the nerdy Day-Lewis’s conversion into a magnificent specimen of masculine power). He is, in short, a babe magnet.
The babe in the woods he attracts is Cora Munro, daughter of a British army captain, ably played by Madeleine Stowe. This too represents a significant change from the original novel, as Cooper’s protagonist was the ultimate loner, the frontiersman who embodied Frederick Jackson Turner’s archetype of the restless wanderer who keeps pushing west to stay ahead of civilization (the final installment of the series is The Prairie, with Natty Bumppo out on the Great Plains). In traditional versions of the story, Cora dies, and amid the various romantic permutations between her, her sister Alice, and various Britons and Indians, Nathaniel is not an available partner. In this movie, however, the two form an unshakeable bond and end side-by-side with Nathaniel’s adoptive father, Chingachgook, the last Mohican of the title, about to lay the foundation for a distinctively new American society.
That society, we’re given to understand, will be multiracial. This of course is a stretch—while the notion of a white person being adopted and raised by Natives was hardly surprising in an eighteenth-century context, the character of Nathaniel in the movie comports nicely with a twenty-first-century social constructionist model of race (pigment be damned, Day-Lewis is extravagantly comfortable negotiating as a fellow Native with a Huron Sachem for the release of the Munro sisters, at one point offering the wampum belt of “my” people as ransom). In those cases where the facts seem to get in the way, the filmmakers have few compunctions about rearranging them. As a number of historians, among them Richard White and Ian K. Steele, have noted, the filmmakers take great liberties with the actual events surrounding the capture of Fort William Henry in 1757. For example, they make the massacre that followed English evacuation of the fort appear to be bloodier than it really was. Moreover, the ethnic composition and orientation of the Indians is seriously jumbled, making the Hurons in particular seem to be much more numerous and decisive French allies than they ever were.
I will confess that many of these inaccuracies escaped me until I began researching this article. (It has made me wonder how much false information I have disseminated over the course of my career through ignorance, unconscious mistakes, or accurately reporting the work of historians who subsequently proved to be incorrect. I shudder at the thought.) But I can’t say I regret showing the film. Indeed, I intend to do so again and again. Actually, the very messiness and confusion of the story—of colonists arguing with the British government; colonists arguing among themselves; Indians fighting the colonists and the British and each other; the French fighting them all and yet having more in common with their British opponents than their Huron allies (and French as the lingua franca for Indians and Anglos alike)—is a truth more important than any particular fact and the one students consistently offer unbidden as the message they take from the film. At the center of it all, and yet standing apart, is Day-Lewis’s Nathaniel, who embodies the prototypical American in a way that would probably turn James Fenimore Cooper’s (and many an Algonquian’s) stomach. But as Jefferson told us long ago, the earth belongs to the living. And history is very much an earthly thing.
Of course the living are constantly buried. This is the topic of Chigachgook’s final soliloquy in Last of the Mohicans—and it’s the core subject of the next film in the Daniel Day-Lewis film festival: Gangs of New York, directed by the legendary Martin Scorsese. A bowdlerized version of the bowdlerized 1928 history of New York street culture of the same name by Herbert Asbury, Gangs makes Mohicans look positively fastidious by comparison. Conflating time periods and gangs, relying on a creaky patricidal plot, and lingering over stylized violence that is brutal yet sentimentalized, the movie was widely criticized by historians and reviewers alike.
There are two things, however, that make Gangs unforgettable. The first is its tremendous visual impact—I remember gasping back in 2002 when I first saw the trailer, which showed the downtown docks (actually recreated on a set in Italy). The other is Day-Lewis’s performance as the brutally charismatic William Cutting, a.k.a. Bill the Butcher, Bowery B’hoy extraordinaire.
In an important sense, the Butcher, as he is called (only in part because of his profession), is the heir of Nathaniel Poe; Nathaniel is the man the Butcher might have become if, instead of being locked inside an urban jungle and forced to turn his fierce energy inward, he were restlessly moving toward new frontiers. Both consumed and sustained by his hatreds—among them the Irish immigrants who swarm into the Five Points neighborhood he runs with an iron fist and celebrates in a patois of profane poetry (inspired by the angry white hip-hop artist Eminem, to whom Day-Lewis listened as he prepared for the role)—Cutting is a man out of time. The pressure is coming from a variety of directions: from those immigrants, whose numbers will soon render his nativism obsolete; from a new breed of politicians, represented by William (soon to be Boss) Tweed, who sees possibilities for power in the ballot box, even if it has to be bought; and from the gathering force of the federal government—temporarily enmeshed in the Civil War—which will break the power of the clannish local lords, whether on the plantation or in the ghetto. Cutting sees the walls closing in; there’s a terrific scene of him wrapped in an American flag, fondly remembering the vanquished Irish foe whose son he has unwittingly adopted as protégé. And it’s a virtual death wish that sends him into a final self-destructive spiral of violence against the backdrop of the New York City draft riots.
Superficially, at least, there could not be a man in New York more different from Bill Cutting than his contemporary Newland Archer, protagonist of The Age of Innocence, Scorsese’s 1993 adaptation of Edith Wharton’s novel of the same name. One man is a lawbreaker; the other is a lawyer. One is quintessentially downtown; the other, part of the emerging scene uptown (there’s a great shot of a huge estate on a cavernously empty Fifth Avenue). The Civil War looms and finally overtakes Cutting, but for Archer, who comes of age in the 1870s, it barely registers, except marginally, perhaps, in the wobbly fortunes of his buccaneering peers, the Beauforts of South Carolina. Cutting is a thug who revels in his provincialism; Archer, ever the gentleman, is an instinctive cosmopolitan.
Yet beneath these obvious differences are surprising core affinities. Both men navigate their way through their respective societies with a firm social compass—one pointing west, the other east—but are perfectly willing to buck convention when they consider it necessary. Further, the very personal vision they pursue also engenders self-imposed limits, limits that are mistakenly perceived by their peers as a form of social conformity. Archer’s passion for the beautiful Countess Olenska (Michelle Pfeiffer) pulls him into open rebellion. That this rebellion is never fully consummated is less a matter of social custom than of honor and a commitment to the deceptively simple-minded May Welland (Winona Ryder, who would team up with Day-Lewis three years later in The Crucible). This sense of honor is seen by some as a form of capitulation, though more sophisticated observers (alas, relatively few of my students among them) recognize it as a form of inner strength. And yet even as he maintains a connection to the society in which he came of age, Archer, as his name suggests, quietly insists on going his own way in the books he reads, paintings he views, and travels he (belatedly) takes.
Nevertheless, the role marks a turning point in Day-Lewis’s vision of American history. For even as Archer extends the line of restless communitarians in the actor’s body of work, he also suggests its descent. One can only go so far in plausibly describing him as a maverick. Compared to John Proctor, Nathaniel Poe, or Bill Cutting, he is a smaller, even diminished figure. (Newland’s precocious taste in art marks him as a proto-modernist, but he is finally too thoroughly the Victorian to cross over to that promising symbolist land.) Though one could argue this is simply one role in one movie, his character seems to suggest a larger point: centennial America is just not as big as it used to be, and its protagonists are smaller. This assertion seems to get more emphatic reinforcement when one considers Day-Lewis’s performance in the 2005 film The Ballad of Jack and Rose, directed by his wife, Rebecca Miller. As Jack Slavin, Day-Lewis portrays an aging hippie who, cut off from his humanitarian past, retreats into incestuous isolation. Proctor, Poe, Cutting, and Archer are impressively tragic figures. Slavin, by contrast, veers uncomfortably close to pathos. His beautiful daughter carries his memory forward—or is it back?—to a Vermont commune, but one has a strong sense of a narrative terminus. Four centuries from the roiling furies of the Puritans, at the end of a hundred years since Newland visited the Old World, a frontier has gone, its archetypal character now trapped on an island off the Down East coast of Maine.
Assigning the title of “American historian” to a Hollywood actor might seem like an implausible act for any number of reasons, among them the simple fact that the creation of history is generally a matter of generating words, and an actor’s job is typically a matter of speaking those of someone else, even if, as Day-Lewis surely has, that actor collaborates on his lines. In most of the cases discussed here, those words are effectively twice removed, as they were adapted—make that invented—by a series of screenwriters from a play, a novel, and popular history, each themselves derivative in one way or another.
But like all the arts, history is above all a matter of choices—of subjects, of sources, of shadings of fact and strokes of imagination. When one considers the body of work of this particular individual, one is left with a surprisingly suggestive interpretive arc. The engine of American history, Daniel Day-Lewis tells us, is a restless individualist who strains against an inherited culture, an individual as likely to look back as to look forward, but an individual who, in that very restlessness, also paves the way for a new generation, one that will ultimately produce a new rebellion for a new age.
Of course, there’s nothing uniquely American about this; Day-Lewis himself has portrayed similar Czechs and Irishmen, for instance. But the American settings in which such dramas unfold are distinctive. They’re settings in which people are repeatedly told, as a matter of birthright, that dreams are valid and realizable—though not all people realize those dreams, and for those who do (or those in the proximity of those who do) the dreams often turn into nightmares. This friction is what gives so many of Day-Lewis’s characters a tragic dimension. It’s also what gives them a powerful sense of relevance. There are few places where such questions and issues are more relevant than high schools—settings that, whatever their specific deficiencies, are veritable workshops of dreams.
In its broadest outlines, the Daniel Day-Lewis school of American history is not particularly complex, and it’s certainly debatable. That’s precisely the point. Day-Lewis may or may not be right that restless individuals are the engine of American history, and if he is, one can argue about whether this is a good thing or not—or whether, for example, those individuals can be women as easily as men. This is the stuff of which good classroom discussions are made.
Further, the Day-Lewis message is not a particularly fashionable one in the academy. In the long, ongoing argument about whether the heroic individual or the impersonal process shapes history, the pendulum has long lingered on the latter. It’s a little surprising to realize that in some respects the argument that Day-Lewis is making is not that much unlike the one John Wayne did in his body of work—or by broadening the frame of reference a bit to bring Alan Ladd into the picture, one might dub it the Shane school of history, where misfits with good hearts redeem and renew a country. I believe Day-Lewis’s interpretation is a bit more textured than that; his characters have a richer and more reciprocal relationship with their communities than these other examples might suggest. But the heritage is there nonetheless.
There is one other aspect of Day-Lewis’s vision of American history that distinguishes it from others propagated by popular media. And that is that it is a vision, a sweeping interpretation that takes in the American past as a whole. Not many professional historians (Sean Wilentz comes to mind as an exception) consider it appropriate to even try. In this regard, Day-Lewis harkens back to earlier generations of American historians: Hofstadter, Parrington, and, especially, Turner, and maybe a few modern descendants such as Patricia Limerick. For a variety of structural and ideological reasons, the contemporary professional vision of the past is fractured, slivered into shards that are constantly being recombined into often compelling new arrangements. A postmodern playhouse. That’s fine for graduate students, maybe. But that’s not what the kids I see need right now.
They need to grapple with a frontiersman in the woods.
Further Reading:
Arthur Miller’s screenplay version of The Crucible (New York, 1997) includes valuable pieces by the playwright and director Nicholas Hytner on the making of the film. Richard White analyzes Last of the Mohicans in Past Imperfect: History According to the Movies (New York, 1995); Ian K. Steele reviewed the film in The Journal of American History 80:3 (December 1993). The major inspiration—as opposed to an important source much beyond the names of the characters—for Martin Scorsese’s Gangs of New York is Herbert Asbury, The Gangs of New York: An Informal History of the Underworld (1927; New York, 2001). A shooting script of The Age of Innocence screenplay by Jack Cocks and Scorsese is available through “The Newmarket Shooting Script Series” (New York, 1991).
This article originally appeared in issue 7.4 (July, 2007).
Jim Cullen teaches history at the Ethical Culture Fieldston School in New York, where he serves on the board of trustees. He is the author of The American Dream: A Short History of an Idea that Shaped a Nation (Oxford University Press, 2003) and the recently published Imperfect Presidents: Tales of Misadventure and Triumphs, recently published by Palgrave Macmillan, among other books.
Why I Can’t Visit the National Museum of the American Indian
Reflections of an accidental privileged insider, 1989-1994
I am often asked what I think of the National Museum of the American Indian. That I have nothing to say surprises the people who ask the question because usually they know that I worked for the museum for the first four years of its existence. The fact is, I have never visited the National Museum of the American Indian and declined the invitation to attend the opening. In her “Why I Cannot Read Wallace Stegner” (1996), an essay in a collection by the same name, Elizabeth Cook-Lynn expresses her rejection of Stegner’s autobiography Wolf Willow: A History, a Story, and a Memory of the Last Plains Frontier (1955) and his Beyond the Hundredth Meridian: John Wesley Powell and the Second Opening of the West (1954). Cook-Lynn protests the colonial privilege and ideology that inspired Stegner’s romanticized view of the American West, with its tragically vanished American Indian. Such works have aided the disappearance of Native people from history. My inability to visit the National Museum of the American Indian stems from a similar sense about its mission and its exhibits. To me, the museum represents a lost opportunity to integrate American Indians into the national consciousness.
“We’ve been trying to educate the visitors for five hundred years; how long will it take to educate the visitors?” spoke an elderly Native woman at one of several community-based consultations I organized for the National Museum of the American Indian (NMAI) between 1989 and 1994. Her words—strong, angry, and impatient—formed a response to the question we carried to each consultation: what should the museum say about Native America? Her agitated comeback affected the remainder of my experience as one of the museum’s early planners and has remained with me for the past fourteen or fifteen years. Smithsonian representatives had no response for the woman then; today, the finished museum stands as a reminder of how the small-but-growing museum staff failed to find, in that tense moment of public scolding, inspiration and encouragement to tell the story that we know and the nation denies.
The museum began before the arrival of the director Richard West. Following the passage of the legislation that established the National Museum of the American Indian in late 1989, the secretary of the Smithsonian, Robert McCormick Adams, requested that an internal committee be formed. Undersecretary Dean Anderson headed the internal committee composed of seven to ten staff members, including me, the sole Native to sit on the committee. A number of well-known Native scholars and others sat on the newly formed NMAI board. The internal committee, however, made decisions regarding daily museum work that would begin to shape the character of the museum.
This was a troubling experience for a junior Native staff member. Non-Native persons, save me, were beginning to direct the course of the museum, a development that ran counter to the idea that this was to be the “museum different” (translation: a museum by Indians, not just about Indians). Until this point, the Smithsonian National Museum of Natural History and museums of similar scale with large Native object collections exemplified the colonial practice of taking possession of Native cultural patrimony and human remains and then, without consultation with Native communities, creating exhibits that falsely represented Native people. I looked at the internal committee and wondered, “What’s so different here?” and I offered my resignation. The undersecretary asked what would make me stay. My naive and earnest response centered on a plan to take “the suits” from Washington to Indian Country.
We held the first of numerous consultations with Native communities in the summer of 1990 shortly after Secretary Adams’s announcement naming Rick West as the museum’s founding director.
For the next four years I organized similar consultations throughout the United States from my home in Oklahoma where I had moved to undertake a graduate program in history. The staff in Washington commenced consultations around specific issues such as “traditional care and handling” of objects (March 1992). Much was at stake. Together, the community consultations and topical meetings in Washington informed the architectural program for the Suitland, Maryland, housing facility, which came to hold the Georg Gustav Heye collection of some one million objects of indigenous origin from throughout the Western Hemisphere, then located at 155th and Broadway, in New York City. Reflecting the desire of many Native communities, the Suitland facility was built to provide an appropriate home for tribal cultural patrimony and religious objects. The building’s design was informed by the expertise of numerous Native scholars and cultural practitioners educated in the history, purpose, and care of tribal objects. The Philadelphia-based architectural firm Venturi, Scott Brown, and Associates translated the direction they received from Native consultants into an architectural plan for a structure worthy of some of the most precious of Native cultural and religious material objects.
The work of coordinating consultations provided a window into the formative years of the National Museum of the American Indian, a period during which the new director assumed his duties. This was also the period during which the museum hired most of its staff and developed its unique institutional culture. In hindsight, red flags were everywhere, and I have since come to question whether Native people should ever look to the state for solutions to the destructive outcomes of colonialism and hegemony.
To a large extent white staff were in charge of the real nitty-gritty stuff like budgets, administration, exhibition coordination, and publication. Indians, mainly male, were in charge of translating and defining Indianness, which, in addition to contributing to planning for the Suitland collection facility, also informed exhibition content, community relations, public programming, and museum policies. The division of work along white and red lines was especially significant in the area of exhibition development. Until 2001 when Jim Volkert, a non-Native, stepped down to assume other responsibilities, he had acted as project coordinator and had final approval of all exhibitions. Each exhibition was planned by a team that included a Native curator. This early absence of Native control challenged the promises implicit in the language of the “museum different.”
The racialized and gendered division of labor required multiple translations between team coordinator and Native curator. The NMAI project coordinator’s role required command of the curator’s language: tribal cultural language, Native pedagogical practices, and Native epistemologies. In addition, the coordinator, and if not the coordinator then the curator, needed to successfully translate Native ways of knowing and practicing to people unfamiliar with the Native world and its history. The absence of Native knowledge and the consequent inability to effect the required translation undermined exhibitions. Were the principal players held accountable? Not so much in the early days, which might help explain the disastrous opening. From my observations, exhibition team members were accountable to a project coordinator, who knew nothing about Native history and culture. He was dependent on the smattering of selected Native men, and few women, for his view of Native America. Lacking cultural knowledge and capital, he consequently lacked the authority to hold curators and others accountable or even to lead effectively. His important contribution was limited to creating a productive division of labor within the team.
Until recently the museum has been awash in money. Travel, meals, consultation, research, and more travel filled the exhibition team’s calendar. Consultants were flown to Washington and paid for their services. Money bought much information and advice through contractual consultation, but it was also a corrupting and distracting force, more so without stringent accountability. Amazingly, a few very hardworking staff members resisted the party and imposed accountability on themselves—both for how they used the museum’s money and how they understood its mission. Too few staff members followed suit. I look back at them now with great admiration and appreciation.
The dominant presence of male Native artists in the early museum years has left a lasting stamp on the museum’s work environment and on its exhibitions. Art and material culture were the preferred media for transferring knowledge about Native America to an unknowing audience. Why art and culture? For many artists, Native creative expression is a presumed window on Native inner life and culture. The exhibit teams have thus relied on art and material culture, the ultimate expressions of Native inner life, as a vehicle for teaching unfamiliar visitors about Indianness. But such thinking represented precisely the problem with the museum: it had become an elite enclave, divorced from the reality of most Native people, where explaining Indians to museum visitors assumed primacy. Moreover, the museum early on made the decision that it would eschew the historical context from which modern Native America has sprung. This meant, astonishingly, no treatment of the history of genocide and colonialism, then and now, or even of the basis of tribal sovereignty.
Jolene Rickard, an NMAI contractor, is quoted as saying, “There are other places where you can learn the exact dates of the Trail of Tears. It’s less important to me that someone leave this museum knowing all about Wounded Knee than that they leave knowing what it takes to survive that kind of tragedy.” As much as I admire Jolene Rickard for her artistic achievements, I wince at her easy dismissal of historical context as an essential prerequisite for understanding “what it takes to survive that kind of tragedy.” Rickard’s statement reflects the “group think” of the NMAI as conceived by the director—what I call, “There will be no unhappy history here.”
Rickard’s statement also suggests that the museum’s senior and curatorial staff imagine that destruction and colonialism have ended. Just as nineteenth- and twentieth-century anthropologists froze authentic Native people in exhibitions while Indians starved on reservations, the museum’s staff has created a modern hermetically sealed Native “community” that has “survived” something long passed. This distancing, forgetting, and desire to divert the public’s gaze from the past simply perpetuates the on-going erasure of authentic Native histories.
Experience, personal and otherwise, has shown me that it is not just white Americans who need to grasp the full scope of Native history. Native people can also benefit from a more just and accurate depiction of their past. No one can understand the experience of twentieth- and twenty-first-century Indians without understanding the U.S. laws and policies that radically reshaped their lives. In my mind’s eye, I see an image of a river (federal Indian policy and law) fed by many streams that shape the flow and form of the river: colonial-era treaties; the U.S. Constitution; early nineteenth-century Trade and Intercourse Acts; the Indian Removal Act; ex parte Crow Dog, which clearly and unambiguously acknowledged Native sovereignty; and the legislative assault on tribal sovereignty in the last quarter of the nineteenth century. Through it all, Native people fought back against the Americans’ assault on the American Indian. They fought in Washington; they fought from the reservation; they fought through hired lawyers; and they fought in the courts, all the while preserving much of what the United States had relentlessly tried to destroy. This is more than mere “survivance”; it is productive, structured, and structuring struggle that kept Native people from extinction.
The river is shaped by the actions of ancestral and modern men and women, tribal leaders and Indian activists who resisted removal, resisted allotment, maintained tribal social values and culture, overturned termination, and who continue to fight to this very day to protect tribal sovereignty. Much is at stake: reserved treaty rights for hunting, fishing, and gathering; the trust responsibility that binds the United States to its obligations to Native people as stipulated in treaties and court decisions; and treaties themselves, the Maginot Line for indigenous peoples who have been swept up under the American state umbrella. Native people continue to fight the genocidal policies of the United States government and the equally destructive practices of the private sector. Native lands contain the highest concentrations of toxic waste anywhere in the United States. Devastated Native economies, which never recovered from forced migrations and other government-imposed dislocations, leave numerous tribes with few economic options beyond the very unhealthy ones of selling toxic dumping permits to the government and to private companies.
Toxic waste is, of course, not the only health problem Native peoples face. Native women are raped and murdered by white males at a rate higher than any other racial or ethnic female population. A part of the explanation for non-prosecution of such crimes goes back to the Federal Major Crimes Act of 1883, which extended federal authority over Indian lands in cases of major crimes, including rape. Because U.S. prosecutors fail to pursue white perpetrators and tribal authorities have no jurisdiction in such cases, the criminal goes unpunished. It seems likely that such injustices have contributed to the high rates of alcoholism and drug addiction in the Native population. Diabetes, which attacks a higher percentage of the Native population, had its genesis in the destruction of Native economies and diet and the introduction of rations. For example, Plains people subsisted on buffalo meat and the meat of other ungulates, small game, and hundreds of plant sources for nuts, fruits, berries, legumes, tubers, and teas. The extermination of the buffalo, the appropriation of Native lands, the collapse of indigenous trade, and the introduction of government rations of lard, bacon, coffee, sugar, corn, rice, and poor beef radically changed the Indians’ diet. Rations were insufficient, irregularly delivered, and frequently unusable either because they were spoiled or because they were unfamiliar to the Indians.
For Native people, especially young people, these trials, and the changes they produced, explain their world today, whether they live on reservations or have dispersed to cities with their families. This knowledge can be a source for recovery from a historic wound. It can also publicly affirm the experiences of younger generations of Natives and inspire them to follow their elders into activism and community leadership.
The importance of historical context to the stated mission of the NMAI had been raised in at least one community consultation. A Lakota man noted that Native ways of life have not been respected since the late nineteenth century. He called for creating a “better environment for our people because the way history books are written and the way we feel when you go to different places[, is] that we need to create a better environment, update our history. There are some things that cannot change our ceremonies, but we change the way we live.” What non-Natives see of Indians, in other words, are inauthentic and degraded people. Why? Because actual Natives do not uphold non-Natives’ crude nineteenth-century understanding of Native culture. This, too, is a form of colonial thought and ideology that is destructive to Native people but that can be corrected.
Native communities and individuals have emerged today from long struggles with the destructive consequences of American hegemony. The Chickasaw Nation, located at Ada, Oklahoma, is a shining example of a tribal success story. As one of the removed southeastern tribes in the early nineteenth century, the Chickasaws rebuilt their lives in Indian Territory only to lose land to other relocated Indian tribes. The tribe also lost lands as a result of the Curtis Act (1908), which brought about allotment (the federal government’s program to distribute reservation lands to individual Native landholders) to the so-called Five Civilized Tribes of Oklahoma. The Chickasaws use casino proceeds to benefit the people in the form of a wellness center, counseling center, library, scholarships, an aviation and science summer academy, and rebuilt stomp grounds (for an annual green corn dance). The Chickasaw Nation is also promoting the increase of scholarship about Chickasaws and is funding a project to carry out that mission under the direction of a Chickasaw scholar. Governor Bill Anoatubby, who has led the Chickasaws for many years, embraces a forward-looking vision for the tribe and extends the tribe’s friendship, services, and resources to the town of Ada.
Cherokee professor Andrea Smith, University of Michigan, is an example of a Native academic who seeks to bridge her activism and her academic work. Her recent book, Conquest: Sexual Violence and American Indian Genocide, broadens the meaning of colonial sexual violence to boarding school experience and rape and pollution of the land. In addition to being a central focus of her scholarly work, sexual violence against Native women is also an area of her activism. Conquest draws from Smith’s activism through the creation of INCITE! Women of Color Against Violence, a national grassroots organization. She is one of a growing number of Native scholars who address the continuing consequences of U.S. internal colonialism and who are bringing international attention to U.S. indigenous issues. Her activism, along with that of Sarah Deer, a Muskogee lawyer and activist from Minneapolis, is the reason Amnesty International began studying violence against Native women in the United States. That work resulted in a stunning and highly disturbing report about the abuse of Native American women. The successes of the Chickasaw Nation, the devoted activism and scholarship of Andrea Smith, and the ongoing work of community-based activists are also Native America.
As a professor of history who teaches the history of Native North America and federal Indian policy and law at a midwestern University, I am frequently reminded of the depths of non-Native ignorance of Native America. Questions I pose in class might go something like this: What do you know about American Indians? Silence. Do you know any American Indians? Heads shake from side to side. What do you think a reservation is? The discussion picks up. “It’s where the Indians live.” Why? How did they come to live there? “They just live there.” Why? “So they can be together.” But why do they live at that particular place? “They just picked that place because they liked it or because it was away from the white people.” By the end of the semester, students have become fascinated with this destructive history and how it worked. I have delivered on my promise that a semester studying American internal colonialism and federal Indian law and policy will be like a trip to Mars. At the end of each semester a student will inevitably ask, “Why didn’t they [parents and teachers] tell us this stuff.” I remind them of the first day of class when I told them that if they were seriously wedded to the fictional national narrative, they might not be happy with my class. At semester’s end, they know what I mean by a fictional national narrative. I tell them that people will take extreme measures to protect and preserve it. And each semester, in that moment, some of my students and I experience reconciliation.
I wish I were confident that visitors to the NMAI experienced a similar sort of reconciliation. But reconciliation cannot happen in a vacuum. One must know the history before reconciliation can occur. Yes, the previous five hundred years represent but a brief moment in the long history of indigenous occupation of this land, as Rick West likes to point out. But those five hundred years have radically changed Native America for all time. The NMAI imparts no understanding by ignoring those five hundred years but only reinforces the invisibility of Native people and replicates practices of the Department of Education, the Bureau of Indian Affairs, and the academy. What else but willful ignorance can explain the continued existence of a sports team with a name like Redskins? or of romanticized movies like Dances with Wolves? Where do we find reconciliation in the midst of such denigration? Can we find reconciliation in a state institution?
For me, the National Museum of the American Indian represents a broken promise, no less consequential than the many broken treaty promises made by the United States to Native people. It represents a betrayal of our trust that this museum would be the Natives’ museum. In place of the stories of the Native past, it focuses on arts, culture, and commerce—the stuff of commodification. To paraphrase the historian Paul Kramer, cultural recognition and power do not connect. Sitting there in close proximity to the Capitol, one might think that the Indians were finally within reach of social justice, political power, and economic change. Not yet. Cultural recognition will not create a working arena where Native America might engage the United States government on something resembling level ground. Rather, cultural recognition is a distraction for Native people, a painless amusement for non-Natives, and a way for U.S. government politicians and bureaucrats to avoid the hard questions raised by the history of U.S. internal colonialism.
Further Reading:
Elizabeth Cook-Lynn’s essays can be found in Why I Cannot Read Wallace Stegner (Madison, Wisc.. 1996). Jolene Rickard is quoted in Amy Lonetree, “Missed Opportunities: Reflections on the NMAI,” American Indian Quarterly (2006). Some of the Chickasaw tribe’s projects can be seen on the tribal Website. Andrea Smith’s book is Conquest: Sexual Violence and American Indian Genocide (Cambridge, Mass., 2005). Sarah Deer works for the Tribal Law and Policy Institute in Minneapolis. Learn more about the institute and access numerous reference sources at its Website. The Amnesty International report mentioned above is titled “Maze of Injustice: The failure to protect Indigenous women from sexual violence in the USA,” Amnesty International, April 24, 2007.
This article originally appeared in issue 7.4 (July, 2007).
Jacki Thompson Rand (Choctaw) is associate professor of history at the University of Iowa history department. She has contributed to Plains Indian Drawings, 1865-1935: Pages from a Visual History, edited by Janet Catherine Berlo (New York, 1996) and to Clearing a Path: Theorizing the Past in Native American Studies, edited by Nancy Shoemaker (New York, 2002). Her forthcoming book, Kiowa Humanity and the Invasion of the State, will be released in spring 2008 by the University of Nebraska Press. Her research interests are state Native American/American Indian/Indigenous policy and law. Her next project focuses on Canada.