As we approach the 20th anniversary of the terrorist attacks on September 11, 2001, we might think that we’ve heard all we need to (and then some) about that day. In a nation consumed with mourning more than 650,000 people due to the continuing COVID-19 pandemic, what does it mean to remember the outpouring of unity and grief that followed the murder of close to 3,000 after passenger-filled planes were piloted as missiles?
Innumerable accounts have combed the depths of the grief, shock, anger, and resilience of that day. They chronicle epic heroism and tiny acts of kindness. But “9/11 fatigue” does a disservice to history. Beyond the jingoistic rhetoric, the market-driven spectacle, the bumper-sticker sloganization of memory, the printed tourist guides and Twin Towers tchotchkes for sale around the site perimeter, lie essential, inspiring, and instructive stories about basic human goodness and the power of collective action. Episodes of pragmatism, resourcefulness, and compassion. Moments of grace and solicitude. Lifesaving efforts born of professional honor. The joining of unlikely hands.
Still, remarkably, some of the most affecting of these stories have gone unheard.
Here’s just one. On the morning of September 11, telecom specialist Rich Varela was working a contract gig on the twelfth floor of 1 World Financial Center, directly across the street from the South Tower. A few minutes into his workday, he looked around the windowless, nearly soundproof “comp data” room humming with servers, telephone switchboards, and other electronics, and noticed that things seemed oddly quiet. There must be a late bell today, he reasoned.
About 15 minutes after he’d sat down at his desk, a “crazy, ridiculous rumble” erupted, and the building did a little shimmy. Maybe it was one of those big 18-wheelers, thought Varela, picturing the thunder created when a large truck rolls over metal plates in the street. He gave a passing thought to his surroundings. If that had been an explosion, from a blown gas line or something, nobody would even know I was in here.
A few minutes later, his buddy called from Jersey. “Get outta there! They’re crashing planes into the World Trade Center!”
Because his friend had a history as a prankster, Varela didn’t buy his story right away. “No, I’m serious,” the friend said. And then it clicked. That rumbling sound. Varela gathered his things, neither panicked nor dawdling. He opened the thick door of the comp room that had shielded him from the screeching and strobes of the building’s fire alarm. Down the hall, phone receivers dangled off corkscrewed cords, papers lay strewn across worktables, and chairs loitered at odd angles rather than nestling neatly under their desks. The trading floor was wholly uninhabited. “It was like people just evaporated.”
In the lobby, Varela caught his first glimpse of what looked to him like Armageddon. The whole front of 1 World Financial Center had imploded, leaving the plate glass window in shards. Amid piles of smashed concrete and polished stone, pockets of flame feasted on combustibles. Varela stared at the blazing hole in the side of the South Tower. “You could hear a pin drop. You’re in Lower Manhattan. You could hear sirens in the distance but immediately in the area there was no motion of life. I thought that was so eerie.”
Outside, a series of artillery-like blasts made him duck for cover. The eruptions sounded like “cannon fire or missiles coming into Manhattan.” Boom! Boom! Are there battleships out in the water shooting planes out of the sky? Varela wondered. Turning to face the towers, he realized he was hearing the sounds of bodies hitting the ground. He tried to shake off the images as he headed toward the Hudson.
Halfway to the river, still reeling, Varela heard a rumbling. The South Tower was imploding. He tried to outrun the plume of ash, dust, and smoke that barreled toward him. At the seawall, Varela spotted a boat.
On September 11, 2001, nearly half a million civilians caught in an act of war on Manhattan escaped by water when mariners conducted a spontaneous rescue. This was the largest waterborne evacuation in history—more massive than the famous World War II rescue of troops pinned by Hitler’s armies against the coast in Dunkirk, France. In 1940, hundreds of naval vessels and civilian boats rallied to rescue 338,000 British and Allied soldiers over the course of nine days. But on that Tuesday in 2001, approximately 800 mariners helped evacuate between 400,000 and 500,000 people within nine hours. The speed, spontaneity, and success of this effort was unprecedented.
Somehow, in the sea of reporting that followed the attacks, this fact has garnered remarkably little attention. I wrote Saved at the Seawall: Stories from the September 11 Boat Lift to address that omission.
Within minutes after thick, gray smoke began spilling through the airplane-shaped hole in the World Trade Center’s north tower, adults and children—some burned and bleeding, some covered with debris—had fled to the water’s edge, running until they ran out of land. Never was it clearer that Manhattan is an island. Mariners raced to meet them, white wakes zigzagging across the harbor. Hours before the Coast Guard’s call for “all available boats” crackled out over marine radios, ferries, tugs, dinner boats, sailing yachts, and other vessels had begun converging along Manhattan’s shores.
So many people ached to contribute something that day and the days that followed. They donated blood, bagged up stacks of peanut butter and jelly sandwiches, built stretchers that went unused. But, as fate would have it, mariners who had skills and apparatus that could help right away became first responders. Their collaborative efforts that morning saved countless lives. So did other selfless acts made by people from all walks of life. Varela is just one shining example.
As the debris cloud rained down on him, Varela bounded over the steel railing separating the water from the land and leapt onto the bow of fireboat John D. McKean. He felt his leg buckle and almost snap when he hit the deck. A mass of people jumped on after him, falling onto the deck, some landing on him, and the boat rocked under the weight of the leaping hoards. Varela worried it might capsize. He stumbled over people on his way to the far side of the deck, away from the avalanche, then curled in on himself, choking as everything went black.
When the air cleared a bit, Varela saw casualties all around him. Somebody was nursing a broken leg. A woman lay splayed out beside the bow pipe. It looked as if she had landed face-first on the steel deck. He hollered to a nearby firefighter that she needed medical attention—that she was unconscious and might already be dead. There was little anyone could do on the boat, so reaching a triage center, quickly, was imperative. But other lives needed saving, and people continued to clamber aboard.
Quickly, Varela made his first choice of many that day to help others. Coughing and gagging, he yanked off his gray-green long-sleeved cotton shirt, tore off a strip, and wet it with water he found dribbling from some leaky hose on deck. He tied the makeshift filter around his face and then tore off more strips for fellow passengers.
Soon, the fireboat McKean would evacuate people to safety at a rundown pier in Jersey City. Before that, though, Varela helped heave up a docking line used to rescue a young woman from the water. While there on the Jersey side, he helped carry a chair to transport a man with a shattered leg off the boat.
Then, just as the fireboat crew prepared to cast off lines, the second tower collapsed. Varela saw the looks on the faces of the fireboat’s crew as 1 World Trade Center pancaked down, burying their fellow firefighters along with the civilians they’d been sent to save.
Their horror prompted Varela to make the decision of a lifetime. “‘I’m coming with you,” he said. “You guys need help.”
“Let’s go,” came the reply, and Varela jumped back on board. So did an older gentleman, explaining, “My son’s in that building.”
“It really felt like, I might die today,” Varela later said. “And I was okay with it.” These guys need help, he thought. And that was it.
So many people that day made choices to take risky action for the sake of others: Firefighters climbing the stairwells, co-workers helping those less mobile to escape the burning buildings, office workers in dress shirts hauling equipment with rescue workers in the plaza, and mariners who dropped evacuees at safer shores, over and over, then set course straight back to the island on fire to save still more people trapped at the seawall.
Surfacing these long-overlooked stories grants us a window onto who we have been for one another and who we can be again. Remembering that this, too, is part of our heritage can help us reclaim our humanity as we face the perils of today.
National 9/11 Memorial, Manhattan. Photo by author.
The post-9/11 era, which effectively began shortly after the terrorist attacks on September 11, 2001, has decidedly come to an end. Shaped not only by the loss of life on 9/11 but also by all that ensued in its wake, including the long wars in Afghanistan and Iraq, it was defined by fears of foreign terrorism, security culture, and a xenophobic and jingoistic form of patriotism. But the post-9/11 era was also significantly shaped by an under-appreciated force: the culture of memory.
As the 20th anniversary of 9/11 looms, it makes sense to reflect on this preoccupation, if not obsession, with memory. The urge to memorialize two decades ago was swift and strong, rising out of a deep sense of grief and loss. Over one thousand 9/11 memorials were built around the country and around the world. Many of these memorials were prompted by the decision of the Port Authority of New York and New Jersey to hand out pieces of steel recovered from the site for memorials from 2010 until 2016. In New York, the 9/11 memorial and museum garnered extraordinary attention when they opened in 2010 and 2014. They also cost, together, almost $1 billion, a significant percentage coming from public funds.
Memorialization has been a nationally affirming enterprise, providing comforting narratives of national unity whose coherence nonetheless required the exclusion of many aspects of the event. With memorials being built well into the 2010s, it seemed increasingly that the surfeit of 9/11 memory was not about 9/11, or even those who died that day. Rather it reflected a desire to return to that post-9/11 moment of national unity, in which, however falsely, the nation seemed to speak with one voice: we are Americans.
What does it mean to remember 9/11 20 years later, when an entire generation has been born since? The memory-focused rebuilding of Ground Zero in lower Manhattan most painfully raises this question. Does anyone still care that One World Trade Center, formerly known as the Freedom Tower, is 1,776 feet tall in a gesture of patriotism? Or, that the $4 billion publicly-funded Oculus shopping mall (excuse me, transportation hub) has a skylight that opens on the anniversary of 9/11? The 9/11 museum, which tells a nationalistic story of 9/11 as an exceptional historical event and sells 9/11 hoodies and souvenirs in its gift shop, looks increasingly dated. Both the museum and the memorial are hugely expensive to run, the memorial because of its water features and security costs. The museum’s business model of selling entry tickets to tourists for $26 has been challenged by the pandemic, forcing it to furlough or lay off more than half its staff and cancel its plans for 20th anniversary special exhibitions. It has become the subject of debate about its relevance a mere seven years after its opening.
Despite the proliferation of 9/11 memory, a notable shift in national memorialization was signaled when in 2018 the National Memorial for Peace and Justice and Legacy Museum opened in Montgomery, Alabama. A memorial to over 4,000 victims of lynchings, it demands recognition that terrorism is not a new or foreign aspect of American history, but has long been a part of the US national story as racial terrorism. The Legacy Museum re-narrates the US myth of racial progress to argue that the contemporary mass incarceration of Black Americans is evidence that slavery never completely ended.
National Memorial for Peace and Justice and Legacy Museum, Montgomery, Ala. Photo by author.
Here, memory is being deployed not to uphold a myth of national unity, as 9/11 memory did, but to demand that the national script be revised. It is notable that Bryan Stevenson, who founded the memorial and museum through the Equal Justice Initiative, felt that memorialization was the best strategy to raise public awareness about the legacies of slavery. At the same time, long fought-over Confederate monuments have been toppled and removed, a reckoning with the past that once seemed impossible and now seems inevitable.
We don’t know what the new era we have entered will bring. But the demand that the nation confront its own history of terrorism has been activated. This means remembering terrorism not as a force that comes from outside but as a fundamental aspect of the American project of settler colonialism and slavery that lives on today. The memory of this past must be confronted in the present. To recognize the history of US terrorism is not to demand shame but to open up the opportunity for the nation to move forward from its difficult histories.
My home state of Arkansas has been the subject of many recent news reports due to a low incidence of vaccination against COVID-19 combined with a high incidence of the unvaccinated rushing to feed stores to purchase cattle deworming agents under the belief that these are more effective against a raging respiratory illness than any of that stuff being “pushed” by the “medical establishment.” Many commentators have linked this behavior to the prevalence of conspiracy theories on social media, while others center the increasing failure of average citizens to respect the knowledge and skills experts have accumulated through years of education and experience. The various books touted as providing some kind of means for understanding our present moment fall into these frameworks, from Nancy L. Rosenbaum and Russell Muirhead’s A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy (2019) to Tom Nichols’s The Death of Expertise: The Campaign against Established Knowledge and Why It Matters (2018). However, I would argue that the best text for making sense of why people across the country are rushing to eat horse paste during a pandemic was published more than thirty years ago—Nathan O. Hatch’s 1989 The Democratization of American Christianity.
By tracking the evolution of five difference religious movements in the early American republic (the Christian movement, Methodists, Baptists, Black churches, and Mormons), Hatch demonstrates how the revolutionary fervor for all things democratic pervaded the spiritual and cultural realms as much as it did the political, resulting in the conscience becoming individualized and all traditionally earned authority being held suspect. “Above all, the Revolution dramatically expanded the circle of people who considered themselves capable of thinking for themselves about issues of freedom, equality, sovereignty, and representation. Respect for authority, tradition, station, and education eroded,” Hatch writes. “It was not merely the winning of battles and the writing of constitutions that excited apocalyptic visions in the minds of ordinary people but the realization that the very structures of society were undergoing a democratic winnowing.”
These popular religious movements “denied the age-old distinction that set the clergy apart as a separate order of men, and they refused to defer to learned theologians and traditional orthodoxies.” In addition, they also empowered “ordinary people by taking their deepest spiritual impulses at face value rather than subjecting them to the scrutiny of orthodox doctrine and the frowns of respectable clergymen.” One early Baptist leader, John Leland, emphasized the right of any layman to read and interpret the Bible for himself, writing, “Did many of the rulers believe in Christ when he was upon earth? Were not the learned clergy (the scribes) his most inveterate enemies?” Some backwoods religious dissenters went even further and emphasized their own illiteracy as making them purer vessels for the Almighty, unable to be corrupted by traditional, elite education.
The craving for democratic equality went beyond the statehouse and beyond the church and infused all spheres of life. In his work, Hatch draws occasional attention to folk like Samuel Thompson, an “uneducated practitioner of natural remedies who learned his botanic medicine in rural New Hampshire at the close of the eighteenth century.” In his autobiographical narrative, Thompson argued that Americans “should in medicine, as in religion and politics, act for themselves.” He and his adherents published an array of pamphlets and journals that “made their case by weaving together powerful democratic themes.” As Hatch summarizes, the cornerstone of their teachings was this: “In the end, each person had the potential to become his or her own physician.” No wonder, then, that such medical practices became adopted by these increasingly democratic religious movements. And medicine was not the only profession increasingly undergoing a “democratic winnowing,” for everywhere, common people were determined to throw off the shackles of traditional expertise often associated with the old order as it pertained to law, economics, and more.
According to legend, as British troops surrendered to General George Washington at the end of the Siege of Yorktown, their band showed a remarkable sense of the historical import of the moment by playing a little tune called, “The World Turned Upside Down.” Most historians hold this tale to be apocryphal, but there is some irony in the legend, for the ballad was first published in the 1640s as a protest against Parliament’s prohibitions of the celebrations of Christmas. In England, the common people defied the elite Parliamentary infringement upon their “lowly” traditions, singing:
Listen to me and you shall hear, news hath not been this thousand year: Since Herod, Caesar, and many more, you never heard the like before. Holy-dayes are despis’d, new fashions are devis’d. Old Christmas is kickt out of Town. Yet let’s be content, and the times lament, you see the world turn’d upside down.
Yet in America, those “new fashions,” associated with Herod and Caesar in this ballad, flew under the banner of Christ. But not only was Christmas “kickt out of Town” in America—so, too, was that more recent tradition of inquiry dubbed the Enlightenment. When I was in grade school, we were taught to regard the American Revolution as the apogee of Enlightenment thinking, the incarnation of those lofty ideas of liberty exposited by the likes of John Stuart Mill and Jean-Jacques Rousseau. And certainly, those who formulated the Declaration of Independence and the American Constitution drew from their works. However, the main thrust of American culture ran in the opposite direction.
“This vast transformation, this shift away from the Enlightenment and classical republicanism toward vulgar democracy and materialistic individualism in a matter of decades, was the real American revolution,” writes Hatch. Some writers may lament the so-called “death of expertise,” but we have to ask: when was it ever truly embraced here in the United States? Those empty shelves of cattle dewormer at the local feed store are as much a legacy of the American Revolution as the laws that govern this country. The motivation to self-medicate with horse paste is driven not by some kind of new, mad, conspiratorial thinking but, instead, by a fervent, foundational belief that “all men are created equal.”
Editor’s Note: HNN recently reposted an excerpt of a Medium post authored by Carol Berkin, Richard D. Brown, Jane E. Calvert, Joseph J. Ellis, Jack N. Rakove, and Gordon S. Wood. That post took the form of an open letter of critique of remarks made by Dr. Woody Holton in the Washington Post addressing the significance of Lord Dunmore’s proclamation promising emancipation to enslaved Virginians who took up arms on the side of the Crown in 1775, and of the broader significance of the preservation of slavery as a motive for American independence.
HNN has offered Dr. Holton the opportunity to publish a rejoinder, which he has accepted.
I am flattered that six distinguished professors of the American Revolution have taken an interest in my work—or least its potential impact. Just one index of these scholars’ significance is that I cite all six of them in my reappraisal of the founding era, Liberty is Sweet: The Hidden History of the American Revolution. It is due out next month.
But it saddens me that these senior professors have chosen to deny the obvious fact that the informal alliance between enslaved African Americans and British imperial officials infuriated white colonists and helped push them toward independence. Surely the professors know that the Continental Congress chose as the capstone for its twenty-six charges against King George III the claim that the king (actually his representatives in America) had “excited domestic insurrections”—slave revolts—“amongst us.”
Congress’s accusation culminated more than a year’s worth of colonial denunciations of the British for recruiting African Americans as soldiers and even—allegedly—encouraging them to slit their masters’ throats (as writers in Maryland, Virginia, and North Carolina all expressed it). Indeed, the six professors’ timing is perfect. Others having also doubted this claim, especially in reaction to the New York Times’s “1619 Project,” I last month began a project of my own. Every day I tweet out one quotation from a white American of 1774-1776 who denounced Britain’s cooperation with African Americans, along with an image of the quoted document.
The book version of the #1619 Project appears in 76 days. 1 of its central claims—that colonial whites’ rage at the Anglo-African alliance pushed them toward Independence—has been disputed. So I will tweet 1 piece of evidence every day for the next 76.— Woody Holton (@woodyholtonusc) September 1, 2021
I will end the series after seventy-six days, but I have collected sufficient evidence to go on and on.
I am in no position to lecture these distinguished professors, who count three Pulitzer prizes among them, but since they have criticized my work, I have no choice but to speak plain: I think their critique betrays a fundamental misunderstanding of how the Declaration of Independence came about.
It happened in stages. In 1762, most colonial freemen were, all in all, satisfied with their place in the British empire. Indeed, as Prof. Wood’s former student Brendan McConville emphasizes in The King’s Three Faces, they loved their new king. The initiative for changing the imperial relationship came not from the colonies but from Parliament. From 1763 through late 1774, Parliament sought more from the provincials, especially in the areas I like to summarize as the 4 Ts: taxes, territory, trade, and treasury notes (paper money). And all the free colonists wanted was … none of those changes. Until late in 1774, they strenuously resisted Parliament’s initiatives, but most of them would have been perfectly happy to return to the status quo of 1762. They did not seek revolution but (to use another loaded word from English history) restoration.
The grand question then becomes, “What converted the colonists from simply wanting to turn back the clock—their view from 1763 to 1774—to desiring, by spring 1776, to exit the empire?” Many things: the bloodshed at Lexington, Concord, and Bunker Hill; the news that the administration of Lord North was going to send German (“Hessian”) mercenaries against them, the publication in January 1776 of Common Sense, and much, much more.
All I argued in the essay that the professors criticize is that one of these factors that turned these white restorationists into advocates for independence was the mother country’s cooperation with their slaves. It was not the reason, but it was a reason. And that is important, because it means that African Americans, who of course were excluded from the provincial assemblies and Continental Congress, nonetheless had a figurative seat at the table.
Nor was Blacks’ role passive. Congress depicted them as incited to action by the emancipation proclamation issued by Lord Dunmore, the last royal governor of Virginia, and the professors adopt that same formulation. But here again, the timeline is crucial. Whites began recording African American overtures to the British in the fall of 1774. At first British officials turned them away, but they kept coming, right up until Dunmore finally published his emancipation proclamation on November 15, 1775, four score and seven years before Lincoln’s.
The professors claim that white colonists were already headed toward independence in fall 1774, when these African American initiatives began. But in this they indulge in counterfactual history—assuming they know what would have happened. It seems clear to me that, even that late, had Parliament chosen to repeal all of its colonial legislation since 1762, it could have kept its American empire intact. What we are looking for are the bells that could not be unrung. Especially in the south, one of the British aggressions that foreclosed the possibility of reconciliation was the governors’ and naval officers’ decision to cooperate with the colonists’ slaves (as well as with Native Americans—the Declaration of Independence’s “merciless Indian Savages”—but that is another story).
In Liberty is Sweet, I supply much more evidence for my stadial (stages-based) view of the road to independence. I compare it to a mouse’s escape from a maze, since it was the product not of a grand design but of a series of discrete choices at intersections, from none of which the next was visible. Would that the distinguished professors had waited to judge my reinterpretation by my 700-page book rather than the 700-word Washington Post article I wrote to promote it!
The professors may be correct that we would still get independence even if we removed one of its main ingredients, like Dunmore’s Proclamation … or the Battle of Lexington and Concord, which I teach as not only the first battle of the revolution but also, for many, especially in New England, the final argument for independence. But I would never take that remote possibility as a reason to write a history of the American Revolution that omitted Lexington and Concord. And by the same token, I hope the professors would never omit the Anglo-African alliance.
I agree with the professors that it would be a disservice to pretend that enslaved Americans played a significant role in the origins of the American Revolution if there was no evidence that they did. But the evidence is overwhelming, and I invite you to sample it on Twitter at @woodyholtonusc. If we heed the professors’ call to ignore the influence of the enslaved people of the founding era, we will dishonor not only those heroic Americans but our own search for truth.
Good morning, HNN!
I’m pleased to present the first episode of Season 3 of Skipped History, chronicling the Attica Prison uprising of 1971. It’s been 50 years since the stunning rebellion, and still the consequences are unfolding:
Today’s story comes from Blood in the Water by Heather Ann Thompson. Have you read it? I found it truly draw-dropping.
I hope you enjoy the video. Questions, comments, and suggestions for further reading are welcome!
The controversy over Critical Race Theory has animated teachers, school administrators and state legislators — not to mention parents. The former chancellor of the New York City schools, Richard Carranza, went so far as to proclaim that it was the duty of teachers to combat “toxic whiteness” — a disastrous term that was picked up by the New York Post.
One of the difficulties in discussing Critical Race Theory is that the term has become entwined with the ideas in Robin DiAngelo’s White Fragility. Endless disclaimers that Critical Race Theory (CRT) is about systemic rather than individual racism seem specious to those who conflate the idea with the so-called “anti-racism training” associated with DiAngelo, and the passive-aggressive personal confrontations offered in her training sessions. Educators and others are afraid of undoing the self-esteem of white students, and this is a legitimate concern. I imagine that many race-training sessions at workplaces are intimidating to adults, but the idea is even more of a danger to classroom teaching. No teacher should enter a classroom and announce that “I will be very cautious about this, but you need to understand that you all as individual white people are perpetuating racism in this country.” You cannot have a real discussion after that, no matter how gently you try to approach the subject. As a long-time teacher of American history, I hope to show that it is possible to discuss racism and the years of protests against it without intimidating students of color or white students.
This essay is dedicated to the students and teachers who want to cut through the controversy about teaching race and racism to confront the truths in American history with all its twists and turns, lights and shadows.
I was a teacher of American history for more than 30 years at a high school in Brooklyn, at several of the New York City community colleges, and Hunter and City College as an adjunct instructor. I taught abolition, slavery and Civil Rights, which consumed much of my class time from the first day to the last every semester. My students were a glorious mixture of nearly every race and color in New York City.
I treated them whatever their academic level as intellectuals-in-training by assigning them speeches and documents long and short for homework, which they had to bring to class the next day. I would not lecture, give them any questions to answer or ideas to look for when they read. Instead they were asked to choose sentences they liked or disliked for what ever reason and made brief comments explaining their choices.
In some classes I had the kids write the first few words of their sentences on the blackboard and then we would discuss what they had chosen. They read their whole sentence out loud and everyone read silently along with them. These were works by ML King Jr., Frederick Douglass, Madison Grant (one of the founders of scientific racism) and Barack Obama. I also taught a class in which we read only American speeches. My first question nearly every day we did these longish readings of 10 or 15 pages was for example, “What do you think about Frederick Douglass’s “Fourth of July Oration?” That would lead to a discussion that served as an introduction to the lesson before we turned to their sentences. We would do shorter documents – sometimes in class – one or two times a week and the longer ones twice in three weeks. In my high school classes I called this the Tarzan Theory of Reading because we were swinging through the document by grabbing on to sentence after sentence.
To teach you have to “bring the things before the eyes.” When we discussed the clause “ all men are created equal” the understanding came from the students that it embodies the hopes and dreams of every American and, simultaneously the nightmares of inequality and violence that people of color have been forced to live with in this country. I did this on the first day of every class when the students suggested events for a timeline from 1492 to 1865. They each would write down three events, I would ask students to volunteer one event, I would write them on the board and then we would discuss them as we went along. The Declaration of Independence and its most famous phrase always came up. When it came to teaching the American Revolution, I spent five days discussing the document and its implications for the Revolution and history up to the present. Whatever you think of the New York Times’ 1619 Project with its attempt to de-emphasize the importance of the Declaration, it is still necessary to understand the most famous phrase in American, if not world history. Here is a brief description of my lessons on the Declaration concentrating on the last day in the sequence when we discussed the meanings of “all men are created equal.”
The first assignment for the series of Declaration classes was to determine how many parts the document had – keeping within a limit of five. Then the students were asked to find and underline the references to the Native Americans, the Stamp Act, the Boston Port Act, the Quebec Act and the Massachusetts Government Act. The last three were parts of the Intolerable Acts which caused the Americans to respond by forming the first Continental Congress in 1774. By narrowing down the Declaration structure to three parts, the students could see that in the middle part of the document all the sentences began with He or For. Those were the Grievances.
The students pointed out each of the grievances that they had underlined. We concluded that “For taxing us without our consent” was ambiguous. It could be the Stamp Act or the Navigation Acts, the Townshend Acts or the Tea Tax which was the motivation for the Boston Tea Party in 1773. The Intolerable Acts were more straightforward to identify. All of these details were parts of classes for the weeks prior to our study of the Declaration which was written in June and July of 1776, more than a year after the first battles of the American Revolution, Lexington and Concord in 1775. Originally celebrated as Patriots’ Day in Boston on April 19 it began with the famous “shot heard ’round the world.”
Then I asked for a volunteer to read the first paragraph of the Declaration itself, which begins with “When in the Course of Human Events….” and ends with “impelled them to the separation” in their version. (The document I used was the version from the Yale Avalon 18th century document site.) When I asked them what they thought of the opening words, the students concluded that it is a theory of history: people make history. This is a description of agency, a key term for historians that describes how all peoples can take control of their fate. In our case, the Americans became revolutionaries by protesting the Stamp Act, and the Tea Act, and forming the First Continental Congress in response to the Intolerable Acts.
The first paragraph also discusses the laws of nature and nature’s God that entitled the Americans to separate from Great Britain. Now a student reads the next few sentences which contain the clause “That all men are created equal” and list the unalienable rights to life, liberty and the pursuit of happiness. Thus, “all men are created equal” is one of the natural laws like the right to life and liberty and, of course gravity, that are all on the same plane here: natural laws –- the laws discovered by Sir Isaac Newton that describe how the universe runs. We discussed the most famous clause, “all men are created equal,” by itself on the last day of the week.
At this point I hand out the part of the Declaration written by Thomas Jefferson that the Second Continental Congress dropped from the final version. It was the section that blamed the king for slavery in the 13 American colonies. Here is the beginning:
He has waged cruel war against human nature itself violating its most sacred rights of life and liberty in the persons of a distant people who never offended him, captivating and carrying them into slavery in another hemisphere, or to incur miserable death in their transportation thither. (T)his piratical warfare, the opprobrium of infidel powers is the warfare of the Christian King of Great Britain.
As my students saw immediately, Thomas Jefferson described the slaves as humans with natural rights and he called slavery “cruel war.” Clearly this omitted section was meant to be part of the grievances because the paragraph begins with “He.” Jefferson uses the phrase “piratical warfare” which might be obscure to readers today, but my students knew it referred to man-stealing, an abolitionist term for enslavement. They had read the five-page polemic “African Slavery in America,” by Tom Paine for the third day of class. Man-stealing is one of the key phrases in his abolitionist pamphlet published in 1775 by the Pennsylvania Journal and Weekly Advertiser. Jefferson also refers to the middle passage, which caused the enslaved people to suffer “miserable death” in their journey to North America. He also sarcastically called the slave trade the work of the “Christian King of Great Britain,” who was practicing the “excreble commerce” of the “infidel powers:” the Spanish, Portuguese and Muslims who had preceded the British in the slave trade. Now the hypocrisy of future president Jefferson becomes the topic of discussion; especially since his livelihood depended on the labor of hundreds of enslaved persons on two plantations. He blamed the king for foisting the slaves on the Americans and complained that the king was also
exciting those very people to rise in arms among us and to purchase that liberty of which he has deprived them, by murdering the people upon whom he also obtruded them: thus paying off former crimes committed against the liberties of one people with crimes which he urges them to commit against the lives of another.
Such blatant hypocrisy was common for those defending system of slavery; especially in view of his words about equality and liberty for whites and blacks in this very document. The audacity of Jefferson to claim that “his” slaves were unfairly treated by the king and that the king was to blame for his (Jefferson’s) own ill-gotten gains reminds me of a drug dealer who claims it is fine to sell drugs to “get over.” Of course he does not take them himself, which would be dangerous to his health. It takes a close reading of the phraseology in the quote above to figure out who were the slaves and who were the Patriots in the convoluted grievance. It is remarkable that almost all of the Declaration is clearly written. It is a prose poem that drives you on, in the same way the Gettysburg Address does. This omitted section is turgid.
Finally, we must point out why the Second Continental Congress rejected the grievance on slavery and the king. The Congress had agreed that the Declaration had to be unanimous in order to create a united front against the king and his army. But the slaveholders, led by South Carolina, refused to vote for the Declaration if it included the section criticizing slavery. It was left out of the final version.
What makes this intimate bond of Enlightenment idealism and rank racism a grievance is the argument that the king is encouraging the slaves of the Patriots (not the Loyalists, truth be told) to kill the revolutionaries to obtain their freedom as members of the British Army. The slaves of the Loyalists were not offered that opportunity by Lord Dunmore in his recently famous but misunderstood Proclamation of 1775. After all, the Loyalists were supporting the king, so they could keep their slaves. This idea in the section comes out in the final grievance that says “he has excited domestic (slave) insurrection among us” which Jefferson couples with a separate grievance condemning the king for encouraging the “merciless Indian savages” to wage war against the Patriots by murdering our people of “all ages, sexes and conditions.” Students are not used to reading the word “savages:” shocking language for a document about equality. It is an assault on modern sensibilities, but of course it was a common way to refer to the Native Americans.
So this class began with a discussion of the causes of the turmoil in the 1760s and ’70s as examples of human agency, then moved to a description of unalienable rights based on nature or nature’s God, then finally to a justification of the Revolution as part of natural law, and an inclusion of the idea “all men are created equal” as one of those natural laws. But, as has become apparent in context, all this is bound up with the deep hypocrisy about the contradiction of holding humans with natural rights in the bonds of what the slave holding founding father called the “cruel” and “piratical” war that we call chattel slavery.
At this point we have read only about 80 words at the beginning of the final version of the document.
Part II of this essay will appear in the coming weeks on HNN.
FDNY Fireboat John D. McKean celebrates the 125th anniversary of the Brooklyn Bridge, 2008.
Editor’s Note: The Retired FDNY Fireboat John D. McKean was pulled into emergency service on September 11, 2021 supporting firefighting with its still-functional pumps and transporting evacuees from Manhattan. The McKean and its crew are discussed in this week’s essay by Jessica DuLong on the 9/11 boatlift.
New Yorkers and visitors to the city will have the opportunity this month to visit the retired FDNY Fireboat John D. McKean at the Hudson River Parks Friends’ Pier 25 in lower Manhattan. The McKean will be available to visitors and for public and private rides in New York Harbor.
The McKean was purchased at auction in 2016. In 2018, the Fireboat McKean Preservation Project was formed with the mission of preserving this historic vessel for museum and educational purposes. In 2019, the McKean underwent major repairs to its hull at the North River Shipyard in Upper Nyack, NY, with subsequent restoration of the ship’s above-deck areas. Through the Hudson River Parks Friends, the McKean will be able to dock at Pier 25 to be in proximity to lower Manhattan for the 20th anniversary of the attack on the World Trade Center on September 11, 2011.
John D. McKean was a marine engineer aboard the Fireboat George D. McClellan when he was fatally burned by a steam explosion in 1953. Despite his injuries, he kept his post to help bring the ship in safely. An already-ordered but yet-to-be commissined fireboat was named in his honor the following year. John McKean’s spirit was reflected in the vessel’s service supporting firefighting and transporting evacuees on September 11. The McKean was also used to fight the 1991 Staten Island Ferry Terminal fire and assisted with the rescue of passengers of the US Airways Flight 1549 Hudson River landing executed by pilot Chesley “Sully” Sullenberger in 2009.
For more information, go to www.fireboatmckean.org .
Secretary of State Henry Kissinger, President Richard Nixon, and Maj. Gen. Alexander Haig discuss the situation in Vietnam at Camp David, November 1972.
There have been any number of pundits comparing the fall of Saigon in 1975 to the horrors we are witnessing as the United States withdraws from Afghanistan. Events in Afghanistan are far worse, some argue; at least in Vietnam there was a “decent interval” between the Peace Accords signed in January 1973 and the collapse of South Vietnam in April 1975.
All of this misses the point. Both situations prove the folly of starting a war where there is little likelihood of unconditional surrender. History screams this lesson, and the United States in its delusional belief in its vast military power continues to fall into the trap that somehow this time will be different.
“Vietnamization” did not work any better than attempts by the United States to put the Afghan government and its military on their feet after the American invasion in October 2001. The chaotic end to both long wars was predestined from the first days of the conflicts. Unless an invading nation can impose its terms after an unconditional surrender, as seen at the end of the Second World War with Germany and Japan, the result will be the predictable loss of public support for the war and a withdrawal that leaves the situation in chaos.
Even the end of the First World War illustrates this problem. After four years of brutal war, the parties stopped fighting under the terms of an Armistice brokered in large part by the United States. The war had been a draw—there were no clear winners and no clear losers. But the peace treaty put the blame on Germany and a weak democratic government was installed, known as the Weimar Republic. After a brief time and intense disorder and fighting in the streets of Germany, the Nazis rose to power, arguing the peace treaty was “a stab in the back,” and a second even more catastrophic world war followed a 20-year interregnum.
In Vietnam, the key architects of the end of the war, Richard Nixon and Henry Kissinger, knew that the peace they were forcing on South Vietnam was highly unlikely to last and that President Thieu of South Vietnam was doomed by the very terms Kissinger was negotiating in Paris with the North Vietnamese Special Advisor, Le Duc Tho.
Looking back, the terms seemed almost ridiculous: leave the Vietcong in place in the South, remove the Americans, hold supposed free elections to reunite the country, and allow material to continue to pass through the demilitarized zone (DMZ). Worse, the United States had to agree to pay billions of dollars to North Vietnam to help it rebuild after years of war (euphemistically referred to in the peace accords as “healing the wounds of war and postwar reconstruction”).
In one of the more instructive Nixon tapes, Nixon and Kissinger spoke just days into the new year of 1973 in the Oval Office about the inevitability of the fall of South Vietnam. Two options were under consideration: Option One (peace now) and Option Two (continued bombing for return of POWs).
“Look, let’s face it,” Nixon told Kissinger, “I think there is a very good chance that either one could well sink South Vietnam.”
Nixon said he had responsibilities “far beyond” Vietnam alone and that he could not allow “Vietnam to continue to blank out our vision with regard to the rest of the world.” He spoke of his efforts at détente with the Russians and his recent opening of China.
And while the options both carried great risks, Nixon knew the American public wanted out and even a bad result was better than continuing. He had trouble even verbalizing the concept given how painful the reality was to him. “In other words,” he told Kissinger, “there comes a point when, not defeat—because that isn’t really what we are talking about, fortunately, in a sense—but where an end which is not too satisfactory, in fact is unsatisfactory, is a hellava lot better then continuing. That’s really what it comes down to.”
Kissinger responded that if President Thieu’s government fell, it would be “his own fault.”
There is a unique humiliation brought down on a warring superpower like the United States when the enemy combatants know they can outlast public opinion in America. The North Vietnamese correctly read the results of the national election in the U. S. in November 1972. Yes, President Nixon won in a landslide; but in Congress Democrats held onto their majorities in the House and the Senate. In fact, the Democrats gained six seats in the Senate—one being a freshman from Delaware named Joe Biden.
After that election, peace was no longer “at hand,” as Kissinger erroneously predicted. Instead the North Vietnamese dug in. Congress threatened to cut off all aid, military and otherwise, if Nixon did not end the war.
In Paris, Kissinger was rudely greeted with a request that the United States set a unilateral deadline for withdrawal. “North Vietnam’s sole reciprocal duty,” Kissinger sarcastically wrote, “would be not to shoot at our men as they boarded their ships and aircraft to depart.”
It took a massive bombing campaign in December 1972, the infamous “Christmas Bombings,” to finally exert enough pressure to obtain a peace agreement that was really nothing more than a veiled surrender.
We do not know what the future holds for Afghanistan. Predictions range from total disaster to perhaps, in time, normalization. Neither scenario is likely. Vietnam teaches us something. After 50,000 American dead and millions killed across the region, nearly fifty years later, Vietnam is on no one’s “axis of evil” list. Americans vacation there; trade is brisk between the two nations.
Korea, the one outlier to the rule that unconditional surrender is the only sure way to predict an outcome in war, actually proves the point given the incredible price the United States continues to pay in maintaining a military presence there—and all the United States and world have to show for this continuing commitment is a virtual madman in charge of North Korea who justifies an aggressive nuclear arms program because of the bogeyman of a continuing threat of war following an armistice that was signed nearly seventy years ago.
The lessons of the First World War, Vietnam and Afghanistan are plain: don’t overestimate American military power and don’t expect clean exits where it was predictable from the start that a long-term war was not winnable.
“No matter what antiabortion crusaders try, pregnant people will always find ways to have abortions — and networks that go beyond borders have long helped them navigate treatment options.”
“The Attica prison uprising was historic because these men spoke directly to the public, and by doing so, they powerfully underscored to the nation that serving time did not make someone less of a human being.”
Using comparisons to the Taliban or other Islamic radicals to attack anti-choice laws obscures the deep roots of misogyny in white Christian America.
A historian argues that a recent and influential book calling for reparations could strengthen its case by considering the arguments made by historians about the connections of American slavery to other manifestations of racism. What’s needed is to link reparations to a global overturning of racial inequality.
The adoption of rhetoric of “humane war” after Vietnam has allowed discussions of how to wage war to sideline discussions of whether to wage war at all, and encourages secrecy, surveillance, and long-term engagement.
Philadelphia’s housing crisis during the first world war shows that worker and citizen activism is essential to compel governments to act to secure adequate affordable housing.
The influential Black newspaper’s publisher Robert L. Vann has been criticized as a self-promoting tribune of the Black bourgeoisie. A historian argues he should be reconsidered as a pragmatist building alliances in a time of upheaval for Black America.
Political factions and then organized parties have fought over the size, composition and geographical ordering of the electorate since the founding. This legacy today undermines the legitimacy of government and the political will to protect the right to vote.
One scholar’s project is using Wikipedia and her students to recover the historical personhood of Dante’s women and elevate them above literary symbols or caricatures.
Whatever the causes of the decline in history enrollments, it’s not because history departments have rejected the study of war and military history.
It has now been twenty years since the terrorist attacks of September 11, 2001 plunged the nation into shock, consternation, grief, and fear. Amid the despair over the loss of nearly three thousand lives and the anxieties about further strikes, many questions arose over how such a devastating blow on American soil could have happened. The most important of them was also the most elusive: were the attacks preventable? After two decades of investigation, the answer remains an equivocal “perhaps.”
Presidents Bill Clinton and George W. Bush were well aware that the Islamist militant Osama bin Laden and his Al Qaeda network posed a serious threat to American interests and lives. Clinton compared him to the wealthy, ruthless villains in James Bond movies. To combat the dangers that Al Qaeda created, he and his advisers considered a wide range of military and diplomatic options that ranged from kidnapping bin Laden to U.S. military intervention in Afghanistan. But the use of cruise missiles against Al Qaeda camps in Afghanistan in 1998 produced acutely disappointing results. Other military alternatives seemed too risky or too likely to fail and diplomatic initiatives proved fruitless.
During the transition after the 2000 presidential election, Clinton and other national security officials delivered stark warnings to the incoming Bush administration that bin Laden and his network were a “tremendous threat.” The immediacy of the problem was heightened by Al Qaeda’s bombing of the destroyer USS Cole in the harbor of Aden, Yemen in October 2000, which caused massive damage to the ship and claimed the lives of 17 crew members. Clinton and his advisers strongly recommended prompt consideration of the options they had weighed.
Bush and high-level national security officials were not greatly impressed. They regarded terrorism as an important but not top-priority problem. The president later revealed that he did not feel a “sense of urgency” about bin Laden and that his “blood was not … boiling.”
The Bush administration viewed Clinton’s campaign against Al Qaeda as weak and ineffective, and it was dismissive of the advice it received. Rather than drawing on the experiences of its predecessor, it embarked on the preparation of a “more comprehensive approach” that National Security Adviser Condoleezza Rice believed would be more successful. During the spring and summer of 2001, it worked at an unhurried pace, even in the face of dire warnings from the U.S. intelligence community that Al Qaeda was planning attacks that could be “spectacular” and “inflict mass casualties,” perhaps in the continental United States.
Eight months after he took office, Bush’s White House completed its comprehensive plan to combat Al Qaeda. The steps it included in the form of a National Security Presidential Directive (NSPD) were strikingly similar to the options the administration had inherited from Clinton. The final draft of the NSPD called for greater assistance to anti-Taliban groups in Afghanistan, diplomatic pressure on the Taliban to stop providing bin Laden safe haven, enhanced covert activities in Afghanistan, budget increases for counterterrorism, and as a last resort, direct military intervention by the United States. This proposal was little different in its essentials than what the Clinton administration had outlined, and it offered no novel suggestions on how to carry out its objectives more successfully. Deputy Secretary of State Richard Armitage later commented that there was “stunning continuity” in the approaches of the two administrations.
The NSPD landed on Bush’s desk for signature on September 10, 2001.
The troubling question that arises is: could the calamities that occurred the following day have been prevented if the NSPD had been approved and issued earlier? There is no way of answering this question definitively; it is unavoidably counterfactual. Yet it needs to be considered. The 9/11 plot was not so foolproof that it could not have been foiled by greater anticipation and modest defensive measures.
The threat that Al Qaeda presented was well known in general terms within the national security apparatus of the federal government, even if specific information about possible attacks was missing. But responsible officials and agencies did not do enough to confront the problem. A presidential statement like the NSPD of September 10, if distributed sooner, could have called attention to the dangers of potential terrorists present in the United States. The CIA and the FBI failed to track the whereabouts or investigate the activities of two known Al Qaeda operatives who lived openly in California for about 20 months, took flying lessons, and participated in the hijackings on 9/11.
On July 5, 2001, high-level officials from seven agencies received a briefing from the National Security Council’s National Coordinator for Counterterrorism, Richard A. Clarke. He cited the dangers that Al Qaeda presented and the possibility that it “might try to hit us at home.” The agencies responsible for homeland security did not react in meaningful ways to the warning, largely because a terrorist strike seemed far less likely in the territorial United States than abroad. Perhaps an earlier NSPD, armed with the weight of presidential authority, would have sharpened the focus on the risks of a terrorist plot within America and galvanized security officials and agencies into effective action. Perhaps, for example, the Federal Aviation Administration would have tightened airline boarding procedures or made terrorists’ access to cockpits more difficult. The FBI instructed its field offices to make certain they were ready to collect evidence in the event of a terrorist assault, but it did not order them to take any special steps to prevent an attack from occurring.
Even if the “what-if” queries surrounding the failures that allowed 9/11 to happen cannot be answered, we can agree with Condoleezza Rice’s heartfelt admission in her memoirs: “I did everything I could. I was convinced of that intellectually. But, given the severity of what occurred, I clearly hadn’t done enough.” Earlier adoption of the NSPD might not have made a difference. But the haunting thought remains that it might have spared America the agony of 9/11.