There’s always something new going on in the History Department.
There’s always something new going on in the History Department.
As a biographer I envy novelists, who can craft a captivating tale without needing to carefully document sources of information. Though they face definite challenges, fiction writers can, with whatever degree of knowledge and understanding they possess, invent composite characters and telling dialogue.
As a reader of Holocaust literature, I prefer non-fiction. Give me the facts straight up, please. I do not want to be left wondering whether some person existed or some action occurred. And though I love good stories, I see little need to manufacture them when the truth is powerful and strange and terrible enough.
As a daughter of survivors, I think my grandparents—two of whom were in their early forties and two in their early sixties when they were killed in Auschwitz-Birkenau—would have wanted the full truth of what happened to them and their children to be wailed unto the heavens—lest they all vanish without a trace of having existed, as evidence of the human capacity for evil.
I understand, however, that a case can be made for Holocaust fiction. In portraying virtuous, ignoble, or complex characters in extreme situations, novelists shed light on human behavior in normality. Vivid scenes facilitate our entry into foreign worlds. And in distilling events, the fiction writer can make the complicated comprehensible. Finally, readers who might not otherwise have known that Mengele experimented on twins, or that diplomat-rescuers saved some prominent Jews, or that tattooists engraved numbers on the arms of select Auschwitz inmates—or about any other of the innumerable dimensions of the maelstrom—might learn something. They may be spurred to further exploration.
When fiction is “based on true events” we receive more than the author’s imagination. Of course, the degree to which such stories can be relied upon for historical accuracy varies. How deeply did the author research the subject? How can those of us who are not scholars evaluate whether we are getting a true picture?
Some fiction writers use the Holocaust in the service of a good yarn—as if throwing perpetrators or victims or survivors into their narrative adds pathos or heft. Sometimes the most fantastical accounts (for example, the movies Life is Beautiful and Jo Jo Rabbit) dish up the absurd enmeshed in the plausible, like an SS officer barking orders at inmates in a language they did not understand. This happened. Nazi leaders training Hitlerjugend to defend the fatherland—this happened. While such works may be accused of trivializing the most serious of subjects, they make no claim to being other than farcical.
But pretenders to truth (such as Binjamin Wilkomirski’s Fragments: Memories of a Wartime Childhood) indisputably cross a line. Passing off a false account as true provides fodder for Holocaust deniers and affronts us all.
Sometimes true accounts are mistaken for fiction. Each semester that I taught a college course on the Holocaust, students would hand in papers that read, “In Elie Wiesel’s novel Night…” The Nobel laureate wrote several novels, but Night is a true account of teenaged Wiesel’s experience of the war. I wanted my students to know that.
Survivor accounts are among our most trustworthy sources. Though it was near impossible for people in extreme situations to remember precise dates and times (not even decently-fed soldiers could recall such details), those who were there were (and are) experts on what they saw and felt on their own skins.
Perhaps, then, the greatest good that ever came out of a work of Holocaust fiction was the Institute for Visual History and Education of the University of Southern California Shoah Foundation. In March of 1994, after accepting an Academy Award for Best Picture for Schindler’s List (based on Thomas Keneally’s historical novel), Steven Spielberg launched an ambitious and impractical project: knowing that survivors’ stories would soon be lost to history, he would capture on video as many of their testimonies as possible. Among the roughly 350,000 survivors then alive, most had been young adults during the war; they were now seniors; it would be a race-against-time.
Moving quickly and efficiently—with the aid of historians and scholars, and production logistics experts; with project directors, coordinators, and, eventually, 2,500 interviewers in 33 cities in 24 countries, Spielberg set about achieving his goal. Wanting viewers to “See the faces, to hear the voices,” he insisted that survivors be interviewed in their own homes wherever possible. They were to tell their complete life stories, but spend most of the interview recounting their Holocaust experiences. Spielberg’s team ultimately amassed 52,000 videotaped testimonies.
What possessed these (mostly) ordinary citizens, including those who were shy or humble or who had never before spoken about their experiences, to dress neatly, invite interviewers and videographers into their living rooms, and open up about the darkest period of their lives? For one, most knew about Steven Spielberg’s Schindler’s List. Secondly, they learned about the project through multiple media sources; flyers and ads with the headline “So Generations Never Forget What So Few Lived to Tell” awakened their sense of moral responsibility. Ms. Miller, a child who had hid with her family in a crowded farmhouse in the hills of Italy, said, “We come forward because we are aware of our own mortality and how important it is to share what happened.”
Fortuitously, during the six-year period (1994-2000) in which the interviewing took place, there was an explosion in the field of information technology, enabling Spielberg’s team to create a vast, searchable cyber-archive. The carefully catalogued and scientifically preserved videos were distributed to various organizations (including the U.S. Holocaust Memorial Museum and Yad Vashem).
It would take twelve years (or more than 105,000 hours) to watch all of the interviews. I have only had time to view some, obtained online through the USC Shoah Foundation. Once I began listening to certain survivors, I could not tear myself away. Their stories are inherently harrowing, gripping, and educational. And authentic.
For his noble work, all of humanity owes a debt of gratitude to Steven Spielberg (whose foundation has subsequently worked with the Kigali Genocide Memorial to capture the testimonies of survivors of the Rwandan genocide). Twenty-five years after the release of Schindler’s List, in December 2018, the filmmaker reflected on this “most important experience” of his career. Survivors who could bear to watch the film told him that it could not compare to what was. But they were glad he told the story—it should not be forgotten.
Owing to the singular circumstances, perhaps no author of Holocaust fiction can aspire to again produce a work as far-reaching as Schindler’s List. But writers who ignore or take liberties with the truth ought to reflect on their purposes. If in some measure they aim to edify, counter hate, and inspire empathy, they might be mindful of those who did not live to tell their stories—who, when they could, engraved their names and places of birth in the walls of barracks, or implored others to remember them. Had they had a choice, I believe Hitler’s victims would have wanted nothing about the mortal crimes against them falsified.
Note: this essay quotes a sign attached to the body of a man lynched in 1919, which uses a racial slur.
HBO’s recent release of a prestige adaptation of Philip Roth’s 2004 novel The Plot Against America makes it worthwhile to examine whether fascism is really so alien to the United States as many wish to believe. In Roth’s novel, a Jewish extended family in Newark experiences fascism’s arrival in America, with the 1932 election of Charles Lindbergh to the presidency, as an intrusion of European extremism against an American Way—and a short-lived one, at that. When many white thinkers ponder Roth’s Plot or the sardonic title of Sinclair Lewis’s 1935 novel It Can’t Happen Here, they often miss the many ways in which it has already happened here. After all, the public prominence of the Ku Klux Klan and the massive riots that rock the climax of Roth’s novel were already regular features of American life, depending upon where you lived. For example, in early May 1927, just a few weeks before the historical Lindbergh took off from Roosevelt Field in the Spirit of St. Louis, some 5,000 whites rioted in the black business district of Little Rock, Arkansas, where they burned the body of man named John Carter. Local police officers, many rumored to be Klan members, did nothing to stop the violence—and may have even taken part.
Black observers of current trends, however, tend to be a little more astute. For example, in his February 21, 2020, New York Times column, Jamelle Bouie argues that the expansive authoritarianism of Donald Trump has its analogue in the Jim Crow South. However, we should not consider the Venn Diagram of fascism and Jim Crow as a circle. The reality is a little more complicated.
The word “fascism” has also long been employed as a political Rorschach blot. As early as 1946, just one year after the end of World War II, George Orwell was complaining of this fact in his essay “Politics and the English Language,” writing: “The word Fascism has now no meaning except in so far as it signifies ‘something not desirable.’” But let us take this as a working definition:
Fascism is the attempt, birthed in reactionary politics, to resolve the contradictions of democracy for purposes of preserving elite power against the demands of the masses.
This will need some explanation. Although we today associate democracy with high ideals, its origins are a bit grubbier. For example, opposition to the Angevin kings of England (which led to the Magna Carta) included, according to historian Robert Bartlett, such charges as “heavy taxation, elevation of low-born officials, slow and venal justice, disregard for the property rights and dignities of the aristocracy.” Much of the Magna Carta focuses upon preserving the property rights and prestige of the aristocracy while limiting the king’s ability to levy certain taxes without “the common counsel of the kingdom.” The eventual emergence of a mercantile class in the late Middle Ages and early Renaissance sparked another expansion of “democracy,” as the bourgeoisie sought similar privileges in order to protect their own wealth. With industrialization, and the eventual concentration of the lower classes into cities, the emerging proletariat began to press for access to the franchise itself on both sides of the Atlantic, as exemplified in the UK by the Reform Act of 1832 and in the US by Jacksonian democracy and the removal of property qualifications for the vote.
At each step in the expansion of democracy, those who already possessed the franchise feared the loss of their power and wealth by allowing any “lower” classes the privilege of voting. The United States has been much more a racial society than a class society along the lines of the UK, and so here it was easier to get elite buy-in to the idea of universal male suffrage, so long as those males were exclusively white. The abolition of slavery and the expansion of suffrage to those former slaves and their eventual descendants provoked the rage of the south’s idle landlords, who initiated a campaign of violence in the immediate aftermath of the Civil War in order to return to the status quo ante of black servitude and submissiveness. What historians call the first Ku Klux Klan was an elite project to scuttle the political empowerment of African Americans.
The “contradictions of democracy” can be seen in this struggle between those who believe that republican government should preserve elite power and the democratic desire that all citizens be given a voice in how they are governed. Fascism is an attempt to short-circuit this tension through the advancement of a purely corporate figure who is cast as the savior of “the people,” not by empowering them but rather by emphasizing his own unique attributes to act on their behalf. As the Israeli scholar Ishay Landa points out in The Apprentice’s Sorcerer: Liberal Tradition and Fascism, while fascism regularly employs the rhetoric of collectivism, it centralizes such collective and democratic yearnings upon the individual strongman leader, so that he becomes democracy personified, the one true spokesman for “the people,” who no longer need engage in self-governance.
But there is more to it. As historian Aristotle Kallis observes in Genocide and Fascism: The Eliminationist Drive in Fascist Europe, fascist ideology was born with the specific aim of seeking redemption from recent “humiliations” by latching onto the glories of the past to drive a new utopian future. This “redemption” manifested itself externally, through expansionist policies of conquest, and internally, through a “cleansing” of the population aimed at eliminating those figures responsible for recent humiliations: socialists, communists, Jews and other minority groups. The drive to “cleanse” the state, Kallis writes, “helped shape a redemptive licence to hatedirected at particular ‘others’ and render the prospect of their elimination more desirable, more intelligible, and less morally troubling.”
This has been just a brief overview, but it allows us to draw some parallels between fascism and Jim Crow. Both fascism and Jim Crow were means of limiting democratic participation and thus the political and economic emancipation of certain “others.” And both fostered a “license to hate” that resulted in massive violence against the enemies of the elite. But there are more parallels. As Landa writes in Fascism and the Masses: The Revolt against the Last Humans, 1848–1945, “Rhetoric of honoring labor aside, the Nazis strove to achieve the exact opposite: keeping wages low and increasing working hours, which was precisely what German business was insisting should be done throughout the years of the Weimar Republic.” Much the same held true in the Jim Crow South, where particular ire was reserved for those who resisted the southern tradition of racialized economic exploitation. In June 1919, after Clyde Ellison refused to work for Lincoln County, Arkansas, planter David Bennett for a mere 85 cents a day, he was hanged from a bridge, with a sign attached to his body reading, “This is how we treat lazy niggers.” Later that year, and not too far away, white mobs and soldiers would slaughter untold numbers of African Americans, in what has become known as the Elaine Massacre, for daring to organize a farmers’ union.
Although both fascism and Jim Crow constituted violent means of securing elite power, there are important distinctions to note. While southern states had their share of demagogues, Jim Crow was a multi-generational project of the Democratic Party, one not centered upon any particular individual. Too, while both fostered a “license to hate” against racial and ideological others, the Jim Crow project made a distinction between “good negroes” who “knew their place” and “bad negroes” who sought the privileges reserved to whites. The latter may have to be killed, and the region “cleansed” of those “outsider” whites who spread dreams of equality, but black people who were dutifully submissive could be tolerated in so far as their unpaid or underpaid labor created the region’s wealth.
Back in 2004, The Plot Against America was widely regarded as a commentary about the administration of George W. Bush. No doubt, the 2020 television adaptation will be viewed in the light of Donald Trump, whose rhetoric and policies have been compared by critics to both fascism and Jim Crow. Perhaps the television series will exhibit a more sophisticated understanding of fascism and America than did the book. Perhaps not. Either way, the series should provide a good opportunity for historians to educate the public on who, exactly, lay behind the centuries-old plot against all Americans.
Antonia Deacock, Anne Davies and Eve Sims were three rather extraordinary women who, when in their mid-twenties and thirties, set off overland from England to Tibet in 1958. Their aim was climbing one of the Himalayas’ unexplored high peaks. They made the 16,000-mile drive to India and back, adding a 300-mile trek on foot to Zanskar, a remote province of Ladakh (part of Kashmir), and became the first European women to set foot there.
They called their adventure the Women’s Overland Himalayan Expedition, and were inspired and encouraged by their husbands, explorers and climbers themselves. Not wanting to be left behind “chained to the kitchen sink” when their husbands went on a trek, the women spent six months planning their own adventure. “It was just something we wanted to do,” said Anne matter-of-factly. “And it was a good idea.”
They had no vehicle, little money and no equipment, and didn’t even know each other very well at the start, but by working together they mustered enough support to fund their expedition, gaining sponsorship from, among other companies, Brooke Bond Tea (they mistakenly ordered enough tea “to keep a family going for 150 years,” according to Eve), lllustrated magazine, John Player & Son, even cosmetician Max Factor. They persuaded Land Rover to sell them a demonstration model of a modified long-wheelbase all-terrain vehicle at a significant discount, and the British Ministry of Agriculture and Fisheries donated steak, vegetables, and berries preserved with the new technology of freeze-drying. A publicity campaign before they left captured the public’s attention, and headlines such as “Four Fed-up Wives” (one member had to pull out at the last minute, due to an unexpected pregnancy) helped raise the required funds.
They weren’t completely naïve, however. Anne was fluent in Urdu and Hindi and had previously trekked in Kashmir with her husband, their baby strapped to the back of a mule. Sims had spent two years motorbiking around Australia and New Zealand, and before that had learned to climb in Wales. Deacock was an experienced rock climber. They weren’t the kind of women who would let the fact that two of them couldn’t even drive when they started planning their trip deter them.
Eve and Antonia had not yet had children, but Anne left behind her three sons, aged 15, 14 and five. “I didn’t feel guilty,” she said. “Because Lester had gone off on expeditions, and this was my turn.”
Their five-month journey saw them battle illness, delays, inhospitable terrain, the effects of altitude, and terrible roads. They occasionally had to fend off unwelcome advances and negotiate with recalcitrant porters. Rest days were spent scrubbing their laundry on rocks in a nearby river, catching up on correspondence–communication was slow and difficult, and they could only occasionally get word to their husbands and families that they were safe–and checking their supplies. The entire trip was a considerable feat of organisation – after the women estimated what they would need to last the entire trip, several crates were shipped to Bombay and had to be retrieved from the docks after bureaucratic delays.
Due to the heat in Iran, they were often forced to drive at night, and were faced with constant enquiries as to where the ‘sahibs’ were and incredulity that the women were travelling on their own. They camped almost everywhere, and were welcomed by Land Rover’s agents in the European cities they travelled through, who praised the women for their maintenance of the vehicle, though they often expressed surprise that they successfully completed the trip.
Following an audience with the Indian Prime Minister Nehru they were granted the rare privilege of travelling beyond the “inner line,”, a boundary across India and Tibet that had been drawn up in the 1800s and beyond which no British subject might rely on government protection or rescue. No non-Indian had been granted permission to cross the inner line since before the Second World War, and it was one of the last regions untraveled by Europeans on the planet. They also were thought to be the first European women to cross Afghanistan unescorted.
Fording fast-flowing meltwater streams, sometimes up to their waists in water, they trekked through snow and ice, climbing a 18,700 foot peak and naming it Biri Giri (“Wives Peak”). As they travelled, they passed through villages untouched by European contact, meeting locals who had never seen things such as zippers or nylon climbing ropes. One elderly woman was terrified of the sight of her own face in one of their mirrors. “How privileged we were to witness and partake in societies that were virtually strangers to the modern world that we know,” said Antonia afterwards.
The women were bound by a common desire to prove themselves. “I’d been my father’s daughter, my husband’s wife,” said Eve. “But this time I was somebody on my own.” Despite living in close proximity and enduring many hardships, surviving dust and discomfort, the freezing cold and the intense heat, they claim never to have argued, preferring to thoroughly discuss issues and abide by a majority rule on decisions.
After their expedition, they carried on the spirit of adventure in their lives. Antonia wrote a book about their exploits, then eventually moved to Australia and established an Outward Bound school and the world’s first dedicated adventure travel company with her husband, also establishing close links with Nepal. Eve had three children and went on to run an Outward Bound centre with her husband, and Davies helped her husband to run an outward bound school in the Lake District.
The 1950s were a time when relations between British and other western people and newly independent nations in South Asia were in their infancy, and the women’s fearless endeavour is all the more remarkable for it. Although trekking in remote lands is now far more common, it is unlikely that they would be able to make such a journey–particularly through Afghanistan and Iran– today.
A film about their expedition, by Pulse Films and Britain’s Film4, is in development. Antonia Deacock’s book, No Purdah in Padam, is now out of print, but available from antiquarian booksellers.
“Zoom” – this playful kid-word, which once referred to fast cars, now signals a fast-approaching sea-change in higher education, unfolding before our eyes thanks to COVID-19 and the need to move live university classes online. Zoom, for those who do not know, is the video meeting platform by which faculty are all migrating our classes to on-line format.
We all might have larger things to worry about in the next few weeks and months, like our loved ones, our colleagues, and our students becoming terribly ill. If this happens, then the nature of online education will hardly be our biggest problem.
But for the moment at least, as an academic who has been teaching at state universities for nearly thirty years, I am torn concerning the issue at hand. On the one hand, the students who signed up for my History of the Holocaust class this semester at the University of Florida did so because they were interested in the topic, some intensely so. As I try and move my lectures and discussion sessions from a classroom to a Zoom format, I want to provide something as close to the classroom experience as I can. On the other hand, I suspect, as do many of my colleagues, that university administrators and state legislators throughout the US will study this crash experiment in online education very closely one day. Are we academics showing them how they might replace us in the name of heightened efficiency?
We can agree that some of the efficiencies are indeed desirable. Those of us who remember putting books on reserve in the library for twenty-five students at a time will attest to this. A certain number of online classes, moreover, have existed for the last couple of decades, helping place-bound students and those students, younger and older, who work full time. But what happens when everything goes online, and all at once? If we discover that all classes can be delivered online from a remote location, then what is the point of having lecture halls, classrooms, or for that matter a diverse faculty of broad expertise and talents? The arguments that have been percolating in universities for the past decade will intensify overnight.Yes, there are faculty who have put in an immense amount of time in order to develop fine online experiences. But I have also seen half-baked efforts over the years that are rather disastrous, even within the oft-cited rationalizing context of the apocryphal 1970s professor (I never actually had one of these guys) who mumbled through his yellowing lecture notes.
My colleagues are proceeding cautiously. One colleague warned me not to record my lectures into the Zoom cloud, but to provide them live through Zoom. Everyone, I hear, is giving synchronous (“live” in Zoomspeak) as opposed to asynchronous (“recorded” in Zoomspeak) lectures. Anyone who has seen the infatuation with online learning in higher education administration over the past twenty years knows that this is hardly a paranoid reaction. The university would own the recorded content, as the work is done for the university in return for compensation. I actually recorded the first few lectures for my class. I needed to crawl with this technology before I could walk, and the students, I thought, would need time to adjust to the new reality of a full course load online as they simultaneously move from Gainesville back to their homes in Florida and elsewhere in the US. Nonetheless, like some of my colleagues, I am uneasy even with synchronous content. If I have learned anything from other people’s travails with Facebook and Twitter over the years, it is that nothing put online, even briefly, is truly protected or truly deleted.
And if I know what faculty will say once the experiment is over, I am less sure about the students. I hope that they give a contextual yet roundly negative assessment – something like, “I understand the situation, but I can’t wait to get back to live classes.” But today’s students are online-surveyed to death, starting with online evaluations of faculty each semester that most do not complete. Worse, twenty-somethings believe that they can effectively multitask–what others of us would call diffusing one’s focus. The professor telling them at the start of class to turn off their cell phones and laptops is now the professor who depends on these devices as we try to lecture or hold discussion sections from remote locations. Zoom actually has a feature that discloses a student’s level of attention—have they left their screen? Are they messaging? Are they watching Netflix? But I really don’t want to check that feature, and the fact that Zoom has it at all reveals the nature of the problem. Live classes promote a level of decorum that everyone in the classroom understands and from which everyone in the room benefits. But taking a class alone in one’s kitchen or bedroom? There is a reason that different rooms have different names, and in every language.
Finally, there is the quality of our own work, our pedagogical preparation. Like most of my colleagues in certain disciplines, I believe that each lecture and each discussion is the result of having worked at our craft over a period of years. What material shall we present to make a particular point about, say, Jewish resistance in the Warsaw ghetto, or about Reconstruction after the Civil War, or about Robespierre’s dictatorship? How shall we present it? What verbiage will we use? What visuals will we use? When will we leave the lectern for a stroll up the aisle? When will we pause and urge the the students think rather than just take notes? What questions will we pose to them when they discuss? How can we encourage them to interact and even debate with one another face-to-face-to-face, complete with expressions and gestures? How will we get them to understand that there are no black and white answers but only arguments, some thoughtful, some needing intensive development?
These questions and many others form the very stuff that makes live higher education on a university campus an experience for faculty and students that cannot be replicated online, at least through the Zoom technology with which I have become familiar. Even if all of the technology “works,” how can our broader efforts, having been squeezed through the portal between a faculty computer and those of the students, come out undistorted on either side in ways that we cannot yet fully recognize? Zoom is fine technology—for conference calls. It enables business executives to talk to one another over long distances while presenting flowcharts and such. Academics can even have faculty meetings via Zoom, so that we ourselves can “multitask” to our heart’s content while discussing the minutiae of departmental by-laws.
But the real interaction that results in true learning? I am not sure at all. Zoom at its bandwidth-driven heart allows us to see one another and hear one another only to the point where we can talk to one another’s images (with cheesy optional backgrounds of the tropics or of outer space no less) and not speak to one another as human beings. We can see, but our vision is circumscribed. We can listen, but our hearing is muffled. We can connect, but our interaction is impeded.
For this, we all need to be, once again, in the same room.
Let’s hope it is soon.
If you teach history at any sort of educational institution, whether K–12 or higher ed, chances are your institution has “pivoted” to remote (online) instruction, if not closed campus entirely, as a response to the rapid spread of COVID-19 and the imperative for social distancing. Over the past week, a wave of these pivots and closures has left many of us scrambling for alternative means of engaging our students in an online and likely asynchronous setting. This is not the optimal way to teach and learn history, given that good online courses take more time and planning to develop than the handful of days most of us have been given.
So what do we do? How do we keep as many of the essential elements of our course as possible, even if those look different online? We want to help our students to continue to be engaged with history; that is, to still feel present in the course and actively work with the course material. We want our students to do things like discuss, analyze, work with primary sources, and be able to communicate their interpretations to others…..
Many historians have already been doing this kind of teaching online, so there might not be the need for you to re-discover fire. Check out the #twitterstorians hashtag on Twitter to read and participate in ongoing conversations and resource sharing. Waitman Beorn, a senior lecturer in history at Northumbria University, has generously created a spreadsheet of teaching tools, digital history sites you can direct students to, digital tools for historical scholarship, digital humanities projects, and digital archives (use the tabs at the bottom of the spreadsheet to navigate between categories). H-Net has put together a repository for resources on teaching history online, which should be a thriving community soon. One of the most exciting cross-disciplinary products has been the “Keep Teaching” online community set up by the staff of Kansas State University’s Global Campus, an excellent virtual gathering spot where faculty, staff, and designers are sharing tips, tricks, techniques, and—most importantly—solidarity as we navigate this rapidly changing landscape together.
As historians and teachers, we pride ourselves on being able to engage students with the complexity and wonders of the past. Though our current circumstances are far different than we anticipated, we have the research skills and critical faculties to help solve this new set of problems. Being analytical and discerning about the tools we use is a necessary part of that process, but so, too, is our discipline’s remarkable willingness to collaborate and share expertise. If you’re one of the thousands of us “moving online,” good luck, and see you on the internet!