There’s always something new going on in the History Department.
There’s always something new going on in the History Department.
I am glad to see that structural critiques of racism are finally gaining some traction in university curricula and professional training in the United States. It’s been a long wait.
In the fall of 1972, while legal scholars were contemporaneously developing what would become Critical Race Theory, I was co-teaching a course on radical criminology to some six hundred undergraduates at Berkeley. Our curriculum included sections on “crimes of racism,” as well as crimes of imperialism, exploitation, sexism, and survival. Students read about the trial of the Panther 21 and institutional racism in prisons, and watched documentaries on the bombing of Hiroshima and the murder of Fred Hampton. I taught another class on criminal justice racism with David Du Bois, W. E. B. Du Bois’ stepson, who was then editor of the Black Panther Party’s newspaper.
Not surprisingly, I was not a criminology professor much longer. Informants in my classes reported to the FBI that I had “continually and consistently displayed anti-American ideas” and that I was “one of the first individuals to wear extremely long hair.” Berkeley’s Chancellor, in a confidential memo to a university committee considering my promotion to tenured professor, complained that my political activism had contributed to the “decline in morale” of the local police department and that my scholarship smacked of “Orthodox Marxism of the 1930s.” Moreover, he added, “I do believe some of his colleagues would be somewhat relieved if he weren’t around.” By the mid-1970s, the teaching of radical theories of racism was marginalized in most universities. In Berkeley, the University closed down the School of Criminology.
It would take another forty years for advocates of Critical Race Theory to again carve out a niche in academia. And again it is under relentless attack. There’s a lot of noise these days about how it’s promoting racial discord and generally wrecking the country. Sufficiently demonized to have its own acronym, CRT has replaced affirmative action and political correctness as the Right’s target in the culture wars. “It weaponizes diversity,” says Utah Senator Mike Lee, and turns “college campuses into grievance pageants and loose Orwellian mobs.” Opportunistically jumping on the populist bandwagon, Donald Trump accuses CRT of “indoctrinating America’s school children with some of the most toxic and anti-American themes ever conceived.” Its “ridiculous left-wing dogma” promotes “divisive messages [that] verge on psychological abuse.” How dangerous is it, you may ask? “It is a program of national suicide,” replies Trump.
No doubt Lee and Trump have in mind initiatives recently taken by universities, such as UC Berkeley’s promise to become an “anti-racist campus” that draw upon “antiracist frameworks” in teaching and research. With its progressive reputation for social justice activism, Berkeley is always an easy target for the Right, though I’ve not yet encountered any multicultural hordes running rampant on my campus.
For most of its history, from its founding in the 1870s through the 1990s, long before it took tentative steps to incorporate critical race theory into its curriculum, Berkeley was a bastion of uncritical race theory. The self-righteous indignation of leading politicians and literati was resolutely muted when academia nourished scientific racism.
Nobody yelled about disrespect for our ancestors when Berkeley’s first president Daniel Coit Gilman (1872-1875) praised California’s founding fathers for making “this wilderness rejoice and blossom like a rose,” thus erasing thousands of years of Indigenous history prior to genocide. The biblically inspired catchphrase fiat lux, a regional expression of manifest destiny, is repetitively embedded in the University’s iconography, from its inscription on an arch honoring philanthropist Phoebe Hearst after her death in 1919, to its use as a sign-off in the Chancellor’s messaging today. Testimony to the staying power of fiat lux is former Senator and Republican pundit Rick Santorum’s recent claim to Young America’s Foundation that “there isn’t much Native American culture in American Culture…. We birthed a nation from nothing — I mean, there was nothing here.”
Nobody raised the specter of divisiveness when the university appointed John LeConte as its third president (1876-1881) and his brother Joseph as professor of geology. Both were unreconstructed racists from a slave-owning family in Georgia and owners of a weapons factory during the Civil War. There was only a slim hope, Joseph Le Conte argued in 1892 in his capacity as president of the American Association for the Advancement of Science, for civilizing the “plastic, docile, imitative Negro.” As for the American Indian, “extermination is unavoidable.”
Nobody registered their concern about cancel culture when the university appointed as its ninth president (1919-1923) David Prescott Barrows, whose experience as General Superintendent of Education in the occupied Philippines left him convinced that Filipinos had an “intrinsic inability for self-governance” and that “the white, or European race, is above all others the great historical race.”
Nobody seemed worried about the dangers of hyphenated Americanism when Berkeley’s most distinguished historian, Herbert Bolton, promoted the global significance of Spain’s colonial ventures, commending its “frontier genius” and efforts to “discipline the Indian in the rudiments of civilized life.” His 1922 textbook for children helped to solidify the myth that Spain’s deadly mission system taught “wild Indians the Christian faith and how to do useful things in the white man’s way.” Nor did university officials express any concern about the possibility of quid pro quo when the King of Spain made Bolton a Commander of the Royal Order of Isabella the Catholic in 1925.
Nobody charged two of Berkeley’s leading academics – zoologist Samuel Holmes and medical scientist Herbert Evans – with un-Americanism when they joined an organization that publicly praised Hitler’s regime for forcibly sterilizing 200,000 women during its first three years in power. They remained loyal members of the rightwing Human Betterment Foundation from its foundation in 1929 until its dissolution in 1942, nine years after the Nazi government enacted the Law for the Prevention of Genetically Diseased Offspring (Sterilization Law) and seven years after the publication of the first American Edition of Mein Kampf. “There is a good deal of discussion in Germany over the curtailment of the increase of Poles, Jews, and other elements not in the good graces of the present régime,” wrote Holmes in 1937, two years after the Nazis enacted the Nuremberg Laws. “What the Germans may accomplish – and many of their best minds are giving serious thought to the problem – remains to be seen.”
Holmes put his best mind to work in the United States by lobbying for a quota on Mexican immigrants (“the least assimilable of foreign stocks”), advocating financial incentives for white female students and faculty wives to procreate, and helping to make California the country’s leader in forced sterilization. You can hear strong echoes of this reactionary angst in Great Replacement Theory that today is a foundational tenet of rightwing ideology.
The university did more than tolerate Holmes’ fascist-inspired ideology, it also funded and published his extensive Bibliography of Eugenics (1924) and his racist screed The Negro’s Struggle for Survival, in which the author commended the Ku Klux Klan for promoting the health and welfare of survivors of slavery in the South, no irony intended. To Holmes, “the more primitive race” (Native People) had failed to “adjust itself to a more highly developed social order” and thus “disappeared” because they “remained, in large part, a sort of foreign element in our population.”
Nobody railed about cultural exclusion when Phoebe Hearst bankrolled archaeological expeditions to Egypt, Peru, Mexico, and California that returned to Berkeley with the spoils of mummies, funerary artifacts, and human remains–aptly described by Hazel Carby as “fetishes of conquest.” During the 20th century, university officials welcomed the plunder of thousands of Native graves and accumulation of the country’s second largest collection of human bones and crania in the name of “salvage archaeology.”
Nobody decried race-based theories when Edward Gifford, the university’s expert on anthropometry, promoted essentialist quackery about how “the living aborigines of California fall into two main groups, one low-faced, the other high-faced.” His research focused on the measurement of racial differences as indicated by the length and breadth of heads, noses, and ears; the degree of slope in forehead; the axis of nostrils; and the degree to which “the fleshy lower margin of the septum is exposed.” The university rewarded Gifford by publishing his research, putting him in charge of public education about anthropology, promoting him to full professor, and naming a room in his honor after his death in 1959.
In the 1970s and 1980s, while activists all around the country condemned psychologist Arthur Jensen’s arguments about the hereditarian roots of intelligence, the university protected his freedom to peddle updated eugenics while it justified closing down the School of Criminology on the grounds that we had abandoned our “professional mission.” I don’t remember any administrators suggesting that Jensen’s scholarship represented an attack on American values.
“How Much Can We Boost IQ and Scholastic Achievement?” asked Jensen in his influential 1969 Harvard Educational Review article. Not much, he replied, and not worth social investment in trying to do so. For the next twenty-five years as professor of education at Berkeley, he stuck to his reactionary views about the fixed, genetic nature of intelligence. Moreover, his research boosted the popularity of Richard Herrnstein and Charles Murray’s neoconservative proposal in The Bell Curve that “it is time for America once again to try living with inequality, as life is lived.”
The logic of eugenics that was embedded in the work of the LeConte brothers, David Barrows, Herbert Bolton, Samuel Holmes, Edward Gifford, and Arthur Jensen has had devastating consequences for millions of people: poor women, disproportionately Latina, Native, and African American who were forcibly sterilized; Mexican immigrants who were restricted from entering the United States on racial grounds; families of color who were tracked into segregated housing and substandard education; people with physical and intellectual disabilities who were denied a right to full lives; and Native People who were represented in popular literature and educational texts as biologically destined to extinction.
The ideas that settler colonialism and capitalism represented a progressive civilization bringing light to the darkness and that social inequality is rooted in biological predisposition shaped public education and everyday commonsense for decades. It lives on, for example, in Trump’s depiction of “American carnage” as inherent in “crime-infested” communities of color. And in police use of the pseudo-diagnosis “excited delirium” to explain why force is necessary to restrain and kill African American men.
In the twilight of my career, I was invited in 2014 to return as a resident scholar to Berkeley, the site of my banishment in the 1970s. It’s good to be back where I started my climb up the academic ladder and to witness the resurrection of efforts to infuse the curriculum with critically informed ideas about race. It’s the least we can do to make up for generations of what Inuit artist David Ruben Piqtoukun calls an “education in forgetting.”
I hope, though, that the university’s anti-racism initiatives also keep in mind one of Critical Race Theory’s central tenets: that racism is systemic and structural, that the past bleeds into the present. To move forward we need to understand how and why universities such as Berkeley hosted and emboldened pro-racist initiatives for so long.
As I recently walked through the Berkeley campus – past buildings, statues, and plaques that celebrate an American president who authorized the appropriation of Native lands to finance the University of California; that dignify entrepreneurs who built their fortunes from the plunder of war and conquest; and that enshrine academics who polished their careers by making white supremacy respectable – I was reminded of Yurok Judge Abby Abinanti’s admonition that “the hardest mistakes to correct are those that are ingrained.”
Gold medal winner Tommie Smith holds one Puma shoe aloft on the podium. Smith and fellow American sprinter John Carlos engineered a guerilla product placement in protest of International Olympic Committee rules that forced athletes to choose between penury and taking endorsement payments on the sly.
At the 1968 Olympics in Mexico City photographers captured the moment when Tommie Smith and John Carlos staged their protests. What most people don’t know was that there were not one but two protests: one at the start of the medal presentation ceremony, and the other at the end. The one that has become iconic is of the two athletes, barefoot, heads bowed, each man raising a single gloved fist in the air while the national anthem was played. The second photograph was taken at the start of the ceremony. Standing on the podium, before bending down to receive their medals, each man raised a gloved fist, this time heads held high, and in their other hand was a single Puma shoe. This was the most audacious example of product placement at an Olympic Games up to then.
“They [the Puma shoes] were as important as the black glove and the black sock,” explained Smith afterwards. “I have them [sic] on the stand, because they helped me get there during the race and long before.” After the men had held up their shoes, they placed them next to where they stood, and then accepted their medals. The placement of the shoes was quite deliberate, according to Carlos. “If you look at the way the shoes were placed on the victory stand, Mr. Smith took his shoes and placed them behind him. I took my shoes [sic] and put it where everybody could clearly see the Puma logo.” They had purposely placed the shoes at right angles to each other so that the logo would be picked up regardless of where cameras were positioned.
During those Olympics, Puma and Adidas representatives plied athletes with thousands of dollars to wear their shoes, making medal winners into running and jumping billboards. Best of all for the shoe companies, they did not have to pay the television networks a cent of prime time exposure of their products.
In the past the IOC and athletic associations had enforced its amateur rules with Robespierrian rigor. Yet in Mexico City they did nothing. Why?
There is no question that they knew about the inducements. Robert Lipsyte revealed in the New York Times that “American athletes had accepted money for exploiting athletic equipment they used in Olympic competition.” Syndicated columnist Red Smith was horrified that the purity of amateur sport was ruined by Puma and Adidas whom he accused of having “poisoned the well,” with athletes receiving up to $7,000. How many athletes were involved? According to the Tribune de Lausanne, more than two hundred athletes had received money from either Puma or Adidas. The Los Angeles Times’s estimate was higher, reporting that “virtually every country in the Olympics are [sic] guilty of violations in the equipment scandal.”
After the Games, once newspapers exposed the extent of the payments, Horst Dassler, who represented Adidas, went to see Avery Brundage, IOC president and confessed that Adidas had paid athletes to wear its shoes. “Things happened that should not have happened, but we would have lost nearly the whole American team, and instead of 85% of the medal winners we would have had 30%.” explained Dassler. His point was that if he had not paid athletes who normally wore Adidas shoes they would have defected to Puma. With remarkable incuriousity, Brundage did not demand the names of athletes who accepted payments.
Then, on his way home after the Olympics, Dassler stopped off in New York where he caught up with Ollan Cassell, track and field administrator of the Amateur Athletic Union (AAU). Again, Horst blamed Puma for starting the bidding war and he wanted the AAU and other sports organizations to write new rules to put an end to it. “We’ll both go bankrupt if this continues,” he explained. As a sign of his good faith, without being asked, he offered to give Cassell the names of all the American athletes that Adidas had paid: names, amounts, dates and even signed receipts. Cassell responded: “Horst, you can’t ask me to do this when there are other athletes in the world who also took your money, perhaps more than the American athletes.” He told Horst that he did not want to see his list.
Had Cassell accepted the list of names, then the AAU would have been forced to investigate and possibly suspend quite a few US medal winners. Not only would this be a disaster for American prestige, but the Olympic Games might not have survived a scandal of that magnitude, as bribes were also paid to athletes in other countries.
Having come clean with the IOC and AAU, Dassler had cleverly made them complicit, which provided him with confidence that no action would be taken against the shoe company.
During the 1980s, the IOC ran up the white flag, and athletes were able to endorse products provided that they did not cut across IOC sponsorship deals, which were helping the Olympic movement avoid bankruptcy.
Tommie Smith and John Carlos are remembered for their salute protest against racism, and their shoe protest was indelibly linked to racism. Amateur rules meant that athletes had to survive on a $2 per diem, and while they attended the Olympics they lost income from jobs at home. Both Smith and Carlos had young families, who would have teetered on the brink of penury if not for money slipped to them by Puma; most African American athletes were in the same position.
Wu Zetian (624-705) was the only legitimate female sovereign of China. Illustration c. 1690
There’s a reason that women who managed to gain power throughout history were often legendary beauties: they had to be beautiful to gain status, and once they were in power, their looks and choices helped set the fashion trends and beauty conventions they were then measured against. Egypt’s Cleopatra was a ruler who became known for her beauty, as was the Mughal Empire’s Nur Jahan. In ancient China, a woman known as Empress Wu, or Wu Zetian, went from an emperor’s concubine to the only woman emperor to rule in her own name in ancient China. She did this in part by leveraging the image of beauty and power she created. And in the 1500s in England, Queen Elizabeth I used makeup to create an image of a virginal, beautiful woman to help keep her power all to herself. These two women rulers in particular—though from different cultures and time periods—illustrate how makeup can help a woman portray an image of strength and sovereignty, and also how their political enemies can use rumors and stereotypes about makeup to undermine their power.
Wu used her image as a pretty teenage girl to first put herself in the position to be near the politically powerful as a concubine to the emperor. Wu was not from a notable or high-class family in China’s history—she was the daughter of a merchant who became a government official in the Tang dynasty. She was first a concubine under Emperor Taizong, who ruled from 598 AD to 649 AD, and then formed a relationship and became a concubine to Taizong’s son, Emperor Gaozong, who ruled from 649 to 683. Eventually, Gaozong removed his empress wife and installed Wu as his empress instead—a rare and risky move for an emperor at the time.
Wu was known to have been particularly gifted with makeup. Society women in China in the Tang Dynasty like Wu used white lead for face paint and cinnabar or vermilion to make rouge for their cheeks and lips. They would also pluck out or shave their eyebrows and draw on patches of a green shadow high on the forehead, called moth eyebrows. In the early seventh century, this style, which resembled the wings of the insect, was so popular that officials supplied a daily ration of twenty-seven quarts of pigment to the emperor’s concubines.
Over decades, Wu cultivated her image by wearing lavish cosmetics to indicate her rising status and increase her visibility in public. She slowly transitioned out of the conventions in her culture for a woman, which emphasized domesticity and patriarchal rule. As Wu gained status in the palace and increased her participation in government affairs, certain male officials said her social climbing made her untrustworthy when she tried to gain power for herself. This faction vocalized their opposition to her and worked against her as she eventually became emperor. This belief that women who use makeup are untrustworthy, vain, and obsessed with status persists today.
There are no paintings or images of Wu from her own time, but paintings of her from later periods illustrate her with thin painted eyebrows, with three lines painted below her eyes and three dots in the center of her forehead. What remains from her time are stories of the glamorous version of herself she created. Her natural beauty may have been striking, but it was a studied combination of makeup and charisma that drew people to her and created an image of strength, youth, and wealth. She used elaborate makeup to hide wrinkles and flaws in her skin and emphasize her status as emperor as she gained power in her old age. And when she was younger, Wu and other concubines used their looks to try to attract attention from the emperor—the man who largely controlled their lives. Beauty could be a strategy for women to gain a better life, and Wu used it in part to help control her future.
In the 1500s in England, Queen Elizabeth I used a white lead makeup known as ceruse to portray a white face with rosy cheeks to indicate perfect, virginal femininity. This trend inspired women at court to also sport white faces with pink cheeks and red hair like the queen. Looking a certain way helped women gain favor with those at court, receive material wealth, or marry into a higher class; ignoring social customs like using certain makeup could lead to being ostracized and falling out of favor.
Elizabeth never married and had no children, so there was no one to automatically succeed her as king or queen. Elizabeth felt that with the question of who would succeed her unresolved, she couldn’t risk allowing her subjects to see any evidence of her advancing age. As she aged, she became more sensitive about wrinkles and sagging skin, using ceruse to cover wrinkles or spots and controlling what paintings of herself were allowed in the public eye. Any image of her had to show white, smooth skin.
Elizabeth’s advisors claimed they were concerned about stability for the country if there were no set plan for succession, so they encouraged her to consider marriage and having children. But as queen, Elizabeth knew she had more power as a single woman. If she were to get married, she would have to submit to her husband, according to the predominant religious and cultural beliefs in England at the time. So to look like she wanted to get married, Elizabeth maintained relationships with eligible bachelors throughout Europe. She’d begin marriage negotiations and then she’d renege at the last minute. Her beauty and virginal image were key to these negotiations. In this way, Elizabeth used every tool she had to support the idea of marriage while maintaining the image of her sexual purity as a single woman, all toward the goal of holding onto her power.
Elizabeth’s beauty and power were so entwined that years after she died, people critiqued the monarchy by criticizing her looks. In the late nineteenth century, writers and painters began to depict Elizabeth as a vain spinster, ridiculously trying to hold on to her youth by using cosmetics and wigs. To chip away at Elizabeth’s power, these men claimed she didn’t fit into the image that she was trying to present.
Empress Wu and Queen Elizabeth illustrate how women in power can be undermined by their political enemies talking about their makeup and reinforcing stereotypes that belittle women who try to gain political power. Both of these women used makeup to create an image of strength, but they were also often portrayed as ridiculous because they used makeup to create that image. This pattern is often repeated throughout history, harming the women who try to lead, with repercussions for modern women in everyday life.
Four centuries of western dominance was solidified with an undisputed American preeminence, with the transfer of global power and influence from Pax Britannica to Pax Americana in the post-World War II era. The US has steered the postwar liberal international system that is distinct from previous European colonial empires more or less through persuasion based on concepts of universal rights and dignity, and freedom to pursue one’s own economic and political paths.
Many of these precepts are echoes of the US founding document. Like its constitution, paradoxes lurk in the shadows. Contemporary observers of the world would argue that a new shift is taking place and that the continuing American delusion of grandeur and exceptionalism lacks an understanding of the world in the 21st century. The United States’ greatest problems are internal rather than threats from the East.
Evidence shows that the US lags behind other industrialized nations on important indicators of social well-being, including life expectancy at birth, access to healthcare and quality primary and secondary education, equitable distribution of societal resources, social and racial harmony, and democratic representation and processes. And yet, the US steadfastly wields its moral-political gavel to split nations and people into categories of good and evil, democratic and authoritarian as part of a dichotomous weltanschauung. Make America Great Again is a subconscious, albeit ill-conceived realization that America requires a rejuvenation from within to achieve its aspired glory.
Splitting, an immature defense mechanism, is the tendency to perceive of oneself or others through a dichotomized lens such as all-good or all-bad, that is uniquely exhibited by borderline personalities. Although an innate component of healthy human development, splitting becomes pathological when a predominance of frustrating experiences in early life disrupts normal representations of good and bad. It is a binary view which evinces no room for a multi-dimensional world, coexistent within gray areas that both coalesce and compete.
In 1852, in his historic speech documenting his refusal to commemorate the 4th of July, Frederick Douglass declared that America lingered in the impressionable stage of childhood, with the potential for the young nation’s history to be either of stately majesty or a sad tale of departed glory. More than a century later, US policymakers’ drive to “split” complex global actors and situations remains deeply rooted in its own history and domestic cultural binary divides: enslaved and freed, Blacks and Whites, conservatives and liberals, Mitt Romney’s makers and takers, and Donald Trump’s urban and suburban dwellers. W.E.B. Du Bois theorized double consciousness as splits within the American psyche, engrained within the color line, the quintessential American problem of the 20th century.
The simplistic duality of the world is also a naïve and dangerous continuation of George W. Bush’s choice between good or evil, with us or against us. Such binary classifications of phenomena are propagated and reinforced within all spheres of our society, and feed downstream, such that many are unwilling to further interrogate these ideas.
Recent history shows that prominent proponents of such bisected worldviews, including Robert McNamara, Donald Rumsfeld, and George W. Bush, had fundamental flaws in their conceptualizations of the world, which led to immense human suffering. In the aftermath of the disastrous Trump presidency, Washington is on course to rejuvenate a similar trajectory.
The Biden Administration presents this split as a choice between democracy or authoritarianism. Washington seeks to magnify perceived external threats, rather than to rebuild our institutions and to invest in the American people. As the US share of the global economy declines and China’s share increases, the weight of American anxiety has shifted to a rising “bad” China.
Similarly, Chinese propaganda against the US and the west is also re-emerging. At the Communist Party’s Centennial celebration, Xi Jinping warned that China would not be bullied by foreign adversaries and that a wall of steel built from the flesh and blood of 1.4 billion would protect China. And yet, a collision course should not be inevitable.
Bernie Sanders cautions about the emerging Cold War between the US and China, noting that conflicts between democracy and authoritarianism occur not just between states, but within sovereign territorial spaces. In the US, this bend to authoritarianism was on full display during the Trump Administration. It continues with Senate Republicans’curtailment of voting rights and the conservative leaning Supreme Court’s agreement with these measures.
Nicholas Kristof rightly notes that China may represent a greater threat to its own internal system than to any other nation. China’s stringent population control, persecution of Uyghur and other minority groups, repression of political dissent, restriction of religious freedoms and other human rights abuses should be condemned, however our guiding agenda should be self-corrective measures at home. Reducing poverty, increasing access to healthcare and quality education, racial harmony and political representation should take center place in the American national debate.
China’s overly surveilled and restrictive system has no chance of supplanting opened and freed societies. Around the world, people, and especially the young, continue to demand their rights, freedom, equality and participation. They will not trade the privileges of the digital revolution and the benefits of an open society for one that is oppressive.
As Americans, we are comfortable with children of affluent families receiving a high-quality education, while other people’s children are semi-educated. Policies that use local taxes as the primary funds for public schools and limit federal funding have contributed to underfunding of education. In the long run, these policies inflict great harm upon us all, as American children are less educated than their peers and less able to successfully compete in the global workforce.
In the US, less than one-half (48.3%) of adults have completed tertiary education, as compared with more than one-half (56.7%) of adults in Russia. While the US continues with segregationist policies, other nations, including adversaries, prioritize education as a right and engage in reforms to standardize education systems for all their citizens, including preschool education, quality of teaching, resources and services, and conditions of learning environments, irrespective of socio-economic and other statuses.
Similarly, Americans should be discontent with high infant mortality and maternal death rates, even when disproportionately concentrated among ethnic minorities and not within their own homogenous groups. The infant mortality rate for Blacks is more than double the rate for Whites, and 2 out of 3 pregnancy-related deaths are considered to be preventable.
Deeply rooted social blights lead to poor physical and mental health, which partly manifest in the unrelenting violence of our streets. National efforts should be directed to improving the well-being of the most vulnerable Americans, and not to magnifying the threat from China.
With the un-triumphant declaration that America is back, President Biden is intent on America’s resumption of its position as a global hegemon. However, ascendency does not require more sophisticated military hardware or a simplistic split of the world that pushes us further into conflict. More than rhetoric, ascendancy now requires investment in the American people. Strengthening our institutions and dismantling racial and social hierarchies are its prerequisites.
US Marines occupying Haiti, c. 1919
Haiti is frequently in the news, Ireland occasionally so. Haiti is one of the world’s disaster areas. It has been wracked by hurricanes, earthquakes, and political instability. By a number of measures, Haiti is the poorest country in the Western Hemisphere and one of the poorest in the world. Sixty percent of its population lives in poverty and one-fourth are trapped in extreme poverty. Ireland, on the other hand, is one of the world’s success stories. From the mid-1990s until the late 2000s, the Celtic Tiger experienced phenomenal economic growth. Although the economy has tailed off since then, Ireland continues to have one of the highest GDP per capita rates in the world.
The divergent paths of Haiti and Ireland are rooted in the history of 19th century European colonialism, European and American racism, and the very different alternatives offered to the people of the former colonies for the last two hundred plus years. In both cases, relatively small islands could not support growing populations. The difference was that the white Irish were allowed, or even forced by landowners to leave, while Black Haitians remained trapped on the Caribbean island they share with the Dominican Republic.
Great Britain’s first colony was misruled into the 20th century when Ireland finally achieved independence after a series of wars against British occupation. Even today, six northern Irish counties remain under British control, though this may finally end because of demographic changes and the stupidity of Brexit.
In six of seven years between 1845 and 1852 the Irish potato crop failed because of a fungus, and millions of people starved, died from disease, or were forced to emigrate. With a commitment to laissez-faire capitalist policies and contempt for Irish lives, the British government let Ireland weather a natural disaster with minimal aid, transforming crop failure into famine. Between 1841 and 1880 the population of Ireland plummeted from over 8 million people to less than four million. The estimated population of Ireland today is 5.2 million people.
John Mitchel, an Irish patriot who challenged British control over Ireland charged, “The Almighty sent the potato blight, but the English created the famine.” What Mitchel meant was that crop failure was transformed into famine because the British failed to act to save Irish lives. The crop failure was an act of nature, but the famine was an act of man.
While the Irish Diaspora was experienced as a tragedy, it may have been what eventually saved Ireland. From 1815 to the start of the Great Irish Famine between 800,000 and 1 million Irish emigrated to the United States and Canada. The Great Irish Famine accelerated the Irish diaspora. Almost 2 million Irish men and women arrived in the United States between 1845-55. By the end of the 19th century, as many people who were born in Ireland lived outside the country as continued to live there. Today, people of Irish ancestry make up more than 10% of the population of the United States, Canada, and Great Britain where, despite intense discrimination, they became the industrial workforce at the end of the 19th and beginning of the 20th centuries, and over 30% of the population of Australia, where they were a major part of the European settler population.
Haiti has a very different history. In 1791, a revolt broke out in the French Caribbean colony of St. Domingue, which was located on the western third of the island of Hispaniola. One of the wealthiest colonies in the Americas, St. Domingue, known as the Pearl of the Antilles, produced half of all the sugar and coffee exported to Europe and the United States. It owed its wealth to the work of brutally treated enslaved Africans. The average life expectancy for an enslaved African in St. Domingue was only 21 years, and it was lower for newly arrived Africans transported to the island during the trans-Atlantic slave trade.
The rebellion started when free Blacks in Haiti were not accorded rights promised by the French Revolution’s Declaration of the Rights of Man. Because enslaved Africans and free Blacks outnumbered whites by more than 10 to 1, the revolt quickly spread. Toussaint L’Ouverture, the leading general of the Black revolt, became the undisputed leader of the entire island. Although Toussaint was captured and exiled to France, where he died in prison, the Republic of Haiti became an independent country in 1804, the second independent nation in the Western Hemisphere and the first independent republic governed by people of African ancestry.
The western countries, including the European colonial powers and the United States, would not accept a free and independent Black nation. In the United States, Southern planters feared that the slave rebellion in Haiti could spread and destroy their way of life. The virus of freedom was essentially quarantined. Europe and the United States placed a trade embargo on the new country and France demanded that Haiti repay French planters the modern equivalent of over $20 billion for their lost human property. That debt, partly financed at a profit by a Citibank predecessor bank, was not repaid until after World War II. The United States would not formally recognize Haitian independence for six decades. Closed out of the global sugar market, Haitians turned to subsistence farming on rocky and mountainous terrain that eventually led to deforestation and soil erosion.
In the twentieth century, United States troops repeatedly invaded and occupied Haiti to protect American economic investment, force payments to France, and support dictatorial regimes. The U.S. military has “policed” Haiti at least since 1888, when war boats were sent to “persuade” the government to be friendlier to U.S. shipping interests. Troops were dispatched to protect American agricultural producers in Haiti in 1891 and 1914. In 1910 the American State Department supported efforts by the National City Bank of New York, now known as Citibank, to take control over the Banque National d’Haïti, Haiti’s only commercial bank and its national treasury. In 1915 United States Marines invaded Haiti, secured control over the national capital and government, and occupied and ruled the country until 1934. U.S. economic interests controlled the economy of Haiti until 1947. In 1995, the United States sent 20,000 troops to occupy Haiti and a small military contingent was sent there again in 2004.
Since 2005, over 150,000 Mexicans have legally immigrated to the United States each year, approximately 70,000 Chinese, 60,000 people each from India, the Dominican Republic and Cuba, and fewer than 20,000 Haitians. While Cuban refugees continue to be welcomed, the U.S. Coast Guard turns back people trying to sail from Haiti to the United States.
The only thing that can save Haiti as a viable nation is what “saved” Ireland – mass emigration and a drastic reduction in population to bring the number of people into balance with the environment that must support them. But while the colonial power that controlled Ireland, Great Britain, wanted the Irish to emigrate for its own ends, the United States, the imperialist power that has dominated Haiti for the last century, does not want the Haitians to emigrate because it does not want them to move here. Both the Trump and Biden administrations have supported humanitarian aid and autocratic rule in Haiti, keeping Haiti on life support so they did not have to address the underlying problems.
In 2010, following a massive earthquake, President Barack Obama delivered a powerful statement to the people of Haiti recognizing “Few in the world have endured the hardships that you have known. Long before this tragedy, daily life itself was often a bitter struggle.” He promised, “you will not be forsaken; you will not be forgotten. In this, your hour of greatest need, America stands with you. The world stands with you.”
However, what President Obama did not say was that Haiti was in this precarious position because the United States and the rest of the world not only refused to help the Haitian people in the past, but they established the conditions that made Haiti so vulnerable.