To study history is to explore the powerful interaction between the past and the present—and to cultivate the knowledge, skills, and habits of mind essential for active global citizenship.

Why study history with us?  

A deeply engaging education… 

Thrive in small courses taught by expert and enthusiastic faculty. Investigate global topics and understand diverse perspectives. Learn to rigorously question, research, analyze, discuss, and advocate.

…that extends beyond the classroom…

Complete an internship, study abroad, or undertake independent research. Present at a conference, publish in a journal, or aspire to join an honor society. Experience a sense of community and pride through events, trips, and workshops.

…to prepare you for the future.

Enjoy individualized advising and career-oriented events aimed at post-graduation success. Get to know, and then join, Ursinus History alumni, who enjoy graduate degrees and successful careers in a wide range of fields. 

“The past is never dead. 
It’s not even past.” 

William Faulkner
Requiem for a Nun

Join us on Facebook
Welcome UC history students, alumni, and friends! We hope you’ll use this group to post questions, personal updates and photos, links of interest, history-related events, and internship/job opportunities. We are looking forward to hearing from you!

February 21st, 2019

February 27th, 2019

  • Feb 27 2019 6:45pm
    david perry Colleges Have 99 Problems (But Political Correctness Ain’t One)
    Lenfest Theater, Kaleidoscope Performing Arts Center
    David M. Perry, a journalist, activist and undergraduate career advisor for history majors at the University of Minnesota, argues that there’s nothing wrong with political correctness, that it’s necessary to be careful about giving platforms to hate, and that we need to shift public attention to the real threats to higher education today.

March 28th, 2019

View More News

From the History News Network 

  • As the Senate Judiciary Committee holds its confirmation hearings for William Barr, the current nominee for Attorney General of the United States, it is clear Barr needs to brush up on his constitutional law, as well as U.S. history.  

     

    During yesterday’s hearing, Senator Mazie Hirono (D-HI) asked Barr whether or not he believed birthright citizenship was guaranteed by the 14th Amendment. The question is important as the idea of birthright citizenship has come under increasing attack from the right in recent years. From the Republican primaries onward, Donald Trump has repeatedly asserted that birthright citizenship is unconstitutional, should be eliminated, and can be ended by executive order. While some on the right have balked at the last claim, Trump has tapped into an ever-present disdain among conservatives for birthright citizenship. 

     

    For his part, Barr seemingly tried to side step the politically divisive issue. However, his answer to Senator Hirono’s question was not only vague, it also suggested that the soon-to-be Attorney General doesn’t know basic constitutional law or history. 

     

    “I haven’t looked at that issue legally. That’s the kind of issue I would ask OLC [Office of Legal Counsel] to advise me on, as to whether it’s something that appropriate for legislation. I don’t even know the answer to that,” Barr answered.

     

    There are a couple of worrying signs in this response. First, birthright citizenship is a part of the 14th Amendment, meaning any action to change that would have to be a constitutional amendment, not legislation. This is a basic tenant of constitutional law. The fact that Barr, who previously served as Attorney General under George H.W. Bush, thinks any action can be taken against birthright citizenship through simple legislation shows one of two things: (1) he isn’t competent enough to understand basic constitutional processes in the United States or (2) he was rather insidiously actually answering Senator Hirono’s question. 

     

    The latter point warrants a bit of explanation. Barr quite visibly looked like he was attempting to simply move past the question and not answer Senator Hirono. However, if Barr does in fact think that birthright citizenship can be dealt with through congressional legislation, then the only logical explanation for this, barring the above first option, is that he doesn’t believe the 14th Amendment guarantees this status. Whereas the first possibility of incompetence warrants a refresher in constitutional law, this second one demands a lesson in history. 

     

    History is quite clear on the intent of 14th Amendment: it was meant to create the birthright citizenship in the wake of emancipation. The 14th Amendment was created to guarantee that freed slaves, free blacks, and their posterity would forever be considered American citizens. Before its adoption, citizenship was a murky, ill-defined, status. The Constitution only mentions citizenship a few times, and does not provide a concrete definition of what a citizen is or who can be a citizen. To this day there is actually no legal definition of what citizenship actually is.

     

    From the Constitution’s ratification to the adoption of the 14th Amendment, black Americans had repeatedly claimed they were citizens because of their birth on American soil. Scholars such as Elizabeth Stordeur Pryor and Martha S. Jones have shown the myriad of ways in which black Americans made claims on this status, only to be rebuffed in many cases. Citizenship could provide black Americans with a recognized spot in the nation’s political community. It represented hope for a formal claim to certain rights, such as suing in federal court. 

     

    This leads to the infamous 1857 Supreme Court decision Dred Scott v. Sandford, when Chief Justice Roger Taney crafted an opinion that quite consciously attacked the very possibility of black citizenship. Taney concluded that Dred Scott, an enslaved man, could not sue in federal court because he was not a citizen. He was not a citizen, in Taney’s words, because black people “are not included, and were not intended to be included, under the word ‘citizens’ in the Constitution… On the contrary, they were at that time considered as a subordinate and inferior class of beings who had been subjugated by the dominant race, and, whether emancipated or not, yet remained subject to their authority, and had no rights or privileges but such as those who held the power and the Government might choose to grant them.”

     

    Taney went out of his way to create a Supreme Court decision that attempted to put the legal nail in the coffin of black citizenship. The 14th Amendment was, quite consciously, crafted to upend Dred Scott, which was still the law of the land after the Civil War.  Thus when conservatives rail against birthright citizenship and claim that it is not, in fact, a part of the Constitution, they are ignoring America’s long history of slavery, discrimination, and segregation. 

     

    When the soon-to-be Attorney General William Barr states that he thinks legislation can be used to make changes to birthright citizenship, it is because he does not believe the 14th Amendment guarantees it. And when he and other conservatives espouse such an opinion, it is because they are once again willfully ignoring American slavery its legacy of racism. This is, admittedly, not surprising. Barr also expressed the opinion during his confirmation hearing that the justice system “overall” treats black and white Americans equally, despite mountains of information proving otherwise. 

     

    While the attack on birthright citizenship from the right deserves attention and should be fought at every turn, the underlying historical erasure of slavery and discrimination is also requires our attention. This willful amnesia is why the potential next Attorney General of the United States can, in one day, ignore so many aspects of America’s fraught history with race. And it is why we all must be on guard. 

     

  •  

    In June 1945, Winston Churchill, who had just overseen the British contribution to victory in the Second World War, was voted out of office in one of the most unexpected election outcomes of the twentieth century. Remarkably, the very soldiers who had enabled victory on the battlefield were central to the routing of the Conservative Party at the ballot box; they, and their social networks, voted overwhelmingly for Labour.

     Churchill was a man of grand vision, of big ideas, and there was no greater mission in his life than the defeat of Nazi Germany and Imperial Japan and the retention of Britain’s special place in the world. The necessity to defeat the Axis and maintain the Empire, however, blinded him to a key dynamic: he fundamentally undervalued and misunderstood the central aspiration of his citizen soldiers in a second world war; they desired immediate and profound social change. For the ordinary citizen soldier, his participation in the war was, at heart, about building a better post-war world at home – a world with better housing, health care provision and jobs.

    Churchill’s inability to fully empathize with his citizen soldiers was to have profound implications for his great mission. Churchill, and others in charge of strategy, were convinced that ordinary English, Welsh, Scots, Irish, Africans, Australians, Canadians, Indians, New Zealanders and South Africans would fight with the required determination and intensity to guarantee victory and save the Empire in its hour of need. This was the key assumption, or understanding, that drove British strategy during much of the first half of the war. It was accepted that in a newly raised citizen army men would be inadequately trained and might not be provisioned with the theoretically ideal scale or quality of materiel. But, it was expected, in this great crisis, the “great crisis of Empire,” that in spite of these drawbacks they would rise to the challenge. 

    As we know, they did not always meet these lofty ambitions. The defeats in France in 1940 and at Singapore and in the desert in 1942 put a nail in the Imperial coffin. In the end, Britain did not even achieve her initial reason for going to war: the restoration of a free, independent Poland. As the war dragged on, the British and Commonwealth Armies played an increasingly smaller role proportionally in fighting the Axis. On 1 September 1944, with the Normandy campaign completed, General Bernard Montgomery reverted to commanding an army group of roughly fifteen divisions. His erstwhile American subordinate, General Omar Bradley, on the other hand, rose to command close to fifty in what clearly signaled a changing of the guard. By the end of March 1945, of the 4 million uniformed men under the command of General Dwight D. Eisenhower in North-West Europe, over 2.5 million were American, with less than 900,000 British and about 180,000 Canadian. The British and Commonwealth Armies advanced across Europe into an imperial retreat. 

    The result was that the the post-war empire was a “pale shadow” of its former self. The cohesion of its constituent parts had been irretrievably damaged. Much of its wealth had been lost or redistributed. Nowhere was this more apparent than in the Far East. The loss of Singapore was not only a serious military defeat, but it was also a blow to British prestige. Barely two years after the war, and only five years after Churchill had uttered his famous words, that he had “not become the King’s first minister in order to preside over the liquidation of the British Empire,” Britain’s Imperial presence in India had ended. The loss of the subcontinent removed three-quarters of King George VI’s subjects overnight, reducing Britain to a second-rate power.

    The history of Britain and the Commonwealth cannot, therefore, be understood outside of the context of the performance of British and Commonwealth soldiers in the Second World War. The Empire failed not only because of economic decline, or a greater desire for self-determination among its constituent peoples, but also because it failed to fully mobilize its subjects and citizens for a second great world war. Africans, Asians and even the citizens of Britain and the Dominions demonstrated, at times, an unwillingness to commit themselves fully to a cause or a polity that they believed did not adequately represent their ideals or best interests. This manifested in morale problems on the battlefield, which, in turn, influenced extremely high rates of sickness, battle exhaustion, desertion, absence without leave and surrender in key campaigns. When the human element failed, the Empire failed.

    Today, the Anglo-Saxon world is enmeshed in another era of what some might term big ideas, although thankfully not in a world war. In the United Kingdom, instead of imperial unity we talk about Brexit and in the United States, President Trump’s vision for America, to make it Great Again, has captured the imagination of a significant cohort of the population. Whether one agrees with these movements or not, they are certainly radical; however, in a similar vein to Churchill’s grand vision of the 1940s, they are vulnerable to ignoring the needs of the many in preference to the visions of the few. In Britain, there is hardly a week that goes by without the announcement of a new set of figures outlining the collapse of basic public services and amenities. Violent crime is uphospital waiting times have risen and child poverty is increasing, just to mention a few social metrics. It seems evident to many, that leaving Europe will not address the fundamental issues faced by people who voted Brexit. Trump’s presidency, the evidence suggests, will harm the welfare of those who were most likely to vote for him; tax cuts for the wealthy and building a wall along the Mexican border will not bring back a lost prosperity to middle America. 

    Trump and the Brexiteers might heed lessons from the Second World War. The American President, Franklin D. Roosevelt believed that the efforts demanded by the state to meet the global cataclysm that was the Second World War, required legitimacy, accorded by citizens who were invested materially and ideologically in that same state. In this sense, he linked intimately the questions of social change and social justice with the performance of American armies on the battlefield. By comparison, in his obsession with defeating the Axis, the British Prime Minister lost sight of the goals and ambitions of the ordinary man, the smallest cog in the “machinery of strategy,” but a vital one all the same. For the citizen soldier, the war was not an end in itself; it was a step on the road toward a greater aspiration, political and social reform. To succeed, big ideas had to take account of the little stories of ordinary people. If they do not, they are very likely bound to fail. 

     

     

     

     

     

     

     

  •  

    For years conservative broadcasters and the right-leaning print media have denounced liberal control of American higher education. This assertion is based on the large number of academic instructors who belong to the Democratic Party and on actions taken at some colleges to promote a sense of inclusiveness and toleration to the point which – some say – discourages free speech. Both of these latter observations about academe are to a limited extent true, but they indicate that college faculty and administrators are conservative, not liberal, at least in the philosophical sense. And for intellectuals, liberal and conservative social and political values have traditionally rested on philosophical views of human nature. 

    Liberal programs initially emerged from the belief that human beings are inherently good or are attracted to the good. Seventeenth century religious liberals like the Quakers used the term Inner Light, or the presence of the Creator in all humans, to explain this, while later liberal theorists like Henry David Thoreau used the term conscience. On the other hand, early classical conservatives rooted their policies in the idea that people are by nature either evil or selfish. Religious conservatives like the Puritans believed that Original Sin left all with a powerful inclination to evil, whereas secularly oriented conservatives like Adam Smith, the father of capitalism, asserted that innate “self-love” drives human action.

    Although we often lose sight of the philosophical origins of liberal and conservative policy, today’s public agendas reflect those roots. Conservatives have traditionally supported powerful militaries believing that strong nations selfishly prey on weak ones, while liberals downplayed the need for military spending and substituted investment in education and social programs in order to help individuals maximize their latent moral and intellectual capabilities. Similarly, conservatives advocated criminal justice systems characterized by strict laws and harsh punishment to control people’s evil or selfish impulses, while liberals favored systems that focus on rehabilitation to revitalize latent moral sensibilities. Conservatives traditionally opposed welfare spending believing its beneficiaries will live off the labor of society’s productive members, while liberals believed such investments help those who, often through no fault of their own, find themselves lacking the skills and knowledge needed to succeed. Though the philosophical roots of these policies are frequently forgotten today, these agendas continue to be embraced by liberals and conservatives.

    College professors are philosophical conservatives. This is a product of their daily experiences, and it shapes their professional behaviors. First, the realm of experience: senior members of the profession are intimately familiar with the old excuse for failure to complete an assignment, “my dog ate my essay,” and its modern replacement, “my computer ate my essay.” Years ago, missed exams were blamed on faulty alarm clocks; today that responsibility has been shifted to dying iPhone batteries. Term papers cut and pasted from Wikipedia are an innovation; plagiarism is not. A philosophically liberal view of humanity is difficult to sustain amidst such behaviors.

    The clearest manifestation of philosophical conservatism in the teaching profession is seen in tests and grades. Testing is based in part on the assumption that individuals will not work unless confronted by the negative consequences of failure, an outlook that is steeped in philosophical conservatism. (Among historians, who spend inordinate amounts of time examining wars and economic depressions, which often resulted from greed and avarice, their academic discipline itself encourages a philosophically conservative outlook.)

    How then can academicians be accused of being liberal? As noted, this is partly because the majority of faculty are registered Democrats. Counterintuitively, this reflects a philosophically conservative and not a liberal outlook, especially relating to all important economic policy. During the debate over the tax bill last year, Republicans continued their traditional support for supply-side or trickledown economics by proposing to lower taxes on high earners and corporations, whereas Democrats continued to advocate demand-side economics by proposing to shift the tax burden from the large middle to the small upper class and to provide tax-credits for workers. Supply-side economics is based on the assumption that reducing the tax burden on the rich will lead them to invest in plant and equipment which will create jobs, the advantages of which will trickle down to workers in the form of wages and benefits.

    This is a philosophically liberal notion. It assumes that people and corporations will invest in plant and equipment even when wages are stagnant leaving many people without the income needed to purchase the goods new factories will produce. Demand-side economics, on the other hand, is partially based on the philosophically conservative notion that no rich person or corporation will build a plant, if the masses lack the income needed to buy the product. In support of this position today, demand-siders note that many corporations are using most of the surplus capital from last year’s tax cuts to buy back stock instead of investing in capital assets because wealth and income are more concentrated in the hands of the few than in recent history, which minimizes the purchasing power of the many. In supporting tax schemes and other economic policies that put money in the pockets of the many, demand-siders embrace the conservative idea that such programs will stimulate selfishly based investment spending by corporations in an attempt to tap the rising wealth of the majority of consumers. Moreover, demand-side policies will unleash the selfishly oriented entrepreneurial inclinations of working people by giving them the wherewithal to open small businesses that spur economic growth.

    College professors and administrators are also attracted to Democratic economic policy because they are aware of the successes of demand-side economics. There has not been a major depression since the New Deal, though there have certainly been recessions. This is because that movement largely achieved its goal of shifting the weight of the government from supporting a supply-side to a demand-side approach to economics by institutionalizing minimum wages, overtime pay for work beyond forty hours a week, unemployment insurance, Social Security, and strong labor unions. This was not part of some left-wing socialist agenda. The goal was to put money into the hands of the many and thus incentivize the capital class to invest in new productive capacity, and more importantly to maintain demand and spending when the economy slows.

    Academicians generally, and historians especially, realize that prior to the New Deal depressions (aka panics), occurred every ten to twenty years and were exacerbated by wage cuts which reduced demand and led to further layoffs and wage reductions. Minimum wages and union contracts which guarantee a wage for the life of the contract have slowed the downward economic spiral that turned recessions into depressions by limiting wage cuts, while Social Security and unemployment insurance also slowed economic downturns by helping sustain demand as the economy slackened. Though the contemporary right often argues that New Deal programs sought to create a liberal safety net for the poor, academics realize those programs were less attempts to help individuals directly and more attempts to jump-start a stalled economy and to keep it humming in part by incentivizing the capital class to continue to invest in productive capacity.

    Conservatives also label academics as liberals because of their attempts to encourage inclusiveness and discourage what some term hateful speech on campuses. To the extent that this is true, and it often seems exaggerated, it is rooted in philosophical conservatism. Academics realize that language has great symbolic power, and symbols have a tendency to generate emotional as opposed to rational responses which colleges and universities rightly scorn. Academics also recognize that negative symbolism, including language, has served to dehumanize groups, and dehumanization has often led to discrimination and persecution. Only philosophical conservatives can have so little faith in human reason and goodness as to believe that emotionally laden language has the power to perpetuate injustice.

    Ironically, the right, in supporting both supply-side economics and in tacitly accepting ethnically insensitive and sexist language, is embracing policies rooted in liberal not conservative thought, while the university – in favoring the opposite – adheres to a philosophically conservative outlook. Indeed, a traditional conservative would argue that the appeal of supply side-economics and insensitive speech lie in their ability to protect the wealth of the rich and to sustain the increasingly fragile sense of dignity of the middle class.

     

  • Chris von Saltza, Olympic champion - By Harry Pot - [1] Dutch National Archives, The Hague, Fotocollectie Algemeen Nederlands Persbureau (ANeFo), 1945-1989, Nummer toegang 2.24.01.05 Bestanddeelnummer 912-8410, CC BY-SA 3.0 nl

     

    Today the #MeToo movement puts the spotlight on young women in college who have been abused without much recourse. Most media attention exposes flagrant violations by men in date rape on campus. It extends to gender harassment by executives in the work place. Looking back to the 1960s, however, another pervasive abuse included benign neglect by colleges and universities. Women as students were treated inequitably in campus activities, especially in intercollegiate sports. Graphic examples can help us remember and learn from past practices.

     Between August 26th and September 11th in 1960 Chris von Saltza stood on the victory podium at the Olympic Games in Rome four times to receive swimming medals, a total of three gold and one silver. She then entered Stanford University and graduated in 1965 with a bachelor’s degree in Asian history, gaining prominence in her long career as a computer scientist. After the 1960 Olympics Chris never had an opportunity to swim competitively for a team again. Stanford, after all, did not offer varsity athletics teams for women. What was a young woman to do? There was no appeal. For better or worse, this was the way things were in American colleges back then. 

    In contrast to Chris von Saltza’s experience, over a half-century later another high school senior, American swimmer Katie Ledecky, won five medals at the 2016 Olympics held in Rio de Janeiro. She, too, was about to graduate from high school and would enroll at Stanford as a freshman in the fall of 2016. The historic difference was that she had a full athletic grant-in-aid plus a year-round national and international schedule of training and competition along with prospects for substantial income from endorsements and a professional athletics career. 

    In 2018 there are pinnacles of success that indicate changes since the 1960s. Katie Ledecky has excelled as a student and works as a research assistant for a Stanford psychology professor. Ms. Ledecky also led her Stanford women’s swimming team to two National Collegiate Athletic Championships and recently signed a $7 million professional swimming contract.

    Connecting the dots to explain the comparisons and contrasts of these two Olympic champion women swimmers who were students at Stanford requires reconstruction of the condition of college sports for students in the decade 1960 to 1969 The profile of Stanford‘s Chris von Saltza’s lack of collegiate opportunities was not an isolated incident. Following World War II American women had triumphed in the Olympic Games every four years - but with little base provided by high school or college sports. 

    At the 1964 Olympic Games in Tokyo, the women’s swimming star was Donna DeVarona, who won two gold medals. In 1964, she was featured on the covers of both Time and Life magazines and named the outstanding women athlete of the year. Despite her achievements, her competitive swimming career was over, as she and other women athletes had few if any options for formal training and participation in intercollegiate sports or elsewhere.

    Young women from the U. S. won gold medals in numerous Olympic sports. A good example was Wilma Rudolph, who won three gold medals in track and field at the 1960 Olympics in Rome. Rudolph benefitted from one of the few college track and field programs for women in the U. S., coached by Ed Temple. Most of their competition was at Amateur Athletic Union (AAU) meets, with no conference or national college championship meets available. Furthermore, at historically black Tennessee State University, funding and facilities were lean. 

    The limits on women’s sports are revealed in college yearbooks of the era. A coeducational university campus yearbook devoted about fifty pages to men’s sports, especially football and basketball. In contrast, women’s athletics typically received three pages of coverage. In team pictures, the uniforms often were those of gym class gear. The playing format was for one college to sponsor a “play day” in which five to ten colleges within driving distance gathered to sponsor tournaments in several sports at once. Softball, field hockey, basketball, and lacrosse were foremost.

    Coaches, usually women, usually received minimal pay. Most held staff appointments, in physical education where they taught activity classes. The women’s gym had few, if any, bleachers for spectators. Coaches of the women’s teams usually lined the playing fields with chalk, mopped and swept up the gymnasium floors, and gathered soiled towels to send to the laundry. One indispensable piece of equipment for a woman coach was a station wagon, as players and coaches piled in with equipment to drive to nearby colleges for games and tournaments. The women’s athletic activities often had their own director – yet another example of “separate but unequal” in intercollegiate athletics and all student activities. There was a perverse equality of sorts. All women students were required to pay the same mandatory athletics fee as male students, even though the bulk of it went to subsidize varsity teams that excluded women.

    Despite the lack of intercollegiate sports for women in the 1960s there were some signs of life. One was creation of alliances that eventually led to chartering a national organization, the Association for Intercollegiate Athletics for Women (AIAW) in 1971, with over 280 colleges as members. The first action the Division for Girls and Women Sports (DGWS) took was to establish the Commission on Intercollegiate Athletics for Women (CIAW)to assume responsibility for women’s intercollegiate sports and championships. 

    One heroic figure associated with women’s sports to emerge in the decade was Donna Lopiano, who graduated with a degree in physical education from Southern Connecticut State University in 1968. She excelled in sports as a girl and was the top player picked in the Stamford, Connecticut Little League local draft. However, she was forbidden to play baseball with the boys due to by-law language’s gender restrictions. Lopiano started playing women’s softball at the age of sixteen. After college, she was an assistant athletics director at Brooklyn College, coached basketball, volleyball, and softball, and then took on leadership roles in national women’s sports associations. Eventually she was Director of Women’s Athletics at the University of Texas along with appointments in sports policies and programs. She also was one of the most honored athletes of her era. Her experiences, including exclusion from teams, shaped her dynamic leadership over several decades. 

    The bittersweet experiences of women athletes such as Donna Lopiano, Chris von Saltza, Wilma Rudolph, and Donna DeVarona show that although the 1960s has been celebrated as a period of concern for equity and social justice, colleges showed scant concern for women as student-athletes. One conventional analysis is that in 1972 the passage of Title IX would usher in a new era for women and scholastic and college sports. That was an unexpected development. In congressional deliberations around 1970, neither advocates nor opponents of Title IX mentioned college sports. All sides were surprised when the issue surfaced in 1972. The National Collegiate Athletic Association opposed inclusion of women’s sports – until it made an unexpected reversal in 1978. Many colleges were slow to comply with the letter or spirit of Title IX. As late as 1997 the burden was on women as student-athletes to file lawsuits against their own colleges, pitting them against athletics directors, presidents, boards, and university legal counsel.

    Title IX eventually demonstrated how federal legislation could prompt universities to provide programs and services accessible to women that they would not have provided if left to their own volition. It has required contentious oversight of resources for student athlete financial aid, training facilities, coaching salaries and other parts of a competitive athletics team. It includes television coverage of women’s championships in numerous sports. Equity and opportunity across all college activities, ranging from sports to fields of study along with hiring and promotion, remain uneven. The caution is that the experience of a Katie Ledecky at Stanford, including her professional swimming contract is exceptional. Sixty years after Chris von Saltza won her four Olympic medals and entered Stanford, inclusion of women as full citizens in the American campus is still an unfinished work in progress.

  •  

     

    The separation of children from their illegally migrant parents in the USA is seen as an aberrant and inhumane deviation from American tenderness for the family. This orphaning, as a matter of policy, is not “who we are,” as many liberals and some conservatives despairingly say.

    But in many ways it is, and indeed, has long been. For the state, in the United States and earlier in Britain, has been a formidable creator of orphans. Perhaps this helps to explain the ambiguity in the attitude to the orphan: great display is made of theoretical pity and piety, but the way such children have been actually treated has frequently been punitive and repressive. Whether in orphanages, asylums, schools or other receptacles for those guilty of losing their parents, the extent of abuse by those in whose power such unfortunates have fallen is only now becoming clear.

    Whatever charitable sentiments are kindled by the plight of orphans, such compassion has rarely prevented countries from the making of yet more of them by failing to wage war, or even to prevent it in those places – Syria and Yemen – where the indiscriminate harvesting of human life yields its sorry crop of abandoned children.

    But it has not required war for governments, charities and even private individuals to rob children of their parents. From the first time a ship sailed from London to Virginia in the early 17th century taking “a hundred children out of the multitude that swarm about the place” until the last forced child migrants from Britain to Australia in 1967, thousands of young people were orphaned, not only of parents but of all ties of kinship, country and culture. The orphans sent from Britain to Australia alone numbered some 180,000.

    A long association of derelict and orphan boys with the sea was formalized in a statute of 1703, which ordered that “all lewd and disorderly Man Servants and every such Person and Persons that are deemed and adjudged Rogues, Vagabonds and Sturdy beggars…shall be and are hereby directed to be taken up, sent, conducted and conveyed to Her Majesty’s Service at Sea.” Magistrates, overseers of the poor were empowered to apprentice to marine service “any boy or boys who is, are or shall be, of the age of ten and upwards or whose parents are or shall be chargeable to the parish or who beg for alms.”

    Transportation removed 50,000 felons – among them many juveniles – by transportation to the American colonies; and in the process robbed many more children of at least one parent. In the 1740s, recruiting agents in Aberdeen sowed fear by luring children to service in the plantations. Peter Williamson and his companions, shipped to Virginia in 1743, were sold for sixteen pounds each. In 1789 the first convict ship, the Lady Juliana, consisting entirely of transported women and girls, set sail for Australia.

    The historical fate of a majority of orphans is unknown. Many were taken in by kinsfolk or neighbors, and while many must have been fostered out of duty or affection, others were certainly used as cheap labor, for whom their foster-parents were accountable to no one.

    It was not until the industrial era that the policy of removing children from their parents in the interests of society became widespread. The Poor Law Amendment Act permitted parishes to raise money to send adults abroad. One of the Assistant Commissioners claimed that “workhouse children had few ties to their land, and such as there were could be broken only to their profit.” In 1848, Lord Salisbury also advocated emigration for slum children.

    Annie McPherson, Quaker and reformer, was the first private individual to organize mercy migrations, the rescue of children from their “gin-soaked mothers and violent fathers.” She set up a program of emigration in 1869. Dr Barnardo used Annie McPherson’s scheme, before implementing his own in 1882. He referred to “philanthropic abduction” as the rationale behind this disposal of the offspring of misery. 

    At the same time, “orphan trains” carried children from New York and Boston to the open plains of the West, under the auspices of the Children’s Aid Society, established in 1853 by Charles Loring Brace. Sometimes children were “ordered” in advance, others were chosen as they left the train, or paraded in the playhouses of the small towns where farmers could assess their strength and willingness to work. These “little laborers” responded to a shortage of workers on farms. Between 1854 and 1929 a quarter of a million children were dispatched in this way.

    In Britain, what were referred to as “John Bull’s surplus children,” were promised a future of open air, freedom and healthy work. Some were undoubtedly well cared for; but others exposed to exploitation, life in outhouses and barns, freezing in winter, stifling in summer, isolation and deprivation of all affection. The proponents of such schemes argued that this would provide them with a fresh start in life; but the cost of a one-way journey to Canada was far less than their maintenance by payers of the poor-rate. 

    Joanna Penglase has called babies and infants taken from their mothers’ care for “moral” reasons, or simply because it was regarded as socially impossible for a woman to raise a child on her own as “orphans of the living.”

    In 2010, the then British Prime Minister Gordon Brown apologized for the removal of children from their parents under the Fairbridge scheme, which took them to Australia, a practice which continued into the late 1960s. In 2008 Kevin Rudd, then Premier of Australia, apologized to indigenous families whose children had for generations been removed. In 2013 the Irish Taoiseach apologized for the abuse of orphans and illegitimate children by the Magdalene laundries from 1910 until 1970. 

    It is in this context that former Attorney General Jeff Sessions declared zero tolerance of illegal immigration in April 2018. All such people would be prosecuted. Families were broken up, because detention centers were “unsuitable” for children. In June, after harrowing scenes of forcible separations, Trump signed an executive order that families should be kept together. All children under five were to be reunited with their families within 14 days, and those over five within 30 days.

    It might have been thought that the creation of orphans by government had been consigned to history. Was it amnesia or dementia that made the administration, in its determination to be tough on illegal migration, separate parents from their children in its retreat to a tradition of punitive indifference to the most vulnerable? 

    And then, what of the orphans of addiction, of mass incarceration, the abductions of the offspring of the marriage of technology with commerce, orphans of the gun-fetish and the multiple social estrangements created by social media and the engines of fantasy which lure children from their parents, protectors and guardians? The orphan-makers have never been busier in this era of wealth and progress.

“The great force of history comes from the fact that we carry it within us, are unconsciously controlled by it in many ways, and history is literally present in all that we do.”

James Baldwin
The Price of the Ticket