There’s always something new going on in the History Department.
There’s always something new going on in the History Department.
In the traditional view of the Pilgrims, one told in many 20th century textbooks, the Pilgrims came to Massachusetts on the Mayflower to make a break with old England and start a unique society.
In her new book, The World of the Plymouth Plantation, Dr. Carla Gardina Pestana corrects the “false impression that Plymouth was disconnected from the world that gave rise to it.”
In fact, the Plymouth settlers were “deeply connected” to many other places through England’s transatlantic trading network. They envisioned creating an English “plantation,” and carried with them many traditional English customs. The term “Pilgrims” was bestowed upon them in the 1700s in order to distinguish them from other settlers, particularly the Puritans of Boston.
Dr. Pestana, the Joyce Appleby Endowed Chair of America in the World in UCLA’s History Department, is the author of four previous books including The English Atlantic in an Age of Revolution, 1640-1661.
Pestana discussed her work with the History News Network.
Q. While we may think of Thanksgiving as a uniquely American holiday, you describe the gathering that took place in November 1620 as one that “invoked the seasonal festivals of English rural culture.” The event was not labeled “thanksgiving” until the 1800s. What shaped our modern conception?
Edward Winslow supplied a brief description of this gathering, the only mention we have of it. His account makes clear that the event marked the first harvest with a meal enjoyed in a community gathering. The martial display that kicks off the event may have attracted the 90 warriors whom Winslow mentions as showing up, but in the end, they supplied venison and took part in the feast. To mark a harvest with a community festival was a standard celebration in any English village, and, from what Winslow describes, that sort of event was their model. It occurred probably in September, a more appropriate time for a harvest gathering in New England, and it represented a chance to rest and celebrate after the hard work of bringing in the crops. The women who prepared the food did not experience a rest from their labors, however, which is usually true of most American thanksgiving celebrations to this day.
The label “Thanksgiving” got added later, and it invoked a different colonial practice, getting it slightly wrong too. In colonial New England, in later years, days of Thanksgiving occurred irregularly, prompted by specific occurrences. Local leaders called for such a day when they wanted to thank God for some good turn of events. A successful harvest could prompt such a day, but so did many other occasions: good news about events in England, success in war, a possible calamity (such as a fire) averted.
Thanksgiving days had a counterpart in days of humiliation. When events went against the community, leaders called for such an occasion, in which fasting and prayer brought everyone together, to beg God for forgiveness. They saw communal suffering as judgement for past sins and successes as a reward or a mercy. In this providential world view—in which God was thought to take a regular interest in the community’s affairs—New Englanders (and Christians elsewhere) responded to what it perceived to be the message behind each major change in their fortunes.
Thanksgiving as a national holiday denoted, at the time of its 19th century beginnings, hard work, sacrifice for the common good, and what we would call family values. Abraham Lincoln wanted to emphasize these values at the height of the Civil War, and they remain central to the holiday’s image. If today we have gotten far from a harvest festival and farther still from that of a one-off moment to thank God for a specific event, family remains at its heart.
Q. You write that the Plymouth settlers considered themselves “planters” intent upon “transplanting the society they knew as well as well as the household work regimens that made it possible.” For many Americans, “plantation” invokes images of southern aristocrats lording over enslaved workers.
The idea of plantation today means a large-scale agricultural undertaking that used enslaved laborers and (usually) produced a single crop like sugar, tobacco, or cotton. Planters were individual landowners who privately owned both the plantation itself and the slaves who worked it. Plantation is thus associated with the worst abuses of racial slavery, and activists today are understandably interested in eliminating that language.
Yet those who named Plymouth a plantation (or for that matter the recently renamed Rhode Island) were not thinking of slavery. Rather, they understood plantation to mean transplanting English people, their culture and ways of organizing society, into another location. It was, in other words, a synonym for colony or settlement, using a then-common meaning for the term. Ireland had plantations dating from the 16th century, created when lands seized from Irish landowners were turned over to English (or later Scottish) migrants who supported the English conquest of the island. In both the Irish case and the North American one, plantations introduced what we would today call settler colonialism, with the goal of displacing native peoples and replacing them with transplants.
That idea of plantation contained a certain arrogance, as those who did the planting clearly believed they had some right to occupy the lands of others. While it was not blameless, it was not worthy of censure for what modern critics of the term believe.
Q. The Mayflower Compact is often cited as the “birth of American democracy,” but you believe the men who wrote it in 1620 had another purpose in mind.
The men who signed the compact (which they did not call the Mayflower Compact) did so because they found themselves unexpectedly in a place with no government. They had hoped to land in what was then northern Virginia (around modern-day Delaware). Had they been there, they would have come under the jurisdiction of the Virginia Company. In the unhappy event of landing in New England, they decided they needed to curb an impulse toward anarchy that some leaders feared would overtake them. The document they drafted placed the group firmly under the authority of the English monarch, King James I. They agreed to create their own laws and ordnances, and pledged everyone’s “Submission and Obedience.”
The signers considered it a stop-gap measure until they could arrange a better basis for government, in particular a charter from the king. They never attained that charter, and they had to fall back on a ragged collection of inadequate documents, including a land grant from a defunct company and their shipboard agreement. Far from celebrating democracy, the agreement aimed to suppress individual impulses, granting more duties than rights. The document opens with allegiance to the king and ends with promising that they would submit and obey because those were its main concerns.
Q. You are skeptical that the “Plymouth Rock” memorialized in contemporary Plymouth Massachusetts was ever used as a docking point for the Mayflower passengers.
Plymouth Rock was identified as the possible landing site more than a century after 1620, when a group of interested townsmen took an elderly man out to the beach to question him about what he knew. If anyone had earlier worried about the precise landing site or discussed it, we have no evidence that was the case. There is little reason to believe that the elderly man (who could not have been an eyewitness) knew anything about the matter. Once he spoke, however, the townsmen dubbed the rock the landing site.
In a surreal twist, they then dragged this random boulder around, breaking it in the process, and installing it in various locations in town. It came to its final resting place, in the classical temple by the highway, much later, and that site was not its original resting place. So, the boulder that is supposed to denote the landing site is not at its original site nor is it likely to have had anything directly to do with the landing in any case.
A boulder as a landing site—often depicted as a sort of natural dock in various later images—makes no sense. Wooden boats steer away from boulders, not toward them. In addition, we have descriptions of the passengers wadding through the surf, dampening their clothing, when they disembarked. Clearly the sailors knew better than to deposit them on boulders, and the idea that they had done so may have been fueled by drink. I envision those who dragged the boulder around and broke as being the worse for having imbibed, in a sort of early version of a fraternity prank.
The Black Lives Matter movement precipitated a national discussion over who and what should be honored with statues, monuments and place names. Recently the New York City Parks Department announced ten “park spaces” would be named in “honor of the Black experience in New York City.” The newly named park spaces recognized national figures like Langston Hughes, James Baldwin, Elston Howard, and Ella Fitzgerald who all had New York ties and local community leaders.
There is a statue of Mary E. Lease in Wichita, Kansas, erected in 2001 by a Kansas women’s club she founded in 1886. As someone active in the push to take down statues and rename places, I have been thinking a lot about that statue and Lease’s role in history.
As a high school United States history teacher, I introduced students to Lease as a leader of the late 19th century Populist movement that introduced many democratic reforms into American politics, as a dramatic speaker who had the ability to mobilize a mass movement, and as an example of a powerful woman whose contributions to American history are often ignored.
Mary E. Lease was a late 19th century Populist leader who campaigned for the rights of farmers, workers and women. She was a leader of both the women’s suffrage and prohibition of alcohol movements and was active in the labor movement as a member of the Knights of Labor. As a Populist, Lease led a crusade against the power of corporate monopolies and banks that dominate the American and global economy. She is credited with advising Kansas farmers to “raise less corn and more hell.”
Lease is featured as a representative of Populism in The Americans (Houghton-Mifflin, 2005), one of the standard texts used in United States high schools. In the chapter on “Farmers and the Populist Movement” there is a photograph of Lease, a mini-biography, and a quote from one of her speeches (425). She is described as a “spellbinding” speaker (427).
Unfortunately, Lease was also an anti-Semite who used anti-Jewish tropes and direct anti-Semitic references to stir up her audience. In her standard stump speech, Lease warned,
this is a nation of inconsistencies. The Puritans fleeing from oppression became oppressors. We fought England for our liberty and put chains on four million of blacks … Wall Street owns the country. It is no longer a government of the people, by the people, and for the people, but a government of Wall Street, by Wall Street, and for Wall Street.
She also accused the Vice-President of the United States of being a “London banker,” a veiled reference to supposed ties with the Rothschild banking firm. This speech is quoted in Howard Zinn’s A People’s History of the United States, but the reference to “London banker” is left out.
In another speech, this one to the Women’s Christian Temperance Union, Lease claimed “the government, at the bid of Wall Street, repudiated its contracts with the people … in the interest of Shylock,” the stereotypical Jewish villain in Shakespeare’s The Merchant of Venice.
On August 11, 1896, the New York World reported on Lease’s “seductive oratory” at a Cooper Union “free silver mass-meeting” in Manhattan. Lease “attacked the entire social system” and “every reference to wealth and its owners received with wild delight.”
During her speech “Every mention of gold or wealth was greeted with shouts and jeers, and the names of Whitney and Cleveland, of Vanderbilt and Rothschild were hailed with hisses and cat-calls.” Whitney was William C. Whitney, financier, coal baron and political insider. Cleveland was President Grover Cleveland. Vanderbilt was Cornelius Vanderbilt II, who inherited his family’s control of the New York Central Railroad. Rothschild was the European Rothschild banking house, reportedly the wealthiest in world history. In the World article, Lease did not identify the Rothschilds as Jews. However, the New York Times also reported on Lease’s speech at the Populist rally. The Times article quotes Lease “We are paying tribute to the Rothschilds of England, who are but the agents of the Jews.” The speech was ”received with close attention, and was heartily applauded.”
In her book, The Problem of Civilization (1895), Lease wrote
“Our commercial system would be sadly disturbed if our government granted a monopoly of gallons, bushels and yards to a company of Jews. Then the man who conducts a wholesale or retail business would be compelled to hire a bushel, gallon or a yardstick from the Hebrew before waiting upon his impatient customers. Hunger, haste and pressing necessity alike would have to wait the pleasure and interest of the Jew.”
“why should money be conceded the quality of intrinsic value? Because tradition and superstition have invested it with an artificial nobility, similar to that of the Divine right of kings; because gold and silver are commodities which can be manipulated in their value by speculators and pirates of the financial world, and because, in the hands of the Rothschilds and their imitators the measure of values may be expanded or contracted to suit their interest.”
Later, Lease described the Rothschilds as the “hook-nose harpies of the House of Heber” and she accused President Cleveland, “Grover the First” of being “the agent of Jewish bankers.”
I am a descendent of the people Mary E. Lease called “hook-nose harpies of the House of Heber.” However my Eastern European ancestors were impoverished village peddlers and factory workers, not global bankers, as were the ancestors of most American Jews. There is no question that Mary E. Lease believed in and used anti-Semitic tropes. As the Black Lives Matter movement challenges statues and place names celebrating the Confederacy and racists, what do we do about Mary E. Lease?
In an October 1962 article in the American Historical Review (68/1, pg. 76-80), Norman Pollack of Michigan State University challenged “The Myth of Populist Anti-Semitism.” Based on his research in archive collections of the papers of Henry Lloyd, William Jennings Bryan, and Ignatius Donnelly, Pollack argued incidence of Populist anti-Semitism were “infinitesimal.” He dismissed accusations leveled at Lloyd and Bryan and describes Donnelly’s anti-Semitism as “ambivalent and complex.” According to Pollack, Populist newspapers were principally concerned with the Rothschilds as international bankers, not as Jews. Mary E. Lease’s use of anti-Semitic references in her speeches raise serious doubts about Pollack’s conclusions.
Unfortunately, anti-Semitism has a powerful and disturbing history in the United States and, through the speeches of Mary E. Lease, a connection with late 19th century Populism. Three decades after the peak of the Populist movement, quotas were imposed that severely limited Jewish immigration to the United States. Prior to World War II, President Franklin D. Roosevelt, fearing political repercussions if he were seen as aiding European Jewry, refused to allow Jewish refugees fleeing Nazi Germany to enter the United States and during the war would not order U.S. forces to bomb rail lines transporting millions of Jews to their deaths in Nazi concentration camps. In 2017, at a Unite the Right rally in Charlottesville, Virginia, marchers who were later defended by President Trump, chanted anti-Semitic slogans.
“The Devil and Daniel Webster” is a short story by Stephen Vincent Benét first published in the Saturday Evening Post in October 1936. It became an Academy Award winning movie in 1941.
Daniel Webster, Senator from New Hampshire, is the lawyer for a farmer accused of selling his soul to the Devil. The Devil acts as prosecutor and produces a jury of assorted villains that he expects to convict the farmer and honor the contract. But Webster is very persuasive and the Devil is incredulous when the farmer is acquitted.
Mary E. Lease, populist and feminist, sold her soul to an anti-Semitic Devil. If I were on her jury, I would vote to let her statue remain but insist that a plaque, and all textbook references, include the disturbing fact of her anti-Semitism.
David P. Barash is professor of psychology emeritus at the University of Washington. His latest book is Threats: Intimidation and its Discontents (2020, Oxford University Press).
USAF General Jack D. Ripper (Sterling Hayden) and RAF Captain Lionel Mandrake (Peter Sellers) overcome the credibility gap in Stanley Kubrick’s Doctor Strangelove or: How I Stopped Worrying and Learned to Love the Bomb (1964)
An ancient dilemma faced by leaders throughout history has been how to prevent — deter — attacks on their realm from outside (invasions) or from inside (rebellions). And an ancient answer, albeit not the only one, has been to threaten that any such perpetrators will be punished. The most prominent alternative has been attempted deterrence by denial, which has experienced mixed success, from the Great Wall of China to the Maginot Line of 20th century France.
Deterrence by punishment gets particular attention, not only because it underlies nuclear deterrence (there being no effective deterrence by defense), but because its consequences have been so horrific. Some of the most riveting accounts of murderous cruelty come down to us from Bronze Age kings, who famously flayed their opponents and made mountains out of human skulls, often as a “lesson” to would-be opponents.
A more recent but nonetheless hair-raising statement of this perspective came from Sir John Fisher, First Sea Lord, Admiral of the Fleet, and widely regarded as the most important British naval figure after Horatio Nelson. Speaking as the British naval delegate to the First Hague Peace Conference in 1899, Fisher emphasized that deterrence by punishment is likely to be effective in proportion as the threatener has a fearsome reputation:
“If you rub it in both at home and abroad that you are ready for instant war … and intend to be first in and hit your enemy in the belly and kick him when he is down and boil your prisoners in oil (if you take any), and torture his women and children, then people will keep clear of you.”
Connoisseurs of deterrence by punishment have long struggled with how to make the concept work, challenged not only by a desire to be something less than incorrigibly bloodthirsty, but also — especially in the Nuclear Age — deeply worried about how to make an incredible threat credible. Here is one of the more intriguing and incredibly dangerous “solutions.”
Even if you’re not a mountain climber, imagine for a moment that you are. Moreover, you’re roped to another climber, both of you standing by the edge of a crevasse. You’re having a heated argument, trying to get the other to do something that she doesn’t want to do — or alternatively, trying to get her to keep from doing something that she wants to do. The details don’t matter; what does matter is that the two of you disagree, strongly, about what should happen next.
How can you get your partner/opponent to bend to your will?
This sets the stage for an imaginary situation developed by the late Thomas C. Schelling, one of the leading theoreticians of nuclear deterrence and co-recipient of the 2005 Nobel Prize in economics. In his book, Arms and Influence, Schelling used a mountaineering model to explain what he called the “threat that leaves something to chance.” He proposed it as a way to get around the problem of credibility when it comes to the use of nuclear weapons. It is a very big dilemma, one that has bedeviled nuclear strategists for decades and that despite efforts by the best and brightest (including Schelling) remains unsolved to this day.
When it comes to nuclear deterrence, the credibility gap is easy to state, impossible to surmount: Nuclear weapons are so destructive and their use is so likely to lead to uncontrollable escalation and thus, to unacceptable consequences for all sides, that the threat to use them is inherently incredible. And this, in turn, presents an immense difficulty for strategists hoping to use the threat of nuclear war as a way of either coercing another side to do something they’d rather not (say, withdraw from a disputed region), or to refrain from doing something that they might otherwise do (e.g., attack the would-be deterrer). So, let’s return to those disputatious climbers.
If you simply announce your demand, the other might well reject it. What, then, might you do if you really, really want to get your way? You could threaten to jump, in which case both of you would die; remember, you’re roped together. Because of its suicidal component, however, such a threat would lack credibility, so the other person might well refuse to take it seriously. But suppose you move right to the edge, becoming not only more insistent but also increasingly erratic in your movements.
What if you start leaping up and down, or shuffling your feet wildly? Your credibility would be enhanced, not because falling in would then be any less disastrous, but because by increasing the prospect of shared calamity, emphasizing your threat by adding a soupçon of potentially lethal unpredictability over which you have no control, your literal brinkmanship just might kill both of you, not on purpose, because as we already saw, that threat would lack credibility. But rather because chance factors — a sudden loss of balance, a gust of wind — might do what prudence would otherwise resist. Thus, according to Schelling, unpredictability — leaving something, somewhat, to chance — would surmount the problem of incredibility. This terrifying loss of control wouldn’t be a bug, but a feature.
In the world of nuclear strategy, the problem of credibility is like that of an adult, trying to deal with a child who refuses to eat her vegetables; the frustrated parent might threaten “Eat your spinach or I’ll blow up the house.” (Don’t try this at home; first of all, it probably won’t work.) Or consider a police officer, armed with a backpack nuclear weapon, who confronts a bank robber by demanding “Stop, in the name of the law, or I’ll blow up you, me, and the whole city.” It’s what led a NATO general to complain during the Cold War, when the West’s nuclear weapons were deployed to deter the Red Army from over-running Europe, that “German towns are only two kilotons apart.” And what led to interest in neutron bombs (designed to kill troops but leave buildings intact), as well as in doctrines (“limited nuclear war-fighting”) and devices (battlefield nuclear weapons), designed to be usable and thus, credible.
But the downside of dancing on the edge of a crevasse in order to make your threat credible by leaving it somewhat to chance, is that, well, it leaves that thing — and a rather important one at that — to chance! By the same token, making nuclear deterrence more credible by deploying weapons that because of their size and ease of employment are more usable means that they must in fact be more usable, a paradoxical situation for weapons whose ostensible sole purpose is to make sure that they won’t be used!
In an earlier book, The Strategy of Conflict, Schelling had discussed the means whereby one side might coerce another, despite the fact that it cannot credibly threaten nuclear war, by employing “the deliberate creation of a recognizable risk of war, a risk that one does not completely control, deliberately letting the situation get somewhat out of hand … harassing and intimidating an adversary by exposing him to a shared risk.”
Schelling’s hair-raising mountain metaphor may have been inspired by the word “brinkmanship,” which seems to have been first used by Democratic Party presidential candidate Adlai Stevenson, who, at a campaign event in 1956, criticized Republican Secretary of State John Foster Dulles for “boasting of his brinkmanship — the art of bringing us to the edge of the nuclear abyss.” At about this time, Henry Kissinger (then a little known university professor) began developing both the notion of “limited nuclear war” as a way of circumventing the credibility problem, and the concept of the “security dilemma,” in which “the desire of one power for absolute security means absolute insecurity for all the others.”
The idea of security dilemmas has typically been applied to the problem of reciprocal arms races, whereby a country’s effort to counter a perceived military threat by building up its arsenal results in its rival feeling threatened, which leads that rival, in turn, to build up its arsenal — and so on. As a result, both sides end up less secure than they were before. Brinkmanship, à la Dulles and Schelling, introduces yet another dilemma: when a lack of credibility leads to various stratagems intended to enhance credibility, they may well succeed in doing so, but in the process reduce security on all sides.
So, the next time you find yourself tethered to an adversary at the edge of a crevasse — whether in a thought experiment or reality — you might want to recall the advice offered by the super-computer in the 1980s movie, WarGames: “the only winning move is not to play.”
William Thornton’s Design for the U.S. Capitol, 1796
On the Sunday after the November 3rd presidential election, Utah Senator Mitt Romney, the 2012 Republican presidential candidate, congratulated President-elect Joe Biden but insisted that the overall election was an endorsement of conservative principles. He pointed to the gains Republicans made in the House, though they are still in the minority, and the failure of the Democrats to capture control of the Senate, at least so far. Romney found further evidence in the Democrats’ inability to flip GOP-controlled statehouses.
Romney, however, is mistaken in his basic assertion. First of all, Biden won by more than 5 million popular votes, nearly 4 percent more than Trump’s total. The president-elect obtained the highest number of popular votes in the nation’s history. Biden’s margin of victory, contrary to Romney’s claim, is not a mandate for conservatism. Rather, at the very least, the election was a referendum on President Trump’s leadership, which of course Trump used to promote conservative ideas concerning tax cuts for the wealthy and the relaxation of business and environmental regulations.
No presidential election outcome reflects any single issue and it remains for the experts to crunch the numbers and analyze the ingredients that secured the Biden-Harris victory. Yet we already have sufficient evidence that the majority of the American people favor progressive positions on many issues. Surveys by Gallup, Pew, and other reliable organizations consistently show that a significant majority of Americans favor “Medicare for All,” tighter gun safety restrictions, and the freedom of women to have abortions in most situations. A number of states, including Florida, have voted for the $15 minimum wage. Last but not least, polls show that a majority of Americans believe that racial discrimination continues to exist and should be addressed. These are progressive not conservative principles, and are sustained by the Biden-Harris victory.
Nevertheless, Romney is correct in one sense. In the United States what I call “systemic conservatism” continues to prevail. I draw this phrase from our reawakened realization of “systemic racism,” and the two are related. By systemic conservatism I mean the institutional barriers created by the founding fathers to limit popular democracy. One obvious example is the Electoral College. Presidential candidates have to win a majority of electoral rather than popular votes. Had it been otherwise we would not have a President George W. Bush or Donald J. Trump. The Electoral College consists of the number of votes each state has in the U.S. House and Senate (except for the non-state of Washington D.C., which has three electors). The choice of an Electoral College to decide the presidency resulted from efforts of small state and slave state delegates at the Constitutional Convention to ensure their ongoing power. Most troubling, under the three-fifths compromise slave states increased their electoral votes. They did so by securing the constitutional right to count 60 percent of their enslaved people for purposes of representation in Congress and the Electoral College.
In addition, the Senate created rules to frustrate a majority of its members. Until the 1960s, southern senators used the filibuster rule, which allowed unlimited debate in the absence of a supermajority vote, to frustrate attempts to pass civil rights legislation. Republican Majority Leader Mitch McConnell has used this rule to thwart progressive legislation for the past ten years. Even if the Democrats wind up gaining two seats in Georgia, resulting in a Senate tie, they will need sixty votes to enact legislation unless the filibuster rule is changed. And if they manage to do so, a conservative majority on the Supreme Court can still overturn that legislation.
The federal system has often blocked the effects of progressive policies initiated at the national level, The post-Reconstruction Jim Crow era that lasted into the 1960s saw the southern states eviscerate the Fourteenth and Fifteenth Amendments in a variety of ways. Even when the Supreme Court struck down racial segregation in schools in 1954, southern states adopted so-called freedom of choice plans to sidestep the court’s ruling for another two decades. President Franklin D. Roosevelt’s New Deal legislation was instrumental in combating the Great Depression, but it had to be administered through the states. This gave states, particularly in the South, the opportunity to reinforce racial segregation within these programs and also ensure that agricultural subsidies benefitted plantation owners to the detriment of their tenant farmers and sharecroppers, a disproportionate percentage of whom were African American.
Progressive change does happen within our political system but it faces serious obstacles. The abolition of slavery and the extension of citizenship and voting rights to African Americans required a Civil War. It took the Great Depression to achieve Social Security, minimum wages, and anti-child labor laws. The Civil Rights Movement was necessary to re-enfranchise African Americans and people of color as was the Women’s Suffrage Movement that extended the vote decades earlier mainly to white women. Democratic Party victories following the 2008 Great Recession provided for a short time the majorities needed to move incrementally toward universal health insurance.
There is also ample precedent within the federal system of states serving as laboratories for progressive policies, as was the case in Wisconsin during the early twentieth century. Under the leadership of Governor Robert M. LaFollette, Wisconsin joined government officials together with academic advisers to create a reform agenda that was copied throughout the nation In the early 2000s, Massachusetts under the leadership of Governor Mitt Romney created a system of statewide health insurance that became the model of President Barack Obama’s Affordable Care Act. However, in 2020, with most state legislatures in the hands of Republican majorities, the prospects for reform measures bubbling up from the bottom to the top of the political mainstream are dim.
Just as President-elect Biden will have to confront systemic racism he will also have to deal with systemic conservatism. It does not look like he will have the necessary legislative majority to achieve his programs. At best, incremental rather than sweeping change is more likely.