History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Tue, 27 Oct 2020 05:41:25 +0000 Tue, 27 Oct 2020 05:41:25 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://blog.hnn.us/site/feed Survivors, Apprentices, and Entrepreneurial Sharks: The Mark Burnett Reality TV Presidency

 

 

We can now be reasonably sure that we have more accurate information about President Donald J. Trump’s financial situation—that he has often paid remarkably little in taxes, that he is not as successful a businessman as he has frequently claimed, and that he may soon have to come up with money to pay off huge debts. However, as a historian of entrepreneurship, I find the most interesting revelation is that the British-born but self-proclaimed self-made American Mark Burnett rescued Trump when he picked the future president for a starring role on The Apprentice, which launched in 2004. This reality television show followed Burnett’s Survivor (2000- ) but preceded Shark Tank (2009- ).

From a working-class background and without a college education, Burnett arrived in the United States in 1982. He quickly persuaded a wealthy Beverly Hills family to hire him as a nanny, having convinced them that his experience as a member of an elite paratrooper force in Ireland and the Falkland Islands meant they were getting a nanny and security guard all-in-one. A job as a nanny for a second wealthy family, this time in Malibu, enabled him to hone his skills as an apprentice to the household’s head who provided contacts and information and, eventually, a job in his insurance office. That job helped Burnett realize such employment would neither make him rich nor sate his adventuresome restlessness. 

 

A critical element in the narrative of his own life that Burnett has offered was the role Trump played in motivating him. He insisted that reading The Art of the Deal (1987) had inspired him “to jump in, take risks, and pursue a career as an entrepreneur.” Trump returned the favor, not only by agreeing to star in The Apprentice, but also by writing an introduction to Burnett’s Jump In! Even If You Don’t Know How to Swim (2005), which the real estate tycoon described as “the inspirational rags-to-riches story of an American immigrant.” Burnett, he continued, was someone who “transformed hard work and an inspired vision into the realization of the American dream.” The values Burnett promoted in Survivor, The Apprentice, and Shark Tank were ones Burnett and Trump shared. However, though they were reliable in producing successful reality TV programs, they do not work as well in governing a nation in crisis or in sustaining a Hollywood career.

 

As Ben Smith wrote in the New York Times on October 19th, 2020, “like his greatest creation, Mr. Trump—who sought and then lost an idiotic ratings war on Thursday night with Joe Biden—Mr. Burnett seems to be struggling to keep his grip on the cultural moment.” It has now been almost ten years since Burnett last hit it big. Both of them have done their best to deploy religion to shore up their careers. Recently Trump controversially, and some say cynically, held up a Bible for a photo op at St. John’s Church. In 2013 Burnett produced The Bible for the History Channel, casting his third wife, Roma Downey, as the Virgin Mary. “An earnest but shallow take on the Greatest Story ever Told,” one commentator remarked, “The Bible suffers from leaden pacing and mediocre special effects.”

 

The selection of Trump as the host of The Apprentice assured Burnett’s success as a television producer but more significantly for the nation and the world helped rescue Trump from an uncertain economic fate and eventually elevated him into the presidency. Burnett had contributed to Barack Obama’s campaign but resisted pressure in late 2016 to release potentially damaging tapes of The Apprentice that ended up on the cutting room floor. During the fall of 2016 Burnett criticized Trump when he remarked that “my wife and I reject the hatred, division and misogyny that has been a very unfortunate part of his campaign.” Once Trump was in the White House, Burnett danced a delicate dance to avoid alienating the president and harming his own brand.

 

Before Trump began firing people on The Apprentice, his brand, especially outside the world of real estate, was not very valuable. That changed dramatically in the show’s first year, when his appearances brought him almost $12 million, a figure that increased dramatically to almost $48 million in 2005. From that high point, the numbers drifted downward and he may well have launched his successful bid for the presidency not seeking to achieve that high office, but to restore the luster and power of his brand.

 

Whether or not entrepreneurship can rescue the nation from the intertwined threats of the COVID-19 pandemic, an economic meltdown, and a potentially contested election outcome remains to be seen. So far, it appears that what is needed is less neoliberal self-government in which entrepreneurship plays the central role and more vigorous action by the federal government.

Moreover, it remains to be seen what results Burnett and Trump have wrought. On November 3rd, American voters will have to decide whether Trump will be a survivor or whether they want to say to the apprentice president, “You’re Fired.”

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177928 https://historynewsnetwork.org/article/177928 0
Trump: Superhero or Superspreader?

 

 

 

When Donald Trump left Walter Reed military hospital on October 6, he wanted to show his strength by channeling Clark Kent. But rather than rip off his shirt, he ripped off his mask. “Don’t let it dominate you,” he commanded on a day that the national death toll from COVID-19 reached 213,000.

 

Maybe our fearless leader got the Superman idea from the White House Gift Shop. Anthony Giannini, CEO and chief designer of this private entity based in Lititz, Pennsylvania, had just announced that a Superhero-themed medal celebrating Trump’s “historic” defeat of the evil coronavirus was in the works. Available for pre-order and priced at $100, it is scheduled to ship ten days after the election.

 

Honoring heads of state with medals is a time-honored custom embraced by Roman emperors as well as autocratic sovereigns like Louis XIV, France’s Sun King. In the U.S., the first Presidential medals were presented to Native American chiefs as a way to promote “peace and friendship” between First Nations and the federal government. Others, including so-called “challenge coins” gifted to military and law enforcement, commemorated the inaugurations and achievements of our country’s highest elected officials. 

 

Trump has revived and expanded this tradition far beyond that of any previous White House occupant. His official challenge coin – bigger and glitzier than its predecessors – replaces the national motto, E pluribus unum (“out of many, one”), with his campaign slogan, “Make America Great Again.” Another coin touts Mar-a-Lago, his personal Versailles. Trump aides and acolytes have gotten in on the game, resulting, for instance, in a challenge coin commissioned by his private security detail that proclaims “Have Gun, Will Travel.”  

 

Medals made by the White House Gift Shop also seek to puff up the President and spread his “l’état, c’est moi” message. Memorializing everything from “Genius Level Thinking” to summits with North Korea’s Kim Jong Un, they include an image of the impeached Trump gloating above the phrase “Acquitted for Life.” One example  – released on April 30, 2020, the day after Florida announced its reopening – depicts a coronavirus atom hovering villainously above a globe encircled by the words “Together We FOUGHT The UNSEEN Enemy/Everyday HEROES Suited Up.”

 

Trump has often been likened, by the outgoing French ambassador and others, to Louis XIV. Known for their mutual love of gold, big hair, and shameless self-promotion – one of Louis XIV’s medals portrayed him as the sun warming the earth beneath his motto Ne Pluribus Impar (“not unequal to many”) – the two leaders also share an appetite for newfangled medical treatments. Long before the President was given Regeneron, the “miracle” antibody cocktail with its own Superhero name, the Sun King underwent experimental surgery for an anal fistula, a painful ailment that his doctors had tried to treat with enemas and laxatives. 

 

medal heralding Louis XIV’s recovery shows him dressed as a Roman emperor, lifting his eyes heavenward below a Latin inscription that reads, “God the Protector of the Prince.” Other royal medals praise the Sun King’s “immortality” along with his persecution of Protestants, his crushing of European rivals, and his naval campaigns against North African Muslims. 

 

All of this gilded propaganda made Louis XIV look like an invincible ruler with a glorious lineage. They also made him the butt of scathing scatological critique. Satirical medals produced by his European enemies condemned the king for his cruel despotism and mocked his capacity for self-destruction. Mimicking the royal medals they ridiculed, several fake designs were snuck into a pirated edition of the monarch’s official medal book. An especially “humiliating” example shows Louis XIV getting a rectal cleanse from the pope while vomiting coins into a basin held by the ruler of Algiers. On the reverse side, an exploding bomb foreshadows “The French Empire” imploding “of its own accord.”

 

Inspired by the Sun King’s detractors, the White House Gift Shop’s self-parodying statements, and Trump’s own Superman fantasies, we offer a medal honoring the Super Spreader. Unlike Louis XIV, who collected the coins that mocked him and apparently enjoyed viewing them with friends, the American leader has proven himself to be not only humorless but impervious to satire. Sorry, Mr. President. It is what it is. 

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177931 https://historynewsnetwork.org/article/177931 0
FDR Was Right to Propose Enlarging the Court

 

 

Franklin Roosevelt rarely made political mistakes. But on February 5, 1937, he made one of the great blunders of his administration when he sent a message to Congress seeking a reorganization of the federal judiciary. Roosevelt sought to increase the number of justices on the Supreme Court, up to a total of fifteen, if justices over seventy years of age refused to retire. For every such non-retiring justice, the president would be permitted to appoint a new justice—potentially six new positions.

FDR was right when he asked for the authority; his mistake was not talking directly about why the expansion was needed. He argued that the courts were not effectively handling increasingly overcrowded dockets and that many judges, taking advantage of life tenure, were staying on the bench too long. “This brings forward the question of aged or infirm judges,” Roosevelt wrote Congress, “a subject of delicacy and yet one which requires frank discussion.”

These came across as sham arguments, and the president thereby lost credibility. The public knew right away that court efficiency and age were not the problems. The problem was the Supreme Court was trying its best to undo all the badly needed legislation passed under the New Deal to address the deepest depression the country had ever known. 

FDR feared this conservative reaction—the drive to negate the New Deal—was about to get worse, even though he had just scored one of the greatest presidential election landslides in history in 1936.

What the president knew was that as prosperity returned the need for reform would wane and the accomplishments of the New Deal were at risk. As Stanford historian David M. Kennedy noted in his Oxford history, Freedom from Fear, the chief danger to the New Deal came from the Supreme Court. “Most ominous,” Kennedy wrote, “the threat of judicial nullification loomed over virtually every New Deal measure thus far enacted. The Supreme Court had already gutted many of the reform initiatives of the Hundred Days, notably NRA (National Recovery Act) and AAA (Agriculture Adjustment Act).”

Kennedy points out that Roosevelt had reason to set his sights on the Supreme Court and the federal bench. He had appointed no justice to the Court during his first term. Woodrow Wilson, his only recent Democratic predecessor, had nominated one liberal, Louis Brandeis, and one conservative, James McReynolds. Republican presidents had named the remaining seven justices. More, nearly 80 percent of the all judges sitting on federal courts had been appointed by Republican presidents.

Through a series of cases the Supreme Court threw out the National Industrial Recovery Act in Schecter Poultry Corporation (where owners violated wage and hour codes, not to mention sold diseased chickens), the Agricultural Adjustment Act in United States v. Butler (where a tax on processors went to pay farmers to limit production), and even a state law—the New York minimum wage law in Morehead v. New York ex rel. Tipaldo.

There was little question that the Social Security Act and the National Labor Relations Act, among others, were imperiled. The judicial assault, based on grandiose arguments about “freedom of contract,” challenged the very power of the national government to address the problems and disruptions of a modern industrial economy.

We have seen this phenomenon play out in our times. A financial emergency in 2008-2009 provided the basis for a Democratic president to pass, among other reforms, the Affordable Care Act providing insurance to millions who were uninsured. But as prosperity returned (and before the pandemic), conservatives organized to undo the ACA and have been singularly focused on its destruction. And despite the fact the Act barely survived a challenge in the Supreme Court, there is little question that Trump appointees including Justice Amy Coney Barrett will declare the ACA unconstitutional, to say nothing of potentially unwinding gay marriage and a woman’s right to choose.

Is it farfetched to see even Social Security as a target?

The case to expand the Court today differs somewhat from 1937. Life tenure in FDR’s time resulted in a highly conservative federal bench stubbornly holding on to power. Roosevelt believed older judges from earlier generations were not best equipped to adjust to modern realities. “Little by little,” he wrote, “new facts become blurred through old glasses fitted, as it were, for the needs of another generation; older men, assuming that the scene is the same as it was in the past, cease to explore or inquire into the present or the future.”

Today, expansion arises from what many perceive as the “theft” of a seat--if not two--on the Court. Republicans would not allow President Obama his right to appoint a justice in an election year and now have reversed course to permit President Trump a nominee just weeks before an election.

A Barrett confirmation will result in a 6-3 conservative majority when the nation does not begin to resemble that sort of political alignment. The only real remedy given Republican machinations is to restore balance by adding justices. As FDR pointed out, there is nothing immutable about the size of the Court. He reminded Congress: “The Supreme Court was established with six members in 1789; it was reduced to five in 1801; it was increased to seven in 1807; it was increased to nine in 1837; it was increased to ten in 1863; it was reduced to seven in 1866; it was increased to nine in 1869.”

What FDR needed to do in 1937 was to say plainly his real reasons for increasing membership on the Court. Where a court decidedly ignores the will of the people and plays a nullifying role to critical national legislation, it is a political and constitutional option for a president and Congress to increase the size of the Court so that it will be representative of the nation as a whole.

Lifetime appointments should not mean that a badly out-of-step group of judges can overrun the will of the people.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177927 https://historynewsnetwork.org/article/177927 0
A Star-Spangled Moment of Reckoning for U.S. Civil-Military Relations?

 

 

 

As the American people hurtle toward the presidential election - a civically dispiriting event-in-the-making that promises a nastily contested outcome and the threatened non-peaceful transfer of power - the U.S. military hurtles toward its own come-to-Jesus moment of reckoning.

 

What comes to mind, oddly perhaps, is not previously contested presidential elections - Bush-Gore in 2000, say, or Hayes-Tilden in 1876 - but the April 25, 2003 National Basketball Association playoff game between the Portland Trail Blazers and the Dallas Mavericks. There, a visibly nervous 13-year-old, eighth-grade girl, Natalie Gilbert, stepped onto the court at Portland’s Rose Garden to sing the National Anthem. Midway through the fourth line - “. . . at the twilight’s last gleaming” - her mind went blank, and she forgot the words. Facing humiliation, she looked around frantically for help. To the rescue - no blaring bugles, no thundering hooves - came Blazers coach Maurice Cheeks, who walked quickly to her side, put his arm around her in encouragement, started singing the words, and coaxed her the rest of the way through to “the home of the brave” ending, the crowd roaring its grateful approval.

 

What a fitting metaphor. But why? It isn’t at all surprising that such a young person, unaccustomed to the glare of public performance, might choke in front of a raucous crowd of 20,000, plus national and international television audiences. That said, she nonetheless had the retentive powers that accompany youth and, only a year and a half out of elementary school, could reasonably be expected to have fully internalized the words to both the National Anthem and the Pledge of Allegiance. 

 

By way of contrast, when the U.S. military is directed, come November, to intervene in the aftermath of the election to put down popular dissenters accused of insurrection or to surround the White House to “defend” their commander in chief from physical removal (should that be necessary), their leaders will have no excuse for forgetting or ignoring the Constitutional oath they took, lo those many years ago. There will be no Maurice Cheeks-like savior to bail them out and excuse their behavior. They won’t be blessed by a “small” crowd of merely 20,000 sports-obsessed onlookers otherwise oblivious to the affairs of state; all the world, not just American society, will be watching. Such a situation will represent the nadir of the advancing politicization of the military that has taken place over the past four years; with it may come the final dissolution of “the deal” that lies at the heart of the social contract of civil-military relations. So the question will be, what is the military to do? To obey or disobey? To speak or remain silent? To serve their political masters, regardless of motive, or the Constitution and the American people?

 

On its surface, the Constitutional oath is seemingly simple and straightforward -the all-too-familiar imperatives being to “support and defend the Constitution of the United States against all enemies, foreign and domestic” and to “bear true faith and allegiance to the same.” Regrettably few who have sworn the oath have given it a second thought since they originally raised their right hand; and fewer still have ever devoted time to understanding its underlying meaning. Here, arguably, is what it means:

 

·      On my honor, I willingly and unreservedly promise to . . .

·      Believe in, subscribe to, be loyal to, and employ legitimately authorized means at my disposal to preserve and protect . . .

·      The principles, values, processes, and institutional prerogatives and arrangements . . .

·      Specified and embodied in (a) the Preamble, (b) the main body, (c) the amendments, and (d) the philosophical foundation (the Declaration of Independence) . . .

·      Of the U.S. Constitution against all parties – inside and outside government, inside and outside the United States – whose actions threaten any of the foregoing.

 

So, for the many who have taken the oath but never thought about its meaning, it means that all parts of the document (even the aspirational Preamble, which could be said, in toto, to be America’s Security Credo) embody principles, values, processes, prerogatives, and structural arrangements that must be embraced, protected, and practiced. But the Declaration of Independence, you say? Isn’t that a bridge too far? No, not at all. The seminal second paragraph of the Declaration underscores the purpose of government (to secure and preserve the specified and unspecified natural rights all human beings, not just citizens, deserve to enjoy), emphasizes the essentiality of government derived from the consent of the governed, and enjoins those subjected to tyrannical rule to exercise their right - and their duty - to dispose of such tyranny. If one accepts this expansive view of the oath, it puts those who have taken the oath in the position of having to think deeply about the lengths and limits of their authority to either enable or inhibit legitimate dissent - and to act accordingly.

 

The presidential oath of office is seemingly more oblique and contingent: “to preserve, protect and defend the Constitution” to “the best of [his] Ability.” Who can say how firm that commitment is expected to be in light of its reliance on presidential ability (and ability’s relationship, in turn, to will)? At least, in principle, the Constitution also directs that the President “shall take Care that the Laws be faithfully executed,” the Old English use of the word “shall” connoting an obligation born of necessity. By association, therefore, as executive agents of government, the military itself is expected to uphold the rule of law by taking care that the laws be faithfully executed, with the Constitution to which they have sworn their loyalty being “the supreme Law of the Land.”

 

The oath is the tangible instrumentality of the social contract of mutual rights, obligations, and expectations that binds the military to its civilian overseers (executive and legislative) and both, in turn, to society as a whole. Though there are plenty of authorities on the subject who would argue that the military is obligated - unconditionally, regardless of circumstance - to obey properly constituted civilian authority, there is a fully defensible argument to be made that at the heart of this contract is a contingent, quid-pro-quo “deal.” The deal is basically this: In return for the deference, obedience, compliance, and silence the military willingly relinquishes in the interest of remaining politically neutral, uniformed professionals expect certain things in return from their civilian masters: specifically, strategic literacy, strategic competence, and strategic leadership. Where civilian authorities fail, willingly and repeatedly, to demonstrate such attributes, the deal is broken.

 

What if the military rightly expects the cherished precepts of popular sovereignty, the rule of law, separation of powers, checks and balances, and civil liberties (such as due process, equal protection, free press, habeas corpus, and search and seizure limits) to be observed and demonstrated by those in power; but instead experiences a reality of polarization, inertia and entropy in the processes of government (e.g., no declarations of war, no budgets, executive orders taking precedence over laws, and executive agreements superseding treaties for reasons of convenience and expediency), executive-legislative collusion without cooperation, party and ideology regularly overriding institutional responsibilities, and the unceasing imperialization of the presidency?

 

What if the military understandably expects such traits as vision, courage, competence, integrity, accountability, and empathy from the political leaders to whom its members have subordinated themselves; but instead, reflecting what might be called “the saga of the vanishing role models,” sees only myopia (tunnel vision, parochialism, and provincialism), cowardice (sycophancy, toadyism, and obsequiousness), incompetence (cluelessness and incomprehension), hypocritical expediency (opportunism and dissembling), unaccountability (finger-pointing and blaming), and contemptuous intolerance (in the form of hateful “-isms” too numerous to mention)?

 

What if the military legitimately expects from public officials unwavering ethical propriety in the form of conscience (the “inner voice” of right), character (emulation-worthiness), principle (consistent action in the service of the public), example (practicing what one preaches), altruism (where Other-interest is accorded priority over Self-interest), and an unequivocal commitment to truth and justice (where fact provides the basis for fair treatment); but instead is forced to tolerate unhesitating shamelessness, excessive pridefulness, self-absorption, and me-ism, convenience and expediency in lieu of adherence to consistent standards, the irrelevance of model behavior, incessant selfishness, demonization, and marginalization, and, overlaying all else, the Death of Truth (in favor of disinformation, misinformation, and false narrative)?

 

These rhetorical what-if questions are ones the military should already have been asking itself, and we – the public – must yet ask of the military as November 3 fast approaches. The answers to these questions will determine whether a deal of such mammoth national importance, if broken, means that all bets are off and higher-order imperatives command the allegiance of those in uniform. When answers come, they will set the course – perhaps a new course – for civil-military relations in this country. 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177886 https://historynewsnetwork.org/article/177886 0
Where Ross Douthat Goes Wrong Five years ago I diagnosed the New York Times conservative columnist Ross Douthat's affliction by what I playfully dubbed “Ideological Displacement Syndrome” (IDS) –  in his case, a seemingly irrepressible drive to accuse liberals and the left of carrying dictatorial impulses that he somehow couldn't find in his own supposedly beleaguered, underdog conservative movement and Republican Party. 

 

Now, like some other longtime Republican loyalists, Douthat has acknowledged that Trump is feckless and reckless and that some rightists get out of hand, too -- but not as dangerously, he still insists, as liberals and progressives, who he claims are even more powerful and dangerous than Trump. 

 

He said so on October 19, writing under the headline, “Where Liberal Power Lies." But this kind of “blame the liberals” thinking --  ‘IDS’ to the max -- is worth contextualizing, because not only Douthat but also many others have been its carriers, on the left as much as on the right. At this moment, in a column portentously (pretentiously?) headlined, "The Last Temptation of Never Trump," Douthat is trying to wiggle out of the grip of what he calls his "right-wing id." But he still exemplifies how people who think ideologically succumb to delusions and twist hard realities, all the more so when they’re straining for trans-ideological omniscience.

 

In a 2015 column condemning the satirical (and, yes, reckless) French magazine Charlie Hebdo for mocking Muslim reverence for the prophet Mohammed, Douthat rebuked the slain cartoonists for having harbored “progressivism’s present confidence (even in the face of murder) in its prescribed hierarchies of power and victimhood…” In the space of just 830 words, he accused “the western left,” “the contemporary left,” “today’s progressives,” “contemporary progressive” thinking, “idealistic and progressive-minded figures,” and “progressivism’s present confidence,” and “today’s progressivism" of fomenting crises that were driven far more often by excesses of the right about which he was silent.

 

And just recently, in an October 11 Times column, he announced that “There Will Be No Trump Coup.” He presented Trump as “weak” and “ranting,” but no more so than “liberals” who “have spent the last four years persuading themselves that their position might soon be as beleaguered as the opposition under Putin, or German liberals late in Weimar.“ He even insisted that liberalism under Trump has become a more dominant force [his italics], with a… monopoly in the commanding heights of culture. Its return to power in Washington won’t be the salvation of American pluralism; it will be the unification of cultural and political power under a single banner.” Against these proto-totalitarian dangers, Douthat assured us, our “infected-by-COVID chief executive is not plotting a coup, because a term like ‘plotting’ implies capabilities he conspicuously lacks.” 

 

Maybe so. Maybe this election is on a precipice not because Trump has grand, authoritarian reasons for subverting it but because, as the historian Timothy Snyder shows devastatingly in the liberal Catholic journal Commonweal,  Trump is obsessed with escaping the legal and economic ruin that awaits him personally if he loses presidential immunity. In his (and much of the country’s) manic state, he may indeed derail the election, in effect pulling off a coup, absurdly but decisively, with more than a little help from the Republican Party.

 

But Douthat, too, is obsessed: Although he concedes that “[t]he threat of far-right violence is certainly real” (often enough at Trump’s suggestion), he writes that “America’s streets belong to the anti-Trump left” – a delusion dramatized memorably by Rudy Giuliani in his speech to this year’s Republican National Convention.  

 

I’d like Douthat to assess Rudy’s rant but also to respond to NBA basketball coach Doc Rivers’ almost-plaintive, at one point tearful, lament after he’d watched the Republican convention:

 

“All you hear is… all of them talking about fear,…. We’re the ones getting killed…. [We] protest…. They send people in riot outfits. They go up to Michigan with guns….  [but] Nothing happens…. My dad was a cop. I believe in good cops…. It’s amazing why we keep loving this country, and this country does not love us back. It’s really so sad. [I]f you watch that video, you don’t need to be black to be outraged. You need to be American and outraged.” 

 

Douthat isn’t outraged, not even by his claim that the streets belong to the anti-Trump left. He’s almost insouciant about the riot cops, the military in Lafayette Square, the police murders of unarmed young Black people, and the militias with assault rifles converging on state capitols. He seems insulated from it all by his oft-expressed faith that the only escape from this fallen world of ours is on your knees, eyes heavenward, while serving techno-capitalism: His book The Decadent Society ends with precisely that admonition: “So down on your knees — and start working on that warp drive.” 

 

The faux-brotherly, faux-humorous second phrase seems intended to ease our submission to the first by affecting a characteristically American, bro-ish fondness for technology. Never mind that liberal pluralism -- not today’s conservatism, let alone corporate governance -- protects Douthat's right to propound his beliefs and prevents him or anyone from imposing them on others.

 

In 2009, Douthat paid wiser attention to precincts less heaven-bound, in his and Reihan Salam’s book Grand New Party: How Republicans Can Win the Working Class and Save the American Dream (which I reviewed half-approvingly for Commonweal).  But since then, he and others, of many faiths, have drifted away from loving America as Doc Rivers has tried to love it: for ordinary Americans’ untiring, uplifting efforts to sustain a democratic, pluralist, economically sane alternative to authoritarian religiosity that accommodates itself to Trump or to authoritarian leftism that would impose “the unification of cultural and political power under a single banner.”

 

Excuse me, but isn’t “the unification” of cultural and political power what Douthat, Amy Coney Barrett, the late Antonin Scalia, and crypto-authoritarian thinkers such as Adrian Vermeule have been urging upon us? Doesn't Douthat's disdain for Trump disguise his own inclinations to such thinking? Even those of us who agree that America is in trouble partly because its liberalism has a hole in its soul  aren’t inclined to fill it by wearing holes in the knees of our jeans while working on our warp drives mainly for the benefit of plutocrats and their apologists. 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177924 https://historynewsnetwork.org/article/177924 0
Bergen-Belsen Through the Eyes of a Teenaged Inmate: A Conversation with Bernice Lerner

 

A Teen Inmate, a Physician Liberator, and Crimes Against Humanity: A Conversation with Dr. Bernice Lerner on Her New Book: All the Horrors of War: A Jewish Girl, A British Doctor, and the Liberation of Bergen-Belsen

 

By April 1945, as the Second World War neared an end in Europe, it was obvious that Germany was losing. Yet, many Nazi death camp and concentration camp commanders were furiously bent on exterminating as many “enemies of the state” as possible before the collapse of the Third Reich. 

In an odd turn of fate in mid-April, the Germans surrendered the notoriously brutal and overcrowded Bergen-Belsen concentration camp to British troops on orders of Reichsführer Heinrich Himmler, the official in charge of the Final Solution, the Nazi effort to destroy all European Jewry.

On entering the camp on April 15, 1945, Brigadier H. L. Glyn Hughes, Deputy Director of Medical Services of the British Second Army, was shocked. He was not prepared for the squalid hellscape that greeted him: 60,000 living but extremely ill, starving, and wasting prisoners, and 10,000 putrefying, unburied corpses, as epidemics raged through the camp. Hughes assumed the monumental task of setting up medical services for this city of pain, suffering, and death in the middle of a combat zone.

A highly decorated veteran of both world wars, Hughes served with the invading Allied forces in the bloody and costly campaigns through France and Belgium and into Germany. Once at Bergen-Belsen, he called for and coordinated medical units and employed innovative tactics to treat as many of the ill and injured prisoners as possible. Survivors admired his compassion.

The experience of witnessing the horrific conditions at Bergen-Belsen unnerved and profoundly moved Hughes. He testified about the horrors of the camp at the trial of accused Nazi war criminals from Bergen-Belsen: “I have been a doctor for thirty years and seen all the horrors of war, but I have never seen anything to touch it.” 

When the British arrived at Bergen-Belsen on April 15, 1945, 15-year-old prisoner Rachel Genuth was critically ill. By then, she and her sister Elisabeth had survived deportation from their home in Sighet, Transylvania; two months at the Auschwitz death camp where the rest of their family was murdered; enslavement at the Christianstadt labor camp; and then a death march to their last site of imprisonment and abuse, Bergen-Belsen. Rachel was near death by the time rescuers attended to her, days after the British arrived.

Author and scholar Dr. Bernice Lerner juxtaposes the stories of her mother, Holocaust survivor Ruth Mermelstein (ńee Rachel Genuth), and heroic British physician and liberator Glyn Hughes in her moving and compelling new book, All the Horrors of War: A Jewish Girl, A British Doctor, and the Liberation of Bergen-Belsen (Johns Hopkins University Press). 

In this first Holocaust history to focus on a high-ranking liberator and a Holocaust survivor, Dr. Lerner traces the separate journeys of Hughes and her mother during the final year of the Second World War. She documents the Allied advances and costly setbacks that Hughes and the Allied armies endured as she intersperses the vivid story of Rachel’s deportation from her home to her horrific and heroic journey through cruel incarceration and enslavement, from brutality and dehumanization to survival and renewal.

As Dr. Lerner stresses, although Hughes and her mother Rachel never met, Rachel was the beneficiary of Hughes’s commitment to saving as many prisoners as possible at Bergen-Belsen. The book reveals harsh truths about war and atrocities and human suffering, but a story unfolds ultimately about empathy and courage and the will to live.

The book is based on extensive historical research and a trove of resources including the papers of Glyn Hughes, oral histories, interviews, and more. Dr. Lerner masterfully combines the fruits of her scholarly research with gripping and engaging storytelling.

Dr. Lerner is a senior scholar at Boston University's Center for Character and Social Responsibility. She also wrote The Triumph of Wounded Souls: Seven Holocaust Survivors' Lives, and co-edited Happiness and Virtue beyond East and West: Toward a New Global Responsibility.  She earned her doctorate at Boston University's School of Education and her masters’ degree from the Jewish Theological Seminary. A specialist in adult education, she has lectured extensively on ethics and character in the US and around the world. Among courses she taught at Boston University were Resistance During the Holocaust and Character and Ethics Education. She also designed and taught Ethical Decision Making for Education Leaders for Northeastern University’s College of Professional Studies.

Dr. Lerner graciously responded to questions by telephone from her home. It was heartening to learn that her mother, Rachel Genuth—now Ruth Mermelstein—is living in her own home and thriving at age 90. Ruth is also a frequent and popular speaker on the Holocaust, and especially enjoys talking with school groups. She finally learned the details of her rescue at Bergen-Belsen from her daughter’s research.

 

Robin Lindley: Congratulations Dr. Lerner on your groundbreaking new book All the Horrors of War that interweaves your mother’s Holocaust story with the story of British officer and physician, Brigadier Glyn Hughes, who supervised medical care during the liberation of the Nazi concentration camp at Bergen-Belsen, the last site where your mother was imprisoned. 

Before getting to your book, I want to ask first about your background as a writer. You're also a scholar with the Center for Character and Social Responsibility at Boston University.

Dr. Bernice Lerner: I was a previous director of the Center at Boston University where I worked for seven years after completing my doctorate in the School of Education. Many of the scholars I worked with were philosophers, so I became steeped in Aristotle and Plato and contemporary writers on virtue ethics. The Center trained teachers on principles and methods of character education. We worked with educators from all over the world, helping them to think deeply about goals for their students--at all grade levels, from preschool to college. 

I did a lot of teacher training stateside and I went as far as Indonesia and Singapore and Japan. Virtue ethics fascinated me because it provides a lens through which you can analyze any material that you're reading or viewing or teaching. It involves asking questions about people’s choices. What is the right course of action in various situations? How do our habits and dispositions show who we are, our character? What does it mean to act out of character? 

The study gave me a framework and a lens. And then of course, I was dealing with the most evil acts in the history of the world when doing my work on the Holocaust. And that subject has always been an interest—my parents are both survivors. I had a lot of questions, about what happened to them specifically, and what happened to my grandparents and my parents’ siblings. 

How did your new book evolve?

At first, I didn't tackle my parents or my own family at all. My first book was about seven Holocaust survivors who were very different from anyone in my family—after having missed years of schooling they went on to earn advanced or terminal degrees. (My relatives did not have much formal education.) Finally, I wondered what happened to my mother at the end of the war, after she fell unconscious. There was a hole in her memory—she could not tell me what happened. How actually was she saved? Why am I here? How am I here? That led me to more questions. What were the mechanics of it? What if the British had come in two days later? I wouldn't be here. My children wouldn't be here. And my grandchildren. None of us would be here. 

Of course, the tragedy is that so many lives, so many generations were cut short. And Bergen-Belsen was a dumping ground for people who had survived the entire war until the end. They were the ones who had evaded the gas chambers at Auschwitz and were doing slave labor and endured the death marches. It took so much to make it to the end of the war, and then people died by the thousands in Bergen-Belsen.

It was a miracle that your mother survived as you describe so vividly in your new book. I admire your lively writing and extensive research. What inspired your book apart from your mother’s story?

When I was trying to figure out exactly how my mother survived, that led me to Glyn Hughes. He was the man most prominently associated with the liberation of Bergen-Belsen.

I set out initially just to write a biography of Glyn Hughes. I was interested in what his character was like and what he was thinking and feeling when he entered and surveyed Bergen-Belsen. I wanted to know about his background and what he brought to the experience. And how it affected him. 

Hughes was such an important figure to the Jewish survivors who knew him in Bergen-Belsen. He was a Schindler-type character in that he befriended survivors and he kept in touch with many of them for the rest of his life. He appreciated his role in their history. So who was this man? I tried to figure out who he was by meeting his surviving relatives and friends. So that was a journey, and that began almost 16 years ago. 

What are a few things you'd like readers to know about Dr. Hughes? 

He saw the humanity in the throngs of “living skeletons.” His motives were moral—immediately, he vowed to save as many lives as possible. As a doctor, he would have wanted to treat more people, but he had such a big responsibility. He faced an absolutely impossible situation that was unprecedented in the history of humankind. 

When he came into the camp, there were 60,000 people who were still breathing and there were 10,000 corpses. Many of the people were dying, emaciated skeletons. Inside barracks built to hold a maximum of 100 people, 600 to a thousand were crammed in and there were no sanitary facilities. Hughes described what he saw when he came into the camp, and he was totally unprepared, totally shocked. 

In Bergen-Belsen, the British liberators settled on a triage system, a factory-like approach that would help them save the most lives possible. Medics went into the huts and marked the foreheads of people who were still alive, who might have a chance. And they were dealing with contagious diseases. Typhus was raging and its germ was in the dust. 

I would never compare anything to that time and place, but we face a situation of medical rescue with COVID-19, and it’s not over yet. Back in April, doctors in Boston wrote about how they might have to triage patients and treat only a limited number. They were using ventilators and they didn't have enough. And they were going to have to make decisions about who to try to save and who they couldn't—who was not worth the effort. This sounded to me like a wartime decision. 

You describe Hughes’ duties when he was in charge of dealing with mass casualties suffered by British troops after the invasion of Western Europe and during the Allied push into Germany. You do an excellent job of juxtaposing the Allied military advances and setbacks with your mother's experience. Perhaps some younger people think that the Allies landed in Europe on D-Day and then got into Germany and that was it. As you chronicle, there were many losses and setbacks for the Allies and months of brutal combat before they got into Germany. You do a commendable job of reminding people of just how incredibly bloody that Allied advance was.

I had read a lot about the Holocaust, but I felt very ignorant about the battles and what it took for the Allies to advance. 

I traced Glyn Hughes’s journey and his responsibilities because I wanted to know what he had already seen before he got to Bergen-Belsen. He was in charge of medical services—first for the British Army’s 8 Corps, and then for the entire British Second Army. He had to decide, for example, how to efficiently evacuate casualties. And how to lift men's morale and make them feel that medical care was near and that they'd be taken care of. And they were facing the most feared units of the German Army. The Panzer and SS units were ferocious fighters and completely dedicated to Hitler. So many young men were maimed, so many died on the way to my mother's rescue. That's a personal way of putting it, but the sacrifices were enormous. 

So Hughes had big responsibilities and he was always looking at the mega-picture. Where can I commandeer a hospital? How should the transportation work? And he was always liaising with higher-ups and meeting with his assistant directors of medical services.

Hughes had overall responsibility for treatment of the wounded and sick and setting up hospitals and all sorts of logistics. So, this was far beyond what we see in a movie or television program like MASH. 

Yes. It was fascinating how he instituted down-to-a-science protocols and was also very innovative. His units had to learn to set up and take down casualty clearing stations and regimental aid posts very quickly. Everything had to be movable and they had to figure out ways of treating those who suffered wounds of various types and degrees. They computed exactly how long each surgical case would take. That was 48 minutes and 32 seconds or so. Attention was paid to every conceivable detail and there was a lot of practice and preparation. Finally, at Bergen-Belsen, he and his men met an unfathomable situation for which they were totally unprepared. 

And you describe vividly Hughes’ impressions when he entered Bergen-Belsen, and how this horrid experience changed his life. 

That was where you really saw his humanity because Hughes had seen every horrific aspect of military combat. He was a highly decorated veteran of the First World War. When he was a Regimental Medical Officer he would run onto the battlefield to try to save wounded men. He saw the bloodiest aspects of war, and he displayed great courage. 

When he arrived at Bergen-Belsen, he had seen nothing like it. He said that he had seen all of horrors of war, but nothing to touch Bergen-Belsen—it was so obscene and perverse. Many who were there describe it as being like Dante’s Hell with the gruesome visions inside and outside the huts. And the stench. Hughes broke down crying, and I think that says so much because he was a tough, hardened, military man. And he cried. He did not initially know how he would go about creating order.

Hughes didn't follow Army protocol and file reports. He just immediately went into action to find help and impress upon the Second Army that, even though there were ongoing battles in northwest Germany, this was a humanitarian disaster and they needed to divert some units to assist at Bergen-Belsen. And he put very good people in charge of procuring resources and readying a hospital, and brought in experts in typhus control and feeding the starved. He tried to get help from wherever he could. 

And the way people deal with disasters, as we see now with COVID-19, is to track numbers. Numbers are a way to get on handle on things, so that's what the British were trying to do when they arrived at Bergen-Belsen. More than 500 people were dying every day after the liberation for several weeks.

At the beginning of our current pandemic, not-yet-graduated medical students were pressed into service. In early May 1945, Hughes brought 97 medical students to Bergen-Belsen. They had been scheduled to do famine relief work in Belgium, but instead were diverted to Bergen-Belsen. And these young men did a very good job treating the backlog of patients remaining in the huts.

There were criticisms and questions about whether more could have been done.  If you put yourself in Hughes’s shoes, it was just an impossible situation. 

By a month or two after the liberation, some people began to recover. Some, who had been active in Zionist groups before the war, emerged as leaders. They started to organize the survivors, to build a community of  “displaced persons.” Many, in their twenties and thirties, paired up. There were a record number of weddings, and then, within a few years, of babies— born in the Glyn Hughes hospital. (Survivors who observed Hughes witnessed his compassion. They named the hospital that was set up near the camp for him.) 

Hughes saw this forlorn group of people organizing themselves. They brought in entertainment. They had a theater. They had their own police force. They had their own newspaper. And once people had food and clothes and some supplies, they started to show their true personalities and all this captivated Hughes. So even when he didn't have to go there anymore, he kept going every day to the Belsen DP (Displaced Persons) camp. He witnessed a remarkable transformation. The summer of 1945 was a watershed in his life.  

So the Glyn Hughes hospital was built at Bergen-Belsen? 

No, it was a short distance from the camp. It had formerly been a hospital for the Wehrmacht, the German Army, and there was also a nearby complex that had been used for German soldiers. There was a “roundhouse,” a large hall adorned with portraits of Hitler. All these facilities were taken over for use by the Jewish DPs. 

It’s striking that liberation didn't occur at the moment the British arrived. And the statistics you mention are staggering with more than 10,000 unburied dead when the British entered on April 15, 1945. And then 2,000 people died right after their first meal.

Yes. The British soldiers saw these starving people begging for food, and they gave them their rations. They gave them Spam and other foods that the digestive systems of the prisoners could not handle. Their intestines were all shriveled; their bodies were dried out and dehydrated. They were eating this very rich food and they had cramping and diarrhea and they died. They just died. That was very tragic. 

The British liberators did not initially know what kind of food to feed these people. They didn't have experience with this level of starvation and abuse. In India, the British gave starved people “Bengal Famine Mixture,” some kind of gruel that proved too sweet for the European palates of Bergen-Belsen survivors. Hughes eventually worked up five different diets for people in various stages of emaciation and starvation, with very gradual increases in nutrients. 

And one would expect the killing to stop with the arrival of the British, but it continued for days. Not just the Germans but also the Hungarian guards were shooting survivors. And didn’t Dr. Hughes witness shootings of prisoners by either the SS or the Hungarians guards? 

Yes. When he first came into the camp he saw some inmates running to a potato patch and the SS guards were shooting them. He saw it. He and the British officer he was with had to put an end to what was a matter of habit.

People think that the liberation happened in one day and prisoners were cheering when the Allied soldiers came in, but it didn't exactly happen that way. It was really a process. 

I would say that the liberation took place over an extended period. For the first couple of days in Bergen-Belsen, Hungarian guards were left in charge-- the British didn't have enough personnel to keep order and make sure contagious prisoners didn’t leave the camp. The Hungarian guards in watchtowers were shooting those who ran to the potato patches because they were starving. 

There was chaos. The liberators faced problems you might not think of: trying to bring in food and water, repairing the water main break, restoring the electricity that had gone down. The Germans sabotaged camp operations before they left. It was a crazy interim period and the British were struggling to set up the facilities. 

The cruelty you describe was horrific. You write that, shortly before liberation, SS guards baked ground glass into bread and fed it to prisoners as a way of eliminating more people before the Allies arrived.

Yes, and those who got the bread were so hungry that they ate it. I thought maybe that was a rumor that my mother heard, but I came across a survivor account and he said that's exactly what they did. It destroyed people's intestines, and so many died that way. The man who survived said he could feel the crunch of the ground glass between his teeth. 

And then to your mother’s harrowing story. Have you been collecting your mother’s stories and those of other survivors since your youth? 

Yes, since I was maybe 13 or 14 years old, but not intentionally or consciously. When I was a kid, maybe six or seven years old, I would ask my mother about her childhood because it was so interesting and different than how I was growing up on Long Island. She grew up in Romania, which seemed exotic and romantic to me. And then she would tell me about her postwar life in Sweden. 

She was smart in sharing her stories. She's just such a positive person. She never wanted to tell me how hard things were: how poor she was in Romania or how sick she was in Sweden. Mostly she told me about her adventures, the fun and daring things she did. And she talked about how kind the Swedish people were, and what a wonderful country it was. 

But when I was about 14, the age she was when she had been taken away, she started to tell me what happened during the war. What happened in Auschwitz. What she experienced as a 14- and 15-year-old. She said, What would it be like if someone were to tell you that in two months your family would be killed and you would lose your friends, your entire community, everything you ever had or owned? You'd think they were crazy. You couldn't imagine that happening. 

And she would tell me all this before the word Holocaust was out there. This was what happened during the war, and she wasn't talking about it to other people. She wasn't even talking about it with my father, who was also a survivor. But late at night, we'd be down in the basement laundry room, and she’d tell me. She was ironing one night and she put down the iron and she stretched her arms out behind her and knelt over. She said this was how she, then 50 percent dead, had to drag the dead to a mass grave. Some were not even dead-- they were still breathing. 

And I couldn’t shake that image from my mind. I was going to high school then and I wasn’t hearing anything like that in my history classes. Later, I studied and taught the Holocaust. But I knew little about the war. Finally, I started to research events—larger contexts—that bore on my mother’s fate. But I also held her particular story. By following an individual, one can begin to grasp the wider story. Writers and film producers understand that. 

The story of your mother and Glyn Hughes would make a gripping movie. Her odyssey was incredible. She and her sister Elisabeth were rounded up by the Nazis. They were taken to Auschwitz first and then to a labor camp and then marched to the horrific Bergen-Belsen. She experienced different forms of incarceration. Each was brutal and dehumanizing, but younger readers may not understand the different forms of imprisonment used by the Nazis. 

Yes. She was captured in the last year of the war. The Germans were losing the war, and already millions had been murdered. My mother and her family were taken in the massive Hungarian deportation in the spring of 1944 and deported to Auschwitz—the largest death camp where one and a half million people were killed. 

My mother was shocked and she might have been numb. In Auschwitz, those who were temporarily spared were given ersatz coffee or “food” laced with bromide, a drug that numbed their senses. 

There was a chance—for those who were fit—of surviving Auschwitz. There was this tension among German higher-ups between needing slave laborers and wanting to kill as many Jews as possible. About ten percent of the more than 424,000 arrivals from the Hungarian provinces who were deemed strong enough were siphoned off—they could be worked to death slowly. 

Some were tattooed—they were meant to be around for a while and given a number. My mother was not tattooed. She was among thousands of “depot prisoners” who were being held to see if they might be needed for the war effort or sent to the gas chambers, which were operating day and night. It must have been hell seeing the smoke from the ovens and the red sky and smelling the stench of burning bodies. When my mother asked a longtime prisoner where her parents were, the woman told her to look at the smoke. That’s where they are. 

It was just horrific. And to think she was a kid who had never been outside her little town. She had never traveled anywhere away from home. She had never slept anywhere else. And here she was in this inconceivable place called Auschwitz—a death camp. And everyone around her was in the same terrifying situation.

She missed her parents’ protection, but she was the type of kid who could fend for herself. She had had big responsibilities at home—heavy chores and helping with her grandmother’s butcher business. She had to deliver orders of poultry to distant parts of town, and made her way back in the dark after curfew. And so, once she somehow acclimated to Auschwitz-Birkenau, she looked to what she could do to survive. 

And she volunteered for different duties. She took out the pail of excrement at night to see if there was something useful she might find. She volunteered for work that would earn her a piece of bread. She dared to beg privileged prisoners for a bit of something they might have on them. 

For the two months she was in Auschwitz, she did not know whether she would die the next day. I describe in the book the various “selections”—to think that some SS officer would determine whether you would live or die by looking you over for a second is crazy making. Harrowing. And so difficult for we who were not there to imagine. 

Bergen-Belsen, this center of one of the most horrific atrocities in human history, had to also seem insane to an innocent young teen. 

Yes. And no matter where you came from, no matter what your background or profession, everybody was equal there. It didn't matter if you were rich or poor or had an education or not. Everyone was in the same horrifying boat. But some people knew better than others how to cope with hardship. I would ask my aunts and uncles—all survivors—about their experiences. They told me that those who were not used to hard work at home, those who had maids and had been pampered, had a harder time than people who had not been coddled.

That my mother and her sister Elisabeth managed to leave Auschwitz together was a miracle. They were selected to work in one of the thousands of labor camps because again, the Germans needed slave laborers.

Conditions varied by camp and much depended on the type of work that you were forced to do, the dispositions of the overseers, the rations that you were given—the Germans realized they had to feed prisoners if they wanted them to produce before dropping dead. 

At the Christianstadt labor camp, my mother was picked to work in the kitchen. This was like winning the million-dollar lottery. She could eat the SS officers’ leftovers. And that's probably the reason she survived ultimately because she had six months in this environment. It was still a very dangerous place, but she could take chances and get nourishment.

But at the beginning of February 1945 came the death march. During the final chaotic months of the war the Nazis evacuated camps in the paths of would-be liberators so no inmate would fall alive into Allied hands. 

That was the Nazi plan. You vividly describe the death march of your mother and her sister to the camp. So many people died or were killed by guards on that brutal trek to Bergen-Belsen. 

Yes. My mother and her sister were on this death march. After five weeks on the road and one torturous week in a cattle train they arrived in Bergen-Belsen. It was mid-March, about two weeks after Anne Frank died there. She was older than my mother. And death was the norm. About 17,000 people died in March in Bergen-Belsen.

Didn’t Anne Frank die of typhus? 

Yes. And probably of other things as well.

Many people don't understand the difference between Bergen-Belsen and Auschwitz. Auschwitz was a killing factory. You didn't see emaciated people there because people had come (in the case of the Hungarian transports) straight from their homes. Most inmates didn't last long. They were killed right away or within a short time. 

Bergen-Belsen was a camp of the war ravaged. It had the largest number of inmates at the end of the war. They had been through so much. Many were but musselmanner, living skeletons. And disease was rampant. At least three epidemics were raging in the camp at the time the British came in. 

Tens of thousands of prisoners congregated at Bergen-Belsen.  Didn’t the Nazis disagree on whether these people should be exterminated or still used as slave labor? And didn't Himmler suggest keeping the camp intact because he knew that the war was nearly over and he didn't want to be responsible for more extermination? 

Yes. At that point in the war, there wasn't a question of using the prisoners for slave labor. The focus was on getting them away from the liberators, on following Hitler’s orders: no inmate was to be left alive. That's why they were dumped in Bergen-Belsen and other camps inside Germany. 

In early April, Himmler ordered the killing of all the inmates in certain camps. Then, he turned Bergen-Belsen over to the British Army intact. This is a “truth is stranger than fiction” story. His masseuse played a part in it. As I describe in the book, it’s just an unbelievable story about how he was convinced to hand over this camp to the British Second Army rather than kill everybody. Maybe he thought that a show of humanity would somehow save him. But he killed himself when the British found him. 

Anyway, in this unprecedented move, the Germans handed over Bergen-Belsen to the British. It was a crisis situation, because if they bombed the camp or there was fighting in the area, some prisoners could escape and spread disease throughout the countryside and that was a risk for the people fighting in the area, the Germans and the British, as well as civilians. 

The handover occurred just three weeks before the end of the war. If that had not happened, my mother wouldn't have survived. I wouldn't be here. It was a race against time for her and other of the inmates to “hold on.” Tragically, the race was lost for too many. Thousands kept dying even after the liberation.

There are so many strange twists to the story. You write that the Bergen-Belsen Commandant Josef Kramer and a brutal SS guard Irma Grese conducted a tour of Bergen-Belsen for the first British troops who arrived on the morning of April 15. Kramer and Grese seemed quite proud of this hellscape they’d created.

It was bizarre. They were in the habit of killing. This was what they did for their jobs. And they believed in what they were doing. They regarded the inmates as subhuman.

And in the meantime, your mother registers the liberation and she was elated but, within a couple of days, she's very ill, and then some fellow prisoners beat her mercilessly. And so, your mother was actually dying? 

Yes. In telling the story, I wanted to show what was actually happening on the ground, behind the scenes. My mother was beaten to a pulp by her fellow inmates days after the British arrived. People treated so poorly had been reduced to this animalistic state, and they didn't just snap out of it on the day of the liberation. It was a long process to come back to life. My mother was near death after having been beaten so badly. I explain that in the book. 

My mother was placed in a makeshift hospital room for dying prisoners. Every day for three weeks, 11 of the 12 people in her room died and 11 nearly dead were brought in to fill their beds. She hung on. She described those details to me when I was a teenager. Later, when I wrote her story, I could calculate practically the day that she was evacuated to the hospital because I had read accounts of the evacuation. The bits of information she told me were windows into larger contexts. 

It’s an amazing survival story—a story of the narrowest escape. The rescuers presumed your mother would die, yet she hung on for weeks. If they used triage, then she was grouped with those who weren’t expected to survive. 

Yes. And she was unconscious when she was taken to “the human laundry.” She didn't know it was called that until I researched the rescue. 

Before she was beaten, but after the British arrived, she wandered to this warehouse that contained tons of clothing and she picked her way through seams and lapels and found all these treasures—gold pens, rings, currency—the deportees had brought with them. And her greatest heartbreak was the moment she came to and realized her precious stash had been taken away from her. There were all these heartbreaks. And then, when she came to full consciousness she thought, “I survived, but how lucky am I? I lost my home, my family, and my health.” 

Her sister Elisabeth also survived and Elisabeth was with your mother for much of the time? Helping each other must have had a role in their survival. 

Yes. It was very important to have a partner in one’s struggles. Elisabeth sacrificed her life for my mother at Auschwitz. She was ready to die with my mother when she herself was picked for possible labor. At that moment, Elisabeth showed her love and deep compassion for her sister. From that point on, my mother did everything in her power to help my aunt survive whatever trials they went through. She would “organize” food for the both of them. They helped each other. And when my mother was in that makeshift hospital, knowing that her sister was alive and in decent shape was a real driving force for her because, if she died, she would be leaving her sister all alone in the world. 

My mother mustered her will to live because of Elisabeth and because she was only 15 and felt she had not yet lived much of life. She didn't know how very sick she was or how long recovery would take, but she fought to have a chance. And her father's parting words to her in the cattle car before they got to Auschwitz came back to her—he had confidence that she would make it. She had to live up to his words.

Your mother was eventually evacuated from Bergen-Belsen to Sweden. What was the role of Sweden in helping survivors of the camps, and how did your mother, unlike many other people, wind up there and then live there for ten years?

The Swedes led a humanitarian mission to save these people and help them get back on their feet. Perhaps they felt guilt over their neutrality or how they helped Hitler during the war. Who knows? But they took in about 7,000 really sick people from Bergen-Belsen. The idea was to rehabilitate them and, after about six months of medical treatment, they would go on their way and maybe be repatriated in their home countries. 

When my mother got to Sweden, she was very sick. She had tuberculosis, and she was in various TB sanatoriums and rest homes in Sweden for ten years. I don't know any survivor who went to Sweden who didn't say that Sweden was wonderful and the Swedish people were kind, and that meant so much. These people had seen the worst of humanity and then in Sweden they were so well cared for. My mother had certain post-war experiences that showed her that there was still humanity in the world.

My mother loved Sweden. When we (my sister and I) were growing up, our house was a mixture of cultures, with certain traditions and foods from Hungary, Romania, and Sweden. Though my parents wanted to be American, they couldn’t help but transmit what they carried from Europe. 

Robin Lindley: You describe many moving moments in your writing. I can’t recall if this scene was from your new book, but after the liberation there were thousands of displaced persons left at Bergen-Belsen. One drop of supplies included a large shipment of lipstick. The soldiers thought that this shipment just useless, but women survivors were thrilled and eagerly accepted the lipstick. It was almost part of their resurrection—a restoration of human dignity after being dehumanized for months and years. That story was so moving. 

Yes, that was fascinating. 

Humanitarian organizations, such as the Red Cross and Jewish organizations, were sending shipments to Bergen-Belsen. And they got this huge box of lipsticks and whoever opened it thought that was ridiculous and completely useless. And then they distributed the lipsticks and that was the best feeling for the women when they started to put on lipstick. They felt like human beings again. And when they were given clothes or a needle and thread and some old garments that they could tailor, life came back to them. They wanted to make themselves look presentable and attractive to the opposite sex. Little things that you might not think about really mattered. 

Yes. A marvelous story of the renewal. 

And becoming human again. There are so many of those little stories. In one instance, a soldier turned to Jewish leader and said, “Look at that woman. She's crazy. She's combing her hair with a broken piece of a comb.” And the leader said, “You give her a real comb and see which she would choose. Then you could see if she were crazy.” 

These people were so deprived and they didn't have the basic supplies that we take for granted. If they had a choice, and they were given the real thing, they wouldn’t have looked crazy. And they were used to saving every little thing they could get their hands on—a piece of string had uses.

Adjusting to life after the war had to be challenging. How is your mother doing now? 

She's doing well. Thank you for asking. I worry about her because of the pandemic. I can't visit with her and she normally has a lot of speaking engagements. She’s really wonderful. She has such a great message when she speaks to kids, and she speaks to a lot of middle and high school students about the Holocaust. 

What did she think of your book? 

She read drafts of it, and I kept her abreast of the entire publishing process, so she learned a lot. She is happy that I achieved the goal of writing her story, and we are both happy to have saved members of our family—a few of the six million—from oblivion. 

She must be really proud of you. 

We are proud of each other. 

Does she live in a senior facility?

No. She’s going to be 91 in a few weeks, and she lives in her own home and takes care of everything in the home herself. 

That’s amazing. She’s still doing well after all of those narrow escapes. Please give her my best regards.

I will. Thank you so much for your interest and this interview.

It’s been a pleasure talking with you Dr. Lerner. Thank you for sharing your thoughtful and moving comments. And congratulations on your compelling and illuminating new book All the Horrors of War on the journeys of your mother and the liberator Brigadier Glyn Hughes, MD.  

Robin Lindley is a Seattle-based writer and attorney. He is a features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, NW Lawyer, ABA Journal, Re-Markings, Real Change, Huffington Post, Bill Moyers.com, Salon.com, and more. He has a special interest in the history of human rights, conflict, medicine, and art. He can be reached by email: robinlindley@gmail.com.

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/blog/154420 https://historynewsnetwork.org/blog/154420 0
‘One Man, and One Man Alone’: Mussolini’s War

 

 

On October 3, 1935, Mussolini sent his armies into Abyssinia.  Fascist Italy was now at war, and would be so more or less continuously for the next eight years; first Abyssinia, then Spain, then France, North Africa, the Balkans and finally Russia.  Gradually, inexorably, defeat and disaster became the order of the day. Finally, on 25 July 1943, as the Fascist house of cards collapsed, Mussolini was unseated.  Over the next forty-five days, the conservative elements around the king extricated themselves from Fascist Italy’s alliance with Nazi Germany and changed sides, abandoning their men to the new enemy.  Several thousand soldiers were shot by the Germans, more went into prisoner-of-war camps, and more again went over to their one-time enemies and fought alongside Greek and Yugoslav partisans.  Grandiose dreams of a new Roman Empire ended in catastrophe. How did this happen?  And why did things go so badly wrong?

Fascist Italy may have looked strong and successful before the wars began – Mussolini had things to his credit, rebuilding a shattered state in the 1920s and pulling the country through the Great Depression – but when they did her economic weaknesses became increasingly apparent, and increasingly important. In his speeches, the Duce made great play with manpower, boasting on the eve of the Second World War that he could put eight million men in the field (he couldn’t). But Italy had none of the industrial and economic resources that she needed to keep her soldiers in the field, her ships at sea, and her planes in the skies: no coal, no oil (though ironically she was sitting on massive reserves in Libya), no iron ore, no metals like zinc and manganese. Her industrial sector was small, concentrated in northern Italy (which would make it vulnerable to Allied bombing), and heavily dependent on imports.  The war Mussolini joined with great enthusiasm was a capital-intensive war. When she entered it in the summer of 1940, Italy was, in the words of one of Mussolini’s chief advisers, ‘like a bath with the plug pulled out and the taps turned off’.

None of this was unknown to Mussolini. Sheaves of economic data crossed his desk before the war and during it – and he simply ignored the numbers or downplayed their significance.  The hard facts of economics were brushed aside in favor of will power, which Mussolini valued above everything else.  Italy was now in the hands of a man who had neither the experience – he had been a corporal for a couple of years in the Great War before being hospitalized and had not taken part in any big battles – nor the ability to run the war.  Advice and counsel might have helped, but Mussolini had no time for that.  Massive personality defects ruled out listening to other people.  Narcissistic to a degree, Mussolini presented himself to his subordinates and to the public as all-knowing – he could reel off statistics about how many sheep and cows there were in the country – and all-wise.  A master of the media of his day – newspapers and state-controlled radio – he ruled on the basis of intuition and extemporization.  

Autocrats – and Mussolini was certainly a would-be autocrat, although because Italy still had a king his authority was not complete – may sit at the top of the pile, but they do not fight wars, or run countries, alone.   In the 1930s Mussolini found generals who promised him quick, fast-moving wars fought with lightly-motorized and mechanized troops.  Between 1940 and 1943 they fought his wars for him – and lost them as they were out-gunned and overwhelmed by more powerful opponents.  Airmen flew planes which had been state-of-the-art in the 1930s but by 1940 were fast becoming obsolescent thanks to poor leadership and direction, while their overlords quarrelled with sailors who demanded more air support as their battleships slowly ran out of fuel. Long-standing inter-service jealousies played out alongside budget rivalry as the men around Mussolini manoeuvred for their boss’s favor. Absent collective analysis and advice, misdirection at the top went hand-in-hand with mismanagement lower down.

Just as there was no real order to Mussolini’s direction of the national war effort, so there was no ordered thought behind the strategic decisions he made.  Italians fought in the Mediterranean, North and East Africa, the Balkans and Russia, spreading their limited strength and magnifying their structural weaknesses, because Mussolini sent them there.  He had his reasons – no dictator, would-be or actual, is simply a mad man acting without logic – and his own way of reasoning.  Every campaign had a rationale – colonial expansion in Abyssinia, great power status in the Mediterranean and North Africa, ideology in Spain and Russia, regional dominance in the Balkans. A kaleidoscopic programme was shaped by opportunism and by rivalry: Mussolini acted on the spur of the moment, always sensitive to the need to be seen as Hitler’s equal.  Rarely did anyone ever try to talk him out of a chosen course, and when they did so they failed.  You couldn’t reason with him.

When Mussolini took Italy into the Second World War on June 10, 1940, the short-term circumstances were in his favor and the long-term odds were against him.  Then, complete failure was perhaps not yet inevitable – though once the United States entered the war on December 7, 1941 it surely was.  En route to defeat, Mussolini made bad choices, disregarded warnings that his military machine was not up to the demands he was making of it, and turned a blind eye to economic realities.  Many of the failures and setbacks were his fault – though not all of them.  Had he lived to write his memoirs, he would no doubt have railed against incompetent generals and inadequate subordinates.  That would have been a smokescreen. The man who took his country to war, pouring scorn on Roosevelt and Churchill as he did so, was a lightweight when compared with the war leaders he had measured himself against.  

The lesson of this history?  Choose your leaders with great care, for they can do real and lasting damage.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177834 https://historynewsnetwork.org/article/177834 0
The Queen's Two Bodies

 

 

When on August 9th, 1588 Queen Elizabeth I appeared on horseback to inspect her troops at the seaside town of Tilbury, the soldiers having had amassed in preparation for a potential Spanish invasion, she was supposedly resplendent in gleaming armor, with a martial helmet framing her blazing red hair. The Spanish Armada was famously halted by the so-called “Protestant Wind,” the storms which spared the English both inquisition and occupation, and if there was an element of that respite which is central to its mythology, it’s Elizabeth’s appearance at Tilbury. “I am come amongst you… at this time, not for my recreation and disport, but being resolved, in the midst and heat of the battle, to live and die amongst you all,” the Queen said during her famed speech to the troops, “to lay down for my God, and for my kingdom, and my people.” 

Descriptions of the speech present Elizabeth as a stoic, regal, and dignified warrioress, with historian Garret Mattingly recounting in The Armada that the queen was “clad in all white velvet with a silver cuirass embossed with a mythological design, and bore in her right hand a silver truncheon chased in gold.” Elizabeth has been remembered, in contemporary epics such as Edmund Spenser’s The Faerie Queene and recent movies like Shekhar Kupar’s Elizabeth: The Golden Age, as a war maiden, a virago, an Amazon. She has been remembered, at her own urging, as effectively a man.

“I know I have the body of a weak, feeble woman,” Elizabeth said to her soldiers, “but I have the heart and stomach of a king, and of a king of England too.” Writing in the academic journal PMLA, Mary Beth Rose says that Elizabeth is often seen as having functioned “politically by disarmingly acknowledging her femininity and then erasing it through appropriating the prestige of male kingship.” The speech at Tilbury is rightly considered an exemplary demonstration of Renaissance rhetoric, the strategically and politically brilliant monarch rallying a nation that faced an implacable foe, a ruler whose own position was precarious, but who combined both classical learning and the plain style into a patriotic argument concerning her own sovereignty. 

Her call to arms does something else as well though, something which defined her reign – it gendered the Queen in a specifically masculine way, while making admission to the reality of her being a biological woman. Elizabeth’s speech thus has some fascinating resonances for us today, seeming to anticipate a more scientific understanding as summarized in the American Psychologist that “individuals show variability across the different components of gender/sex, presenting a mosaic of biological and psychological characteristics that may not all align in a single category of the gender binary.” It’s widely understood today that sex and gender are two different things, and that the latter need not match the former – something that Elizabeth gestured toward at Tilbury in 1588. What Elizabeth’s speech does is help to give us a more nuanced understanding of “gender” over the centuries, and the ways in which a more scientific accuracy was anticipated by her rhetoric. 

So often the supposed strict gender binary between male and female, which is enshrined by traditionalists who are adamant on its universality, fails to take account of the ways in which our understanding of what it means to be a woman or a man has been different in the past, and often for surprising reasons. It should be made clear that the argument is neither that Elizabeth considered herself to be a man, nor that it would be appropriate to classify the Queen as being transgendered. 

Yet as scholar Janel Mueller explained in a lecture delivered before the Washington DC chapter of the University of Chicago Alumni Club, Elizabeth had “no anxiety in representing herself as being above and beyond the social and biological mandates that her age attached to womanhood.” What Elizabeth’s speech does is provide an example of the difference between sex and gender from a surprising source, an illustration of the fact that gender is more complicated and variable than is sometimes assumed, and that during a time as authoritarian and reactionary as the Renaissance there could exist a progressive understanding of sexuality, albeit for ends which would seem foreign to us today.

Because if we can think of the Tilbury speech as “progressive,” insomuch as that’s a useful term, it was in spite of a profoundly regressive social context. When Elizabeth ascended to the throne, it was after the death of her sister Mary I, who, alongside her husband the Spanish King Philip II, ruled as the first female monarch given the traditional authority of a king. With the exception of her cousin Mary, Queen of Scots, Elizabeth was the sole European queen to be vested with the same awesome monarchical power as the absolute rulers of the continent, and the fact of her female body complicated her power during an otherwise profoundly misogynistic century. 

Writing in 1558, the year of Elizabeth’s coronation, the Scottish Presbyterian theologian John Knox argued in his pamphlet The First Blast of the Trumpet Against the Monstrous Regiment of Women that to “promote a woman to bear rule, superiority, dominion, or empire above any realm, nation, or city, is repugnant to God… it is the subversion of good order, of all equity.” Knox’s pamphlet was written during an era marked by the violence of witch-burnings, whose thousands of casualties have been interpreted by some theorists as a gendercide which marked women for extermination. That within this era Elizabeth was able not just to rule, but to thrive and oversee a cultural golden age, is exemplary. She was able to do this, in part, by drawing upon theological and political understandings of what a monarch was supposed to be that relied on a malleable model of gender.    

Traditionally and mythically Elizabeth has been regarded as cagey tactician regarding her own cult of femininity; the emphasis on the Queen’s chastity and propagandistic representations of her meant to conjure an image of the Virgin Mary, whose icons had so recently been deposed in the English Reformation, all attest to the ways in which womanhood was used for political purposes. But as Rose writes, Elizabeth’s “rhetorical technique involves appeasing widespread fears about female rule by adhering to conventions that assert the inferiority of the female gender only to supersede those conventions.” Elizabeth’s words at Tilbury indicate that relying on tropes of femininity and virginity wasn’t the only way she solidified support, nor necessarily the most effective. What was even more authoritative, was that Elizabeth was able to imagine herself, and more importantly to compel her audience to imagine her, as being a man even while her biological sex was female. The authors of the American Psychologist article summarize the orthodoxy of the gender binary system which “typically assumes that one’s category membership is biologically determined, apparent at birth, stable over time, salient and meaningful to the self, and a powerful predictor of a host of psychological variables,” but Elizabeth’s declaration that “I have the heart and stomach of a king” greatly complicates that conservative perspective. 

For both theorists and activists, clinicians and physicians, there has been a developing understanding that the gender which someone knows themselves to be may not culturally match the biological sex which others understand them to have. Research on the relationship between sex and gender has enriched our understanding of the blurriness of those terms, and the complexity between them. Interestingly, however, was that the more modern, complex, scientific, and accurate understanding was anticipated during the sixteenth-century and earlier. Historian Ernst Kantorowicz supplied a vocabulary for comprehending Elizabeth’s language in his seminal 1957 study The King’s Two Bodies: A Study in Medieval Political Theology. In that work, Kantorowicz explains how for medieval and Renaissance political theorists, a monarch had to be understood as composed of two elements – the “body natural” which referred to their actual anatomical being, and the “body politic” which was the abstracted, spiritual, transcendent quality which marked them as sovereign. He writes that the “King’s Two Bodies thus form one unit indivisible, each being fully contained in the other.” For male monarchs, it necessarily holds that the masculinity of both the body politic and the body natural are in alignment; what makes the Tilbury speech so radical is that Elizabeth has allowed for the possibility of a female body being unified with the male body politic, she has in essence spoken in a language which anticipates the subtlety, nuance, and complexity of gender which we’ve come to more fully understand in the past century. 

The King’s Two Bodies was a political-theology designed and understood to uphold a reactionary system, and yet by serendipity it could also inadvertently allow for more enlightened conclusions, as when Elizabeth challenged not just the Spanish Armada at Tilbury, but the strictures of a constructed gender binary as well. 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177929 https://historynewsnetwork.org/article/177929 0
COVID Man: Life during Wartime 523

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/blog/154421 https://historynewsnetwork.org/blog/154421 0
Reading Pope Francis's "Fratelli Tutti" through Carl Sandburg

 

 

 

In reading Pope Francis’s latest encyclical, Fratelli tutti [All brothers]: On Fraternity and Social Friendship, I was struck by how much it reminded me of Carl Sandburg’s life (1878-1967) and works. The latter included not only his poetry and many volumes on Abraham Lincoln, but also many other works of prose. Most noteworthy were the Pope’s words “we need to think of ourselves more and more as a single family dwelling in a common home” (Point #17) and similar words of Sandburg in the Prologue for the extremely popular The Family of Man (1955).

 

The latter was primarily a book of photographs put together by Sandburg’s brother-in-law, Edward Steichen. It featured pictures from around the world meant to demonstrate humanity’s oneness. In it Sandburg wrote: “Everywhere is love and love-making, weddings and babies from generation to generation keeping the family of Man alive and continuing. . . .Though meanings vary, we are alike in all countries and tribes in trying to read what sky, land and sea say to us. Alike and ever alike we are on all continents in the need of love, food, clothing, work, speech, worship, sleep, games, dancing, fun.”

 

Like Pope Francis, Sandburg was especially insistent on the oneness we should display toward the poor, unfortunate, and victims of discrimination. In his encyclical the pope writes, “Racism is a virus that quickly mutates and, instead of disappearing, goes into hiding, and lurks in waiting” (#97). In 1961, the poet’s friend Harry Golden wrote that for Sandburg “the fight against anti-Semitism and Negrophobia had been a special project.”

 

In his book The Chicago Race Riots, July 1919, the poet indicted the racism that had propelled attacks against African Americans. In 1966, he was made a lifetime member of the National Association for the Advancement of Colored People (NAACP), and its head, Roy Wilkins, declared him “a major prophet of Civil Rights.” (Unless otherwise indicated, all quotes about and from Sandburg are taken from my “The Wisdom of Carl and Paula Sandburg.”) During World War II the poet hired two Japanese-Americans to work for him during the same period that over 100,000 other such Americans were being uprooted and sent to internment camps. He also wrote a column warning against such prejudice.

 

Although not a Catholic, Sandburg shared the Pope’s abhorrence of religious prejudice. While Francis wrote of the necessity of “harmony and understanding between different cultures and religions” (#279), Sandburg once indicated his appreciation of various religions by saying, “I am a Christian, a Quaker, a Moslem, a Buddhist, a Shintoist, a Confucian, and maybe a Catholic pantheist.” 

 

The works of both the poet and the Pope also display a deep sympathy for immigrants. Sandburg was the son of two Swedish immigrants, and Francis’s father was an Italian immigrant to Argentina, where he married Francis’s mother, also of Italian ancestry. In such poetic works as his Chicago Poems (1916) and The People, Yes (1936), Sandburg displays his affection for immigrants. In his encyclical, Francis indicates that our attitude toward them should be “welcome, protect, promote” (#129).

 

In his The People, Yes and other works, in the words of one scholar, Sandburg “wanted Americans to understand and appreciate the common man and the common laborer.” Toward the end of his last volume on Lincoln, he wrote “to him the great hero was The People. He could not say too often that he was merely their instrument.” This emphasis on the masses was a consistent element in Sandburg’s writing. In his Chicago Poems, in the poem “I Am the People, the Mob,” he wrote, “I am the people—the mob—the crowd—the mass, / Do you know that all the great work of the world is done through me?” Francis’s encyclical displays a similar spirit: “To be part of a people is to be part of a shared identity arising from social and cultural bonds. . . . a slow, difficult process . . . of advancing towards a common project” (#158).  

 

In Fratelli tutti, Francis uses the phrase “common good” more than 30 times. As he made clear in his 2015 Address to the U.S. Congress, this should be the main goal of politics (in 1959, on the occasion of the 150th anniversary of Abraham Lincoln's birth, Sandburg had previously been privileged to address the Congress). 

 

To achieve this good, both the Pope and the poet advised openness, tolerance and dialogue. In The People, Yes, the poet wrote: 

 

Let the argument go on

. . . . . . . . . . . . . . 

The people have the say-so. 

Let the argument go on

. . . . . . . . . . . . . . 

Who knows the answers, the cold inviolable truth? 

. . . . . . . . . . . . . . . . . . . . . . . . . . . 

And how few they are who search and hesitate and say: 

"I stand in this whirlpool and tell you I don't know and if I did know I 

would tell you and all I am doing now is to guess and I give you

my guess for what it is worth as one man's guess.”

 

In 1959, Sandburg went with his brother-in-law to a traveling photography exhibit in Moscow, where both men hoped the exhibition would further understanding among Cold War rivals.

 

Chapter Six of Francis’s encyclical is entitled “Dialogue and Friendship in Society,” and in it he writes, “if we want to encounter and help one another, we have to dialogue.” He eschews rigid dogmatism and intolerance and believes that people should reason together to advance the common good. In his 2015 encyclical on climate change he mentioned dialogue about two dozen times and in his address to Congress a dozen times, while also adding that “a good political leader is one who, with the interests of all in mind, seizes the moment in a spirit of openness and pragmatism.” 

 

Although the pope does not use the word “capitalism” in his present encyclical, just as he had not in a 2013 detailed criticism of capitalist excesses, he again employs various terms to decry the economic practices of countries, like the USA, that are often labeled “capitalist.” For example, “Local conflicts and disregard for the common good are exploited by the global economy in order to impose a single cultural model.”  

 

Sandburg was also a strong critic of capitalist excesses. For at least a decade, from 1907 to 1917, he considered himself a socialist, and strongly defended Socialist Party positions until he split with most socialists in 1917 by supporting President Wilson’s decision to go to war with Germany. Yet in the 1920 presidential election he still voted for his friend, the Socialist candidate Eugene Debs. Later on, Sandburg was a strong supporter of Franklin Roosevelt, who hemmed in capitalist excesses with his New Deal and was, according to historian Robert Dallek, “ever the pragmatist.” 

 

In my essay on Sandburg’s wisdom is a section on “Achieving Wisdom and Balance in an Age of Consumer Culture and Mass Media,” which points out the poet’s early criticism of TV’s emphasis on consumption--e.g., “more than half the commercials are filled with inanity, asininity, silliness and cheap trickery.” Similarly in a Fratelli tutti section labeled “Information without wisdom,” the pope writes, “The flood of information at our fingertips does not make for greater wisdom. Wisdom is not born of quick searches on the internet nor is it a mass of unverified data. That is not the way to mature in the encounter with truth.” 

 

In his encyclical the pope also emphasizes, or at least alludes to, other subjects not stressed by Sandburg. For example, he writes that social love “makes it possible to advance towards a civilization of love, to which all of us can feel called.” This love is a “force capable of inspiring new ways of approaching the problems of today’s world, of profoundly renewing structures, social organizations and legal systems from within” (#183). Although Sandburg does not theorize about “social love,” his political philosophy suggests it. 

 

Nor did the poet speculate much about some of the other subjects directly addressed by the Pope’s encyclical. They include war and capital punishment, both of which Francis criticizes at length, or mistreatment of our environment, which, after devoting an earlier lengthy encyclical to, he now once again mentions.

 

Nevertheless, Sandburg’s multi-volume biography of Lincoln did include the four-volume Abraham Lincoln: The War Years (containing over a million words). Early in World War I, he wrote several poems lamenting the loss of human lives. In August 1941, he stated, “Who wants war? Nobody. Only fools and idiots want war. . . . Yet sometimes the issue comes before a nation of people: Will you fight a war now, or would you deliberately chose another later inevitable war?” In WWII, he wrote a weekly column supporting the U.S. war effort for the Chicago Times syndicate, some of them later published in his 1943 book, Home Front Memo. 

 

Today, however, more than a half-century after Sandburg’s death, we live in a nuclear age. In his encyclical Francis writes: “At issue is whether the development of nuclear, chemical and biological weapons, and the enormous and growing possibilities offered by new technologies, have granted war an uncontrollable destructive power over great numbers of innocent civilians. The truth is that never has humanity had such power over itself, yet nothing ensures that it will be used wisely’” (#278). (His words remind us of those of WWII General Omar Bradley: “Ours is a world of nuclear giants and ethical infants. If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner.”) 

 

Fratelli tutti also condemns capital punishment--“All Christians and people of good will are today called to work  . . . for the abolition of the death penalty.” Although Sandburg does not theorize about the death penalty, a few of his poems such as “Killers” and “Legal Midnight Hour” suggest his disapproval of it. (See here for a recent essay on Catholicism and capital punishment in the USA.)

 

Finally, Sandburg and Pope Francis share two other convictions--the greatness of Abraham Lincoln and the importance of history. In his address to Congress, the Pope listed Lincoln as one of four great Americans. The poet’s six-volume biography of Abraham Lincoln won the Pulitzer Prize for history in 1940, and he frequently pointed out Lincoln’s significance to leaders such as FDR. 

 

In Sandburg’s long novel, Remembrance Rock, (1948) which his publisher declared “follows the growth of the American dream through more than three centuries of our nation’s history,” he has his hero say, “When a society or a civilization perishes, one condition can always be found. They forgot where they came from.” Francis also bemoans the loss of “historical consciousness.” He advises people not “to ignore their history, to reject the experiences of their elders, to look down on the past”; not to “spurn the spiritual and human riches inherited from past generations” (#13), not to “forget the lessons of history” (#35).

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177930 https://historynewsnetwork.org/article/177930 0
Joe Biden Is Setting A Record That No One Has Noticed

 

 

Most articles about this year’s Democratic presidential candidate, Joseph Biden Jr., mention his age, 77, which makes him the oldest major party standard-bearer in U.S. history.  Yet few mention that Biden’s third appearance on a major party’s national ticket – he won the vice presidency in 2008 and 2012 – vaults him into third place among Americans achieving multiple national nominations, in a tie with six predecessors.  

Biden shares this distinction with giants in our past (Thomas Jefferson, Andrew Jackson) and with some who left smaller footprints in history (George Clinton, Rufus King, Charles Cotesworth Pinckney, and William Jennings Bryan).  The qualities required for that achievement include good health, a tenacious appetite for electoral politicking, and a knack for sustaining popularity among the party faithful for an extended period of time.

Winning multiple nominations on major party tickets has not been a reliable predictor of presidential excellence.  Franklin D. Roosevelt leads the overall list with five such nominations, having won the presidency in four consecutive elections between 1932 and 1944, thereby redeeming his unsuccessful run for vice president in 1920.  Roosevelt’s reputation has remained sky-high.  He finished third in C-SPAN’s 2017 survey asking historians to rank U.S. presidents.   

Yet Roosevelt is tied at the top of the list with Richard M. Nixon, who won the vice presidency in 1952 and 1956, lost the presidency in 1960, then came back to take the top spot in 1968 and 1972.  Nixon, however, left the White House in disgrace and finished 28th in the same C-SPAN survey of presidential performance.

Second place on the frequency list is shared by two one-term presidents with four runs on a major party ticket, but only middling rankings for performance in office.  John Adams won the vice presidency in the nation’s first two elections (1789 and 1792), the presidency in the third (1796), and lost a presidential run in the fourth (1800).  George H.W. Bush followed the identical pattern, serving two terms as vice president to Ronald Reagan in the 1980s, winning his own term as president in 1988, then losing his race for re-election.  Perhaps fittingly, Adams and G.H.W. Bush finished neck-and-neck in the C-SPAN survey, ranked 19th and 20th, respectively.  

Two political figures require judgment calls in the frequency-of-nomination rankings.  Theodore Roosevelt was the Republican candidate for vice president in 1900 and for president in 1904.  But what about his campaign in 1912 as the Progressive Party candidate for president?  The Progressives swiftly faded to obscurity, but Roosevelt finished second in 1912, well ahead of Republican incumbent William Howard Taft.  On balance, that seems to qualify him as a “major party” candidacy.

An easier case involves Martin Van Buren, a Democrat who won the vice presidency in 1832, and the presidency in 1836.  Van Buren returned to presidential politics a dozen years later at the top of the Free Soil Party ticket.  Since Van Buren’s Free Soil Party won zero electoral votes in 1848, it seems fair to exclude that campaign from the frequency-of-nomination ranking.

Then there is the sad record for futility shared by Rufus King, Charles Cotesworth Pinckney, and William Jennings Bryan.  Though all three ran on the national tickets of major parties on three occasions, none ever won.

If former Vice President Biden comes out on top in this November’s poll, he will achieve one more distinction, joining Franklin Roosevelt, Nixon, Adams, Jefferson, and George H.W. Bush as the only Americans to win at least three elections as a major party candidate.  That’s pretty fair company, at least as American presidents go.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177885 https://historynewsnetwork.org/article/177885 0
An Interview with Ian W. Toll, author of "Twilight of the Gods: War in the Western Pacific, 1944-1945"

 

 

 

Books about World War II generally fall into two categories. The first is the hour-by-hour account of individual battles, often told via accounts of individual soldiers. One example is Eugene Sledge’s With the Old Breed: At Peleliu and Okinawa, which was the basis for the HBO mini-series The Pacific.  

 

The second type is the “big picture” or strategic view, such as A World at Arms: A Global History of World War II, by Gerhard Weinberg.  

 

Ian W. Toll’s unique approach is to skillfully interweave narratives of the bloody, on-the-ground combat with an in-depth look at the Washington D.C. decision-makers who picked the commanders and determined long-term strategy.  

 

Twilight of the Gods is the last of Toll’s magisterial, three-volume account of World War II in the Pacific. His first two volumes covered the earlier years, Pacific Crucible: War at Sea in the Pacific, 1941-42 and The Conquering Tide: War in the Pacific Islands, 1942-1944. 

 

Why did we make our first stand against the Japanese at Guadalcanal and not another island? Why did Iwo Jima cost so many Marine casualties? And why was the Japanese Zero so deadly in the first years of the war? Twilight of the Gods answers these questions and gives the reader thrilling eyewitness accounts of the major battles of the war’s last year.   

 

Toll has an unusual background for best-selling historian. After graduating with master’s in public policy from the Kennedy School of Government at Harvard University, he served as a political aide to U.S. Senator Paul Sarbanes and a New York lieutenant governor. He then worked as an equity research analyst at three major investment banks before quitting to become an author. 

 

Q. Twilight of the Gods is one of the few WWII histories in which the author includes both detailed battle scenes (e.g. Iwo Jima) and a “macro view” of the strategic discussions by FDR and the Joint Chiefs.  What were your challenges in accomplishing this?  

 

A. While researching my first book, Six Frigates, I concluded that a history of naval operations for that period was insufficient. I had to tell the entire story of the early American republic, its major political and diplomatic challenges, order to place the military history in context. The dilemma was essentially the same for my Pacific War trilogy. I believe that military histories have tended to take a “stay in your lane” approach, adhering to accounts of battles and operations. I prefer to weave the strands of politics and foreign policy into the fabric of the narrative. 

 

Q. You did not start writing history books until you’d successful careers in two other fields. Why such a late start? 

 

A. I had always wanted to be a writer, since childhood; and I had always been a voracious reader of history. In 2002, I had read almost everything there was to be found on the early American Navy, and I came to believe that there was a very good book that had not yet been written. I left my job (in banking) after 9/11, wrote a first chapter and a book proposal for Six Frigates, and to my surprise the book was acquired by a major trade publisher, W.W. Norton. 

 

Q. WW II is now 75 years distant, yet it remains fascinating to Americans in way that more recent wars (e.g. Korean War, Vietnam War) are not. Is this because the war was so large, or because America seemed more “united?”  

 

A. The Second World War was the largest and bloodiest conflict in human history, and for that reason alone it commands more attention than regional wars such as those in Korea and Vietnam. But I think you are right in suggesting that we tend to see the war as a contest between good and evil, and that enhances its appeal and fascination, particularly to Americans and our allies, the British.

Q. One of the most interesting figures portrayed in your trilogy is Admiral Ernest J. King. In your books he emerges as a key architect of victory in the Pacific and an individual who excelled at picking the commanders who operated underneath him.  He has been less favorably portrayed in other WW II histories. What factors caused you to rate him so highly?  

A.  It seemed obvious to me that Ernest King did more than any other American military figure to shape the course of the Pacific War, but he is barely remembered at all by those who are not aficionados of World War II and naval history. Admiral King kept a low-profile and did not seem much interested in leaving his mark in history. 

This is a larger issue, generally, for historians and biographers -- figures who shy away from publicity and do not take much of an interest in promoting their own place in history tend to receive less attention than those who seek more attention. The Pacific War provides many good examples. Why are there so many biographies of Admiral Bull Halsey, but only one on Raymond Spruance, who was a better fleet commander? Why does everyone know George S. Patton, while many have forgotten the 5-star general Omar Bradley, who stood above him in the chain of command? Historians and biographers should not allow themselves to be led astray by disparities in 1940s-era publicity.

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177932 https://historynewsnetwork.org/article/177932 0
The Roundup Top Ten for October 23, 2020

The Framers of the Constitution Didn’t Worry about ‘Originalism’

by Jack Rakove

"Some of the key words and terms in our constitutional vocabulary were subject to pounding controversy and reconsideration. One has to engage these debates to understand how Americans were thinking about these issues at the time."

 

Religious Identity And Supreme Court Justices – A Brief History

by Nomi Stolzenberg

In recent decades, religious influence on the Court has been shaped by conservatives of different faiths, construed as part of a mythical Judeo-Christian tradition, coalescing around a common agenda defined less by affiliation with a religious denomination than with opposition to liberalism and secularism.

 

 

What Fans of "Herd Immunity" Don't Tell You

by John M. Barry

Prolonged isolation measures to fight COVID-19 do cause harm--social, emotional, and economic. But advocates of "herd immunity" are not offering a practical or safe plan to protect the vulnerable if the virus spreads on a mass scale. 

 

 

American Exceptionalism Gives Voters a False Sense of Security about the Election

by Melissa J. Gismondi and Shira Lurie

American democracy won’t endure just because it always has. In this moment, American exceptionalism could prove fatal.

 

 

Disenfranchisement in Jails Weakens our Democracy

by Charlotte Rosen

Because the pretrial population is disproportionately non-White, this kind of “de facto disenfranchisement” constitutes an abhorrent form of racist voter suppression, despite rarely gaining the headlines and outrage that long voting lines do. 

 

 

The Women Behind the Million Man March

by Natalie Hopkinson

Community archives such as the District of Columbia’s are critical interventions into the omissions of history. This one, like others, makes clear that behind every great feat in the public record lies an untold story of the unsung foot soldiers, architects, analysts and fixers — and these are often women.

 

 

We All Think History Will Be on Our Side. Here's Why We Shouldn't

by Priya Satia

We would do better to listen to today’s historians in order to understand how we got here and recover other guides to conscience, not just look to future historians for consolation.

 

 

Toward a Global History of White Supremacy

by Daniel Geary, Camilla Schofield, and Jennifer Sutton

We need to understand the history of global connections between white supremacists if we are to grasp what has sustained white nationalism despite global trends toward liberation and equality.

 

 

Conservative Activists in Texas Have Shaped the History All American Children Learn

by Rob Alex Fitt

"Liberal groups such as People for the American Way were aghast at what was happening in Texas. They launched counter campaigns in the early 1970s to try to break conservative activists’ stranglehold on the textbook selection process, to no avail."

 

 

1619, Revisited

by Nicholas Guyatt

Argument isn’t an obstacle to the work of historians; it is the work of historians. Public interest in 1619 has suggested something truly profound: that Americans have the capacity to think differently about their history. 

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177915 https://historynewsnetwork.org/article/177915 0
Fraught Family Reunification After the Holocaust

Jewish Youth liberated from Buchenwald, en route to an Oeuvre de Secours aux Enfants (OSE) home in Ecouis, France, 1945

 

 

 

As the Second World War ended, an estimated 150,000 – 180,000 child survivors of the Holocaust emerged from their hiding places or centers of internment. They were a tenth of Europe’s pre-war population of Jewish children, a fragment of an entire generation. In the months and years that followed, these child survivors began to search for their families, and some (we will never know exactly how many) were reunited with surviving mothers and fathers. But this was rarely the happy ending that we would like to imagine. 

 

In my latest book, Survivors: Children’s Lives After the Holocaust, I found that the stories of child Holocaust survivors upended my assumptions again and again, and nowhere was this truer than around the emotive topic of family reunions. We are so used to encountering scenes of joy as families are reunited in Holocaust film and literature that it is challenging to accept the reality of a bleaker picture. The notion that family reunification was the best possible outcome for a survivor child is deeply seductive, particularly as so many survivor children found themselves with no remaining family members at the war’s end. Children who found their way back to surviving mothers and fathers were frequently told how lucky they were. Yet of the 100 children whose stories I examine in the book, not one who was reunited with surviving parents described the experience as joyful. They often had far more painful post-war experiences than those who found themselves living in care homes, without family. Their stories expose just how deep is our desire to see the survival of a family in and of itself as a victory.

 

If we consider the complexities of the early postwar period, we begin to see why it was intensely difficult for surviving children and parents to find themselves together again. The reunified Jewish family was profoundly fragile. Parents and children had often been apart for years, and the war had changed them. They sometimes could no longer speak the same language. Children had acquired new wartime identities, particularly if they had survived in hiding: they may have had new names, new religious identities, and new attachments to families who had hidden them during the war. If they were young enough, they may have had no memory of their parents at all.

 

Moreover, parents had changed. They had experienced internment and concentration camps, forced labor, life in hiding, or the anxiety-ridden experience of those who tried to pass as Aryan. Many were in a state of physical and emotional collapse by 1945. These exhausted adults returned to find strangers living in their homes, their possessions scattered, their jobs gone. Adult Holocaust survivors faced staggering poverty in the early postwar period, and children who went back to live with surviving parents often found that their families lacked housing, food and clothing. In such circumstances, the additional burden of caring for a child could push a family to the point of collapse. 

 

We can see this in the numbers. The American Joint Jewish Distribution Committee supported 120,000 child Holocaust survivors after the war, and a majority of these – 85,000 children – were living with a surviving parent or relative who was too impoverished to care for the child without financial aid. Children living in care homes intended for orphans also frequently had surviving parents. In September 1946, the French aid organization Oeuvre de secours aux enfants (OSE) had 1,207 Jewish child survivors living in its network of care homes, but only a quarter were full orphans. We might expect to see these numbers decrease over time, as parents found their feet in the postwar world and reclaimed their children from care, but the opposite was true: the OSE was shocked to find that as time went by, more and more surviving parents tried to place their children in the agency’s care. Surviving parents saw in such institutions an opportunity for their children to have a better quality of life than in the family home. These arrangements were often meant to be temporary, but they also extended the period in which children and parents grew stranger to each other – sometimes until too much time had passed for the relationship to be restored. 

 

Many children struggled to trust survivor parents who were essentially strangers. Some felt anger that their parents had abandoned them, and further anger when these parents removed them from wartime rescue families where they had felt happy and comfortable. Whether they had survived the war in hiding or in internment, children had often needed to be obedient, quiet and good to stay safe. With that need gone, they could rebel. They withdrew emotionally from their parents, refusing to touch them or even to accept them. Henri O., who survived in hiding in the Netherlands as a very small child, was reunited with both his mother and father after the war. He was five years old. He recalled the discomfort of their reunion:

 

When they turned up, I recognized my mother, and I said, 'you stayed away a very long time.’ Yeah, two and a half years, half my life. Okay. And then somebody says, ‘Why don’t you want to sit on your daddy’s lap?’ So I sat on my daddy’s lap. But it wasn’t quite the same.

 

For some child survivors, one of the greatest frustrations in these reunited homes was the enforced silence around the wartime past. Many had urgent questions that surviving parents did not want or could not bear to answer, particularly about murdered relatives. Those living with surviving mothers asked after their murdered fathers (or the reverse), and frequently had their enquiries rebuffed. That precious knowledge – what was my father or mother really like? – was withheld by parents too traumatized to engage in the work of remembering their dead partners. In other cases, the enormity of a parent’s own traumatic memories interceded in the daily life of the household. Cecile H., who had survived the war by escaping with her mother to neutral Switzerland, was reunited with her concentration-camp survivor father after the war:

 

Father had found photos in the barracks or SS lockers when he was liberated, and had taken some because he was worried that no one would believe him. He hid these pictures in the house on top of a closet, and once I took those pictures and lay them all out on the floor, and I’ll see those pictures until my dying day. […] I kept having nightmares after that, and [my father] burned them. He said, ‘I’m never going to talk about that again.’ It was a difficult time for us, after the war.

 

When thinking about families after the Holocaust, we should be attuned to the incredible challenges that these families faced. We should consider what sorts of outcomes awaited children who went back to live with surviving parents, and how this felt to children, what it meant to them, and what role the memory of the recent past played in reconstructed households. We should, moreover, be attuned to what light these stories of reunified families might shed on a present world in which child refugees are regularly separated from their parents – a practice which, these stories suggest, is not only unimaginably cruel in the present, but is one with harrowing long-term consequences for refugee families as they journey into uncertain futures.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177744 https://historynewsnetwork.org/article/177744 0
"Every Goodbye Ain’t Gone and Every Close Eye Ain’t Shut": Black Georgians' Memories and Election Unease

 

 

During the June 9th primary elections in Georgia, Black voters were alarmed as they stood in line to vote, in some cases for three to four hours.  Terri Russell told the New York Times, “I refuse not to be heard and so I am standing in line.” Her dedication was even more poignant as Russell, who is 57 year old and who suffers from both asthma and bronchitis, had requested and never received an absentee ballot, and consequently was risking her life by standing in line during a pandemic in a county with one of  the highest rates of COVID-19 cases in the state.  Elections officials blamed the problems, which primarily plagued Black and Democratic regions of the state, on new voting machines. For many including Stacey Abrams, the problems harkened back to her narrowly defeated 2018 bid to become the state’s first Black governor, and the nation’s, first Black woman to win a governorship.

Nearly four months later on September 29th, when Chris Wallace called on President Donald Trump to denounce white supremacist groups like the Proud Boys during the first Presidential Debate, the President responded by calling on the militia group to “stand down and stand by.”  For many in Black communities, a collective shudder like steel against porcelain was palpable.  The primary elections and Presidential Debate of 2020 remind many in Black communities of efforts to mobilize black voters during the 1960s, but the reality is that semblances of these events reach further back to Reconstruction when formerly-enslaved people were first able to vote en masse.  This was a time when millions of Blacks, fresh from the horrors of slavery and the Civil War, held onto the promise that with Emancipation they could participate actively in the American experiment. But In Georgia, it would not be until the Spring of 1868 that they would be able to run for office and vote for laws that they had participated in writing.  

In the spring of 1868, African American voters faced long lines at polls when and where polls opened at all.  Part of Georgia’s newly ratified constitution was an expansion of education for all Georgians as well as property rights for women, and local white officials frequently questioned Black Georgians’ right to vote.  Black Georgians faced tremendous odds to vote, even in regions of the state with Black majorities, and only thirty-three out of nearly two hundred legislators elected in 1868 were Black.  Less than five months later, they would be expelled from office because they were Black, and it would be more than a year before they would regain their legislative seats.   During that year, White officials throughout the state targeted Black voters with the poll tax. For those who organized politically, their message was clear: the Black franchise was a threat to be quelled.  A well-known incident of Black voter suppression was the Camilla Massacre on September 19th, 1868, in Southwest Georgia near Albany.  Hundreds of local, primarily Black residents marched to Camilla to attend a political rally, but before they could reach the town square, White residents and officials shot and killed nearly twenty of them while countless others were injured and hunted down as they fled back to Albany. 

Efforts to dissuade Black voters frequently materialized in campaigns to threaten both Black voters and Black political leaders who were beaten, often in the dark of night.   Others involved in political organizing were held indefinitely in jail with no cause as was the case of F.H. Fyall, one of the Black legislators expelled from office.  Fyall was eventually arrested for his political activity and held in jail for nearly two months because of his refusal to switch his political affiliation.  Among the more high-profile targets of Black voter suppression were Henry McNeal Turner, a Georgia legislator, a minister and  future Bishop in the African Methodist Episcopal Church, and sixty-four-year-old Tunis Campbell Sr., a Georgia legislator and an Elder in the African Methodist Episcopal Church, Zion.  McNeal faced systematic threats for his political activity and Campbell would find himself incarcerated and on a Georgia chain gang. Blacks were also threatened economically for political activity. Sam Gilbert, a freedman from Houston county, exemplified this suppression when he faced a reduction of his wages for attending a political meeting.

By the fall elections of 1872 the state was firmly under conservative control and state leaders reinstituted poll taxes as a means of discouraging Black voters who already were facing economic challenges due to low wages.  Governor James M. Smith legally reconvened the state militia under the guise of promoting peace and civility, and he supplied guns and munitions to state sanctioned militias which were overwhelmingly composed of former Confederate soldiers.  During the elections that fall, these armed groups frequently surrounded polling stations bringing violence and intimidation in virtually every part of the state, blurring the line between state sanctioned armed groups and local Klan, and promoting their form of “law and order.”  

Black voters and politicians were helpless to oppose state sanctioned suppression, and the number of Black legislators declined even in counties with clear Black majorities.  Moreover, Georgia officials, hoping to avoid any return to federal oversight, moved state elections back a month to October and required Black voters to produce poll tax receipts.  Unsurprisingly, Ulysses S. Grant again lost the state during his bid for re-election in November of 1872.

For most Georgians, the history of Reconstruction in their state is not a familiar one, but the unspoken realities of past voter suppression in Georgia during this era resonate in the present experience of many Black Georgians.  Many have a tacit understanding that, even with the promise embodied in the election of Barack Obama in 2008 and 2012, the past is not gone, but promises to resurface like a shark in deep water with efforts de-register voters or with voting machines that don’t work in black or brown communities.  Voter suppression also resurfaces as Black voters are forced to choose between staying at home in an election or risking their lives to COVID-19 by waiting to vote in long lines and endure the potential threat of intimidation by “patriots” emboldened by the President himself to monitor polling stations.  150 years after Emancipation, many Blacks in Georgia and throughout the nation still feel a familiar unease, and many understand the proverb, “Every Goodbye Ain’t Gone, and Every Closed Eye Ain’t Shut”.  

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177745 https://historynewsnetwork.org/article/177745 0
Treason, the Death Penalty, and American Identity

Pueblo de Taos, 1847

 

 

In 1847, in a dusty plaza in Taos (now part of the American state of New Mexico), American authorities tried and executed a thirty-nine-year-old man named Hipolito Salazar for treason against the United States.  He is the only person ever executed for this crime since the adoption of the United States Constitution.  His story has faded into almost complete obscurity, but it tells us much about American national identity and whom we are willing (and not willing) to execute for acts of national betrayal.         

According to the solemn declarations of judicial opinions and legal treatises, treason is the highest crime in American law, worse even than murder.  Legal historians, however, know that such statements should not be taken at face value; the law in action is often quite different than the law stated in the books.

Since the late eighteenth-century, America has executed thousands of people for murder, but only a minuscule number for treason.  In addition to the Salazar execution, a small handful of people were executed as traitors to the individual states during the American Revolution, and John Brown and Edwin Coppoc were executed for treason against Virginia in 1859 for their role in the raid on Harpers Ferry.

If murder was committed far more frequently than treason, this disparity would be unremarkable.  In most years, treason is indeed rare.  But in a handful of turbulent years, treason has been widespread.  Although precise data do not exist, is it clear that tens of thousands of Americans have committed treason.  During the American Revolution, thousands of people sided with the British and committed acts that brought them within the technical scope of state treason laws.  The largest number of offenses came during the American Civil War.  All the members of the Confederate military, along with any person who provided them aid and assistance, committed the crime of levying war against the United States.    

But the Civil War did not lead to any treason executions, and there were very few during the American Revolution (many prosecutions failed because juries refused to convict, and where convictions were obtained, governors usually granted clemency).

So the disparity between treason and murder executions must be explained by other factors.  Two are particularly salient.

First, treason has generally been perceived as a political crime, not an individual crime of violence, even if it involved fighting as a soldier in the enemy’s military.  This was particularly true during the Revolution, when almost everyone knew someone—a friend, a neighbor, a relative—who had chosen the other side.  Such people were not incorrigible criminals, but ordinary Americans who could easily be welcomed back as productive members of society once the conflict was over.  Treason, in short, could be forgiven, whereas murder could not.

Second, conflicts like the Revolution and the Civil War would have been even more horrific if a wave of executions had followed on their conclusion.  Once the war was over, the primary goal became national reconciliation, welcoming back fellow Americans who had laid down their arms.  There was little appetite for further bloodshed, or for the creation of martyrs around which disaffected individuals could rally for generations. 

All of which suggests that something distinctive was happening in that plaza in Taos in 1847.  Until the publication of my book On Treason: A Citizen’s Guide to the Law, his case was entirely unknown to American treason scholars, who insisted that no such executions had occurred (and I include myself in that same category, having only learned of Salazar’s case in 2019). 

“Distinctive” would be an understatement.  In 1847, New Mexico did not belong to the United States, but to Mexico.  It would not be formally transferred to American jurisdiction until the 1848 Treaty of Guadalupe-Hidalgo.  Hipolito Salazar was a Mexican citizen who had never set foot in the United States.  As such, he owed no allegiance to the United States and could not be lawfully prosecuted for treason against it.

But when American military forces stormed into New Mexico in the Mexican-American War, they ignored these legal niceties, issuing proclamations stating that New Mexico was now part of the United States and that all residents of New Mexico owed allegiance to the United States.  Any military resistance to the American occupation would be treated as an act of treason.

Taos would later erupt in violent resistance.  The “Taos Revolt,” as it came to be called, was eventually suppressed by the American military, but American officials were determined to place the surviving leaders on trial.  Some were charged with murder, some with treason.  Of the treason defendants, Salazar was the only one who was convicted and executed.

As word of these trials seeped back to Washington, DC, many members of Congress responded with horror—why were American officials trying Mexican citizens, on Mexican soil, for the crime of treason against the United States?  The Polk Administration was forced to concede that the treason indictments had been issued in error.  Since New Mexico had not yet been formally ceded to the United States, treason was an inappropriate charge, and Salazar’s conviction was legally invalid.  But administration officials argued that this was a mere technicality—Salazar was basically a murderer who deserved to die, and whether the charge was treason or murder, he had received his just deserts.

Curiously, the same argument could potentially have been made eighteen years later, in the aftermath of the Civil War.  The Confederates could have been viewed as murderers, attacking American forces in the same way as had Salazar and his men.  But no one referred to the Confederates in this manner.  They were once and future Americans, part of the political community who had made a political mistake, but who could be welcomed back with open arms.

Salazar, by contrast, had never been an American; he had always been an outsider, perceived to be racially different.  And that fact, perversely, made him far more susceptible to execution for treason against the United States.  His offense could be casually compared to murder, in ways that other forms of treason would not.  And as a quasi-murderer, execution was entirely appropriate.

At the gallows, Salazar complained bitterly about the unfairness of his trial.  As the platform was about to drop, he uttered his last words, perfectly capturing his outsider status: “Caraho, los Americanos!”  Or, in English, “F--- the Americans!”

It is not a story that is recounted in our textbooks, but it should be.  It reveals much about who counts, and who doesn’t, in dealing with crimes of betrayal.  White Americans who waged war against their country and killed thousands other Americans in defense of race-based slavery were forgiven and allowed to return to their regular lives.  By contrast, the only person to die for betraying America was not even an American at all, but a Mexican defending his homeland. In this case, as in so many others, the law in action leaves much to be desired.

  

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177832 https://historynewsnetwork.org/article/177832 0
Fear of the "Pussification" of America: A Short Cultural History

 

 

Of all the responses to the COVID-19 pandemic in the United States—ranging from debates over mask wearing to school closings— perhaps the most bizarre is the suggestion that this deadly disease can be avoided simply through manliness.

 

Nowhere was this made more explicit than when former US Navy Seal Robert O’Neill shared a photo of himself, unmasked, on a Delta Airlines flight. “I’m not a pussy,” declared O’Neill on Twitter, as if to suggest that potent, masculine men, like those on Seal Team 6, would not be cowed into wearing cowardly protective gear (Never mind that a passenger sitting one row behind O’Neill, in a US Marine Corps baseball cap, was wearing his mask).

 

 

O’Neill’s use of the “P-word” was far from an outlier; in fact, it has been employed near and far in recent months. Adam Corolla stoked public outcry only weeks later when he maintained, incorrectly, that only the “old or sick or both” were dying from the virus. “How many of you pussy’s [sic] got played?” the comedian asked.

 

Nor were these remarks limited to COVID-19. Not to be outdone by such repugnant rhetoric, President Donald Trump—who elevated the word during the 2016 presidential campaign for other reasons—reportedly lambasted senior military leaders, declaring that “my fucking generals are a bunch of pussies.” On the opposite end of the military chain of command, 2nd Lt. Nathan Freihofer, a young celebrity on TikTok, recently gained notoriety for anti-Semitic remarks on the social media platform. “If you get offended,” the young officer proclaimed, “get the fuck out, because it’s a joke…. Don’t be a pussy.”

 

What should we make of these men, young and old, employing the word as a way to shame potential detractors? Perhaps the most telling, and least surprising, explanation is that sexism and misogyny are alive and well in Trump’s America. Yet it would be mistaken to argue that the epithet has regained popularity simply because the president seemingly is so fond of the word. Rather, such language—and more importantly, what it insinuates—is far from new. 

 

In July, after Alexandria Ocasio-Cortez (D-NY) was verbally accosted on the Capitol steps by fellow representative Ted Yoho (R-FL), the congresswoman delivered a powerful speech on the House floor. The problem with Yoho’s comments, Ocasio-Cortez argued, was not only that they were vile, but that they were part of a larger pattern of behavior toward women. “This is not new, and that is the problem,” she affirmed. “It is cultural. It is a culture of lack of impunity, of accepting of violence and violent language against women, and an entire structure of power that supports that.”

 

She’s right. This “violent” language—calling women “bitches” and men “pussies”—and the understandings that accompany it has a long history in American popular culture. And few cultural artifacts depict such sexist notions more overtly than Cold War men’s adventure magazines.

 

 

These “macho pulps” were an outgrowth of earlier men’s periodicals, including Argosy and Esquire. In the aftermath of World War II, magazines with suggestive titles like Battle Cry, Man’s Conquest, and True Men, exploded in popularity. The February 1955 issue of Stag, for example, sold more than 585,00 copies nationwide. The stories that filled these magazines portrayed the ideal man as physically tough, sexually virile, and unabashedly patriotic. Women, conversely, were represented either as erotic trophies of conquest or as sexualized villains to be overpowered.

 

Take, for example, an illustrative story from the March 1963 issue of Brigade. In “Castration of the American Male,” pulp writer Andrew Petersen decried how the “manly virtues—strength, courage, virility—are becoming rarer every day…. Femininity is on the march,” Peterson claimed, “rendering American men less manly.” To put a finer point on the message, Brigade included with the article a photograph of a sullen husband, in floral apron, doing the dishes. The message seemed clear. The masculine ideal of sexual conqueror and heroic warrior, touted in nearly every issue of the pulps, was under assault.

 

 

Indeed, in Cold War men’s adventure magazines, “real men” were never “pussies.” They courageously defeated former Nazi henchmen and evil communist infiltrators. They exposed femmes fatale who were engaging in “sexological warfare,” using their physical bodies as weapons of war. And they seduced women across the globe, one navy vet describing himself in the pulps as a virile “bedroom commando.”

 

 

Yet just below the surface of these hypermasculine narratives, a subtext of anxiety loomed. Read a different way, the pulps might also be seen as a form of escapism from deep anxieties about not measuring up in a rapidly changing postwar society. Fears of being emasculated by Cold War suburbia and a consumeristic society pervaded these men’s magazines. Pulp writers, as seen in the Brigade article, habitually expressed concerns over American men becoming “soft.”

 

Arguably, these fears of losing one’s masculinity engendered not only hostility toward women but spawned a backlash against those supposedly “weak” men who weren’t holding the line against supposedly aggressive feminism. As Betty Friedan argued in The Feminine Mystique (1963), male outrage was the result of an “implacable hatred for the parasitic women” who apparently were denying husbands and sons a more vigorous, manly lifestyle.

 

The Vietnam War, at least in the pages of men’s magazines, seemed only to widen the gap between “real men” and their “pussy” compatriots. Saga lashed out at members of the “new left” and the blatant “draft dodging underground” taking hold on college campuses. Man’s Illustrated condemned the “cardburners” and “slackers” who had worked the system to stay out of uniform. One antiwar activist recalled hearing epithets of “faggots” and “queers” as often as “commies” or “cowards.” In the pulps, the best American men went to war, while the weaklings stayed home.

 

 

Such narratives outlasted the pulps themselves, which died out in the early 1970s. Stories glorifying war and sexual conquest seemed out of step with the cultural revolutions rippling through the United States in the immediate aftermath of a failed overseas war. Yet the macho pulp storylines retained their attraction enough to resurface only a few years later.

 

By the mid-1980s, “re-masculinized” men returned in full force. A finely chiseled Rambo deployed back to Vietnam to save American prisoners of war still held captive there. So too did Chuck Norris’s Colonel Braddock in the Missing in Action films. Even President Ronald Reagan took his cue from these tough-minded action heroes, quipping in 1985 that he would now “know what to do” if faced with a hostage crisis after watching Rambo: First Blood Part II. Would anyone call Rambo or Braddock a “pussy”?

 

The militarization of masculinity portrayed in the Stallone and Norris action movies had clear roots in the Cold War macho pulps. Nor should we be surprised by former Seal team O’Neill’s use of the term “pussy.” Because the veteran had achieved his manhood through military service, especially in an elite unit, he could be secure in demeaning others who didn’t meet his masculine ideals—which apparently also inoculated him from deadly viruses. 

 

Yet it’s not only the militarization of the “P” word that resonates, but the politicization of it as well. When Senator Ted Cruz (R-Tx) recently claimed that “many liberal males never grow balls,” he was purposefully contrasting his own supposed conservative masculinity with the femininity of his political rivals, whether they be male or female. One wonders, though, if Cruz truly fashions himself as the new archetype for twenty-first-century manhood or simply hopes to score a few cheap political points via social media name-calling.

 

Or, conceivably, Cruz is channeling what has underscored a decades’ long anxiety over American masculinity: that “real men” are on the verge of extinction because of political correctness gone awry or a feminist movement subverting traditional gender norms or any other imagined threat that stokes fears among mostly white, young, angry men.

 

Perhaps the most revealing expression of these anxieties comes from right-wing, all-male groups like the Proud Boys who see themselves as “aggrieved, marginalized, and depressed.” These traditionalists extol the imagined superiority of western culture, believe they are being disenfranchised by the left, and have found in Trump’s America a “place to put [their] political resentment.” As if to demonstrate their masculinity, the Proud Boys, according to one critic, “like to spoil for a fight.”

 

 

According to the “pussy” narrative, it’s not just the sensibilities of persecuted white men who are under attack, though, but the nation’s security as well. When I posted to social media a few covers from the macho pulps to promote my forthcoming book, one retired colonel who believes conservatives must win the current “culture war” replied that we all should focus more on crafting a militarized notion of masculinity “because a lot of Americans are pussies.” Another Twitter respondent claimed that the “pussification of America’s youth is a matter of national security. We need a touch of testosterone added to the water with fluoride,” he argued. “No more cavities and fewer softies.” By this logic, if only US soldiers and marines were equipped with more hormones, they might have achieved more lasting results in Iraq and Afghanistan.

 

So, what to make of this perceived “pussification” of America? Most importantly, we need to accept the fact that real violence stems from imagined grievances and misogynistic language. A congresswoman being verbally assaulted. A group of wrathful white men seeing themselves as a “right-wing fight club.” A militarization and polarization of society based on outdated gender norms.

 

For those who don’t want to examine the violence they perpetrate—against women or minorities or immigrants or any other perceived social or cultural threat—the “pussification” of America provides a warped justification for violent means.

 

Popular narratives of what it means to be a man, to paraphrase Josephine Livingstone, need not rest on connecting “the vulva with weakness” for those men who don’t act like chiseled Hollywood action heroes. In the Cold War men’s adventure magazines, “real men” were depicted as heroic warriors and sexual champions. More than a half century on, such depictions continue to resonate far too widely across American society. It seems well past time to evolve beyond mid-1950s mindsets and begin conversations about alternate models of masculinity. Chances are, America will survive, even if all men aren’t Rambo.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177828 https://historynewsnetwork.org/article/177828 0
Ranking Donald Trump: No Cause for National Happiness

 

 

As editor of a recent book about presidential misconduct from George Washington’s administration through Barack Obama’s, I’m often asked where Donald J. Trump stands in the rankings of American presidents.  I respond that, in observance of historians’ practice, it’s too early to tell.  After all, the president’s term in office hasn’t yet ended.  And although we already know much about the Trump presidency from press coverage and court filings, the administration’s records are closed to examination.

But today this kind of reticence seems difficult to defend on civic grounds.  In this moment of political, environmental, public health, and resulting economic crisis, Americans deserve a considered answer to the question they ask: How does Trump fare in comparison with his predecessors?

The truth is that historians have never arrived at agreed-on criteria by which to compare American presidents.  Anyone who tries his hand at placing a single president in a ranking of some sort runs up against the fact that, despite many previous attempts, no group of specialists in presidential history has used the same set of measures.

Furthermore, although a number of historians’ presidential rankings have appeared since 1948 when Arthur M. Schlesinger Sr. led the first effort to create one, none is recent.  Consequently, what follows is my own stab at an assessment of Trump’s presidency.  Although it adopts some of the same yardsticks used in earlier attempts, it shouldn’t be taken to represent the views of historians generally.  Nevertheless, with another presidential election rushing at us, let me try my hand at trying to determine where the incumbent president ranks against those who’ve occupied the presidency before him.

Preparation for office.  It used to be asked about candidates for the White House whether they were “presidential timber.”  By that was meant two things: previous experience as an elected official and possession of the proven knowledge, bearing, and authority appropriate to the presidency.  Measured for public office-holding experience before their presidencies, all but three pre-Trump chief executives served in elective posts prior to their election,   The three who didn’t—George Washington, Andrew Jackson, and Dwight D. Eisenhower—demonstrated their leadership and command abilities by leading large military forces in the field.  Earlier experience in elective office, whether in national posts (like congressional seats and the vice presidency), through high military command, or in state office (say, a governorship), is assumed to give a chief executive the necessary political skills, leadership abilities, and knowledge essential to governing.  Not all presidents have possessed all these qualities even after previous experience.  One thinks of, say, Andrew Johnson as lacking what’s needed in the presidency.  An amateur in public office, Trump, having never been tested for leadership or command in public office, fails when measured for previous preparation, too.

Fitness for office.  This large measure encompasses aspects of a president’s mind and character—knowledge about the history, constitution, and culture of the nation as well as possession  of honesty, skill in selecting cabinet officers and advisors, balance of judgment, prudence of expression, empathy toward others, calmness in action, strength in decision, and the moral compass necessary for effective governance.  Fitness also includes the absence of inherent personality traits that inhibit soundness of judgment, calm behavior in the face of critical challenges, and balance in decision-making.  It’s difficult to find any previous president who exceeds Trump in his lack of so many of these qualities.

The successful pursuit of stated goals.  Central to an administration’s record are the aims it sets for itself, the quality of those aims, and its success in achieving them.  Historians’ favorite example of the successful achievement of campaign objectives is the one-term presidency of James K. Polk.  During the 1840s, Polk met all four of his campaign objectives: incorporating after negotiations with Great Britain the Pacific Northwest (today’s states of Washington and Oregon); gaining the American southwest from Mexico through war; lowering tariff rates; and establishing a federal monetary system independent of private banks.  Most other presidents have achieved only parts of their platform goals.  Trump’s major aims have been to free the U.S. of military and other foreign entanglements (half win), see to the repeal of the Affordable Care Act (fail), convince NATO to reduce its dependence on American funding (half win), erect barriers against immigrants, especially via a wall at the Mexican border) (half win), nominate conservative federal judges (win), and reduce federal taxes and regulations (win).    Whether Trump’s aims have been beneficial to the US and the world is debatable, just as the Polk administrations’s addition of additional slave territory has never sat well with historians.  But measured against campaign goals, Trump has done well, especially having been in office for less than four years.

Protecting the national interest.  This is considered the bedrock responsibility of a president.  It’s a major constituent of every campaign platform.  Its pursuit always faces serious challenges, its achievement many obstacles.  Central to it is the avoidance of war, victory in any conflicts that prove necessary, and the creation and preservation of good relations with other nation-states so as to guard and enhance the national interest.  On these grounds, Trump does better than, say, James Madison, James Polk, William McKinley, Woodrow Wilson, Franklin Delano Roosevelt, and the two Bushes, all of whom, claiming provocation or having seen the nation attacked, took the country into wars, some of them of questionable justification.  But are the unforced errors that have led to our recent fraying relationships with NATO, Iran, and China and our hard-to-explain cozying up to Russia and Saudi Arabia productive of greater American security?  By this metric, Trump comes out somewhere in the middle of former presidents.

Skill in governing.  Leading a real estate development firm is unlikely to give a president the political skills and other capabilities needed to govern.  In the White House, you’re head of a political party as well as head of state and chief of government, and it helps to be good at all three.  Sometimes—think of Millard Fillmore and Warren Harding—even prior experience in elective office fails to prepare you to be president of the United States.  Moreover, you’re president of all Americans, not of just some of them.  You’ve got to manage cabinet departments; represent the U.S. with dignity abroad; distinguish between campaigning and governing—all of these and many more skills and sensibilities being central to leading a large, powerful, and diverse nation like the U.S.  Abraham Lincoln and Franklin Roosevelt stand out as brilliant at governing the nation, their cabinets, and Congress in the midst of war.  Donald Trump?  His governing skills put him in the bottom half of the pack in any ranking of his predecessors.

Truth-telling and exemplary conduct.  Exemplary conduct, honesty being its chief ingredient, is the coinage of effective governance.  Six-year-old George Washington’s celebrated statement (a fictional one) that he could not tell a lie has set the standard for each president who followed him.  Most presidents, most notoriously Richard M. Nixon, have shaded or avoided the truth, often by covering up misdeeds.  Citizens are likely to shrug off a president’s lies if they’re infrequent, venal, and few.  But if, like Nixon’s, they’re many and strike at the heart of the government’s integrity, they pass the bounds of tolerability.  The number of Trump’s lies, tabulated by press and other organizations, surpass previous records by such an order of magnitude that they stagger belief.  Never has a previous president proved as mendacious as today’s incumbent.  In this category, Trump resides at the bottom.

Staying within the law.  Being on the defensive is normal for a president; no act goes without scrutiny and attack.  But once a president has to say, as Nixon did, that “I am not a crook,” that president has lost the authority and credibility to lead.  Nixon was the previous champion of illegal behavior—in his case being the first to orchestrate misconduct (what we know as the Watergate Affair) from the Oval Office.  But compared to Trump, Nixon was a lightweight.  Where Nixon acted purposefully to break the law though illegal acts and cover-ups, Trump has, while flouting the law, used his office to enrich himself at the public’s expense while ignoring existing legislation and breaking venerable norms of governance.  Seeking favors from foreign governments (Russia’s and Saudi Arabia’s), being exposed for corruption (“Individual #1”), padding his company’s pockets in contravention of the Emoluments Clause of the Constitution, using his foundation’s tax-protected funds for personal use, and failing to see that his eponymous university deliver promised education to its students—never before has a president so flouted the spirit and substance of the law.  Another case of Trump’s landing at the very bottom of the list.

Lifting hearts, banishing fear.  The greatest presidents, through the words they use, summon people to the nation’s service by raising their hopes, routing their anxieties, and helping the best in human nature express itself.  An effective president speaks as a wise counselor and rousing coach as FDR did in telling his fellow Americans that “the only fear we have to fear is fear itself.”  Never before has a president referred to his fellow citizens as “scum” “haters,” “losers,” “lowlifes,” and “thugs” or tried to set American against American.  Although Nixon indulged himself privately in insults against his political enemies, no other president has ever done so publicly or with the same viciousness.  Trump comes out at the bottom in this respect, too.

Empathy toward others.  Humility.  Social Conscience.  An ability to bring people together.    To merit the reins of government, chief executives must have the capacity to project themselves imaginatively into the feelings, thinking, and situations of those whom they lead—to act, in Abraham Lincoln’s words, “with malice toward none, with charity for all.”  Trump possesses no capacity to understand, accept, and empathize with others’ difficulties.  One recalls our great presidents for their magnanimity and inspiration.  On this score, Trump also falls to the end of the list.

Steering clear of self-dealing.  It has long been established—by the Constitution, law, and norm—that an incumbent president must not use his office for self-enrichment.  A chief executive may have inherited or accumulated wealth as, say, Washington and FDR did, and office may fit him for future earnings as it has Bill Clinton and Barack Obama.  But as we now know in detail, Trump has repeatedly manipulated law and regulation to protect and enlarge his and his family’s wealth while in office.  He has brazenly hawked his own products, directed the government to house federal officials at his resorts, made clear his expectation that foreign delegations and party stalwarts use his Washington hotel, and thwarted the move of the FBI to the Maryland suburbs so as to prevent the construction of a competing hotel in place of the FBI’s current headquarters.  No other president has attempted anything like Trump’s efforts to line his own pockets while in office.  On this count, too, he ends up at the bottom of any ranking.

Abiding by existing norms of government.  A nation’s code of governance, as much as its laws, reflect the character of the nation itself.  Since the birth of constitutional government in 1789, all presidents have abided by most of the standards of behavior, presentation, and action adopted before their time in office.  Without exception—even Nixon at his most fragrantly illegal—no president has openly stated his defiance of constitutional and other norms as has Trump.  None has flaunted his intention to challenge the outcome of a presidential election.  This threat alone places Trump at the bottom in a league of his own.

An assessment like this one of Trump’s presidency constitutes a bill of indictment of his presidency.  But it also gains his record one gold star of sorts.  He’s accomplished what no other president has been able to achieve since the first presidential ranking in 1948.  He’s managed to raise James Buchanan, Millard Fillmore, Andrew Johnson, and Warren Harding off the floor.  The sad thing is that this is no achievement we can cheer.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177833 https://historynewsnetwork.org/article/177833 0
Does the "Divided Loyalty" Question Still Dog Catholic Politicians?

Al Smith campaigns for president, 1928

 

 

The candidacy of Joe Biden, a cradle Catholic, together with the nomination to SCOTUS of Amy Coney Barrett, a cradle, charismatic Catholic, have again raised questions about Roman Catholicism’s place in American politics.  Presenting too strict a faith can generate reactions like Mark Sumner’s at Daily Kos, who wrote in reaction to Barrett’s nomination that 

 

the far right is so excited to see her name put forward [because] Barrett is a religious extremist, a member of a small sect that takes the inherent misogyny of traditional Catholicism and adds to it by doubling down with … more misogyny.

 

At the same time, Biden’s generic appeals to Christ or prayer barely register with voters and journalists accustomed to Christian rhetoric in political campaigns. A piece at NPR on how “Biden’s Catholic Faith Shaped His Life,” included a quote from John McCarthy, the candidate’s national deputy political director.  When McCarthy said, “It’s about the vice president being who he truly is, which is a Catholic and a deeply devout person of faith,” NPR’s reporter did not ask for an explanation. One way to account for the difference in coverage of Biden and Barrett is to acknowledge that a faith that inspires or comforts a politician is an easier sell than one that seems to challenge existing policies. 

 

For some in the press corps, however, Biden’s faith needs more spice.  Elizabeth Bruenig at the New York Times, for instance, has argued that Biden needs to incorporate more of Pope Francis’ teachings into the Democratic nominee’s campaign.  “Mr. Biden could look to the example of Pope Francis as a model for a kind of Catholicity that is both pious and challenging to the powers that be — if he, or anyone else, were interested in that sort of thing.” For Bruenig,  Francis provides grounds for challenging America’s ruling class and Biden, who needs to move to the left, has spent too much time in the moderate middle. 

 

That advice might make sense for someone like Bruenig, a socialist and relatively recent convert to Rome.  But the recent release of Fratelli Tutti, a papal encyclical that appeared after Bruenig’s column and that challenges Donald Trump’s brand of conservatism, raised the stakes of the columnist’s point.  Will Francis give Biden cover to move to the left?  Aside from differences between Republicans and Democrats over domestic policy, should a Catholic candidate aspiring to the presidency appeal to the papacy for support for public policy? The answer from American history is that for almost a century, Roman Catholic presidential hopefuls have distanced themselves from the church both to silence anti-Catholic critics and to affirm national norms. 

 

John F. Kennedy likely set the standard for all Catholic successors when he ran for POTUS.  In 1960, the U.S. Senator from Massachusetts had to answer many critics who worried that a Catholic president could not uphold the Constitution owing to competing loyalties.  In fact, anti-Catholicism had received a new lease on life only ten years earlier when Paul Blanshard wrote the best-seller, American Freedom and Catholic Power (1949).  The author was no fundamentalist bigot.  His anti-Catholicism tapped reputable Protestant sources, such as those that accompanied him from Harvard Divinity School to Union Seminary (New York) and into the Congregationalist ministry.  Although Blanshard wound up agnostic, he believed Roman Catholics were a threat to liberal democracy and wrote a book to prove it.  This was the predictable objection to Catholicism, namely, that a higher religious loyalty conflicted with duties to uphold American law (for some reason, it never applied seriously to Bible-thumping Protestant politicians). In Blanshard’s mind, Rome stood for “antidemocratic social policies” that were “intolerant,” “separatist,” and “un-American.”  Perhaps the reason the book sold well was that Blanshard expressed what most white Protestants whether mainline or evangelical thought about Roman Catholicism.  Even Reinhold Niebuhr, who remains practically every faith-friendly politician’s favorite theologian, believed Rome’s “authoritarianism” was fundamentally at odds with the “presuppositions of a free society.” 

 

Blanshard’s anti-Catholicism was hardly original.  The year before the 1928 presidential election, for instance, New York governor, Al Smith, the Democratic nominee, needed to answer a long article in The Atlantic by Charles C. Marshall, a prominent New York City attorney and lay Episcopalian that questioned whether a Roman Catholic could be loyal to both the Constitution and the pope.  Smith, who wondered “what the hell is an encyclical?” was blind sided. His reply, written with help by a priest who was famous for his service and heroism as a chaplain during World War I, was to affirm every point of the American creed and claim that no tension existed between American patriotism and church membership.  To the litany of quotations from papal pronouncements, Smith replied, “I have been a Catholic all my life and I never heard of these encyclicals and papal bulls.” On questions about his loyalty to American government, he blazed the trail for Kennedy: “I believe in the worship of God according to the faith and practice of the Roman Catholic Church” and “I recognize no power in the institutions of my Church to interfere with the operations of the Constitution of the United States.”  Smith added, he believed in “absolute” freedom of conscience and the “absolute” separation of church and state.  

Even if Smith’s devotion was closer to the norm for most American Catholics than a supposed following of every utterance from the Vatican, Rome’s reaction to the American Jesuit John Courtney Murray indicated that parts of anti-Catholicism had merit.  Murray came on to the radar of the Vatican in the early 1950s when he tried to defang Blanshard’s book and demonstrate the harmony between the American Founding and Roman Catholic natural law.  In the process, he challenged the Vatican’s default position on religious freedom which, put simply, was “error has no rights.”  Murray’s ideas became sufficiently alarming that his superiors instructed him to stop writing about church and state.  Although Murray eventually served as an advisor to the bishops at the Second Vatican Council, his positions were suspect within the Vatican even as the bishops convened in Rome. When the Council eventually embraced religious freedom in Dignitatis Humanae, Murray’s views seemed to prevail.  

 

Whether Vatican II also vindicated JFK’s earlier declaration of independence from church authority is debatable.  Although Murray had had to worry about offending his superiors, the church’s bishops offered no objections to Kennedy even when he said, before a body of Houston’s Protestant clergy: “I believe in an America where the separation of church and state is absolute, where no Catholic prelate would tell the president (should he be Catholic) how to act, and no Protestant minister would tell his parishioners for whom to vote.”  Chances are that Kennedy’s strict wall of separation was not what the bishops at Vatican II had in mind when endorsing religious freedom.  At the same time, Kennedy’s position of independence from the church has been the pattern over the last century for Catholic politicians in the United States. 

 

This is not only true for public figures; Rome’s bishops have also come around to a position that the United States is not a nation in need of correction by the church but is in fact a beacon of freedom and hope for the world.   In 2015, when Pope Francis visited the United States and spoke outside Independence Hall in Philadelphia, he avoided a prophetic witness and chose words of inspiration.  He invoked the examples of Abraham Lincoln, Martin Luther King, Dorothy Day, and Thomas Merton to underscore themes of his own papacy.  He also declared that the Declaration of Independence’s famous words – “all men are created equal” – were on the side of such Christian ideals as protecting “the good of the human person” and “respect for his or her dignity.”  He confessed a hope that the United States would “continue to develop and grow, so that as many young people as possible can inherit and dwell in a land which has inspired so many people to dream.”  Francis was echoing what the American bishops had been affirming since 2012 when they launched the Fortnight for Freedom, an annual two-week period prior to Independence Day that called on American Catholics to recognize and express gratitude for the freedoms their nation protected.  “We are Catholics. We are Americans,” the bishops asserted. “We are proud to be both, grateful for the gift of faith which is ours as Christian disciples, and grateful for the gift of liberty which is ours as American citizens. To be Catholic and American should mean not having to choose one over the other.”

 

Joe Biden will likely do what JFK and Al Smith did, namely, fit his faith into the norms of American politics.  In fact, Biden did something like that in his acceptance speech at the Democratic National Convention: 

 

We have a great purpose as a nation: To open the doors of opportunity to all Americans. To save our democracy. To be a light to the world once again. To finally live up to and make real the words written in the sacred documents that founded this nation that all men and women are created equal. Endowed by their Creator with certain unalienable rights. Among them life, liberty and the pursuit of happiness.

 

That may not be a vigorous expression of Roman Catholic conviction over and against the excesses of American society, but Biden is in a sense following the church hierarchy.  Since the 1960s, Roman Catholicism has provided more room for affirming America’s secular and liberal forms of government than the church ever did in the four centuries preceding Vatican II.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177830 https://historynewsnetwork.org/article/177830 0
My Memories of Voter Suppression

Andrew Goodman, James Chaney and Michael Schwerner were murdered in 1964 for their efforts to secure Black voting rights in Mississippi. 

The author joined an interracial movement for voting rights in Louisiana in the years before.

 

 

Back in July 1962, when, according to Donald Trump, America was “great,” I was in the Deep South, working to register Black voters.  It was a near-hopeless project, given the mass disenfranchisement of the region’s Black population that was enforced by Southern law and an occasional dose of white terrorism.

It all started in the fall of 1961, the beginning of my senior year at Columbia College.  My roommate (Mike Weinberg) and I, both white, had joined the campus chapter of the Congress of Racial Equality (CORE) and participated in a few of its New York City projects.  The real action, though, was in the turbulent South, swept by sit-ins and Freedom Rides that demanded an end to racial discrimination and, especially, the right to vote.

On an evening in the spring of 1962, Ronnie Moore, a Black CORE Southern field secretary, brought the news of the Southern freedom struggle to our Columbia CORE meeting.  Having headed up desegregation efforts in Baton Rouge, Louisiana, Ronnie and three other students at Southern University, an historically Black institution, were out on bail on “criminal anarchy” charges.  The laws under which they were charged and imprisoned, which provided for a penalty of ten years at hard labor and a hefty fine, dated back to the state’s early twentieth century repression of union organizing among Black and white timber workers.

Stirred by what Ronnie told us, Mike and I went up to him after his talk and asked him how we could help the cause.  Looking us in the eyes, he said, smiling: “What are you boys doing this summer?”  In reply, we explained that, inspired by Jack Kerouac’s On the Road, we would be driving around the country.  “Any chance that you’ll get to Baton Rouge?” he asked.  “We could manage it,” we said.  “Well, do it,” he remarked, adding: “Maybe we could arrange to get you arrested!”  We all had a good laugh about that.

That July, as Mike and I drove along Louisiana roads enveloped in an atmosphere of racial segregation, racist remarks, and unbearably hot and steamy weather, the venture no longer seemed quite as amusing.  Nor, after arriving in Baton Rouge, was it easy to find Ronnie, for the Congress of Racial Equality wasn’t listed in the phone book.  But we did find a Committee on Registration Education, and figured that, with the same acronym, that must be his group.  It was.  The state authorities had obtained a court order to shut down its predecessor.

When we arrived at CORE’s tiny office, Ronnie was delighted to see us and, together with his coworkers, took us to an all-Black hangout for coffee.  In his view, and ours, the only safe people in the South were Black.  As for local whites, we considered them all actual or potential Nazis, and stayed clear of them and their institutions.  Whether they would stay clear of us remained uncertain.  Mike and I slept on the Moore family’s entry hall floor, and local residents had been known to fire bullets into it through the front screen door.

Although most of the voter registration campaign Mike and I worked on in Baton Rouge was rather mundane, one evening was particularly exciting.  At dinner time, Ronnie suggested that we drive over to Southern University, from which he and the other CORE activists had been expelled for their “crimes.”  As we entered the all-Black dining hall, students started yelling: “It’s Ronnie!  It’s Ronnie!”  Hundreds of students swiveled around and cheers rent the air.  Leaping onto one of the tables, Ronnie made an impassioned speech about the freedom struggle and, then, announced that he had brought with him two movement supporters from the North.  “Get up here, Larry and Mike!”  So we jumped up there, too, and did our best to deliver strong messages of solidarity.  We had just about finished when someone rushed in, warning that the campus security police were on their way and that we had better get out of there fast!  While students ran interference for us, we did.

One day, Ronnie suggested that Mike and I drive him to Jackson, Mississippi, where a region-wide CORE-SNCC conclave would be held at the local Freedom House.  Accordingly, after dinner, we hit the road through northern Louisiana (where a local gas station operator threatened to kill us) and, then, through Mississippi to Jackson.  Here, in an abandoned building taken over by the movement and around which police cars circled menacingly, we joined dozens of CORE and SNCC activists from the Deep South.  At night, they had lengthy political discussions, in which they expressed their bitterness toward the Kennedy administration for its failure to back civil rights legislation or to protect movement activists from racist violence.

During the days, Mike and I joined Luvaughn Brown, a Black activist recently incarcerated at the county prison farm, to go door to door in a Black Jackson neighborhood and encourage its residents to register to vote.  This was a tough job because people feared retaliation if they dared to exercise their voting rights and, also, because they would almost certainly be rejected.  At the time, Mississippi used a “literacy test” to determine if a citizen was qualified to vote.  A voting registrar would ask a potential registrant to define the meaning of a section in the lengthy state constitution.  If you were Black, the registrar announced that you had failed the test; if you were white, you passed.

Voter registration work was not only frustrating, but exceptionally dangerous.  The following summer, Medgar Evers, head of the local NAACP, was murdered in Jackson by a white supremacist for his leadership in a voter registration campaign.  The next June, James Chaney, Andrew Goodman, and Michael Schwerner—participants in the Mississippi Freedom Summer voter registration project—met a similar fate.  Although rattled by our fairly brief Southern venture, Mike and I escaped with our lives, as did Ronnie.

Mike and I kept in touch, and were delighted when Congress responded to the scandal of Southern voter suppression with the Voting Rights Act of 1965, which outlawed the discriminatory voting practices of the past and established federal oversight of any new voting procedures in the offending states.

Imagine, then, our sense of sorrow, mingled with disgust, when, in 2013, by a 5-4 vote, the Republican-dominated U.S. Supreme Court gutted the Voting Rights Act.  This opened the door for numerous Republican-controlled state governments—many but not all Southern—to implement mass purges of their voter rolls, closure of polling places in minority neighborhoods, government ID requirements, felony disenfranchisement, and other barriers that deprived millions of Americans of the right to vote.

I wonder how Republican leaders can live with themselves when they betray the most basic principle of democracy.  Of all the things they have done during their time in power, this is surely one of the most despicable.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177743 https://historynewsnetwork.org/article/177743 0
Where In The World Are You? How My Great-Grandmother’s Letters Helped Me Locate My Great-Uncle after 78 Years

 

 

When did you last send a handwritten letter, or receive one? Can’t remember? Me neither, but there was a time, not that long ago, when the snap of the letterbox brought people rushing to see what had been delivered: reassuring words from a much-missed loved one, or the dreaded telegram bearing the very worst of news?

 

Much of what we know about the world wars comes from the letters and diaries of ordinary people caught up in those extraordinary events. Without these personal reflections, we wouldn’t have such an intimate knowledge of how war affected those who lived through it. Those like my great-grandmother, and great-uncle.

 

 

Between October and December 1942, my great-grandmother wrote a number of letters to her youngest son, Jack, who was reported as missing in action shortly after leaving home to fight for the British Army with his regiment from East Yorkshire. Jack, aged 25, was sadly never found and, heartbreakingly, my great-grandmother’s letters to him were all returned to her, marked simply, ‘To Mother’. Seven of these letters survive, and have been passed down through the family over the decades. In 2017, my father gave them to me.

 

 

Reading my great-grandmother’s unimaginable anguish, her desperate worry for Jack and her agony in not hearing from him, had a profound impact on me, and made me think about war in a different way. Her words made me realise that the pain of separation, of not hearing from loved ones for months, years, or ever again, wasn’t something that had happened to strangers in grainy black and white photographs, but had happened to my own family; to a mother, like me. Reading her letters inspired me to write a novel about the experience of ordinary people caught up in the war, and cut off from their loved ones, and soon after being given the letters, the right story found me: a story of the war in the Pacific; a story of resilient women and resourceful children; a story from a distant corner of that terrible war. 

 

I first learned about the events that inspired When We Were Young & Brave in a podcast. The episode began as an amusing anecdote about waylaid Girl Scout cookies, but went on to reveal the remarkable true events surrounding a group of schoolchildren and their teachers who were taken to a Japanese internment camp in China, following the bombing of Pearl Harbor in December 1941. The children’s parents were mostly British, American and European missionaries and diplomats, and their lives, up to that point, had been one of great privilege. Many of the children were part of the school’s Girl Guides unit, and the principles of girl guiding, the routine of patrol meetings, the practical skills learned, and a willingness to Lend a Hand and Be Prepared, became increasingly important in helping them, and their teachers, endure their ordeal over the next five years. The account stirred fond memories of my own years as a Brownie, and of the BBC drama Tenko, and the Ingrid Berman movie, The Inn of the Sixth Happiness, based on the life of missionary Gladys Aylward. 

 

I was intrigued, not only because World War II was an event I wanted to write about, but also because girl guides, schoolchildren and war simply didn’t belong together. I wanted to understand how it had happened, how the children and their teachers had coped in the circumstances they found themselves in so far from home, and how the experience, especially the prolonged separation from their family, had affected those children in later life. What I hadn’t expected to discover during my research was a story not only of unimaginable hardship, but of extraordinary hope, friendship, community and kindness as the children and their teachers adapted to the rapidly changing circumstances they found themselves in. 

 

As a historical novelist, I spend a lot of time walking in the shoes of those who have lived through world-changing events, so it felt very fitting to finish editing When We Were Young & Brave during lockdown in March, at the start of our own world-changing event. Now, more than ever, it seems to me that the past is not a foreign country where people do things differently, but is a reassuringly familiar place, one from which we can draw comfort, one from which we can learn. Disaster and tragedy are often where we find our strongest bonds, and as we find ourselves separated from family and distanced from loved ones, stories of community and shared hope are, arguably, more important than ever. 

 

 

 In February this year, with the help of the War Graves Commission website, I located my great-uncle Jack. The family had always believed he was lost in France, but he wasn’t. He was in North Africa, in Tunisia, so very far away from his rural Yorkshire home. His final resting place is marked with a memorial plaque, noting his age and the date of his death. He died on the day great-grandma wrote her final letter to him. 

 

Dec 27, 1942

 

My dear Jack, Where in the world are you? I keep writing and still no news of you. We are all ears at news time but seldom glean any news about your special team. How did you spend Christmas? We are all thinking of you … Everybody here has victory on their lips now, but I keep on looking for you. It does seem such long months since you went away.

 

Before my grandma passed away in May (after reaching her 100th birthday), we were able to tell her we’d found her brother’s final resting place, and show her the photograph of his memorial plaque. It gave her great comfort. Without my great-grandmother’s letters we would never have known how to find Jack, and I would never have understood how deeply the war had affected my family. 

 

 

How incredible that the words of a mother to her son, written nearly seventy-five years ago, still evoke such powerful emotions. I now look at the faces in old family photographs with a renewed sense of connection, sadness, and above all else, a profound sense that the past is not so different, or far away, after all.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177835 https://historynewsnetwork.org/article/177835 0
Lessons from the 18th Century Dutch Republic

Binnenhof, seat of the States General of the Netherlands, The Hague

 

 

 

On the eve of the 2020 US presidential elections, American society remains deeply polarized. Study after study demonstrates that Republican and Democratic voters disagree on key policy issuesare becoming increasingly partisan, and rarely switch parties. Whoever gets elected president of the United States in November will unquestionably govern a divided and restless nation.

 

The 2020 election may be pressing and significant, but more relevant for the long term is how to move forward. If the United States is to survive, what can be done to mitigate the country’s divisive politics?

 

The history of the Netherlands serves as both a warning and an opportunity for the United States in its current polarized state. The Dutch lesson is that there is a way to achieve reconciliation and deal with the divisions that naturally arise in any society; the embrace of political pluralism.

 

Similar to the United States today, the Dutch Republic - the predecessor to the current Kingdom of the Netherlands - was a deeply divided country around the time of the American Revolution. Comparable to Democrats and Republicans in American politics, the Dutch Republic had two opposite poles on the political spectrum, the Patriots and the Orangists.

 

By the late eighteenth century, the Orangists were defenders of the status quo and their most ardent supporters hailed from the urban working classes and farmers in the countryside. They favored an alliance with Great Britain and large standing armies to defend against land invasions from France. 

 

In contrast, the Patriots were an opposition movement predominantly from the urban mercantile middle classes. They supported larger navies to protect their overseas trade. The Patriots regarded the incumbent government as a corrupt aristocracy and sought to reform it through elections and the creation of citizen’s militias.

 

Like in the United States today, various developments amplified the political stakes as well as the polarization between the two parties. The Dutch Republic had been a world power in the seventeenth century, famous for its riches, military might, and high culture, but the country experienced a gradual decline of prestige in the eighteenth century. Foreign powers such as France and Great Britain sought to exploit Dutch internal divisions for their own benefit. Meanwhile, wealth inequality had grown to epic proportions with real wages barely rising.

 

“Imbecility in the government; discord among the provinces; foreign influence and indignities; a precarious existence in peace, and peculiar calamities from war”, was how Alexander Hamilton and James Madison portrayed the Dutch Republic in the Federalist Papers in 1787. In other words, much like how many would currently characterize the United States.

 

The Orangists and the Patriots insisted that polarization was the root cause of Dutch decline. Both parties agreed that only unity (eendracht) could restore Dutch glory, basing their argument on the Dutch Republic’s motto, Eendracht maakt macht, or strength through unity.

 

But the pursuit of unity in a country in which people would never fully agree on policy was as cynical as it was destructive. Though the Patriots favored the implementation of elections, they often limited who could vote for local offices by requiring membership in a citizen’s militia to cast a ballot, a rule that heavily favored the Patriot party. Similarly, Orangists thought that unity could be achieved through collective deference to the Stadtholder, an executive with monarchical pretensions who was allied with the Orangists. In the end, neither party got their way, at least not permanently. After an invasion by French revolutionary forces in 1795, the Dutch Republic ceased to exist.

 

Like in the eighteenth-century Dutch Republic, contemporary American politics has two major factions endlessly battling over dominance over the nation’s institutions. But where the eighteenth-century Dutch Republic represents a warning, the nineteenth and twentieth-century history of the Netherlands can provide a path towards national reconciliation for the United States. 

 

Political compromises on universal suffrage, labor rights, and freedom of education in the nineteenth and twentieth centuries institutionalized political pluralism in the Netherlands, which proved an effective method of easing tensions between various factions in politics and civil society. These compromises ushered in the period of “pillarization” (verzuiling) in Dutch society, the cultural foundation of political pluralism in the Netherlands today. It was broadly understood that different factions in society could peacefully coexist, as long as each group respected each other’s “sovereignty”, as Calvinist politician Abraham Kuyper put it. Socialists, various Christian denominations, and liberals each formed their own civil society in which they practiced their beliefs and shaped their political ideas.

 

The period of pillarization in the Netherlands is long gone, but the culture of political pluralism persists. Dutch Parliament currently counts thirteen different parties with the largest only commanding thirty percent of the seats. Though Dutch people complain about the quality of contemporary political debate and the splintering of political parties, having a different view than your neighbor on a political or cultural issue is hardly problematic.

 

We understand the contemporary United States as fundamentally different than the Netherlands in this regard, in part because politicians and their partisans have manufactured a red/blue, Republican/Democrat division to benefit electorally from polarization. Yet the idea that American citizens, nearly 330 million people from an endless variety of ethnicities, faiths, and socioeconomic backgrounds, can be categorized as either Republican or Democrat is a grotesque oversimplification. Even on abortion, arguably one of the most divisive issues in American politics, American citizens actually have a much less binary view than is usually understood.

 

It is unrealistic to expect that the United States will become a multiparty democracy any time soon. But what needs to be broadly recognized is that societies will always have a wide variety of political ideas that somehow have to be reconciled in compromised public policy, a fact the Dutch Patriots and Orangists never understood. Important steps can be made towards gradually building a culture of E pluribus unum, the foundation of a new era of political pluralism in the United States.

 

Practically, citizens should be encouraged to envision themselves on a larger political canvas than just Democrat, Republican, or even independent. In the Netherlands, millions of voters fill out the StemWijzer (literally “Vote Indicator”) before every election to orient themselves on the political spectrum. State governments and educational institutions should popularize the use of similar nonpartisan tests, such as the Pew Research Center’s Political Typology Quiz

 

The humanities also have an important role to play in promoting political pluralism. Through the humanities, students learn to appreciate how diverse political views and compromise shaped society around them. For instance, documents like the Declaration of Independence or the United States Constitution should be examined as products of compromise between people with a wide variety of ideas as opposed to a genius document that simply fell out of the founders’ heads. 

 

Likewise, governments on every level should do more to combat the civic illiteracy of American students. A holistic and deep understanding of civics is key to creating an informed and politically tolerant citizenry and can become a vehicle for political pluralism. In addition to learning about the Republican and Democratic parties, students should know more about the rich socialist and libertarian political traditions in the United States to help them recognize the diversity of opinions on the American political spectrum.

 

The embrace of political pluralism is essential to the health of the American Republic and will be even more so as polarization persists in the coming years. Voting is often seen as the duty of every citizen in a democracy and that is certainly important. But voting is just one part of citizenship. The history of the Dutch Republic demonstrates that polarization can gradually destroy a country from within and can easily be exploited by foreign actors. The embrace of political pluralism by every citizen is the key antidote to the rot of polarization and partisanship that haunts American politics today.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177836 https://historynewsnetwork.org/article/177836 0
Return to the Presidential Succession Act of 1886 (With Some Modification) Ronald L. Feinman is the author of “Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama” (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

A major controversy has arisen over the issue of presidential succession in the wake of President Donald Trump’s diagnosis with COVID-19.

There have been three presidential succession laws enacted. The first, in 1792, set up the President Pro Tempore of the Senate and the Speaker of the House of Representatives as the first two leaders following the Vice President, and then followed by cabinet officers in order of the creation of the Cabinet positions by Congress.  That law survived the crises that followed the deaths of William Henry Harrison, Zachary Taylor, Abraham Lincoln, and James A. Garfield, without the need to go beyond the Vice President.  

However, during the second abbreviated term of Abraham Lincoln and his successor Andrew Johnson (1865-1869), the nation potentially faced an unprecedented crisis, as two situations developed around Andrew Johnson. John Wilkes Booth plotted to eliminate both Lincoln and Johnson. If conspirator Lewis Powell had not gotten drunk and failed to assassinate Johnson, Connecticut Senator Lafayette Foster, the President Pro Tempore of the Senate,  would have succeeded Lincoln, a point this author points out in his book on presidential assassinations (Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama, Rowman Littlefield Publishers).  Also, if Andrew Johnson had been successfully removed from office by impeachment, it would have led to then President Pro Tempore of the Senate Benjamin Wade of Ohio becoming President. Wade was a major critic of Johnson and refused to abstain from the vote to convict Johnson.  That fact led a group of seven Republican Senators, who disliked Wade and his lack of ethics, to join with 12 Democrats to save Johnson from conviction and removal from office in 1868.

In 1886, the Congress wisely changed the Succession Law of 1792, and eliminated both the President Pro Tempore of the Senate and the Speaker of the House of Representatives from the line of succession. In so doing they took partisan politics out of the issue of who should succeed a President.  So the Cabinet Officers of the President, in order of the creation of the agencies, became the new order of succession, and remained so until 1947, spanning the deaths in office of William McKinley, Warren G. Harding, and Franklin D. Roosevelt.  

However, as reported in this author’s Assassinations book, Theodore Roosevelt faced a mostly unknown threat on September 1, 1903 when Henry Weilbrenner approached Roosevelt's family home at Oyster Bay, New York, attempting to get past the Secret Service detail created after President William McKinley’s assassination in September 1901.  Possessing a firearm, Weilbrenner claimed he wanted to marry the President’s daughter, Alice. Fortunately, Weilbrenner never was able to meet the President late on that evening.  Had an untoward event occurred, however, Secretary of State John Hay, who had been a private secretary to Abraham Lincoln in the White House, would have become President.

After President Franklin D. Roosevelt’s death in 1945, and the succession of Harry Truman to the Oval Office, the Republican Party opposition was able to gain massive control of the 80th Congress in the midterm elections of 1946, and were able to pass the Presidential Succession Act of 1947, again putting the Speaker of the House and President Pro Tempore of the Senate in line of succession after the Vice President, and before the Cabinet Officers. This was a purely partisan political act, with Republicans Joseph Martin and Arthur Vandenberg leading the way.  

In so doing, we have seen in the 74 years from 1947 to 2021 a situation in which the Speaker of the House has been of the party in opposition to the president for a total of 44 of 74 years, 60 percent of the time.  And the opposition party has held the position of President Pro Tempore of the Senate for 34 of the 74 years, nearly half the time.

This is not a tenable position in today's hyper-partisan environment. Therefore, reverting to the Presidential Succession Act of 1886, with updates for the additional Cabinet positions since created makes sense, although the idea of the order of succession being based on when the agency was created needs to be modified to allow the Secretary of Homeland Security, the last position created after September 11, to move up to next in line after the Attorney General and before the Secretary of Interior, due to the national security ramifications, in case of a Presidential vacancy.

Even though Cabinet Officers are not elected, it makes for better continuity that those selected by a President be in line for succession in case we ever have to go beyond the Vice Presidency in case of any unforeseen emergency, and it insures that there is a continuation of the political party chosen by the voters to control power in the Executive branch for that term of office.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/blog/154418 https://historynewsnetwork.org/blog/154418 0
Vaughn Davis Bornet, RIP at 102 This blog post was written by Rick Shenkman, founder of the History News Network, and the author of Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books). 

Let's begin with the cliche.  "So, Mr. Bornet, what is the secret to living a very long life?"  It was the question he got used to being asked. And it was the subject of a speech he gave to the Medford, Oregon Rotary Club when he turned 100, which we published a few months later. The following year in another piece on HNN he went into more detail. There was no secret, he answered.  He lived a long life, he supposed, because he didn't smoke, he stayed active "mentally and physically," and he married a good woman.  (She passed away in 2012.) 

He did have one secret about that speech, however, which he shared with me afterwards.  He managed to deliver it standing upright only with the help of a man who stood behind him out of view.  He may have graduated from Emory and Stanford, risen to the rank of commander in the US Navy during World War II, worked at RAND and written an armful of books on labor, Herbert Hoover, and Lyndon Johnson, but he was human.

He also was unsentimental.  Or was it just his endearing sense of humor on display when he wondered if having children helped or lessened one's chances of surviving a long life:

"I am of two minds about children’s effect on longevity. They may shorten your life by sometimes almost driving you nuts. Or, they may actually lengthen your life, as they may pay part of the bill for that fancy retirement home. They can provide a really good motive to stay alive as they visit weekly or monthly, bringing chocolates."

Vaughn wrote some sixty articles for HNN through the years, beginning in 2007 with, "How Race Relations Touched Me During a Long Lifetime."  Characteristically, it showed his continuing engagement with world affairs. When Mitt Romney was on pace to win the GOP nomination in 2012 Vaughn penned a piece that helped put the issue of Romney's LDS faith in historical perspective.  Mixed in with the articles on politics were dozens that spoke specifically to historians:  reminiscences on the death of his friend, the diplomatic historian Norman Graebner, reflections on life as a historian here and here. (If you're a student thinking about a career in history those two articles might help you make up your mind.)  Along the way he wrote numerous articles about life in America as it used to be: here and here, for example.  

Throughout those articles from the early years of HNN Vaughn took the attitude that he'd seen it all and we'll be fine.  In May 2016 he declared flatly:  "Why I’m Optimistic About Our Future." Then Donald Trump was elected president.  From then on Vaughn often seemed like a man in a state of shock.  This historian who had seen it all in his 100 + years -- in the Great Depression he'd watched powerless as his family lost their house and car as he was shipped off to live with an aunt -- now seemed dumbfounded by events, caustically commenting on the "spectacle of government by guesswork."  "So it has come to this," he observed in despair.

As events unfolded he pleaded with me to do whatever I could to draw attention to Trump's failings.  Meanwhile, he did all he could. He reviewed Michael Wolfe's book, then Omarosa Manigault Newman’s, then Bob Woodward's.

Vaughn was most comfortable in the role of patriot.  These were the kind of articles he wanted to write: "How Military Service Changes You,"  "It Has Been 63 Years Since I Raised My Right Arm and Joined the Navy""Good Luck, People of Our 50 States!"   And in his final piece for HNN, written back in May, he suggested, " 'This Too, Shall Pass.' History, and Life, Say So!"

Still, Trump unnerved him. His last book, published just a few weeks ago, is titled, "That Trump!" In the book, which consists of both new material and his HNN Trump articles, he aims to be objective but his disgust with Trump is self-evident.  At one point he hopes Trump will simply resign.

Vaughn hoped to live to see the end of Trump's presidency.  He didn't.  But maybe we will -- and soon. 

You can read Vaughn's many articles here at HNN and at his website, Clioistics. A family memorial can be found here and his obituary here.

 

 

 

 

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/blog/154417 https://historynewsnetwork.org/blog/154417 0
"The Silent Guns of Two Octobers" Reviewing a New History of the Cuban Missile Crisis

 

HNN Editor's Note: This review was originally published in Washington Decoded on June 11, 2020, and is republished here with permission at the anniversary of the crisis. 

 

[Note to readers: Theodore Voorhees, whom I did not know, contacted me in 2017 about reading his manuscript. I concluded that his work added an important and fresh perspective to Cold War scholarship and, with Professor Martin Sherwin, assisted in finding a receptive university press. This article originally appeared on washingtondecoded.com on June 11, 2020] 

 

Part I: The Author’s Argument:

 

The standard view of the Cuban missile crisis is engraved in our historical memory. My own books reflect that outlook, describing those iconic thirteen days as the most dangerous episode of the nuclear era and the thirteenth day, October 27, 1962, as the most perilous twenty-four hours in human history. That view is so widely shared in missile crisis literature that it was startling to read a book in which that interpretation was all but relegated to the status of “the conventional wisdom.”

 

Theodore Voorhees, Jr., Senior Counsel at Covington & Burling LLP in Washington, DC, concludes “that much of the Cold War rhetoric the leaders employed was posturing and that neither had any intention of starting a nuclear war.” Voorhees begins by dissecting the October 1961 confrontation along the Berlin Wall at Checkpoint Charlie when some sixty Soviet and US tanks faced each other “across a tense Cold War border.” His conclusion, however, is that John F. Kennedy and Nikita Khrushchev were personally determined to avoid escalation. Indeed, in a matter of hours, they maneuvered to assure that the confrontation evaporated without violence or casualties.  

One year later, a vastly more dangerous crisis arose when US surveillance aircraft discovered that the Soviets had secretly placed medium and intermediate range ballistic missiles in Cuba (the IRBMs were never actually delivered because of the imposition of the US naval blockade). How Voorhees asks, did the rival leaders resolve the crisis “with lightning speed?” [i]

The simple answer is that the sudden, seemingly miraculous, restoration of peaceful coexistence was possible because both the underlying point of dispute and the ultimate deal terms that ended each crisis were matters under the personal control of each leader. When Kennedy and Khrushchev chose to settle, each man had the authority and the power to do so almost instantaneously. The two leaders personally directed all key decisions down to precise details…. It has become increasingly clear that Khrushchev and Kennedy felt free to reject the views of their closest advisers and brush aside the consternation they caused their alliance partners. … Neither Kennedy nor Khrushchev, whatever his publicly stated position, actually believed that his adversary’s actions presented a problem whose substantive importancewarranted even a conventional military engagement, far less a nuclear showdown.

Voorhees acknowledges that hawks on both sides of the divide regarded the missile crisis as an opportunity to settle the Cold War militarily and “there was always the danger that men lower down the chains of command might pull the trigger, whether by mistake, through personal belligerence, through fear, or all three.” However, this shared outlook at the top also significantly diminished the potential for unwelcome contingencies. The two leaders kept both the conventional and nuclear buttons under tight control and used back-channel diplomacy (involving the president’s brother Robert and Khrushchev’s son-in-law Alexei Adzhubei) to make sure that the other side received unmistakable signals of their ultimate intent to restore the status quo. JFK intended the naval quarantine of Cuba as a sign of caution and sober restraint, 

and that is how Khrushchev and his colleagues at the Kremlin immediately interpreted it—with great relief. On the other hand, the president’s DEFCON-2 alert unmistakably signaled to the Soviets the dire peril into which their gamble in Cuba had placed them. … In the days that immediately followed, both Khrushchev and Kennedy were literally tripping over one another to be first to make a settlement proposal that would be so generous that his adversary would be unable to turn it down.  

Both leaders, Voorhees contends, understood that the US held “all the cards” in the nuclear balance of power with a twenty-to-one advantage in nuclear warheads. The extraordinary Kennedy-Khrushchev missile crisis correspondence, he insists, once the Cold War bluster is discounted, reveals two anxious men committed to “keeping the lid on” and ready “to get the deal done.” 

And, most importantly, the rivals understood the danger posed by the tinder box in West Berlin, located deep inside Soviet East Germany, and carefully avoided any sign of aggressive intent to alter the status of that divided city. The US had nuclear superiority, but the USSR, with a substantial advantage in troops on the ground in East Germany and the Soviet satellites in Eastern Europe, could quickly overrun West Berlin. President Kennedy had remarked at a White House meeting that “It is insane that two men, sitting on opposite sides of the world, should be able to bring an end to civilization.” Khrushchev, fortunately, shared that point of view. The antagonists “realized that no politician in his right mind was going to use nuclear weapons first.” 

There were, Voorhees concedes, unanticipated and very dangerous incidents: most notably the October 27th downing of a U-2 by a surface-to-air missile fired without Kremlin authorization by a Soviet officer on the ground in Cuba. Sergei Khrushchev recalled his father’s near-hysterical reaction to that stunning development, which led to the death of the American pilot, the only fatality of the missile crisis. The furious Khrushchev even threatened to exile the officer to Siberia because “Everything is hanging by a thread as it is.” From Voorhees’ perspective, Khrushchev’s response, surely one of the dramatic highpoints in missile crisis literature, coupled with Kennedy’s decision not to retaliate against the SAM site(s), confirm the shared determination in Moscow and Washington to avoid nuclear war. 

 

Could it be,” Voorhees argues, 

that the Cuban missile crisis proved exactly the opposite of what was widely feared: namely, just how much safer and better protected the world had become from the risk of war arising between the superpowers given the widely appreciated horrors that nuclear weapons had introduced to modern war-fighting? … The lesson—perhaps counterintuitive to generations who have long accepted that the world came close to a nuclear holocaust in October 1962—is that the fearsome prospect of nuclear war-fighting of any kind virtually guaranteed that the crisis would be settled with remarkable speed and certainly well before the parties came anywhere near a point of no return.

Part II: The Reviewer’s Response:

After listening to hundreds of hours of recorded meetings and telephone conversations, I agree that JFK would never have chosen the nuclear option. Kennedy eagerly pursued a secret fallback plan, the so-called Cordier Ploy, in the wee hours of October 27-28 to give Khrushchev a face-saving way out by offering a Cuba-Turkey missile withdrawal plan that would appear to the world at large to have been put together by the United Nations rather than the US. JFK was ready, albeit reluctantly, to face the inevitable political fallout in the upcoming midterm elections if the secret missile swap had to be made public to avert war. The president, in a state of near despondency, told his 19-year old mistress that he would rather his children be red than dead—not the predominant view in the United States in 1962. The only other choice was nuclear fallout. 

Voorhees, however, in my judgment, seriously exaggerates the ability of the Kremlin to successfully micromanage a complex operation—carried out in secret for many weeks and more than 6,000 miles from the USSR. Soviet Ambassador Anatoly Dobrynin later acknowledged that erratic and limited communications severely undermined Moscow’s ability to cope with every conceivable or inconceivable eventuality in real time because their Washington Embassy did not have direct phone or radio communications with the Kremlin; coded messages had to be sent by Western Union Telegram—which could take 8-12 hours—after being picked up by bicycle couriers who, oblivious to the urgency of the situation, were known to stop for a snack or to flirt with a girl. JFK and the ExComm struggled with similar constraints—for example, waiting hours to receive State Department translations of Khrushchev’s messages. And, of course, neither Kennedy nor Khrushchev were able to control a potentially lethal wild card in the crisis, Fidel Castro—as revealed by his October 26 cable to Khrushchev advocating a nuclear first-strike on the US and his refusal to accept on-site UN inspection of the missile sites even after the October 27-28 negotiated breakthrough.

 

There were, of course, several other perilous and potentially unmanageable episodes. Khrushchev had also ordered the nuclear warheads in Cuba to be stored miles away from the missile bases to prevent an accidental or rogue launch; but at least one base commander, again without authorization from Moscow, secretly moved them to his site. And, even more ominously, tactical nuclear cruise missiles had been put into position to obliterate the American naval base at Guantanamo if the US bombed or invaded Cuba. If the Soviets had killed thousands of Marines using tactical nuclear weapons, could Kennedy have kept the public demand for retribution in check? Voorhees seems confident that the answer is yes, despite the fevered Cold War context of 1962 (which included a poll in which most Americans concluded that a nuclear showdown with the USSR was inevitable). 

 

Perhaps the most striking incident, which has gained a great deal of notoriety in recent decades, involves a Soviet submarine near the quarantine line forced to surface on October 27 after the US Navy dropped so-called “practice depth charges” [PDCs]—with the explosive force of a hand-grenade—producing “harmless explosive sound signals.” Voorhees recapitulates:

 

One of these PDC hand grenades may have detonated close enough to inflict some modest damage on at least one of the Soviet submarines, B-59, which would have allowed its captain under his standing orders to respond to any presumed damage-causing attack by firing torpedoes, one of which available to him in this case carried a nuclear warhead. … This incident has earned an outsized place in missile crisis lore owing to reports that a Soviet naval officer named Vasily Arkhipov on board B-59 allegedly stood up to his vessel’s captain, Valentin Savitsky; single-handedly talked him out of his threat to arm the submarine’s nuclear-capable torpedo for possible firing at US naval vessels; and thereby became known as ‘the man who saved the world from nuclear apocalypse’. 

 

Voorhees argues that Savitsky “had received notice of the new American [PDC signals] policy,” sent from Washington to Moscow on October 25, and “presumably [my italics] knew the difference between the sound of signaling PDCs and a determined lethal attack using real, full-strength depth charges.” However, JFK and the ExComm, Michael Dobbs concluded, “assumed that the Soviet submarine captains had been informed about the new procedures and understood the meaning of the [PDC] signals. They were mistaken.” [my italics] The Kremlin failed to confirm receipt of the message about the underwater signals and did not alert their four submarines in harm’s way near Cuba. Savitsky “knew nothing about the signaling procedures” and “nobody [on board] knew what was going on.” The submarines, Svetlana Savranskaya stressed, were also unable to contact Moscow without reaching “periscope depth” or surfacing in waters teeming with US Navy vessels.[ii] Voorhees remains confident, however, about “the essential inevitability of the actual outcome.” 

 

Finally, also on Black Saturday, October 27, a U-2 from a Strategic Air Command base in Alaska, apparently on a “routine air sampling mission” to check on nuclear testing in the USSR, “accidentally” strayed into Soviet air space. MiG fighters scrambled and the plane was permitted to return to its base escorted by US F-102 fighters equipped with nuclear air-to-air missiles. Voorhees insists that the Soviets, “already facing actual [my italics] oncoming attack threats” from American B-52’s “took no responsive measures.” In short, he concludes that the evidence suggests that the threat was not an “actual” threat and the Soviets knew it. Fortunately, however, the MiG’s could only reach a maximum of 60,000 feet and the U-2 flew at 70,000 feet—thus limiting the Soviet fighters, at least initially, to tracking the path of the American intruder. 

 

However, when Dean Rusk updated the president about the U-2 “accident” just hours later, he was reading from a prepared text—unlikely to have been written in the brief time since the intrusion: “Would there be,” Rusk asks President Kennedy, “anyadvantage [my italics] in our saying that ‘an Alaska-based U-2 flight engaged in routine air sampling operations in an area … normally 100 miles from the Soviet Union had an instrument failure and went off course … overflying a portion of the Soviet Union?’” Rusk’s calculated language and tone, captured on the tape recording, suggest that he was proposing a public relations cover story rather than simply presenting the facts to the president. 

 

Decades later, at a conference, Professor Scott Sagan asked Robert McNamara if the U-2 flight was part of the ultra-secret Strategic Integrated Operational Plan (SIOP) for nuclear war. The former defense chief curtly denied it but refused to discuss details—intensifying the skepticism of the panelists and the audience. Fred Kaplan, however, has documented that JFK, in 1961, had read and seriously discussed a nuclear first-strike plan that could have led to a million Soviet casualties in the first attack alone.[iii]

 

Michael Dobbs later utilized some newly released documents and interviewed U-2 pilots and senior SAC officers to nail down additional details on the overflight.[iv] He nonetheless stressed that the full report, originally ordered by McNamara, remains classified. Can historians rule out, without this potentially definitive evidence, the possibility that this episode was linked to a botched or aborted effort to “resolve” the crisis with a pre-emptive nuclear strike—in other words, that it was initially a strategic gamble that contingency morphed into a hazardous unanticipated consequence?

 

Both Kennedy and Khrushchev, Voorhees insists, were resolved to avoid the use of nuclear weapons. But, as explicated above, the micromanagement of historical contingency is an illusion. “The destinies of nations,” Martin Sherwin demonstrates, “just as the lives of individuals, are moved inexorably forward through crossroad after crossroad by decisions and chance, with the influence of each in constant flux. The disconcerting conclusion … [is that] a global nuclear war was averted because a random selection process had deployed Captain Vasily Arkhipov aboard a particular Soviet submarine.”[v]

 

Theodore Voorhees, Jr. has written a boldly original and impressively researched account of how events, fortunately, did turn out in October 1962.  But, if those fateful thirteen days could be repeated one hundred times, it is all but inconceivable that fortuitous contingency, branded as “plain dumb luck” by former secretary of state Dean Acheson, would substantiate Voorhees’ confidence in “the essential inevitability” of a peaceful outcome. Kennedy was steadfast about deterring nuclear war—a fact incontrovertibly documented by the real-time tape recordings; Khrushchev’s apparently analogous motives must be deduced from his actions, his memoirs, and the testimony of those around him. Nonetheless, that shared outlook alone could not and did not predetermine the outcome. As historian Fredrik Logevall recently warned: “we should avoid the trap of hindsight bias, or what the philosopher Henri Bergson called ‘the illusion of retrospective determinism’—the belief that whatever occurred in history was bound to occur.”[vi]

 

 

[i] If, as Voorhees maintains, the Checkpoint Charlies standoff provided Kennedy and Khrushchev with “a kind of blueprint and preview in miniature” of the missile crisis, it did not have a noteworthy impact, despite the persistent angst about Berlin, on the ExComm discussions or the correspondence between the two leaders.   

[ii] Michael Dobbs, One Minute to Midnight: Kennedy, Khrushchev, and Castro on the Brink of Nuclear War, 2008, 297-303; Svetlana Savranskaya, “New sources on the Soviet submarines in the Cuban missile crisis,” Journal of Strategic Studies, 28/2 (2005) 233-59.

[iii] Fred Kaplan, “JFK’s First-Strike Plan, Atlantic Monthly, October 2001, 81-86.

[iv] Dobbs, Op.Cit., 258-65, 268-72. 

[v] Martin Sherwin, www.cornerstone.gmu.edu/articles/4198 and Gambling with Armageddon: Nuclear Roulette from Hiroshima to the Cuban Missile Crisis, 1945-1962, forthcoming September 2020.

[vi] Fredrik Logevall, JFK: Coming of Age in the American Century, 1917-1956, 2020, 361.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177831 https://historynewsnetwork.org/article/177831 0
The Roundup Top Ten for October 16, 2020

Republican Voter Suppression Efforts were Banned for Decades. Here's what Changed

by Kevin M. Kruse

In 2020, as in 1981, the realities of voter fraud don't matter. Republicans are insisting that their very real efforts at voter intimidation are warranted because they insist that Democrats have done or will do or possibly might do something much worse. 

 

The Right's War on Universities

by Ruth Ben-Ghiat

"From the fascist years in Europe, nearly a century ago, to our own times, right-wing leaders have accused universities of being incubators of left-wing ideologies and sought to mold them in the image of their own propaganda, policy, and policing aims."

 

 

How Do Pandemics End? History Suggests Diseases Fade but are Never Truly Gone

by Nükhet Varlik

"Whether bacterial, viral or parasitic, virtually every disease pathogen that has affected people over the last several thousand years is still with us, because it is nearly impossible to fully eradicate them."

 

 

For 200 Years Courts Upheld Rules to Protect Americans’ Health. Until Now

by John Fabian Witt

"Now a new generation of judges, propelled by partisan energies, look to deprive states of the power to fight for the sick and dying in a pandemic in which the victims are disproportionately Black and brown."

 

 

Stop Othering Latinos

by Geraldo L. Cadava

When politicians see us as more than voters, we may give them our votes.

 

 

#WEWANTMOREHISTORY

by Greg Downs, Hilary N. Green, Scott Hancock, and Kate Masur

At historic sites across the United States on September 26, dozens of participating historians presented evidence to disrupt, correct, or fill out the oversimplified and problematic messages too often communicated by the nation’s memorial landscape.

 

 

Higher Ed’s Shameful Silence on Diversity

by Hasan Kwame Jeffries

Right-wing diatribes about diversity training often ended with a call for Trump to issue an executive order banning federal agencies from holding them. So it was not unexpected when, on September 22, Trump signed an executive order forbidding diversity training within the government.

 

 

The Real Black History? The Government Wants To Ban It

by Priyamvada Gopal

Tory attacks on "victim narratives" in the history curriculum defend entrenched power and ignore the fact that Black British histories are about the power of protest and activism to make social change. 

 

 

The Political History of Concealing Illness, from Brezhnev to Trump

by Joy Neumeyer

Like his Communist counterparts, Trump’s predilection for pageantry offers a hollow illusion of vitality while letting potentially fatal problems fester.

 

 

America Has No Reason to Be So Powerful

by Stephen Wertheim

"There was a time when Americans believed that armed dominance obstructed and corrupted genuine engagement in the world, far from being its foundation."

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177820 https://historynewsnetwork.org/article/177820 0
"Provided I Can Fuse on Ground Which I Think is Right": A Lincolnian View of the White House History Conference

Drs. Wilfred McClay and Allen Guelzo, National Archives, September 17.

 

 

 

Some friends have importuned me for an explanation of why I joined the panel that spoke at the National Archives as “The White House Conference on American History” on September 17th. Having been engaged in teaching the subject in various ways for forty years, I can say bluntly that I am not happy about its present condition. That I would say so at the behest of the White House set off an overabundance of anxiety in some quarters and over-congratulation in others, and mostly about the fact that the Vice-President and President spoke on the same subject later in the event. I am not sure what the cause of either the anxiety or the congratulation was, since my comments, of course, were not directed to the President or Vice President, or made in consultation with them. I have never even met the former, and the latter only once, at a reception. 

 

The issue for me was history education; and if I anticipated causing upset, it was more for making no secret of my conviction that the Enlightenment universalism of the Founding, the Declaration and the Constitution is a remarkable and exceptional moment in human history, or for my resistance to the worrisome versions of tribalism which I see bidding to replace it. I am not ashamed to say that I am a Lincolnian on this point, and subscribe myself fully to Lincoln’s opinion. In 1858, he said that half of Americans then alive had come from some place other than the United States. “If they look back through this history to trace their connection” to the American Founding strictly “by blood,” then “they find they have none.”

 

But when they look through that old Declaration of Independence they find that those old men say that “We hold these truths to be self‑evident, that all men are created equal,” and then they feel that that moral sentiment taught in that day evidences their relation to those men, that it is the father of all moral principle in them, and that they have a right to claim it as though they were blood of the blood, and flesh of the flesh of the men who wrote that Declaration, and so they are.

 

The fact that Americans have not always lived up fully to that Enlightenment universalism, or that ethnicity has often gotten bloodily in the way of it, merely shows that we are human, not that it is wrong. Lincoln again:

 

It is said in one of the admonitions of the Lord, “As your Father in Heaven is perfect, be ye also perfect.” The Savior, I suppose, did not expect that any human creature could be perfect as the Father in Heaven; but He said, “As your Father in Heaven is perfect, be ye also perfect.” He set that up as a standard, and he who did most towards reaching that standard, attained the highest degree of moral perfection. So I say in relation to the principle that all men are created equal, let it be as nearly reached as we can. If we cannot give freedom to every creature, let us do nothing that will impose slavery upon any other creature. 

 

I do not see that aspiration represented in much of our history teaching today. I complained, in my panel comments, that to look through the tables-of-contents of our flagship quarterlies is frequently to encounter a witches-sabbath (and Night on Bald Mountain was thrumming in the back of my mind as I wrote that) of complaint about injustices, deportations, genocides, failures, co-optations, and miseries. Unhappily, I am not alone in this lament. As David Hackett Fisher complained years ago, we have made “the American past into a record of crime and folly” and told ourselves “that we are captives of our darker selves and helpless victims of our history.” Perhaps this fulfills a certain Puritanical gene in our national make-up which is never entirely happy unless we are unhappy; perhaps it’s because human nature is drawn to misanthropy and outrage because it makes us feel so powerful; or perhaps it’s an illustration of what Tocqueville observed when he said that the nearer we approach genuine equality, the more screamingly intolerable the remaining inequalities feel. I am not equipped to decide which of these preponderates in every case, but I see a good measure of each in the squinting vision that pervades our profession.

 

In my comments, I was particularly severe on critical theory, and especially critical race theory, which strikes me as indulging precisely the same circular reasoning as the Calhounites long ago, with the same appeal to the supremacy of “community” (that was Amos Kendall’s word in shutting down the circulation of abolitionist literature in the mails in the 1830s) and race. It’s that circular reasoning which leads me to reject the theorists’ despair; nor does their despair carry much persuasion when I hear it coming from people who, in flat contradiction of despair, occupy positions of prosperity, privilege, influence and (yes) property which would otherwise be the envy of the preceding three thousand years. Adorno thought that Enlightenment reason discards difference and thus victimizes two-thirds of the world’s population. The very point of the Enlightenment was that reason understood difference, and saw difference as the cult-goddess of violence. 

 

I don’t deny that academic historians always run the risk of being manipulated, infantilized and traduced, especially by the political classes. But if, to avoid that, we say nothing except to ourselves, then I think we forfeit what we owe to the history we purport to serve. And I do believe we have a responsibility as historians, both to those who cannot speak from the past and to those whom we teach, a responsibility not to wallow in guilt or drag others into the wallow, and it does not seem to me at all unreasonable to ask what souls we are forming as we teach. If we laugh at honor, we should not then profess shock when mass murder is perpetrated. I do not think it wrong to ask myself whether what I say builds up, or destroys; whether it would strengthen the resolve of some eighteen-year-olds to storm Omaha Beach, or whether it would incline them to sell nuclear secrets for the first offer in spot cash. History is an art that holds off dissolution; it should improve life rather than debasing it. 

 

I have no sympathy whatsoever with the pompous foolishness which argues that all Americans have been right, valiant, brave, noble, innocent, blue-eyed and pure. But the myths of the mindless patriots on the Right are not worse than the myths of the mindless cynics on the Left, and I do not need to explain that it is the Left that dominates in our profession. I suppose that this will invite the accusation that I am merely bourgeois. Very well. Susan B. Anthony was bourgeois, Frederick Douglass was bourgeois, and Lincoln was certainly the most bourgeois of all.

 

So, I will take the opportunity of any platform offered me short of outright tyrants, depraved fools and genocidal murderers to talk about American history -- I have done that for Dinesh D’Souza and was roundly condemned for doing so; I did it for the World Socialist Web Site, and was roundly condemned for doing that, too. I think I can do both without being either a Trotskyist or a D’Souzaist. Lincoln once more: “I have no objection to ‘fuse’ with any body provided I can fuse on ground which I think is right.” I would be just as willing to do so as an officer of the American Historical Association, except of course, that I was told by the chair of the committee on nominations years ago that people who thought like me were not wanted. So much for diversity and inclusion.

 

What is the way forward? That is the question I wish more people would ask. I begin with a comment John Cheever once made in Falconer, about “the inestimable richness of human nature” -- that we are tragic, selfish, and cruel, and yet capable of great vision. I then would take us to the foundations of law -- divine, natural and positive – and superimpose the conviction that the American democracy opens up for us, in a way seen in no society before 1776, the chance to fight for the natural rights with which we have all been endowed. Next, I would find in the American experience the rejection of tribe -- of blood, soil, and kings -- and the achievement, more than in any previous epoch, of happiness, of eudaemonia. To write this story, I would borrow a rule from John Gardner (from whom I have borrowed a lot): there is no true compassion without will, and no true will without compassion. And the last word should be from Sgt. William Carney: “The old flag never touched the ground, boys.”

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177724 https://historynewsnetwork.org/article/177724 0
Who Owns Churchill?: Three Mythic Configurations

 

 

 

We sent our manuscript, The Churchill Myths, to Oxford University Press on 24 July 2019, the very day that Boris Johnson became prime minister of the United Kingdom. The manuscript had been in conception, and came together in various drafts, for almost exactly a year: its completion and the dramatic shift in the political world was no coincidence. The conjunction of the two felt strangely unnerving to us, when for once our own historical study appeared unusually punctual, addressing immediately the contemporary moment.

 

Johnson’s career had become increasingly identified with Churchill’s memory, not least in his own dreams, which he proved keen to share with the nation. His bestselling book, The Churchill Factor (2014), signalled not only a literary but a political event. Since Churchill retired from office in 1955 many politicians, in the UK, the USA, and elsewhere, have endeavoured to appear “Churchillian.” Like Johnson, they have wished to exploit Churchill’s reputation as steadfast, determined, and resolute. Yet although the core of Churchill’s image -- indelibly linked with his cigar and V-sign -- has remained constant, there have been significant historical changes in how successive political generations have deployed him.

 

This is why we refer to myths in the plural. Churchill is constantly being reshaped in public debate and put to new political uses. How he is understood will continue to evolve in our own present, as well as in the future.

 

We suggest, speaking broadly, that there are been three overriding phases in the mythic configurations of May 1940, with the figure of Churchill himself assuming ever greater prominence as the story evolves. 

 

First, during the global crisis of May 1940, when the Nazi invasion seemed a matter of hours away, the place of Churchill in the telling of the story was not what we would expect. Mythic Churchill was present, of course, not least in Churchill’s own self-dramatizations. But this “Great Man” version of history had to compete with the role of “the people” as radical, and as the principal historical actor, with Churchill himself assuming a significant, but not an absolute, part in the story. This idea of ‘the people’ itself had mythic properties, while Churchill was relegated to acting as an adjunct to the larger popular mood, the one that was expressed in Labour’s 1945 election victory.

Second, from the late 1940s – that is, after Churchill lost the premiership – mythic Churchill really takes off. His own hand in the making of this state of affairs was not slight. In these years, Churchill became the means by which the compact between the state and the people could be harmoniously accomplished, magically unified through the commanding person of Winston Churchill. This is when Churchill himself became the totemic distillation of the nation.

 

In the last twenty years, thirdly, a new configuration of meanings has emerged. This is partly evident in Boris Johnson’s own revision of the story. Churchill still remains the incarnation of “the people.” But this is a more nativist, more belligerent people which is understood to be – not in harmony – but in contention with the state. This reimagining of Churchill sees the “respectable,” mainstream Conservatives as the enemies and betrayers of the people. The historic Conservative Party is re-imagined as the enemy within.

 

The tangled relationship with Europe lies behind this. Johnson himself, when adopting the garb of Churchill, does so as a “man of destiny,” saving the people from the corruptions of the old Establishment, which includes both the advocates of Europe and the institution of the Conservative Party. This signals a more Jacobin, intemperate, and resolutely populist politics. 

 

When we were completing the book we knew Johnson’s electoral triumph was not the end of the story. But we could not have possibly anticipated how the explosion of Black Lives Matter would come to represent such an electric current in the public life of the British nation. History can always take us by surprise. Churchill, this time as the “racist,” is again emphatically a charged sign in the polarization of contemporary British politics.

 

But notwithstanding these divisions Churchill as transcendent figure remains powerful. When his name is uttered in public, chances are that it will be underwritten by faith in the empire as the unilateral medium for the benediction of others. Repetition of Churchill as the nation’s story runs deep. Its popular variations are impervious to critique. It’s not that contrary voices don’t exist. They do, and frequently they’re heard. It is, rather, that the Ur-story reproduces itself regardless of whatever manifestations of dissidence cross its path. The more repetitious it becomes, the more impregnable it is, and the further it departs from historical realities. In its telling, thought is eviscerated. The past is acted out, with no need for reflection. The story – the myth -- obliterates history. The story tells itself.

Churchill becomes the means for the nation’s exceptionalism to exert its authority. There persists a puzzling faith in the notion that England, providentially, has been peculiarly immune to the existential darkness which shadows modern selfhood. Whether anyone actually believes this or not, as a literal truth, may be unlikely. But its echoes nonetheless persist. This is a voice heard in different frequencies. It constitutes a barely conscious substratum of popular experience, flaring into the light of day in tabloid headlines. It turns on an attachment to the protocols of an English fundamentalism. When critics point to the dark matter in the nation’s past -- to the violence of colonialism, to enslavement, to the racialization of others -- a reflex is triggered as if even to say such things the lifeblood of English selfhood is carried to the precipice of destruction.

Even now, to question Churchill’s historical record triggers all manner of aggrieved reactions. To do so is perceived to be denigrating not only the man but the nation. Churchill, in this sense, remains a charged element in the civilization of the British. The aim of our book is to help the reader understand how and why. 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177679 https://historynewsnetwork.org/article/177679 0
Judicial Overreach in High Partisan Times: How the Dred Scott Decision Broke the Democrats and Boomeranged on the Court

 

 

 

 

With the imminent confirmation of Amy Coney Barrett, Republicans appear to have a solid Supreme Court majority in their grasp. But they –and the conservative Supreme Court majority— ought to heed a lesson from the Court’s history: Beware of overreaching.

 

The most dramatic example comes from the Court’s most infamous case. We usually parse Dred Scott v. Sandford as the Worst Decision Ever, but it also offers an overlooked political lesson. The Court waded into a high partisan battle and badly damaged the institutions behind the ruling. The Democratic Party broke in two and the Supreme Court itself endured a decade of court packing. 

 

Start with the case itself. An army surgeon was posted to the free territory of Wisconsin and took along a slave named Dred Scott. While in the territory, Scott married in a civil ceremony – something he could not have done as a slave. When the army sent the doctor back into slave states, Scott sued for his family’s freedom (by now they had two daughters) citing the traditional legal rule, “once free, always free.” After a 12-year legal saga through state and federal courts, Chief Justice Roger Taney decided to use the case to settle the fiercest question of the 1850s: Which of the vast western territories should be open to slavery? 

 

Democratic President James Buchanan, who never missed an opportunity to side with slaveholders, used his inaugural address to cheer the awaited court decision as the final word on the matter. Like all good citizens, intoned this soul of innocence, “I shall cheerfully submit … to their decision … whatever it may be.” Except that he already knew perfectly well what it would be. Buchanan had pushed Justice Robert Cooper Grier (a fellow Pennsylvanian) to join with the five southern justices in order to improve the optics when the legal bomb detonated. 

 

Two days later, on March 6, 1857, the Supreme Court announced its ruling in Dred Scott v. Sandford, an historically inaccurate, legally implausible, virulently partisan decision marked by eight different opinions (two dissenting). A clerk even misspelled the plaintiff’s name –it was Sanford, not Sandford-- so that even the name of this infamous decision memorializes a typo. 

 

At the heart of all the jurisprudence sits Justice Taney’s majority opinion, an implacable picture of race and exclusion. The Constitution and its rights could never apply to Black people. What about Scott’s claim to have lived in a free territory? Not valid, ruled Taney, and for a blockbuster reason: no one had the authority to prohibit slavery in any territory – not the federal government, not the residents of the territory, not anyone. What about the Missouri compromise of 1820 which forbade slavery above the 36 30’ parallel? “Not warranted by the Constitution and therefore void.” How about the compromise of 1850 and the Kansas-Nebraska Act, which had turned to popular sovereignty? Nope. No one could limit slavery in any territory. 

 

The political fallout quickly spread to both the parties and the courts. The Republicans had sprung up, in the mid 1850s, to stop the spread of slavery into the territories. The Dred Scott decision, which was the first to strike down a major act of Congress in more than half a century, ruled out the party’s very reason for being. As historian George Frederickson put it, the ruling was “nothing less than a summons to the Republicans to disband.” Republican leaders denounced the Court as part of the Slave Power and accused Chief Justice Taney of conspiring with pro-slavery Democrats in the White House and Senate. The decision itself helped propel these new-found enemies of the court to power.

 

Across the party aisle, the detonation unexpectedly wrecked the Democrats. At their next political convention, in 1860, they paid the price of their victory. When it came time to write a party platform, northern Democrats opted for the same slavery plank they had used during the last presidential election: White men in the territories should decide the slavery question for themselves. After all, these politicians could not very well go before their voters –who were eager to claim western lands for white men and women-- and announce that every territory was open to slavery regardless of local opinion. 

 

The southern Democrats, however, insisted on a plank that said exactly that. They bitterly denounced their party brethren for casually handing back what the Supreme Court had given. The southern version of the plank proclaimed that Congress had a “positive duty” to protect slaveholders wherever they went -- “on the high seas, in the territories, and wherever else [Congress’s] constitutional authority extends.” The high seas? A sly call to bring back the Atlantic slave trade which had been banned in 1808. When the convention narrowly chose the northern version of the platform, the southerners walked out of the convention and eventually nominated their own candidate. 

 

A divided Democratic Party eased the way for a Republican victory in 1860 and that, of course, gave the nation a hard shove toward the Civil War.  Democrats had dominated Washington throughout the antebellum period. Now they fell from power. They would not control the Presidency and Congress again for forty-two years. 

 

The recoil from the Dred Scott decision also shook the Court. Lincoln bluntly expressed the Republican’s skepticism in his Inaugural Address.  “The candid citizen must confess that if the policy of the government … is to be irrevocably fixed by decisions of the Supreme Court, the people will have ceased to be their own rulers.” The other branches of government were every bit as capable of enforcing the Constitution, he continued, and the Court had no business claiming that right for itself. 

 

Republican Senator John Hale (NH) added that the Court “had utterly failed” and called for “abolishing the present Supreme Court” and designing a new one. The new majority did not quite go that far, but they packed and repacked the court. They added a tenth justice (in 1863), squeezed the number down to seven members (in 1866) and, finally, returned it to nine (in 1869). The Republicans also reached into the lower courts and rearranged the Circuits. The partisan reorganization of the courts –the only sustained court packing in American history-- went on for most of a decade. 

 

The lessons from Dred Scott echo down through to the present day. A declining political party only injured itself by using the courts to settle a fierce political controversy. Even more important, the Court’s plunge into the hottest issue of the era blew right back on the Court itself. Taney’s botched effort to settle the slavery issue sends a warning to every generation. There are distinct limits to the Court’s legitimacy in highly partisan times. Modest jurisprudence can protect the court. Overreach can cause all kinds of blowback.  

 

 

 

This essay is taken from Republic of Wrath: How American Politics Turned Tribal from George Washington to Donald Trump (Basic Books, September 2020) 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177738 https://historynewsnetwork.org/article/177738 0
The Coming Election and the Political State of Fugue

 

 

 

Americans teeter on the brink of a state of collective fugue. A psychiatric state of mind, the fugue is caused by extreme distress in the aftermath of one or more cataclysmic events. The fugue state causes a person to fail to recall intrinsic identifying personal characteristics and to no longer remember what they believed in the past; those things they knew to be true no longer exist. This dissociative mental state erodes one’s fundamental concept of self. Under Donald Trump’s cataclysmal presidency, our collective memory and awareness of who we are as a people and our shared aspirations to perfect our union appear to be at the point of dissolution.

 

We are not at war with a conventional army, yet our nation is in chaos. Over 200,000 American lives have been lost to COVID-19, and with winter on the horizon, a second wave of the pandemic is emerging in Europe. The United States leads the world in the number of people infected with the virus and in the number of COVID-19 deaths, despite the nation’s exceptional biomedical and health related research and infrastructures. 

 

Further, despite our proselytizing instinct to lecture the world about minority rights and good governance, police brutality against Black people in the United States is dramatically displayed in media across the globe. The ensuing continental uprising for civil rights, far-reaching economic spasms, and crisis of governance are exacerbated by the reflexive responses of an unpredictable President. 

 

The transition from a unipolar to a multipolar world, the emergence of economic centers in the East and the ensuing erosion of Pax Americana, accelerated by Trump and his team, compound our unease and search for identity. These seismic shifts nationally and internationally perpetuate our state of heightened anxiety. 

 

The erosion of our centuries-old governmental institutions is particularly distressing. In the wake of the death of Associate Justice Ruth Bader Ginsburg, we are now forced to put aside our national mourning and deal with the political ramifications of her passing. We must reckon with the seemingly assured confirmation of Amy Coney Barrett as Ginsburg’s replacement, a social conservative who would certainly erode the civil rights of minorities, the healthcare gains of the contemporary era, the procedural rights of ordinary Americans in the justice system and the bargaining power of workers. The death of Ginsburg, a champion of rights, bodes the potential to regress to a darker past.

 

Trump and his team claim to have the mandate from the American people and are preparing their Senate allies to complete the confirmation process within the span of a few weeks, prior to a presidential election. Mitch McConnell’s pledge to support the President ignores a precedent he declared just four years ago - not to appoint a Justice during an election year - which we are now expected to erase from our collective memory.

 

In many instances, particularly at the level of the High Court, the American judicial system has been predictable in rendering judgments based on Justices’ and other appointed federal judges’ partisan political suasions. Notwithstanding, the near balance of opposition forces within the Supreme Court provided stability and prevented the abrupt tilting of the scales of justice, such that they overwhelm their point of balance and implode the judicial system. 

 

The Republicans obstructed President Obama’s appointments for federal judges during his tenure as a political maneuver, albeit with an underlying racial element. Yet, in under four years, Trump has appointed over 300 federal judges. It is troubling that judges each view their disposition on national issues through a political lens, as a President who lost the popular vote and a Republican Senate whose members represent a minority of the nation have pushed the courts sharply rightward.

 

As Republicans abandon their legal philosophy when it is no longer expedient and backpedal from their position after the death of Justice Antonin Scalia just four years ago, they gaslight the American people. They claim that what we see and hear are not true. They ask us to forget what they committed to in 2016, to question everything and to dispute the existence of what we know to be true.

 

Through a similar prism of purged memories, Trump and his team deny the existence of systemic racism that underpins police killings of unarmed Black people or the warming of our planet. They attempt to erase what is real from our memory. These unprecedented events have brought the nation to the precipice of a political state of fugue. Trump and his team are determined to push us off the cliff. When confronted with existential social and political crises, they foment political fugue with campaigns of disinformation.

 

Bob Woodward discloses that in February, President Trump was fully aware of the fatal potential of the coronavirus. Trump not only failed to share this information with the American public; he actively downplayed its deadly potential to the public and strongly encouraged his followers to ignore preventive measures. The President had promised prior to his election in 2016 to end American carnage. Paradoxically, his words foreshadowed what his legacy would be – the savaging of the American dream and reaching the milestone of hundreds of thousands of preventable American deaths during his presidency.

 

If re-elected, after another four years of a Trump presidency, the Justice Department, Supreme Court, and other institutions of the American democracy will not be recognizable. Our system of checks and balances, the foundation of the American democracy, will be dismantled. Our identity and who we are as Americans and our aspirations for a perfect union will cease to exist. Our government will be so fundamentally altered from what we know it to be, that we will have entered a collective political fugue.

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177684 https://historynewsnetwork.org/article/177684 0
Corporate Money Turns Democracy Upside Down in California Initiative Process

 

 

 

The California ballot initiative process, created in 1911 by the progressive movement to control the influence of corporations, has instead been turned upside down by massive corporate spending. Corporate interests now use referenda to defeat popular legislation and grassroots reform efforts.

 

The most glaring example is the current fight over Proposition 22 on the November ballot, which has attracted a whopping $190 million in spending from Uber, Lyft and DoorDash.  The measure, backed by the companies, would overturn a current California law defining their drivers as employees who must be paid a minimum wage and be eligible for unemployment insurance. 

 

According to Ballotpedia, an independent tracker, this has become the most expensive ballot proposition in California history. Unfortunately, this massive outlay is only the latest in a long line of corporate efforts to sway public opinion on initiatives that impact their bottom line.

 

For example, in 2018, the dialysis clinic industry spent $110 million to defeat a measure to regulate their operations. In 2016, the pharmaceutical industry doled out $109 million to defeat drug price controls and in 2006 the oil industry spent $91 million in a successful effort to kill an oil extraction tax. 

    

Hiram Johnson

 

If Hiram Johnson, the progressive governor who championed the ballot initiative process, were alive today he would be appalled. Johnson was first elected in 1910 on a platform of controlling “the interests,” specifically the Southern Pacific Railroad, which basically owned the legislature. Johnson was a liberal Republican, and a follower of “Fighting Bob” LaFollette, the progressive Republican governor of Wisconsin, who instituted an income tax, a railroad commission and a pure food law. 

 

Although California’s proposition battles have attracted the most headlines, the Golden State was not the first to put initiatives on the ballot. South Dakota was the first to adopt statewide referenda in 1889, followed by Utah in 1900 and Oregon in 1902. By 1918, an additional 16 states implemented the practice.  

 

As the progressive movement ebbed in the Roaring Twenties, interest in initiative process fell off. Between 1912 and 1969, less than three ballot initiatives per year, on average, appeared on the statewide ballot. 

 

However, the power of the proposition for implementing major change jumped into national view in 1978 with the passage of California’s Proposition 13. This measure, sponsored by apartment owner Howard Jarvis, rolled back residential and commercial property taxes and limited the legislature’s ability to raise new taxes. 

 

Awakened to the potential power of the initiative, grassroots activists and corporate interests alike rushed to qualify initiatives for the ballot. Between 1978 and 2003 there were 128 initiatives on the statewide ballot.  

 

Easy to qualify 

 

Getting on the ballot is not that difficult.  Currently, 624,000 signatures are required for a basic or statutory referendum and 997,000 for a proposed amendment to the state constitution. The widespread use of paying signature gatherers makes the process easy for deep-pocketed interests. Attempts to ban the use of paid gatherers have been struck down by the courts. 

 

With Californians facing a long list of ballot measures year after year, many are questioning the wisdom of the process. A 2013 survey of likely voters found that 67% said there were too many propositions on the ballot and 84% said that the wording on initiatives was “too complicated.” 

 

This November, an even dozen measures will be up for a vote. They include rent control, dialysis clinic staffing (a second time), eliminating cash bail, voting rights for 17-year-olds and ending the state’s 22-year-long ban on affirmative action. 

 

As for the $180 million spend on Proposition 22, Los Angeles Times business correspondent Michael Hiltzik recently observed “no other initiative campaign in California history — given that California campaigns are the most expensive in the country, that means U.S history — has come close to the gig companies’ spending on Proposition 22, even accounting for inflation.”

 

The spending has manifested itself in a barrage of TV ads and mailers. The campaign features Uber and Lyft drivers (presumably volunteers) asking voters to “save my job.” The TV ads feature drivers (often young mothers and fathers) who state that the only way they can make ends meet is by driving for Uber or Lyft.  

 

Despite the ad onslaught, the ridesharing companies have an uphill climb to win approval.  A poll released September 22 by the U.C. Berkeley Institute of Governmental Studies found that only 39% of likely voters would vote “yes” and support the ride-hailing companies, compared with 36% who would vote “no” to retain current law with 25% undecided. Given that that in most cases undecided voters ultimately vote “no” on ballot measures, Proposition 22 could be wind up a costly defeat. 

 

A number of studies in recent years have found that the cumulative effect of California’s multitude of ballot initiatives has been to reduce the power of the state legislature and make it difficult for cities, counties and school districts to raise funds. This, of course, is hardly what Hiram Johnson intended.

 

Can anything be done to restore fairness to the initiative process? 

 

Some legislators have proposed raising the signature requirement or increasing the filing fee (now just $2,000). But as long as the federal courts equate corporate spending with free speech, those measures would do little to discourage wealthy special interests from using this policy-making tool.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177740 https://historynewsnetwork.org/article/177740 0
Paul Revere Made the Boston Massacre a Flashpoint for Revolution.

Paul Revere's Engraving "The Fruits of Arbitrary Power, or the Bloody Massacre," from Henry Pelham's drawing, 1770

 

 

At this moment that feels like a hinge in history—when America will swing either toward authoritarianism or toward a more just and liberal democracy—the ghosts of history rise up and speak to us. Five of those ghosts lay in the snowy gutters of King Street, Boston, nearly two and a half centuries ago, and their dying gasps resounded into revolution.

On the snowy night of March 5, 1770, a band of citizens allied as Patriots taunted and harassed a lone British sentry, Private Hugh White, who was standing guard over the Custom House, the repository of the funds General Thomas Gage needed to pay and operate the two regiments of troops occupying the city.

Some of his senior staff had counseled him to station the troops outside the city at Castle William in the harbor to avoid provoking violence and stiffening resistance to the occupation, but Gage intended to stun the self-proclaimed Patriots with a demonstration of overwhelming force. Thus he “quartered the soldiers” in the city—a military way of saying he ordered them to take over private homes—a move that enraged even those among the population who professed loyalty to the king. It’s useful to remember what seemingly innocuous phrases really mean.

The troublesome Bostonians were refusing to pay the taxes imposed on them by a Parliament across the ocean to pay off the debt incurred fighting the Seven Years War against the French. Patriot gangs routinely blacked their faces and accosted Customs collectors in the nighttime streets. One of their most troublesome leaders was reputed to be a silversmith named Paul Revere. Gage wanted to teach them all a lesson.

The lone sentry at the Custom House belonged to the 29th Foot—an unruly and unreliable regiment, hardly the nimble, disciplined force needed to project power and yet avoid violence. At some point during the altercation, in which he was knocked to the ground, he cut one of the civilians with his bayonet. Suddenly the injured man’s cohorts raised a great hue and cry, and a mob formed. Captain Thomas Preston arrived with eight reinforcements, also from the 29th.

The mob pelted the soldiers with snowballs—some of them probably cored with stones—some men daring the soldiers to fire, others pleading with them not to. Someone at last did shout “Fire!”—or, according to later court testimony, it may have been Preston ordering “Hold your fire!” In any event, the first Brown Bess musket went off—then others followed. 

The Brown Bess, so sweetly named, was a formidable and reliable weapon, in use since 1722. It would remain the standard British Army firearm, with modifications, for more than a hundred years. It fired a one-ounce .71 caliber ball—gigantic by modern standards—that could, it was claimed, penetrate five inches of solid oak. 

Three men died instantly, two others died later of their wounds, and six additional civilians were hit. 

It’s fitting to remember the names of the dead: Samuel Gray, a rope maker; James Caldwell, a seaman; Samuel Maverick; and Patrick Carr. The fifth fatality is often described as a dockworker of mixed race, or “mulatto”: Crispus Attucks. Like Caldwell, he was hit twice. The autopsy, performed by Dr. Benjamin Church, a prominent Patriot who would later betray the cause and be exiled into oblivion by George Washington, records horrific wounds. The first ball broke the second rib an inch from his breastbone, blasted downward through his diaphragm, blew his liver and gallbladder to pieces, severed the aorta descendens just above the iliacs, then exited through his spine. The trajectory would suggest that he was already on his knees when he was shot. He was likely dead before the second shot punched him in the ribs.

If you’ve ever fired such a musket, as soon as it socks your shoulder hard, and the powder flames out in a long sheet—a delayed and startling instant after you’ve pulled the trigger—you realize it is not a quaint museum piece but a killing instrument of awesome power.

Just so, peaceful protesters today are learning with (literal) physical shock that so-called “non-lethal” “rubber bullets” and “beanbag rounds” are hard, brutal projectiles that can horribly maim and even kill. Again, it matters what words we use to describe things in the world of conflict.

As Preston writes later, “None of them was a hero. The victims were troublemakers who got more than they deserved. The soldiers were professionals…who shouldn’t have panicked. The whole thing shouldn’t have happened.”[1]

The soldiers were arrested and jailed—their actions were clearly a matter for the bar of justice. The trial, winding up seven months later, was thorough—John Adams for the defense. Paul Revere—notorious to the British occupiers as an instigator and rabble-rouser—provided key evidence: a pen and ink diagram of the kind familiar to contemporary juries. It located each of the shooters and victims on King Street with clarity and precision.

That was Revere’s second and far less famous pictorial representation of the event. The first was rushed into circulation when the blood was hardly dry on the frozen ground: an engraving of a Henry Pelham drawing titled, “Fruits of Arbitrary Power, or The Bloody Massacre Perpetrated in King Street.” Thus the event was publicly and for all time named a “massacre.” The engraving removed any ambiguity about who was at fault in the episode, depicting a line of soldiers volley-firing into an unarmed crowd on the order of an officer with raised sword, as a little dog watches the horror. It became the ubiquitous graphic account of the violence of March 5, 1770.

The Boston Massacre was just the most prominent flashpoint so far. Gage belatedly removed his troops from the city. There were others, not as infamous from our historical remove, but equally inflammatory—and they began to add up

On February 22, 1770, just weeks before the massacre, a hated customs informer named Ebenezer Richardson retreated into his home after being harassed by a gang of boys throwing dirt clods and waving sticks. He grabbed a musket and fired through a broken window into the crowd outside, killing an eleven-year-old boy named Christopher Seider. Four days later, some 2,000 Patriots staged a public funeral procession that began at the Liberty Tree, symbol of resistance to the King of England.

Among the inscriptions on the boy’s casket was a motto that could serve for Black Lives Matter: Innocentia nusquam tuta—“Innocence is nowhere safe.” 

Seider continued to inspire resistance. On the one-year anniversary of the massacre, Patriots gathered for a silent memorial—and it was no accident that the site chosen for the demonstration was the home of Paul Revere, an acknowledged leader of the Patriot movement. Once again, he understood the power of the visual. He created a triptych of iconic—and lurid—images that filled three windows, calculated to appeal to the smoldering resentment and fervent patriotism of the crowd. As the Boston Gazette reported:

“In the Evening, there was a striking Exhibition at the Dwelling House of Mr. PAUL REVERE, fronting the Old North Square. At one of the Chamber Windows was the Appearance of the Ghost of the unfortunate young Seider, with one of his Fingers in the Wound, endeavoring the stop the Blood issuing therefrom.” 

The portrait bore an incendiary caption:

            Seider’s pale Ghost fresh-bleeding stands,

            And Vengeance for his Death demands.

The Pelham-inspired print of the Boston Massacre filled the next window: “. . . the Soldiers drawn up, firing at the People assembled before them—the Dead on the Ground—and the Wounded falling, with the Blood running in Streams from their Wounds: Over which was wrote Foul Play.”

Revere understood how to fashion narrative through unifying the images: “In the third Window was the Figure of a Woman, representing America, sitting on the Stump of a Tree. With a Staff in her Hand, and the Cap of Liberty on the Top thereof—one Foot on the head of a Grenadier lying prostrate grasping a Serpent.—Her Finger pointing to the Tragedy.”

The exhibition worked its emotional magic, striking the thousands of assembled citizens to “solemn Silence” and ”melancholy Gloom.”[2]

Two years after the massacre on King Street, Gen. Gage advised the Secretary of War, Viscount William Wildman Barrington, “Democracy is too prevalent in America, and claims the greatest attention to prevent its increase.”[3]

Gage’s lament seems to be the current mantra of the Republican Party, as it seeks to suppress voting and clear the streets of peaceful citizens assembled to petition the government for redress of grievances, a right explicitly—if inconveniently for those in power—enshrined in the Constitution.

As for Captain Preston and his grenadiers, a jury of non-Bostonians (chosen for their presumed lack of bias) took just three hours to acquit them of murder. Two were found guilty of manslaughter, but did not suffer the usual sentence of death. Instead, their thumbs branded: should they ever commit another crime, the consequences would be dire indeed.

So what do the ghosts of that bloody history whisper to us now?

First, that language matters. The words with which we describe a thing can be accurate or misleading, are often fraught, and hardly ever are neutral. As soon as the event on King Street was popularly labeled a “massacre,” the Patriots had a rallying cry as potent as “Remember the Alamo!” It turned an event into a story with a clear moral, removed ambiguity, assigned fatal blame, and demanded justice. 

Likewise, it matters whether we describe a peaceful assembly as a “demonstration,” a “protest”— or a “riot.” The terms escalate in their degree of danger and violence. The first requires official forbearance, the second forbearance with caution against possible escalation, and the third warrants heavily armed police with shields and the apparatus of violence. 

The people assembled on June 1, 2020, in Lafayette Square, across from the White House, stood firmly in the first category. The police and National Guard were the rioters, instigating violence in a previously peaceful arena using tear gas—which is banned as inhumane by the Geneva Conventions. “Tear gas” sounds relatively benign, the kind of thing that will make your eyes water for awhile. But it can damage the lungs, cause respiratory distress, and in this era of pandemic, fatally compromise the health of its victims.

“Batons,” so genteelly named to conjure images of drum majorettes, are actually clubs with which to beat people into submission. So-called “beanbag rounds,” fired from shotguns, have broken a man’s head open. “Stun grenades” or “flashbangs” routinely cause temporary hearing loss, have started fires, and have triggered heart attacks.

Second, whatever you bring to the event will get used. If Ebenezer Richardson, the Customs man, had not had a musket handy, an eleven-year-old boy would have lived to see another day. The gang of boys would likely have gotten bored and left.

The grenadiers on King Street—and grenadiers were recruited for their size and strength to serve as shock troops, not to finesse their way out of confrontation—had their own muskets, and sooner or later they were bound to be fired. When police march onto the scene of a demonstration geared up with heavy firepower and protective vests and shields, they will likely find the riot they are equipped for and use their arsenal.

Third, frame the situation accurately. Enlightened military planners do this routinely: What are we facing? What are the facts on the ground? What outcome do we want, and how best can we achieve it? 

I wonder, for instance, what Captain Preston hoped to achieve on that snowy night? Why didn’t he just pull the lone sentry indoors and let the weather eventually disperse the crowd before it became a “mob”?  For that matter, what did General Gage expect to happen when his 2,000 troops invaded the homes of ordinary Bostonians, most of them not part of the firebrand Patriot movement? His own officers warned him that such a provocation could only have a bad outcome, in fact might accomplish the opposite of his purpose by uniting the city against him and his troops.

Because fourth, the mindset of those in authority—and those they send to do their armed bidding—matters. Soldiers, like police, are trained to stand their ground. In the words of our own Secretary of Defense, Mark Esper, they must “dominate the battle space.” But crowds are not armies, and there is no battle space until it is created by confrontation with an opposing military force. Lexington and Concord were just peaceful farming towns until two armed forces determined to make them battlegrounds. Boston was just an unruly city, still part of a British colony.

And let’s be clear: American citizens protesting in American cities inhabit civic space—not battle space. There is no earthly need for a civic space to be cleared simply for the sake of clearing it and asserting dominance. Yet again and again, we see it happening exactly that way, because of the way an increasingly militarized police force is trained. From “To Serve and Protect” we seem to have evolved to a place of “Occupy and Dominate,” as if citizens were not the clients of police but their enemies in an occupied zone.

And as bad as the police mindset has too often become, the military is even worse as the guarantor of civic order—as many prominent military leaders have made clear. Troops are trained to subdue the enemy with force, and they are granted the extraordinary license to kill the enemy to make this happen—not the ideal recipe for guarding Americans’ constitutional right to petition for redress of grievances in the streets.

Fifth, real-life violence always comes as a shock to its victims. The eleven-year-old boy throwing dirt clods at the custom’s informer’s house surely never expected to be torn apart by a lead musket ball. Crispus Attucks and the others on King Street were probably used to brawling—but they hardly expected to be ripped apart by volleyed musketfire in their own hometown.

They should not have been so surprised, because organizations behave according to their training and habits and use whatever tools or weapons they bring to the situation. When we witness the extraordinary and unprovoked violence unleashed on unarmed citizens by police and soldiers on the streets of America, we are shocked to discover the violence of their habits and training. Yet it was always there, like the tear gas and stun grenades in their lockers, waiting to be used. Previously it was used on a select vulnerable population, off-camera. Now it is center stage, happening on a grand scale in broad daylight to citizens of all races, ages, and backgrounds. It is happening to journalists even as their cameras are rolling on live TV.

Finally, the ghosts tell us, images are forever. Paul Revere’s print, made from his engraving of Pelham’s depiction, survives today as the definitive visual, of that event. What we are witnessing in the streets of America today is also a reaction to a horrific image—in this case a video of a slow-motion murder that plays out for almost nine agonizing minutes. That galvanizing image will forever haunt our nation. And like the Paul Revere triptych, its is woven into a narrative, connected to a train of other images, all of them frames in a dark movie about an America whose existence we have denied for far too long: the postcards of picnickers at lynching sites; Emmett Till’s ruined face in his casket; Rodney King beaten and beaten forever by the side of a freeway; and now the myriad new images of police beating and shooting and tear-gassing our neighbors.

When the grenadiers on King Street were taunted and hit by snowballs, they defaulted to their basic training and showed their true colors: they were indeed willing to shoot and kill their American cousins—to treat them as the enemy. 

Even as the worst of our leaders repeat the blundering, provocative, divisive policies of General Gage, far too many of our police and National Guard have shown us their true colors. They are indeed willing to treat their fellow Americans as the enemy.

They have become the redcoats.

 

[1]The Boston “Massacre,” Historical Scene Investigation (H.S.I.), College of William and Mary. https://hsi.wm.edu/cases/boston/boston_documents.html#doc2.  See also David Hackett Fisher’s vivid account in Paul Revere’s Ride, Oxford University Press, 1994, pp. 23-25.

 

[2] Quotes from The Boston Gazette and Country Journal, March 11, 1771. https://www.masshist.org/dorr/volume/3/sequence/458.

[3] Gage to Barrington, Aug,. 5, 1772, cited in Fisher, p. 379.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177683 https://historynewsnetwork.org/article/177683 0
The Lenin Plot: The Concealed History of The US-Led Effort to Overthrow the USSR

 

 

 

With the United States again accusing Russia of election interference, and Moscow again accusing Washington of rattling sabers on the Russian border, there’s been talk of a “new” Cold War. But what exactly does that mean? What happened to the “old” Cold War?

 

For years the prevailing narrative said that the Cold War against the Soviet Union started shortly after the end of World War II. Financier Bernard Baruch coined the term in a speech he made before the South Carolina legislature in 1947. He warned that a “new kind of war” was being fought “in which guns were silent; but our survival was at stake nonetheless.” As it turned out, the guns were not silent, and the Cold War went on to include hot wars against Soviet surrogates in Korea, Viet Nam, Cuba, and other trouble spots. And it wasn’t just a shooting war. It was also an attempt by each side to defeat the other side politically, economically, and culturally. The difference was that the Cold War was not an officially declared war, as both the world wars had been.

 

But the true origin of what President Kennedy called “the long twilight struggle” goes back much further, as I explore in my new Cold War history, The Lenin Plot: The Untold Story of America’s War Against Russia, published by Pegasus Books in New York and Amberley Publishing in the U.K. The Lenin Plot was a two-pronged operation, to (1) invade Russia and defeat the Red Army, and (2) stage a coup in Moscow, assassinate Soviet dictator V.I. Lenin, and get the country back in the war.

 

The plotting began shortly after Lenin seized power from the Provisional Government on October 24, 1917. Lenin called it the “Bolshevik Revolution” and the “Great October Socialist Revolution.” But it wasn’t a true revolution, a general uprising of the country. That had already occurred, in February 1917. Russian historians now see Lenin’s takeover as a military coup. 

 

The Western Allies were alarmed at the bloodbath the Bolsheviks were conducting against innocent civilians. The Provisional Government had declared political amnesty for all, but the paranoid and vindictive Lenin wanted his enemies exterminated. In the Western view, Lenin’s coup was actually a counterrevolution that returned Russia to old tsarist days of widespread terror, torture, and mass murder. 

 

The Allies were also alarmed by Lenin’s secret deal with Germany. Berlin had sent millions of marks to Lenin’s agents in Stockholm, and they laundered the money and passed it along to the Bolsheviks to finance a coup against the Provisional Government. In case of success, Lenin would take Russia out of the war, allowing Berlin to redeploy army divisions to the Western front, the main battleground. Speaking of this deal, Lenin said, “We would have been idiots not to have taken advantage of it.” 

 

Secretary of State Robert Lansing, a bored pacifist who usually sat doodling in Cabinet meetings with President Wilson, sprang into action and used the State Department as a bully pulpit to demand immediate action against Lenin. Lansing told Wilson that the United States had to stage a coup in Moscow and install an Allied-friendly “military dictatorship.” Lansing suggested U.S. funds be sent to the French and British as military assistance, and they could launder it for use in the plot against Lenin. 

 

“This has my complete approval,” Wilson told Lansing in December 1917. 

 

De Witt Clinton Poole, a young U.S. consul in Moscow and a former tennis star nicknamed “Poodles” at the University of Wisconsin, was sent on a secret mission to recruit a Cossack army in South Russia. But Poole found the generals down there too antagonistic toward one another to mount a coordinated attack on the Bolsheviks. Poole returned to Moscow without a new Caesar, but the fledgling Lenin Plot was not quit. It merely segued into 1918.  

 

Poole became Washington’s spymaster in Russia. His chief field officer was Xenophon Kalamatiano, a University of Chicago track star who had sold tractors in Russia before the Allied embargo. Kal recruited dozens of assets, including the head of the Red Army’s communications office. Poole and Kalamatiano sent their reports to U.S. ambassador David Francis, a bourbon-sipping old Confederate who forwarded them to the State Department’s Bureau of Secret Intelligence, predecessor to the CIA and NSA. 

 

One of Kal’s closest spy colleagues was Henri de Verthamon, a French saboteur who wore a black trench coat and beret, and slept with his explosives under his bed. Another was the impressively named Charles Adolphe Faux-Pas Bidet, who had worked the Sûreté’s case against Mata Hari. The British Secret Intelligence Service (later MI6) was represented by Sidney Reilly, a freelance Russian adventurer and drug addict who had visions of himself as another Napoléon. The British Foreign Office sent Bruce Lockhart, a footballer susceptible to the charms of exotic women, one of whom, Maria Benckendorff, was a triple agent serving Britain, Germany, and the Soviets. Then Boris Savinkov, an experienced Socialist Revolutionary terrorist, was added to the Western plot. Savinkov was also a drug addict; he saw himself as Nietzschean Superman immune to bullets. He and Reilly advanced the conspiracy from a simple capture of Lenin to an assassination plot. 

 

Allied forces invaded Russia and fought the Red Army in an attempt to support the Moscow plotters. But Wilson and Prime Minister Clemenceau made a mistake in placing U.S. and French troops under British command. Most of the British officers were mental or physical rejects called “crocks” or the “hernia brigade.” They resented being shuffled off to a “sideshow” like Russia, and took out their anger on the American and French troops under them. The crocks arrived with 40,000 cases of Scotch whiskey, and their drunken incompetence caused Allied battlefield deaths. The Yanks and poilus retaliated by staging mutinies against the British. One doughboy walked up to a crock, told him to say his prayers, and shot him dead. Sanity finally arrived after the British commander was sacked and replaced with Brigadier General Edmund Ironside, a decorated officer from the Western front. The troops loved him. 

 

Lenin was shot and seriously wounded by Fanny Kaplan, a hardened Socialist Revolutionary terrorist. Allied agent Savinkov said he gave Kaplan her pistol. The shooting caused a dramatic escalation of the Red Terror, resulting in thousands of deaths. Thousands more casualties were counted in the combat zones. 

The Lenin Plot was a massive embarrassment for the Allies, and they tried to cover it up. The denial continued for years. President Roosevelt said a “happy of tradition of friendship” had existed between the U.S. and Russia “for more than a century.” President Reagan in a television address said “our sons and daughters have never fought each other in war.”

 

 *   *   *

 

I first found out about the Lenin Plot when I was a student at Tulane. I met a gentleman at the university library who had known some young men in Paris in the twenties who served in the war against Russia. I’d never heard of that. I started checking.

 

The internet was a primitive tool at that time, so I consulted bound volumes and microfilms of the London Times, the French L’Illustration, and the Literary Digest, an American news weekly. Times coverage stood out because their Russian articles were written by historians and former military officers. The newspaper also published an encyclopedia, The Times History of the War, which covered the Russian campaign in detail.

 

A few other histories of Allied involvement in Russia had been written, and some were helpful in providing leads. But I learned to distrust a lot of “scholarly” research because many of the writers simply rewrote one another without verification. That’s hearsay, not original research.

 

Internet research is easier now. Old publications have been digitized and posted to the net, and I found a number of interviews that way. The Hoover Institution at Stanford and the national archives in Washington, Paris, and Kew, England, provided copies of many documents not available on the web.

 

But beware of censorship on the internet. The State Department on their website admits that certain documents have been “edited” before being posted, in order to “avoid impeding current diplomatic negotiations.” To get around that, I verified documents by using bound volumes of the State Department’s Foreign Relations of the United States, published long before the internet. Bound volumes of the Readers’ Guide to Periodical Literature also contained much valuable information not available on the web.

I shy away from websites with “wiki” or “.com” in the title. They, too, tend to run unverified information. I use “official” documents and the web only as a starting point, as a clue to what really happened. I recommend that researchers contact libraries, archives, and presidential libraries, and look for letters, notebooks, diaries, interviews, memoirs, autobiographies, photographs, and eyewitness accounts. You’ll find the truth only by persistent digging off the grid.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177737 https://historynewsnetwork.org/article/177737 0
The Battle of Salamis Opened the Door for Ancient Greece’s Golden Age

Battle of Salamis, Wilhelm von Kaulbach, 1868

 

 

 

 

Twenty-five hundred years ago in the Battle of Salamis (dated to September, 480 BC), the ancient Greeks defeated the invading Persians and paved the way for Greece’s Golden Age of the 5th century, BCE, a foundation period for Western Civilization. 

 

By the late 6th century BCE, the Persians had come to dominate numerous peoples and reigned as the superpower of the era. At its height, the Persian Empire consisted of twenty provinces and stretched from the Indus River in the east to northern Greece and Egypt in the west.

 

At this time ancient Greece, or Hellas as the Greeks called it, consisted of some 1500 city-states spread across the Greek mainland, the Aegean Sea islands to the east, and Sicily and southern Italy to the west. The most important and powerful of these were Sparta, a highly regimented city-state (polis) with a mixed political system and an invincible army, and Athens, a democratic polis with the largest population and navy in all of Hellas.

 

As the Persian Empire expanded westward into Asia Minor (current day Turkey), it came to dominate a number of Greek city-states on its western coast and on the islands in eastern Aegean Sea. In 499 BCE, this domination became intolerable to some city-states and they rebelled, calling on other Greeks for assistance. Athens responded and provided support. Though the revolt was suppressed, King Darius of Persia never forgave the Athenians for their audacity in challenging him. Legend has it that at dinner he ordered a slave to say three times: “Master, remember the Athenians.”

 

Persia had launched two earlier expeditions which did not bring success. The first in 492 BCE proved disastrous. The second in 490 BCE ended in the stunning victory for the Greeks, led by Athens, at the Battle of Marathon. (Our current day marathon is 26.2 miles because this was the distance that the messenger, Pheidippides, ran from the battle site of Marathon to Athens to announce the victory.) 

 

In 480 BCE, Persia, now led by Xerxes, renewed its campaign with overwhelming force. The ancient historian Herodotus indicated that 300,000 Persian allied forces crossed the Hellespont into northern Greece and faced Greek forces perhaps one-third that size. In his play The Persians, the Greek playwright Aeschylus, who fought in the battle, indicated that the Greeks had 310 ships facing a Persian allied fleet of 1207 ships. 

 

After defeating the Greeks, led by Leonidas and 300 valiant Spartans, at the Battle of Thermopylae, the Persian force marched south to Athens, now essentially evacuated, and sacked it. Most of the Athenians and other unconquered Greeks had withdrawn to the island of Salamis or manned the Greek fighting ships, the triremes. 

 

While the Spartans argued for withdrawal and the defense of the Peloponnesian Peninsula, the Athenian leader Themistocles won the debate on the strategy. His plan for defeating the Persian navy was simple: Lure the large Persian navy northward into the narrow strait feigning withdrawal, neutralizing its superior numbers, and then attack. 

 

To set the hook, he arranged for a slave, Sicinnus, to give the Persians false information: The Greeks were squabbling and were in disarray. They planned to withdraw the next day. Eager for victory, Xerxes took the bait. 

 

On September 29, 480 BCE, the Persian fleet—its rowers already in action for 12 hours—advanced into the trap. In his play Aeschylus relates the action at dawn:

 

 “…first there came from the Greeks the sound of cheerful singing, and the island rocks loudly echoed it. Fear struck all the Persians who had been disappointed in their hopes. For the Greeks were not singing their hymns like men running away, but like men confidently going into battle. The noise of the war-trumpet on their side inflamed them all.”

“It was possible too to hear shouting: ‘Sons of the Greeks, forward! Liberate your country, liberate your children, your wives and the temples of your gods, and the graves of your ancestors. The fight is for everything.’” 

He also paints the picture of the utter defeat of the Persians. 

 

“The sea was full of wreckage and blood. The beaches and the low rocks were covered in corpses. Every ship rowed in a disorderly rout, every one of the Persian fleet. … Wailing and shrieking covered the sea until dark night put an end to it. I could not finish telling you of the terrible happenings even if I were to relate them for ten days. Of the one thing you can be sure, never in one day did such a multitude of men die.”

Xerxes observed the action from the heights above the strait. Aeschylus envisioned his reaction to the disaster. 

 

“Deep were the groans of Xerxes when he saw this havoc; for his seat, a lofty mound commanding the wide sea, o’erlooked his hosts. With rueful cries he rent his royal robes, and through his troops embattled on the shore gave the signal for retreat.”

 

Salamis has come down to us as a key event in the early history of Western Civilization. If the Greeks had succumbed and came under the Persian “barbarian” yoke, ancient Greece probably would not have experienced its Golden Age in the 5th century BCE, with all its achievements: scientific inquiry of the natural world free from religion, philosophy, architecture, sculpture, mathematics, organized athletic competition, the realization of the world’s first democracy and the enrichment of the idea of freedom. 

 

Charles Freeman in his book, The Greek Achievement: The Foundation of the Western World, gives due adulation to the Greeks for the victory. However, he argues that it was the land Battle of Plataea, the succeeding year, which was more decisive. “It had dislodged the Persian forces from Greece and sent them home in humiliation and so, possibly, had changed the course of European history.” This is true; however, without the decisive naval battle of Salamis there would have been no decisive land battle of Plataea. 

 

The Greeks today have been celebrating the anniversary of this battle to include the staging this summer of the play The Persians, at the remarkable ancient amphitheater at Epidauros, which I was lucky enough to visit fifteen years ago.

 

Independent journalist John Psaropoulos witnessed the play and noted that the audience erupted in applause when the Persian queen Atossa asked of the Greeks, “Who is their master and commander of their armies?” The chorus leader answered: “They call themselves nobody’s slaves, nor do they obey any man.” 

 

Contributing editor Fred Zilian (zilianblog.com; Twitter: @FredZilian) teaches Western Civilization and politics at Salve Regina University, RI.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177681 https://historynewsnetwork.org/article/177681 0
Winners, All: A Personal History of Soldiers at War

 

 

 

 

According to multiple, increasingly unimpeachable reports, President Donald Trump disparaged those who served in the military, including the wounded and the dead who defended the United States. He called them, “suckers and losers.” He indicated they were fools to have served their country. 

 

When I went to Vietnam in 1966 as bureau chief for NBC News, I thought my assignment would be straightforward: run the bureau, decide which stories correspondents and cameraman would cover, ship those stories on time, be the best I could be in competition with the CBS and ABC, and keep an accurate set of books. My bosses in New York asked nothing less from me than to cover a war that consumed America. I had a rotating staff of Americans, Vietnamese, Koreans, Japanese, French, British and Germans.  

 

Sometimes members of my staff got hurt, suffered minor wounds, and had other illnesses that put them into hospital for care and recovery. That meant I put on yet another hat, that of visitor to the American Army 3rd Field Hospital at Tan Son Nhut Airbase just northwest of Saigon. 

 

Because of the generosity of the American military, and the respect given to NBC News, when someone on my staff got ill or hurt, I was able to secure for them a bed at 3rd Field. When I had someone in hospital I always visited him, sometimes three times a week. When I made those visits I learned more about our fighting men than anything I saw in combat. With as many as 1200 beds to service Army, Marines, Navy and Air Force wounded in combat, 3rd Field was a microcosm of the men who fought and suffered in the war.  

 

It was there that I got my real baptism of fire in the war. It was there that I saw the results of a war, mostly in the horrific wounds that we did not often report because we never got around to it properly. It was there that I learned about the unflagging spirit of young men who would never be the same because of their serious wounds. The first time it happened came after visiting one of my staff who was recovering from an attack of malaria. As I was leaving a nurse approached me and said, “ We are really shorthanded today. Can you help us feed the men?” I wanted to say no but I could not refuse her request and said, yes. There it was, a new role. I had become a volunteer to help them with their duties. It would become something I did after every visit to the hospital. Here is an excerpt of that first experience as edited, from my oral history, “The Soldiers’ Story.” 

 

“As we spoke, doctors, medics, and nurses were on the incoming ramp outside the hospital, receiving a large number of severely wounded men who had just arrived from an ambush near the Cambodian border. True, as usual, the hospital did not have enough staff. The recently heavy influx of wounded demanded the staff’s full attention. It needed help. I suddenly became part of what the hospital needed. 

 

"It was lunchtime, the hour to feed the men on the ward. A medic brought me a kitchen door. He gave me a small trolley loaded with food, and an apron and handed me a list of names to go with each tray. He then started me down a wide aisle with a long row of beds on each side. It felt like the inside of a World War II Hollywood movie—only this was real. One row of beds ran along the outside wall, which had large windows with white adhesive tape in crisscross patterns to prevent flying glass if bombs or rockets hit the building. The other row lined up against the inside wall, with a seriously wounded man in each bed. I planned to open their tray table, swing it up, around, and over their prone bodies, hand them the tray, and walk away. That proved unrealistic and impossible. Some of these men had no hands, no arms, no legs. They had so many serious wounds; they could not eat without help. It was the middle of 1967. I had been in Vietnam more than a year, and I had seen my share of horror. But being in the presence of so many wounded in one place was very difficult. As I marched down the aisle distributing trays of food, I saw that I had to feed many of the men. Some were patient; others were not. One man, more a boy of less than twenty, his body swathed in white bandages, lay unmoving. But his eyes were bright—they burned with life’s fire. And he could talk.

 

“Hey, man, over here. Don’t ignore me!”

 

"I stopped and turned to look at him. There seemed to be so little of him left, but he was still alive. Here was a young man who had held out for life when faced with almost certain death. The futility surrounding his future would come much later in his recovery. Now he was in charge, and he demanded service.

 

“Get that food over here. I’m hungry. I want to eat. Feed me.”

 

I moved over to him, unwrapping the tray as I approached his bedside. Wrapped in bandages and a plaster cast from his head to his toes, he resembled a mummy from a 1930s’ film. There were two black holes for his eyes, two black holes for his nostrils. His mouth was a larger black hole in his white-bandaged head. So I fed him. One spoonful at a time. Spoon by spoon. Slowly.

 

“More,” he said.

 

“Faster,” he said.

 

He demanded attention, and I readily complied. Then his tray was empty. There was no more food. His glass of water was empty. He could suck nothing more through his straw. There was nothing more for him to drink.

 

“Good, man,” he said.

 

He sighed deeply and was quiet. I moved away and distributed the rest of my trays. This was gut-real. War is mostly what is in front of you at the moment. War for me then was the seemingly hopeless situation of that blond-haired youth. But he was not helpless. I learned that, though badly wounded, their individual spirits were strong, and that these young men had an enormous gusto for life." 

 

These wounded men, mostly all young, did not make the war they signed up for. After seeing combat, many did not want to be in Vietnam any longer than they had to. Many complained but they stayed the course and finished their enlistment or the terms of their draft. The wounded men I saw at 3rd Field Hospital and talked to week after week, year after year, were something special.  Most had an enormous spirit and a gift for life unlike anything I had ever seen. No matter how seriously wounded, they belong in every parade. They should never be out of sight. We must never forget who they were, and who they now are. Whether they knew it at the time, they were the spirit of America. Today they still are. Winners all, they were not then suckers, losers, fools or mugs.  

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177725 https://historynewsnetwork.org/article/177725 0
Paris, City of Dreams: Napoleon III, Baron Haussmann and the Creation of Paris

Avenue de l'Opera, Camille Pissarro, 1898

 

 

It has been a long, long wait.  After years of dreaming and months of planning, your first trip or perhaps equally anticipated return-trip to Paris fell apart due to the global outbreak of COVID-19.  In cancelling your airline tickets and hotel, the chance to walk the cobblestone streets of the “City of Light” vanished with a sigh and considerable disappointment.  Yet, you are strong and remain hopeful for the future.  Imagine: One year and some months have passed, and a vaccine has allowed the world to rediscover its passion for international travel.  While emerging from the underground Metro subway system near the Eiffel Tower, you contemplate your good-fortune and brim with excitement.  You are not only alive…but you can now fully live again.

On an elevator to the top of the “Iron Lady” – a popular nickname for the Eiffel Tower, you glance at the mesmerizing views of the city and become aware of the diversity of your companions.  Surrounded by people from Brazil, the Netherlands, Saudi Arabia, Senegal, South Korea, Spain, Taiwan, Thailand, Venezuela and elsewhere, it seems the entire planet has convened to celebrate the end of the COVID-19 era in Paris.  After purchasing a delicious made-to-order crepe from an outdoor kiosk near the Louvre, you take a short stroll and wait in-line for twenty minutes to sip one of the finest hot chocolates in Europe at Angelina – a world-famous chocolatier and tea house established in 1903 on the Rue de Rivoli.  

While gazing at the chic Parisian elite at Angelina, two questions come to mind: 1) When was modern Paris constructed? and 2) Who was responsible for planning the streets and developing the distinctive white apartment buildings that define this stunningly beautiful city?  In the new monograph, Paris, City of Dreams: Napoleon III, Baron Haussmann and the Creation of Paris (2020), French historian Mary McAuliffe has delivered a highly-engaging and enjoyable narrative on how the “City of Light” achieved its stunningly beautiful architectural and structural character – a book recommended for anyone visiting Paris.

Visions of Grandeur: A New Nation, A New Paris

In early 1848, waves of revolt and revolution swept through the German states, the Italian states, the Austrian Empire, Denmark, France and elsewhere across Europe.  Decades and centuries of fossilized, monarchical regimes became besieged with demands for popular representation in government.  On 10 December, the newly-proclaimed Second Republic witnessed the election of Louis Napoleon – the nephew of Napoleon Bonaparte – to the presidency.  Rather than a majestic metropolis, contaminated water, dirt, suffocatingly narrow streets and dilapidated, overcrowded housing teeming with restless citizens defined the city of Paris.  At the outset of his term, the French president developed a strategic vision to transform the capital into a worthy symbol of the nation by recreating the city to enhance the beauty and the lives of its proud inhabitants.  Near the conclusion of his constitutionally-mandated one term in office, Louis Napoleon orchestrated a coup d’état, seized power as Emperor Napoleon III and ended the Second Republic in 1851.  As McAuliffe deftly notes, the president-turned-monarch began the recreation of Paris one year later by partnering with two Jewish brothers, Emile and Isaac Pereire, and their banking house - Crédit Mobilier – a rival to the financial empire of the Rothschilds.  In the Pereires, Louis Napoleon secured the requisite means to underwrite the reconstruction of Paris through bond-issues to raise capital.  

To accomplish the vast undertaking to revamp Paris and lift more than 600,000 of its residents out of squalor (out of a population of one million), Louis Napoleon selected the Prefect of the Bordeaux – Georges-Eugène Haussmann.  Louis Napoleon and Haussmann fostered a complimentary working relationship and commenced upon bringing to fruition the formerly proclaimed urban dream of Napoleon Bonaparte when he once famously stated “I intend to make Paris the most beautiful capital in the world.” (p.51) From chapters three to thirteen, McAuliffe follows the three “systems” of development unleashed by Haussmann.  The first phase or “system,” which focused on the nucleus of Paris, commenced with the elongation of the Rue de Rivoli and continued with a significant reconfiguration of the Latin Quarter, the design of an exquisite park (Bois du Bologne) and the establishment of large markets at Les Halles.  In his attempt to add kilometers to the Rue de Rivoli, progress halted due to the disparate, steep grades of the streets.  Haussmann remained undeterred and promptly overcame the conundrum by elevating “most of the surrounding neighborhoods” to the same level with large-scale engineering tactics.  In 1858, Haussmann launched his second system – a vast and bold undertaking to sweep away considerable portions of the city and turn his urban vision of grandeur into a quotidian reality for Parisians of the nineteenth century and beyond.  On both the Right Bank (the northern side of the Seine River) and the Left Bank (the southern side of the Seine), Haussmann lengthened and widened streets, demolished old buildings, installed new sewage and water systems, remade the Île de la Cité (the small island on the Seine containing Notre-Dame cathedral (est. 1260), planted new trees and added water fountains around the Avenue des Champs-Élysées and redrew the boundaries of the arrondissements – the districts of Paris.  Most significantly, Louis-Napoleon’s grand architect redefined Paris by constructing a plethora of simple yet elegant, off-white apartment buildings that still stand and exude the romance of the city today. 

 

Apartment Block on Boulevard Haussmann

Regardless of a public backlash to the steep cost of the tripartite-phased project by the time of the Third System in 1861-62, Haussmann doubled the size of Paris, increased its population, engineered a far-more livable city for its residents and ultimately won-over many of his critics due to his aesthetically-inspiring designs.  For a number of artists and intellectuals, however, Haussmann symbolized empty notions of progress at the expense of Parisian communities with ties to the ancient past. (p. 116-171)

New Artists in Old Paris

Through each chapter, McAuliffe intersperses the architectural remaking of Paris with biographical pastiches of a new generation of artists and literary savants.  The novelist Victor Hugo, who had sided with the monarchy against the masses during the large-scale revolt by workers in June 1848, changed course eighteen months later upon the brazen consolidation of power by Louis Napoleon in a successful coup.  Beyond launching a “small resistance committee,” Hugo picked up his talented quill, issued a virulent broadside and accused the duplicitous monarch as being “someone who ‘lies as other men breathe.’” (p.10-14, 36-37) Beneath the façade of the newly-remade, bourgeois Paris, Hugo published the first segment of a literary masterpiece on the lives of the destitute and the working-poor – the largest segment of the Parisian population – on 3 April 1862.  By the end of June, copies of Les Misérables had sold-out at booksellers across the city. (p.166-171) His tale on the struggles of Parisians to survive on a day-to-day basis offered a riveting and eloquent contrast to the Paris of Louis Napoleon, Haussmann and the elites.  

In response and reaction to the triumph of Louis Napoleon and his authoritarian regime, Hugo, the female novelist George Sand, several emerging artists, including Édouard Manet, Claude Monet, Berthe Morisot, and a coterie of young intellectuals gathered in Montmartre (an area largely untouched by Haussmann) and/or the Left Bank near the Sorbonne and produced a vibrant, countercultural alternative to the new order defined by kleptocratic power and crass materialism.  Indeed, their brilliant, soul-fulfilling work continues to flourish in academe, on theatrical stages and in the art world today.

Conclusion 

In Paris, City of Dreams: Napoleon III, Baron Haussmann and the Creation of Paris (2020), Mary McAuliffe has written a superb historical synthesis of scholarship on the period and thus provides a near-perfect introduction to the “City of Light” in the mid-nineteenth century.  Future visitors of Paris will also profit by reading two previously published books by the author – Dawn of the Belle Époque: The Paris of Monet, Zola, Bernhardt, Eiffel, Debussy, Clemenceau, and Their Friends (2014) and Twilight of the Belle Époque: The Paris of Picasso, Stravinsky, Proust, Renault, Marie Curie, Gertrude Stein, and Their Friends through the Great War (2017) to fully explore the evolution of Paris and French society and culture from 1848-1914.  

It has been a long, long wait.  Hopefully, the world will receive a viable vaccine to COVID-19 in the coming months and Paris – a true “City of Dreams” – will once again become a reality for millions of excited, historically-curious travelers.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177739 https://historynewsnetwork.org/article/177739 0
Like Lincoln, Biden at Gettysburg Urges Reunification

 

 

On October 6, Joe Biden gave a 22-minute speech near the famous battlefield of Gettysburg, Pennsylvania. He began it succinctly, “On July 4, 1863, America woke to the remains of perhaps the most consequential battle ever fought on American soil. It took place here on this ground in Gettysburg. Three days of violence, three days of carnage. 50,000 casualties wounded, captured, missing or dead. Over three days of fighting.” In November 1863, President Lincoln came to the battlefield to deliver the Gettysburg Address, which historian James McPherson called “the most famous speech in American history . . . only 272 words in length and took two minutes to deliver,” short enough to be reproduced on the walls of D. C.’s Lincoln Memorial. 

On his website Biden displayed the necessary humility, referring to his own speech as only “remarks,” not suggesting that they rose to the level of Lincoln’s Address. About the latter Biden said, “His words here would live ever after. We hear them in our heads, we know them in our hearts, we draw on them when we seek hope in the hours of darkness.” And yet, even though Biden’s “remarks” did not match the oratorical greatness of Lincoln’s Address, they were significant--and timely. 

Timely because Biden put himself forward, as he has consistently done this year, as the leader best equipped to unite our fractured nation. Of the many problems facing us, many exacerbated by President Trump, the extreme division separating the Trump supporters from the rest of us is certainly central. 

More than any other of the many Democratic candidates earlier in 2020, Biden stressed the need to heal our extreme and festering political divisions. Sometimes, as occurred already in 2019, even to the point of angering other Democrats for being too compromising. The proper balance between political passion, tolerance, and compromise is certainly difficult. But if Biden is correct that this divisiveness (and sometimes even hatred) is a central danger to our nation, then it could be argued, as I have done, that more than anyone else, “Biden has a better chance of unifying our nation and delivering positive long-range results.” 

In his Gettysburg speech, alluding to Lincoln’s House Divided Speech of 1858, Biden stated that “once again, we are a house divided. But that, my friends, can no longer be.” He warned of our shipwrecked state being “on the shoals of anger and hate and division.” 

Again citing Lincoln’s words, this time his Second Inaugural--“With malice toward none, with charity for all, with firmness in the right as God gives us to see the right, let us strive on to finish the work we are in, to bind up the nation’s wounds”--he pledged to “work with Democrats and Republicans,” to “work as hard for those who don’t support me as for those who do.” For our times of bitter rancor, he offered the balm of trying to “revive a spirit of bipartisanship in this country, a spirit of being able to work with one another.” (For lists of the large numbers of Republicans, already opposing Trump and supporting Biden, including many conservative columnists, see here and here.)

Although Biden did not mention Barack Obama, the leader and friend he worked so closely with for eight years, his remarks also reflected the spirit of the former president. A spirit demonstrated in his keynote address at the 2004 Democratic National Convention, when he was still an Illinois state senator, in which he called for overcoming Red-state-Blue-state divisions, for overcoming “those who are preparing to divide us.” A spirit also demonstrated frequently as president, for example during his 2010 commencement address to University of Michigan graduates, when he told them, “We can't expect to solve our problems if all we do is tear each other down. You can disagree with a certain policy without demonizing the person who espouses it.” 

Unfortunately, however, this pragmatic president, temperamentally so well equipped to work with Republicans to achieve the common good, discovered little reciprocity from the likes of John Boehner and Mitch McConnell.

 

After Donald Trump succeeded Obama matters got worse, in large part due to Trump’s belligerent style, so amply demonstrated by him in his first debate with Joe Biden. It is ironic that many conservatives support Trump, but that Biden seems to realize much better than he the truth of the words of one of the fathers of U. S. conservatism, Russell Kirk (1918-1994): “The prudential politician . . . is well aware that the primary purpose of the state is to keep the peace. This can be achieved only by maintaining a tolerable balance among great interests in society. Parties, interests, and social classes and groups must arrive at compromises, if bowie-knives are to be kept from throats.” 

 

Proceeding further in his speech, Biden linked many of our other most pressing problems with our national divisiveness, with our extreme partisanship. One of these problems is racial injustice. . . “the product of a history that goes back 400 years, to the moment when black men, women, and children were first brought here in chains.”  Recalling recent “peaceful protests giving voice to the calls for justice,” Biden also mentioned “examples of violence and looting and burning that cannot be tolerated.” But unlike President Trump, who stresses only law and order but not racial justice, the former vice president stated that “we can have both,” and that our country needs “leadership that seeks to deescalate tensions, to open lines of communication, and to bring us together.”

He also linked the over 200,000 coronavirus deaths we have suffered to “the deep divisions in this country.” Wearing a mask, social distancing, testing, and developing a vaccine should “follow the science,” he said, and not be politicized. Echoing the rhythm of Obama’s 2004 Democratic Convention keynote address, Biden added, “The pandemic is not a red state versus blue state issue. The virus doesn’t care where you live or what political party you belong to.”

Finally, Biden targeted “the divisions in our economic life that give opportunity only to the privileged few. America has to be about mobility,” the type that enabled Lincoln, a child of the frontier, to “rise to our highest office.”

Throughout Biden’s speech a can-do, optimistic spirit prevails. It emulates not only Lincoln’s words, but also those of Franklin Roosevelt  and Obama. 

In his first inaugural address (1933), coming near the height of the Great Depression, FDR said, “This great Nation will endure as it has endured, will revive and will prosper. So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.” 

Similarly, in his first inaugural address (2009) in the latter stages of the Great Recession, Obama spoke of being in the midst of crises that included a “badly weakened” economy, lost homes, “jobs shed, businesses shuttered,” costly health care,  energy policies that “threaten our planet,” a “sapping of confidence across our land,” and “a nagging fear that America's decline is inevitable, that the next generation must lower its sights.” But Obama assured the nation that these challenges “will be met,” that our nation will choose “hope over fear, unity of purpose over conflict and discord.”  

More than a decade later with our inner political conflict and discord worsened by eight years of Trumpism, Biden at Gettysburg urged us to “talk to one another,” to  “respect one another,” to “love each other.” He promised to be a president that would “embrace hope, not fear. Peace, not violence. Generosity, not greed. Light, not darkness.” A president that followed the example of “Lincoln and Harriet Tubman and Frederick Douglass,” that represented an America that “welcomed immigrants from distant shores,” and broadened opportunities for women, minorities, and gays. A president that embraced “the dreams of a brighter, better, future.”

Near the end of his speech Biden once again echoed the spirit of Obama’s 2004  keynote address at the Democratic National Convention. “We can,” said Biden, “end this era of division. We can end the hate and the fear. We can be what we are at our best: the United States of America.”

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177741 https://historynewsnetwork.org/article/177741 0
My Wish for Trump Steve Hochstadt is a writer and an emeritus professor of history at Illinois College.

 

 

 

Trump has COVID. What do I wish for him?

 

That question provokes a variety of verbal contortions. Official Democrats wish him and Melania well. A conservative, Ross Douthat, plays defense, blowing up a bit of evidence into an assumption that any other President would have made the same early mistakes. More liberal Nicholas Kristof wants to abstain from offense, saying the main thing to do now is to avoid snark. David Barash at Daily Kos is refreshingly brutal. He won’t wish Trump well, and he’s right in everything he says. I’m sure there will be many more efforts to publicly acknowledge the emotional, moral, and political battle between our better and worse angels.

 

We have been schooled to believe we should always wish the best to everyone, even to a man who epitomizes hate toward one’s opponents. When Hillary had the flu, Trump mocked her. Our President in a time of plague is the greatest source of public misinformation about it, says Cornell University researchers, after studying 38 million articles about the pandemic. If anyone deserved to get coronavirus, it’s Donald Trump. But let’s still play nice.

 

I won’t play nice, but I don’t want Trump to die, or even become deathly ill. That would not just be bad for him, but bad for my wishes for our national future. I want Trump to get well, to live many years beyond the end of his Presidency.

 

The last thing we need is for the Republicans to be able to validate his pose as the ultimate victim, so they can transform Trump into a martyr, even if it is to his own stupidity.

 

I am gleeful at the prospect of dozens of tell-some books by those who were present for Trump’s outrageous behavior, who heard what he said. Publishers will dangle millions of dollars in front of people who have thus far demonstrated little spine or conscience. Some of what they say will stick to his image like obscene Post-Its.

 

I look forward to countless court cases around Trump, a later life of defending himself for his whole life thus far. Eventually the accumulation of evidence and judgments will prove to any reasonable person that he was and is a crook, a fraud, a failure in everything but inherited privilege. I recognize how many unreasonable people there are in America, who could never be convinced of any truth about the object of their idolatry. Their fantasies will disappear into the dustbin of history, then reappear in some other guise as another generation of deluded souls gets taken in by the latest con. But some of the Americans who were duped by the greatest con man of our lives will eventually realize that they had no idea what was really going on. The history books will paint a damning portrait of Trump.

 

I look forward to the Republican Party explaining how it became Trump’s slave and where it is going now. The Never Again Trumpers still have a lot of squirming to do about their role in creating such low-hanging fruit for their most dangerous adherents. We all need to confront our participation in systemic racism, but most of all the systematically racist Republican Party. That could bring us a little closer to a just society.

 

That could only happen if the ultimately privileged Mr. Trump has some mild case of this flu, and recovers quickly enough to continue his reign of terror on the country of his birth for just a few more months. Then I look forward to the crash.

 

I want Trump to get better, but I don’t wish him well.

 

Steve Hochstadt

Jacksonville IL

October 6, 2020

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/blog/154415 https://historynewsnetwork.org/blog/154415 0
Trump's Opportunities to Thwart Democracy

 

 

 

Americans keep asking, “Can it possibly get worse?” and each day we discover that it can. During the first presidential debate, Donald Trump refused to commit to a peaceful transfer of authority if he loses in November, arguing that he did not have to abide by the results of a “rigged election.” This can no longer be dismissed as Trump being Trump. This is a threatened coup.

 

Five times the United States faced similar crises, and in four cases the “losing” candidate and his party accepted the decision, placing nation over political power. However, one case led to civil war, and in at least two of the cases the nation suffered from the result.

 

In 1800, Thomas Jefferson was elected President as a result of the 3/5 clause giving added electoral votes to Southern slaveholding states. President John Adams, who was seeking reelection, accepted the result. Adams’s decision not to contest the election led to the first political transition in the United States and established a precedent the country has largely followed for over to two hundred years. The other problem in 1800 was that the way the Constitution was originally written, the leading candidate became President and the second place candidate became Vice-President. Jefferson and his Vice-Presidential candidate Aaron Burr had the same Electoral College vote total so this had to be sorted out in the House of Representatives, which chose Jefferson over Burr. This fiasco led to passage of the 12th Amendment to the Constitution clarifying that Presidential and Vice-Presidential candidates were designated in advance. 

 

The 12th Amendment also established that if the election was thrown into the House of Representatives, each state delegation polls its members and has a single vote. If the 2020 election ends up in the House of Representatives, California’s 53 members representing 39.5 million people will have the same voting power as Wyoming’s single member representing a little over 500,000 people. Some commentators speculate that getting the election thrown into the House may be part of Trump’s reelection strategy because although the Democrats will likely have a clear majority, Republicans may control a majority of the states.

 

In 1824, four candidates split the electoral vote so none received the majority needed for election. Although Andrew Jackson had the largest electoral and popular vote total, the House of Representatives, under provision of the 12th amendment, selected the runner-up, John Quincy Adams. Adams won because the 4th place candidate, Henry Clay, who was no longer eligible, disliked Jackson and threw his support to Adams. Jackson’s supporters in the emergent Democratic Party later swept the 1828, 1832, and 1836 elections.

 

But in 1860, crises and collapse could no longer be avoided. Abraham Lincoln, the candidate of the Republican Party, was a regional candidate who did not even appear on the ballot in ten Southern states. Lincoln secured under 40% of the popular vote but 60% of the electoral vote because the Democratic Party was split and also nominated regional candidates. In response to Lincoln’s election, eleven Southern states, anxious to protect slavery, tried to secede from the federal union and plunged the United States into civil war.

 

The 1876 election result was corrupted when competing slates claimed victory in three Southern states where former Confederates were trying to regain local control and to throw out Reconstruction governments committed to protecting the rights of formerly enslaved African Americans. Although the Democratic Party candidate Samuel Tilden appeared to have strong majorities in both the popular and electoral vote, a special committee appointed by Congress with seven Democrats and eight Republicans awarded all the disputed electoral votes and the Presidency to Republican candidate Rutherford Hayes. In exchange for Democratic Party acquiescence, Republicans agreed to end post-Civil War Reconstruction, effectively abandoning Southern Blacks to Jim Crow white-controlled governments and laws for the next 100 years. 

 

In 2000, a Republican majority on the Supreme Court voted 5-4 to block a recount in Florida, making George W. Bush President. In his concession speech, Democratic Party candidate Al Gore simply said “Let there be no doubt, while I strongly disagree with the court's decision, I accept it . . . for the sake of our unity as a people and the strength of our democracy, I offer my concession.” The 2000 Supreme Court decision may establish a precedent for 2020 that the Supreme Court, which will have a Republican majority, may decide the outcome of the election.

 

The Trump campaign’s unsubstantiated challenge to the legitimacy of mail-in ballots is angling to throw the election into the courts if he loses. A rightwing Republican majority on the Supreme Court could then support his claim and allow state governments controlled by Republicans to throw out disputed ballots or delay the results past the December 14, 2020Electoral College deadline so small state Republicans can throw the election to Trump in the House of Representatives. In either case, it would constitute a betrayal of democracy and an electoral coup.

 

This would set the United States up with for four more years of Trump’s and Republicans’ contempt for democracy and majority rule. What would this mean? In a frightening parallel to Germany in 1933, Adolf Hitler and the Nazi Party took control of the German Parliament and government as a minority party, and within eighteen months established a one-party dictatorship.

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177742 https://historynewsnetwork.org/article/177742 0
Life during Wartime 522

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/blog/154413 https://historynewsnetwork.org/blog/154413 0
The Roundup Top Ten for October 9, 2020

The Plot Against Whitmer Won’t Be The Last White Supremacist Threat

by Kathleen Belew

I'm very concerned that more violence is imminent, and that these ideologies pose an imminent threat to our democracy and to people going about their everyday lives.

 

Yes, Mike Lee, America is a Democracy

by Jonathan Bernstein

Mike Lee's insistence that the US is "a republic" and not "a democracy" is a petty distinction that ignores the historically interchangeable usage of the terms in American politics in order to justify undemocratic rule by a minority party. 

 

 

The Overlooked Queer History of Medieval Christianity

by Roland Betancourt

An attentive reading of the record shows that same-sex intimacy, gender fluidity, and diverse sexual identities were prevalent among early Christians, contrary to the claims made by some fundamentalists today that these represent deviations from historical norms. 

 

 

Why Heller is Such Bad History

by Noah Shusterman

Antonin Scalia's opinion in District of Columbia v. Heller ignored the actual history of the early American militia in order to invent an individual right to gun ownership.

 

 

What White Power Supporters Hear Trump Saying

by Alexander Hinton

Donald Trump's attacks on "political correctness" aren't calls for intellectual openness or academic freedom; they are coded messages invoking white grievance politics, including the longstanding idea that multiculturalism is part of a genocidal attack on the white race.

 

 

The Root of American Power

by Megan Beyer

"October is National Arts and Humanities Month. Observing what happens in America when we fail to protect them, invest in them, and recognize their value, is the best case that could ever be made for the Arts and Humanities."

 

 

A Brief History of the Taxpayer in Chief

by Margaret O'Mara

The revelation, at the height of the Watergate investigation, that Richard Nixon had abused deductions to avoid nearly all of his tax obligations initiated modern interest in presidential candidates' tax returns. 

 

 

Trump's Call for Freelance Poll-Watchers Summons a Dark History

by Nicole Hemmer

In 1981, the Republican National Committee used threatening signs and deployed off-duty officers to polling places in Black and Latino neighborhoods to help win the New Jersey governorship. This is the first presidential election year since the decree expired, making Trump's call for supporters to "watch the polls" ominous. 

 

 

Trump’s Attacks on Refugees Expose the Inadequacy of the Current System

by Carl J. Bon Tempo

The Refugee Act of 1980 is the law allowing the President to set an annual ceiling for refugee admissions to the United States, and is in urgent need of revision by Congress. 

 

 

Coronavirus Can Afflict the Powerful. Yet Food Workers Remain the Most Vulnerable.

by Angela Stuesse

The rollback of workplace protections under a generation of conservative state and federal administrations has made low-wage service workers acutely vulnerable to COVID. 

 

]]>
Tue, 27 Oct 2020 05:41:25 +0000 https://historynewsnetwork.org/article/177723 https://historynewsnetwork.org/article/177723 0