History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Wed, 24 Apr 2019 03:52:18 +0000 Wed, 24 Apr 2019 03:52:18 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://blog.hnn.us/site/feed To His Followers, Trump is a Folk Hero Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

There is nothing new in trying to figure out Trump. His appeal and his personality have been the subject of countless analyses and speculations since long before he ran for President. Yet the mysteries continue. Why do people like him? Why does he act so badly?

 

Charles Blow of the New York Times produced a thoughtful explanation of Trump’s popular appeal a couple of weeks ago in an opinion column entitled “Trumpism Extols Its Folk Hero”. Blow believes that Trump has become a “folk hero”, that rare person who “fights the establishment, often in devious, destructive and even deadly ways,” while “those outside that establishment cheer as the folk hero brings the beast to its knees.” Because the folk hero engages in the risky David vs. Goliath struggle against the awesomely powerful “establishment”, his personal sins are forgiven: “his lying, corruption, sexism and grift not only do no damage, they add to his legend.”

 

Thus the persistent belief among Trump’s critics that exposing his manifest dishonesty will finally awaken his base to reality is mistaken. His ability to get away with every possible form of cheating is part of his appeal, because he is cheating the establishment, the elite, the “deep state”, the “them” that is not “us”.

 

For his fans, the Mueller report is only the latest example of this extraordinary success. Despite years of investigation, Trump skates. It’s not important whether he was exonerated or not. What matters is that he can claim he was exonerated and go on being President, no matter what the report says, no matter what he actually did.

 

Wikipedia provides a list of folk heros, every one a familiar name, including Johnny Appleseed, Daniel Boone, Geronimo and Sitting Bull, Nathan Hale and Paul Revere, all people who really were heroic. The key early elements of the Robin Hood folklore, developed hundreds of years ago, are that he fought against the government, personified in the Sheriff of Nottingham, and that he was a commoner, who gave his ill-gotten gains to the poor.

 

That is one way to become a folk hero, but not the only one. Neither politics nor morality determine whether someone can become a folk hero. Wikipedia also tells us that the “sole salient characteristic” of the folk hero is “the imprinting of his or her name, personality and deeds in the popular consciousness of a people.” It would be hard to find anyone who has done a better job of doing just that for decades than Trump.

 

Villainy unalloyed by any goodness has also propelled many people, almost all men, into the ranks of folk heroes, like Jesse James, Butch Cassidy, and Bonnie and Clyde. These criminals captured the popular imagination, not despite being bad, but because of it. They were great in their villainy, outlaws in both the legal and social sense, stealing other people’s money for their own benefit, but that does not detract from their appeal. 

 

Enough people love bad boys that they can achieve legendary status, or even more rarified, a TV series. The popularity of series with villains as heroes demonstrates the broad appeal of bad people. “Breaking Bad” attracted giant audiences and honored by Guinness World Records as the most critically acclaimed show of all time. 

 

Since he first came into the public eye, Trump has reveled in being the bad boy. He grabbed women at beauty contests and bragged about it. He delights in his own running racist commentary on people who are not white. He lies when he knows he’ll get caught, and then keeps repeating it. He celebrates himself in his chosen role as the bad guy. Meanness was at the heart of his role in “The Apprentice”, where his greatest moments were saying “You’re fired!”

 

One writer recently asked, “Why does Trump fall in love with bad men?” Trump says nicer things about the world’s most notorious political thugs than would be normal for speaking about the leaders of our closest allies. After meeting North Korea’s Kim Jong Un, Trump told a rally ,“Then we fell in love, okay. No, really. He wrote me beautiful letters. And they’re great letters. We fell in love.” Trump met President Rodrigo Duterteof the Philippines in November. The White House said they had a “warm rapport” and a “very friendly conversation” on the phone. Trump said “We’ve had a great relationship.” Duterte sang the Philippine love ballad “Ikaw” to Trump at a gala dinner.

 

The prize goes to Trump’s open admiration for Vladimir Putin. During his campaign, Trump said he had met Putin and “he was nice”. Then said, “I never met Putin. I don’t know who Putin is. He said one nice thing about me. He said I’m a genius.” Putin never said that, but for Trump that made Putin “smart”. He claimed a “chemistry” with Putin. Here’s what Trump cares about: “He says great things about me, I’m going to say great things about him.”

 

Trump’s attraction to this international rogues’ gallery is personal and emotional. He wants the exclusive club of dictators, macho men, tough guys, to love him and to accept him as one of them. Donald Trump’s foreign policy is his attempt to become the leader of the bad boys of the world.

 

But at the heart of connection between bad boy folk hero Trump and his adulating base is a fundamental misunderstanding. Trump is not fighting the establishment. Trump is not using his powers to help his angry supporters. Trump is screwing them.

 

He attacks their health by eliminating rules which reduce corporate air and water pollution. He hasn’t stopped his repeated attempts to cut their health insurance by Medicare, Medicaid, and Obamacare. He is dismantling the bank and lending regulations overseen by the Consumer Financial Protection Bureau. Nothing good will come to average Americans from the foreign members of Trump’s club. These are all assaults on the standard of living, present and future, of non-elite America.

 

The 2017 tax cuts are the best example of how Trump betrays his base. Poor and middle-income Americans got small tax cuts, but also inherit gigantic future deficits to pay for the enormous cuts in corporate and income taxes for the very wealthy.

 

Trump is good at what he does, but that is bad for everybody else, especially for those who cheer him on.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/blog/154205 https://historynewsnetwork.org/blog/154205 0
The Value of Rating Presidents

A screen shot from the latest CSPAN Presidential Historians Survey. Click to go to the website. 

 

The performance of modern presidents is continually addressed in myriad public opinion polls as they serve. Presidential leadership surveys are intended take longer views of our chief executives, while also providing comparisons with their predecessors.  These surveys can be seen as the progeny of historian Arthur M. Schlesinger, Sr., who made the initial foray into the field in 1948 for the then widely-popular Life magazine. As historian Douglas Brinkley notes in his introduction to C-SPAN’s 10th book, The Presidents, Schlesinger's survey, conducted soon after Franklin Roosevelt's impactful presidency and at the dawn of the broadcasting age, demonstrated "a tidal wave of interest in this new way of looking at all presidents simultaneously," judging by the "bags of colorful letters" Schlesinger received about his results.

 

C-SPAN followed in Dr. Schlesinger’s footsteps beginning in 2000 with our initial Historians Survey on Presidential Leadership, which ranked every president from best to worst. It was envisioned as a more formal academic follow-on to a year-long C-SPAN biographical television series, American Presidents: Life Portraits.

 

Just as Dr. Schlesinger's project had in 1948, C-SPAN's 2000 survey attracted much attention in the media and among historians, and so it continues. C-SPAN has now conducted three such surveys—in 2000, 2009, and 2017—with ninety-one historians participating in our most recent ranking. We ask presidential historians to rank individual performance across 10 qualities of leadership. Conducted just as a sitting president leaves office, we aim to create a first assessment for history. In 2000, Bill Clinton, only the second president to be impeached, nonetheless debuted with an overall rank of 21 of the then 42 men. In 2009 George W. Bush entered the field on the heels of a global financial crisis and an ongoing war; historians ranked him 36th overall. In 2017, Barack Obama's history-making presidency debuted in our survey in an impressive 12th place. 

 

Our leadership qualities were developed with the guidance of three presidential historians: Douglas Brinkley of Rice University; Edna Greene Medford of Howard University; and Richard Norton Smith, biographer of Washington, Hoover, and Ford. Purdue University political scientist Robert X. Browning, executive director of the C-SPAN Archives, tabulates the survey results.

 

The 10 attributes are:

  • Public persuasion
  • Crisis leadership
  • Economic management
  • Moral authority
  • International relations
  • Administrative skills
  • Relations with Congress
  • Vision/setting an agenda
  • Pursued equal justice for all
  • Performance within the context of the times

 

C-SPAN's three surveys take their place among a few other contemporary rankings of presidents, such as the six-time Sienna College historians poll. No matter who's conducting the survey, the top three places seem cemented among the pantheon of Lincoln, Washington, and FDR. For the other presidents each survey is but a snapshot in time, judgments rendered, to borrow a phrase from Donald Rumsfeld, based on "known knowns," but without the benefit of "unknown unknowns."  Much has changed in the nearly two decades since C-SPAN ran its initial survey—three more presidencies; newly opened archival records; new history books written. Our base of historians has also changed with retirements, deaths, and new hires. Importantly, our society continues to transform, impacted by demographics, technology, and evolving sensibilities. These many factors make the changing assessments of the presidents fascinating to watch.

 

It's particularly interesting to see how presidents' rankings change after their debuts. Bill Clinton advanced six places between 2000 and 2009, to an overall rank of 15, where he remained in 2017, with his highest marks in "economic management," and "equal justice." In 2017, with the benefit of eight years' hindsight, George W. Bush edged out of the bottom 10 to an overall rank of 33, with his highest ranking (19) in "equal justice." Perhaps unsurprisingly his lowest numbers are in "economic management" (36) and "international relations" (41). When our next survey is done following the Trump presidency, Bush 43's numbers will be worth keeping an eye on. 

 

 

 

 

There are other notable changes:  Andrew Jackson, for instance, held the 13th spot in our 2000 and 2009 surveys. By 2017, he had fallen five places, the most of any president. Jackson placed just 38th in "equal justice," driven largely by increasing disapproval of his policies toward Native Americans. Another interesting change is the ascension of Dwight Eisenhower into the top five. Ike has moved up four spaces since 2000—the most of any president in the top 10 — as the "hidden hand' theory of Eisenhower's leadership gains wider acceptance.

 

Acknowledging that presidential lists are simple reductions of complicated realities, we're still pleased that they ignite popular interest in history. Presidential rankings play nicely into our national numbers obsession—they are quick, fun, and highly shareable on social media.  They foster lively debate about contemporary presidents and even help resurface some of our most obscure presidents; think Millard Fillmore (#37), or James Buchanan (dead last at #43).

 

Leadership surveys also serve a more substantive purpose. Our historians' rankings create useful data points for deeper analysis of a presidency and those "snapshots in time" become lasting metrics to assess evolving assessments of presidencies.  For C-SPAN, these leadership rankings also add important context to our ongoing public affairs coverage. Our network's archives store nearly every significant public moment by American presidents since Ronald Reagan—literally thousands of hours of video. The historians' rankings, particularly the individual leadership categories, complement that video, providing valuable metrics for today's and future generations to assess presidential effectiveness.

 

In this highly partisan age of President Trump, people invariably ask us how he fares on the 10 leadership attributes. We won't know for sure until the end of his presidency when, once again, C-SPAN will ask historians to formally evaluate him in the context of his predecessors. You, however, can use the ten metrics to form your own judgment of his performance at this midway point of his first term.  And as campaign 2020 gets underway, the metrics also provide a solid starting point for voters to assess the leadership skills of the long lineup of individuals vying to replace him.

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171781 https://historynewsnetwork.org/article/171781 0
The History of the Meaning of Life

 

Life's but a walking shadow, a poor player  That struts and frets his hour upon the stage  And then is heard no more: it is a tale  Told by an idiot, full of sound and fury,  Signifying nothing.

 

Thus, William Shakespeare in Macbeth. As the French novelist Albert Camus said, life is “absurd,” without meaning.

This was not the opinion of folk in the Middle Ages.  A very nice young Christian and I have recently edited a history of atheism.  We had a devil of a job – to use a phrase – finding people to write on that period.  In the West, Christianity filled the gaps and gave life full meaning.  The claims about our Creator God and his son Jesus Christ, combined with the rituals and extended beliefs, especially about the Virgin Mary, meant that everyone’s life, to use a cliché from prince to pauper, made good sense. The promises of happiness in the eternal hereafter were cherished and appreciated by everyone and the expectations put on a godly person made for a functioning society.

Then, thanks to the three Rs, it all started to crumble. First the Renaissance, introducing people to the non-believers of the past. Even the great Plato and Aristotle had little place for a Creator God.  Then the Reformation tore into established beliefs such as the importance of the Virgin Mary. Worse, the religious schism suggested there was no one settled answer.  Finally, the (Scientific) Revolution, showed that this plant of ours, Earth, is not the center of the universe but a mere speck in the whole infinite system. This system works according to unbreakable laws – no miracles – and God became, in the words of a distinguished historian, a “retired engineer.”

There was still the problem of organisms, whose intricate design surely had to mean something.  Blind law just leads to rust and decay.  And yet, organisms defy this.  If a clever technician set out to make an instrument for spotting things at great distances, the eye of the hawk is built exactly as one would predict. There had to be a reason.  As the telescope had a telescope designer, so the eye surely pointed to The Great Optician in the Sky. Along came Charles Darwin with his theory of evolution.  He showed, through his mechanism of natural selection – the survival of the fittest – how blind law could indeed explain the eye. Thanks to population pressures, there is an ongoing struggle for existence, or more importantly struggle for reproduction.   Simply, those organisms with better proto-eyes did better in this struggle, and over time there was general improvement.  The hawks with keener sighthad more babies!  They were “naturally selected.”

Darwin did not disprove God. When he wrote his great Origin of Species, 1859, he still believed in a deistic god, a god of unbroken law.  But he made it possible not to believe in God and to be, in the words of Richard Dawkins, a “fulfilled atheist.” More importantly, Darwin suggested that the deity is like the common perception of the God of Job, indifferent to our fate.  Thomas Hardy, novelist and poet, put it well.  He could have stood a God who hated him and caused untold misery. It was an indifferent God who crushed him.

Crass Casualty obstructs the sun and rain, And dicing Time for gladness casts a moan. . . . These purblind Doomsters had as readily strown Blisses about my pilgrimage as pain.

Christianity continued, of course, in the post-Darwinian era.  There were and are many believers who embrace Darwinism,although, as is too well known, in America especially there is a vibrant evangelical branch that denies evolution and embraces a literal reading of Genesis.  But what of those of us like Hardy?  He and I were raised asChristiansand – I think I can speak for him too -- all our lives we have been in our ways deeply religious.  An indifferent or absent God does not mean we stop worrying about questions on meaning and purpose.  In fact, we probably worry more, not so much because we are scared but as humans these are important issues for us.  Was Shakespeare right?  Is life no more than a tale told by an idiot, signifying nothing, full of sound and fury?  

“The Lord gave, and the Lord hath taken away; blessed be the name of the Lord’’ (Job 1: 21).  Darwin’s theory of evolution has done its fair share of taking away.  Can it also do some giving?  Many evolutionists think it can. In the 150 years plus since the Origin appeared, we find that people who worry about these sorts of things – and there are many who do –embrace one of two approachesto finding an evolution-influenced meaning to life. 

First, there are those who might fairly be called the “objectivists.”  Just as Christians think there is an external reality that rules our lives, so these Darwinians – although they have no promises of an eternity – believe that their theory imposes order and meaning on our lives.  This lies in the progressive nature of the evolutionary process – from blob to human, or (as they used to say in the past) from monad to man.  Evolution is not a slow meandering process going nowhere.  Despite setbacks it is on an upwards trajectory, ending in our species.  Order and meaning come out of this.  It is our duty to aid the evolutionary process and help it forward, or at least not to drop backwards.  

Today’s most enthusiastic evolutionary progressionist is the Harvard ant specialist and sociobiologist, Edward O. Wilson.  He has devoted the last thirty years of his life to the preservation of the lands where he did so much of his original research: the Amazonian rain forests. Wilson’s Darwinian commitments uncover his reasons. He is not interested in the rain forests as such, but rather as an aid and resource for humankind. They fill the heart with beauty and awe, an essential human emotion.  More practically, they still conceal many natural drugs that may prove of great value to humans.  Hence, we should preserve them.

The trouble with this approach is that Darwinian evolutionary theory is not progressive, at least not in the needed absolute sense. The key mechanism of natural selection says that anything may help the possessors in the struggle for existence.  This can lead to improvement or progress.  Those hawks with keener eyesight outbred those with lesser eyesight.  But it’s all relative.  A mole burying underground does not need keen eyesight.  Indeed, functioning eyes might be a problem, both because they are not used and are taking limited physiological resources and because they are prone to infection and consequent ill-health.  It is the same with the supposed superiority of humans.  Relatively, we may do well.  We were cleverer than the Neanderthals and look at who is around today. Absolute progress is another matter. Large brains are high maintenance and alternative strategies may be preferable.  In the immortal words of the paleontologist Jack Sepkoski: “I see intelligence as just one of a variety of adaptations among tetrapods for survival.  Running fast in a herd while being as dumb as shit, I think, is a very good adaptation for survival.” Cow power rules supreme!   

Is there a subjectivist alternative?  One that starts with evolution?  I believe there is, a position I (somewhat grandiosely) call “Darwinian existentialism.”  Sartre said that the key to existentialism is that God’s existence is beside the point. We humans are thrown into the world and must make sense of it ourselves.  Sartre also went on to say that there is no such thing as human nature. On this, as an evolutionist, I disagree strongly. Human nature, Darwinian human nature, certainly exists.  Above all, Homo sapiens is a social creature. We have evolved above all to need and be in with our fellow humans.  The great metaphysical poet, John Donne, hit the nail on the head.

No man is an island, Entire of itself, Every man is a piece of the continent, A part of the main. If a clod be washed away by the sea, Europe is the less. As well as if a promontory were. As well as if a manor of thy friend's Or of thine own were: Any man's death diminishes me, Because I am involved in mankind, And therefore never send to know for whom the bell tolls;  It tolls for thee.

That is the secret, the recipe, for a meaningful life in this Darwinian world.  First, family and the love and security that that brings.  Then society, whether it be going to school, shopping at the supermarket, or simply having a few drinks with friends, and sometimes strangers.  Third, the life of the mind. Shakespeare’s creative works are about people and their relationships – happy (Twelfth Night), sad (Romeo and Juliet), complex (Hamlet), doomed (Macbeth), triumphant (Henry V).  This is the life of meaning.  Take life for what it is.  Enjoy it to the full, realizing that the secret of true happiness is being fully human, taking from and giving to others.  And stop worrying about the future.  There may be one. There may not.  There is a now.

The lover of life knows his labour divine,

And therein is at peace.

The lust after life craves a touch and a sign

That the life shall increase.

 

The lust after life in the chills of its lust

Claims a passport of death.

The lover of life sees the flame in our dust

And a gift in our breath.

         (George Meredith 1870)

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171782 https://historynewsnetwork.org/article/171782 0
Slavery and the Electoral College: One Last Response to Sean Wilentz

Map of the Electoral College for the 2016 presidential election

 

 

In his History News Network response to Akhil Reed Amar’s argument that the Electoral College was conceived as part of the defense of slavery in the United States, Sean Wilentz welcomed continuing the debate on the relationship between slavery and the Constitution. In his essay, Wilentz reiterated his view, first presented in a New York Times op-ed, that the Electoral College, while deeply undemocratic, was not part of a constitutional defense of slavery and he argued that Amar’s position is deeply flawed. In fact, he dismisses Amar’s position as “illogical, false, invented, or factually incomplete.”

 

As evidence to support his position, Wilentz points out that at least two of the leading slave states preferred an executive chosen by Congress to the Electoral College. Of course, the South would have more power in such a system because the 3/5 Clause boosted the South’s representation in Congress. Wilentz asserts “proslavery concerns had nothing to do with the convention’s debates over what became the Electoral College.”

 

Actually, as explained by Pierce Butler in Madison’s Notes on the Constitutional Convention, the selection of the Executive was part of a four-pronged defense of slavery. Butler, a South Carolina rice planter, was one of slavery’s strongest defenders and one of the largest slaveholders in the United States. He introduced the Fugitive Slave Clause into the Constitution, supported the constitutional provision prohibiting regulation of the slave trade for twenty years, demanded that the entire slave population of a state be counted for Congressional apportionment, and championed an Electoral College with voters selected by state legislatures. Such a system permitted wealthy white men to represent white women, landless whites, and enslaved Blacks.

 

Wilentz is correct that some Southern representatives thought an Executive chosen by the national legislature would more effectively defend state prerogatives including slavery and that some Northerners feared empowering a broad electorate. But that does not change the fact that with the 3/5th Clause in place, both proposals to select the Executive by the national legislature and to do so through a separate Electoral College, were designed to ensure the continued existence of slavery.

 

In the New York Times op-ed, Wilentz claimed that the Electoral College would not have effectively helped the slave states in Presidential Elections. But between 1801 and 1837, every President except one was a slaveholder from a slaveholding state. The South held the Presidency for 28 of the 36 years. Despite Wilentz’s dismissal of the numbers, without the 3/5 Clause and the Electoral College, John Adams, not Thomas Jefferson would have been elected President of the United States in 1800. Even if the intent of the Electoral College was not to support slavery, though I believe the evidence shows it was, as Gary Willis argued in “Negro President,” Jefferson and the Slave Power (2003), it definitely entrenched slave power in the United States.

 

William Lloyd Garrison dramatically expounded a very negative view of the United States Constitution on July 4, 1854 when he publicly burned a copy at a rally sponsored by the Massachusetts Anti-Slavery Society. For Garrison the Constitution’s tolerance of enslavement condemned it as “a Covenant with Death, an Agreement with Hell” and he and his followers refused to participate in the electoral process because it gave legitimacy to the illegitimate.

 

In an 1832 editorial in The Liberator, Garrison elaborated on his position with greater detail. “There is much declamation about the sacredness of the compact which was formed between the free and slave states, on the adoption of the Constitution. A sacred compact, forsooth! We pronounce it the most bloody and heaven-daring arrangement ever made by men for the continuance and protection of a system of the most atrocious villainy ever exhibited on earth.” “Such a compact,” according to Garrison “was, in the nature of things and according to the law of God, null and void from the beginning.”

 

In March 1849, Frederick Douglass, who later modified his views, called the United States Constitution “a most cunningly-devised and wicked compact, demanding the most constant and earnest efforts of the friends of righteous freedom for its complete overthrow.”

 

The debate over the meaning of the Constitution and its origins as a pro-slavery document during the Abolitionist battle to end slavery in the United States gives us some clues to its deeper meaning as well as weapons in our own struggle to preserve what may be a fragile democracy in the United States today. As historians and public figures, we have an obligation to defend democratic institutions and expose vestigial anti-democratic elements like the Electoral College that threaten democracy, which includes a careful examination of their origin and history. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171783 https://historynewsnetwork.org/article/171783 0
The Unseen Significance of Jefferson’s Natural-Aristoi Letter to Adams  

On October 28, 1813, Thomas Jefferson crafts a letter to John Adams in a delayed reply to several other letters, written by Adams on his views of aristocracy, which cry out for a reply. Adams had written extensively on the subject in Discourses on Davila and “Defence of the Constitutions of the Government of the United States of America,” and continually complained thereafter that he had been much misunderstood, and has asked for Jefferson’s views. After several letters by Adams, Jefferson, months later, finally replies.

 

Jefferson’s reply (28 Oct. 1813) to Adams is too well-known to need introduction—almost every Jeffersonian biographer has something to say concerning it—yet too few recognize the philosophical significance of it. Jefferson’s letter is an eloquent and luculent summary of a political philosophy, as rich in substance as it is profound in its simplicity. As he believed that the truths of morality were few and straightforward, so too he believed that the principles of good governing were few and straightforward.

 

After some preliminary thoughts concerning interpretation of a passage on selective breeding by the Greek poet Theognis, Jefferson begins what amounts to a polite refutation of Adams’ views on aristocracy. His refutation underscores key differences between the two men’s views of thriving republican government. Both believed that republican government would thrive when the best men (Gr., aristoi) governed. Adams, however, insisted that the best men were those of wealth, good birth, and even beauty. “The five Pillars of Aristocracy, are Beauty[,] Wealth, Birth, Genius and Virtues. Any one of the three first, can at any time over bear any one of or both of the two last,” writes Adams in a prior letter. In support of that claim, Adams appeals to history. People have always preferred wealth and birth, and even looks, to intelligence and morality.

 

“I agree with you that there is a natural aristocracy among men,” Jefferson coltishly, and perhaps insidiously, concedes. He then proceeds to a distinction between “natural aristoi” and “artificial aristoi,” which amounts to disambiguation of the former through an attempt at a precise account of it. Pace Adams—for whom beauty, wealth, and birth individually or conjointly can trump talent and virtue—this natural aristoi for Jeffersoncomprises only the virtuous and talented. Jefferson adds, “There is also an artificial aristocracy, founded on wealth and birth, without either virtue or talents; for with these it would belong to the first class.” Jefferson’s phrasing here is cautious. Virtue and talent are sufficient to place one among the natural aristocracy. Lack of virtue and talent (more precisely, lack of either) is sufficient to exclude one. Jefferson’s distinction aims to refute Adams, and that refutation underscores the essential difference between Jeffersonian and Adamsian republicanism.

 

There is to note Jefferson’s use of “natural,” which is neither discretionary nor incautious, and is often overlooked by biographers. Nature (i.e., God) has foreordained, as it were, that the wisest and most moral ought to preside among men, if only as stewards—that is, primus inter pares. Consequently, the centuries-old practice of the aristoi as the wealthy and wellborn is a contravention, and corruption, of the dictates of nature.

 

Jefferson says more. He offers this rhetorical question. “May we not even say, that that form of government is the best, which provides the most effectually for a pure selection of these natural aristoi into the offices of government?” Nature has foreordained that genius and morality are the defining features of aristoi, so the best government is that which selects the aristoi. There is no place for the wealthy and wellborn in politics, unless they are also endowed with genius and moral sensitivity.

 

The rhetorical question leads naturally to other, relevant issues. All concern establishment of a sort of government, genuinely republican, in which a system is in place that selects for the true, natural aristoi. Who are to be the selectors? “The best remedy is exactly that provided by all our constitutions, to leave to the citizens the free election and separation of the aristoi from the pseudo-aristoi, of the wheat from the chaff.” The vox populi is not infallible, Jefferson acknowledges, but the people “in general … will elect the real good and wise.” Thus, the aristoi will for the most part assume political offices, though they will be watched and will serve for short terms.

 

How can we be sure that the people will “in general” select wisely?

 

On the one hand, the people have the same, if not better, moral sensitivity than those who are fully educated. Why? Moral “judgment” for Jefferson is immediate and sensory, and corrupted by the input of reason (e.g.., TJ to Thomas Law, 13 June 1814). Those schooled in morality in the main corrupt their sensory moral faculty by infusion or intervention of thought. A class on morality for Jefferson is as sensible as a class on hearing. One need not be schooled in hearing. One merely hears. That is why Jefferson was insistent that his nephew Peter Carr should eschew formal education in morality. It would likely be of more harm than of good. He writes to Carr (10 Aug. 1787): “I think it lost time to attend lectures in this branch. He who made us would have been a pitiful bungler if he had made the rules of our moral conduct a matter of science.”

 

On the other hand, Jefferson maintains here, and elsewhere, that hale republican governing, entailing selection and overseeing of governors by the people as well as governmental decisions consistent with vox populi, requires wholesale and systemic educational reform: public or ward schools for general education of all citizens, male and female; higher education for future politicians, educators, and scientists; and education of an intermediate sort (grammar schools) to take men from the ward schools to, say, the University of Virginia. For republicanism to thrive, all people need a general education—comprising reading, writing, and basic math, and perhaps also some history. Jefferson proposed such structural reform in his 1779 bill, Bill for the More General Diffusion of Knowledge, which failed to pass the Virginian legislature due to resistance by Virginian wealthy and wellborn citizens, who refused to be taxed to educate the masses. Virginia’s wealthy and wellborn already had access to quality education through private tutoring or private schooling, and that access, in Jefferson’s eyes, allowed for the perpetuation of their monopoly on governing.

 

And so, even though Jefferson championed thin government, he also and always championed such “infrastructure,” such internal improvements in affairs of wards, of counties, of states, and of the nation, that would most facilitate freedom of all citizens in their pursuit of happiness. Thus, he championed systemic educational reform. Thus, he championed laws eradicating entails and primogeniture. Thus, he championed religious freedom to eradicate “the [unnatural] aristocracy of the clergy.”

 

Jefferson too and most significantly championed science, understood then much more broadly than it is today understood. It was a patronage for which he would be criticized throughout his life. “Science had liberated the ideas of those who read and reflect and the American example had kindled feelings of right in the people.” He continues in a buoyant, rhetorical tone: “An insurrection has consequently begun, of science, talents and courage against rank and birth, which have fallen into contempt. … Science is progressive, and talents and enterprise on the alert.”

 

Jefferson adds before ending his exposition, “I have thus stated my opinion on a point on which we differ, not with a view to controversy, for we are both too old to change opinions which are the result of a long life of inquiry and reflection; but on the suggestion of a former letter of yours, that we ought not to die before we have explained ourselves to each other.” That addendum shows other key features of Jefferson’s political philosophy, reducible to Jefferson’s views on morality: Conciliation and friendliness are always preferable to confrontation.

 

It is often presumed today that Jefferson’s political philosophy—with a focus on thin government, agrarianism, self-sufficiency, full participation by all citizens insofar as talents and time allows, elected officials for short terms as stewards and not tyrants, free trade and amicable relations with all nations, and so on—is bewhiskered: that is, that its tenets cannot be instantiated because they are passé. Proof of that is the fact, most cavalierly assert, that we have gone politically much more in a Hamiltonian than in a Jeffersonian direction. Yet that argument is a fallacy of fatalism. That we have gone in a certain direction means neither that we could not have gone in another direction nor that we cannot still go in another direction. Jefferson’s political philosophy is not bewhiskered. It ought to be studied, reconsidered, and revitalized as an alternative the thick, intrusive government practiced today.

 

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171784 https://historynewsnetwork.org/article/171784 0
We Judge Presidents in part by Who Precedes and Follows Them

 

I’ve been interested in the presidency since I was 10 years old. On WWGH 107.1 FM, Marion, Ohio, (Home of Warren G Harding), I comment weekly on political events and the host, Adam Lepp, has granted me a title that does not really exist, although I think it should be a word—presidentologist. 

 

I am also fascinated by how one evaluates and ranks presidents. Previously, I’ve written about presidential ranking changes over time. Now, I want to discuss a different element of how historians and the public rank presidents: the “company they keep.” Presidents are often judged in comparison to who preceded and followed them. Some Presidents are fortunate to come before or after a President who was not a great success so they seem like a better president in comparison. Others follow and precede a President who was perceived as successful, dimming their reputation.

 

For example, Thomas Jefferson was in office after John Adams and before James Madison. In the latest CSPAN poll ranking of presidents in 2017,  scholars consider Jefferson a major success as they rank him 7th, while Adams was ranked 19th and Madison 17th.  John Adams also served after George Washington who was rated  2nd, so he suffers even more in historical assessment. Of course,  the fact that he was defeated for reelection does not help his comparative reputation.

 

Then, we have the case of Abraham Lincoln who followed James Buchanan and was succeeded by Andrew Johnson. Historians consider both Buchanan and Johnson as the absolute bottom of the Presidency—they rank them 43rd and 42nd, respectively. That just adds to the stature of Lincoln, who is judged in most scholarly assessments as our greatest President, and is first in the C Span poll of historians.  Ironically, Lincoln had far less government experience than either Buchanan or Johnson. 

 

William Howard Taft was in office after Theodore Roosevelt and before Woodrow Wilson, and therefore suffers by comparison.While Teddy is ranked 4th by scholars and Wilson 11th, Taft is ranked 24th. Taft’s reputation is helped a bit because he later served as Chief Justice of the Supreme Court, but he also suffered the greatest reelection defeat in American history.

 

George H. W. Bush  served between two charismatic Presidents, Ronald Reagan and Bill Clinton and he is rated lower by comparison. While Reagan is ranked 9th and Clinton 15th, in the latest CSPAN poll, Bush was 20th. Of course, Bush’s juxtaposition with these presidents is only part of it: he suffered the second greatest reelection defeat in American history, only surpassed by Taft.

 

Finally, Barack Obama is already rated 12th by scholars just two years after leaving the Presidency.  But he is also extremely fortunate, and will continue to be so in the future, as he succeeded George W. Bush and preceded Donald Trump. At this point, the second Bush is rated 33rd and Trump is rated down in the basement with James Buchanan and Andrew Johnson. This will shape the image of Obama for the long run of history.

 

So life is unfair, and that certainly applies to Presidents who would surely prefer to be the success between two failures  than the lesser figure between two giant figures in the Presidency.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171785 https://historynewsnetwork.org/article/171785 0
The Middle East, Land Disputes, and Religious History

 

 

The entire Middle East region that bridges three continents has historically been defined by change: changing people, politics, religious ideas and ideologies. People with power have come and gone, but the land remains and still presents the international community with one of its most challenging conflicts.

 

For centuries Jerusalem has been an interfaith, intercultural, and international city where different faith communities have converged, comingled, and coexisted in relative peace. This diversity and inclusivity are characterized by such landmarks as the site of King Solomon’s First Temple, the Church of the Holy Sepulcher, and the golden Dome of the Rock in the centuries-old mosaic of Jewish, Christian, Armenian, and Muslim quarters that constitute the city of Jerusalem to this day.

 

Jerusalem may have had a Jewish identity during the Biblical period of the Kingdom of Israel 930-720 BCE, but the greater region of the Middle East has been occupied and settled by many people bothbefore and after: The Egyptian Merneptah Stele, 1200 BCE; the Neo-Assyrian Empire 722 BCE; the Neo-Babylonian Empire 586 BCE; the Achaemenid Empire 538 BCE; the Macedonian Greeks 332 BCE; the Hasmonean Kingdom 165 BCE, and the Romans in 64 BCE. They all occupied the region for a time.  The area subsequently became increasingly Christian after the 3rd century and increasingly Muslim after it became part of the Muslim state in 638 CE. The Crusader states in the Levant controlled Jerusalem between 1099-1291, the Ayyubid Sultanate ruled from1171-1260, the Ottomans between 1517 and 1917, and the British from 1917 until 1948 when the Jewish State of Israel was proclaimed.

 

In most of the previous cases, Jerusalem was annexed to existing imperial states. In 1948, however, at least a part of Jerusalem became part of the independent state of Israel. This was partially legitimized on the basis of a scriptural claim that god had promised to the progeny of Abraham. What is often overlooked here is the revisionist interpretations of these scriptures.

 

The varied scriptural reinterpretations of the myth of the Promised Land aligns with and contributes to the historical claims and controversies from the Hebrew Bible of 500 BCE to the more than 20 braided and polished versions that bridge the Septuagint Bible of 250 BCE with the 1611 King James Bible.

 

According to the Hebrew tradition, G-d promises the land of Canaan to the descendants of Abram as He brings them out from Ur of the Chaldeans (Genesis 15:7) and after the Exodus from Egypt (Deuteronomy 1:8). G-d promises this land to all eight of Abraham’s sons by his three wives: Hagar, Sarah, and Keturah.

 

Then, somehow, the scriptural narrative changes as though revealed by a different god.  God becomes a historical revisionist and revokes the covenant with Abraham, as if he didn’t have the foreknowledge of what was to come. This god excludes the other seven sons of Abraham and promises the land only to the descendants of Jacob (Jeremiah 3: 3-34); and the land is conveniently re-named Israel. God also forgets that in addition to Abraham he had specifically made a covenant with Hagar, the mother of Ishmael, when the angel of God called on Hagar: “Lift the boy up and take him by the hand, for I will make him into a great nation” (Genesis 21:18, 17:23, and 17:26) are all forgotten and they are now doomed with slavery (Galatians 4:25).

 

This god not only orders the ethnic cleansing of the land from its existing inhabitants, “the Kenites, Kenizzites, Kadmonites, Hittites, Perizzites, Rephaite, Amorites, Canaanites, Girgashites and Jebusites" (Genesis 15:18-20), he also participates in it, “and I will drive out the Canaanite, the Amorite, and the Hittite and the Perizzite, the Hivite and the Jebusite” (Exodus 33:2). Furthermore, the performance of religious rites becomes a requirement to live on the land: “He that sacrificeth unto any god, save unto the Lord only, he shall be utterly destroyed” Exodus 22:20).

 

There are obvious scriptural and moral inconsistencies here. The seemingly arbitrarily targeted tribes for expulsion from the land in Genesis and those in Exodus do not match. Neither do the boundaries of the Promised Land described in Genesis 5, Numbers 34:1-12, with those in Ezekiel 47: 13-20. But more importantly, this is a conflicted god who destroys his own creation through ethnic cleansing, institutionalizes religious tribalism, and has a weird sense of ‘divine justice’ as he takes land from one group of people and gives it to another, without compensation, transaction, or exchange at a time when 95 percent of the earth’s surface was uninhabited.

 

This idea of a land identified with people rather than with an empire came in handy centuries later when the "Westphalian" doctrine of states led to the formation of 19th-century European nation states. When secular European states were defined in ethnic terms, the religious identity of the European Jews became an anomaly and the ‘descendants’ of Jacob, victims of centuries of racial discriminations found themselves stateless. If before a Jewish person didn’t have a place in a European neighborhood, now the Jewish nation had no place on the European continent.

 

European Jews had two choices. They either had to follow the 18th century “Jewish Enlightenment” movement, the Haskala, that urged Jews to assimilate in Western secular culture or they had to follow the Zionist movement that called for the establishment of a Jewish state in Palestine. Zionism was defined by its inherent paradoxes. On the one hand, due to the integrated racial religious identity of the Jews, Zionism was a racial and religious movement that was diametrically opposed to European nationalism that was racial, but secular. The second paradox, that would become more visible and pronounced in due course, was Zionism’s call for the establishment of a racially and religiously exclusive state in  a racially inclusive and religiously-diverse Middle East. 

 

In spite of these contradictions the call for the for the establishment of a Jewish state in Palestine came in handy for the British. During WWI Lord Balfour, the British Foreign Secretary, used both the British design on the disintegration of the Ottoman Empire from within and scriptural revisionism to “favour the establishment in Palestine of a national home for the Jewish people,” with the clear understanding “that nothing shall be done which may prejudice the civil and religious rights of existing non-Jewish communities in Palestine.”

 

The establishment of a Jewish state outside of Europe may have resolved the European racial conflict, but in reality, it has simply been transferred to the Middle East where every European Jewish settler turns a Palestinian into a refugee.  A hundred years later, the racial divide and religious hatred persists, in a role reversal where today’s Palestinians live the lives of yesterday’s Jews.

 

To make an already bad situation worse, the Trump administration recognizes Jerusalem as ‘the eternal capital’ of Israel. To claim Jerusalem only as a Jewish city in biblical/scriptural terms is based on two false premises: 1) that a certain people would remain faithful to the same belief system throughout history, and, 2) if a fraction of that community happens to change or revise their belief system, as people often do, they would disqualify to inherit the land.

 

Can a land be claimed on the basis of the scriptural myths of one religion with implied exclusion of all other religious communities? Can ethnic cleansing be ‘divinely’ sanctioned as stated in Genesis 15:18-20 and Exodus 33:2?  Such a revelation is incomprehensibly incompatible with democracy and human rights in the age of reason.  

 

Resolving these inconsistencies, biblical, historical, political and otherwise, demand nothing short of a paradigmatic shift in our attitude and approach to a problem that is at the heart of the Middle East crisis. In view of the current state of regional and global politics, such a shift may seem unattainable and certainly not to everyone’s liking, but in its absence, the future looks bleak for all parties concerned.

 

The current asymmetrical warfare is fought on the battleground of the womb and the tomb that is threatened by both the incentivized accelerated Jewish migration and the reactive Palestinian population explosion in a geography where water and land cannot sustain an environmentally sustainable growth much longer. Will there arise responsible leadership in any of the Abrahamic faith communities with a vision to see people’s humanity before their racial or religious identities? Only time will tell.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171786 https://historynewsnetwork.org/article/171786 0
Civil War History Brought to Life  

Animating the American Civil War from My Colorful Past on Vimeo.

 

Matt Loughrey, the founder of My Colorful Past, lives on the west coast of Ireland. His work exists somewhere between history and art, using technology as the storyteller. Above, you can watch his latest project. Below, Elisabeth Pearson interviews him about the value of his work. 

 

1. Were you always interested in combining art and history? How did your interest in reanimation begin?

Digital art has always been in the background for me. My first experience of that was in 1991 and getting myself accustomed to Dan Silva's 'Deluxe Paint' software on the Commodore Amiga. It was a very defining time in light of my own creativity. Almost three decades later, art software is still as groundbreaking and archives of historical material are accessible through the internet, not least the ability to communicate freely with archivists the world over. It always made sense to combine both interests.

 

2. Has living in Ireland influenced your work? If so, how?

In the west of Ireland there's a very real sense of creative community. Creativity is widely accepted and fostered by those that appreciate it. I think that overall acceptance and the peaceful surrounds of home are positive for work.

 

3. How did you discover that these frames could be inadvertently reanimated?

I've spent many years stumbling upon what I thought could be duplicate frames that were uncatalogued in different archives. For the most part I dismissed them, until it became more and more obvious to me that it might be worthwhile looking at closely. I looked at a couple closely and it was like finding treasure in plain sight.

 

4. Your Instagram, My Colorful Past, has taken on quite a following. How do you manage that account? Are you pressured to come up with new content? What are some of your favorite images you’ve posted?

People with specific interests look for quality content, provided I keep that in mind then the account becomes somewhat self managing. It's hard to say what is a favorite image, albeit I've always enjoyed American history as it has been so very visually documented, it makes the experience relatable almost. The Gold Rush, The Dustbowl, The Civil War, Ellis Island...

 

5. What do you hope your viewers get out of the documentary and or the work that you do?

The aim is always to invite a completely new sense of relatability and the opportunity to learn a little more.

 

6. How has this project changed or developed since you first started back in July 2018?

The project itself has been realized, in this instance that was was the main priority. So long as it exampled what is possible then I was going to be happy with the outcome. The interesting part on a professional level are the expressions of interest from libraries and museums that see its potential as an aid to their visitor experiences.

 

7. Do you remember the first time you saw the reanimation of an image? Which image was it and did it provoke any feelings that may have inspired this mini documentary?

The first portrait I animated authentically was of George Custer when he was a Captain. In that moment he was 'alive' and I fast realized the potential as well as importance of seeing the project through. It was very surreal in the sense of discovery, that I do remember well.

 

8. What do you hope to accomplish moving forward with the reanimation of frames?

The integral part of this project is realizing that it is all about preservation and discovery combined. The end goal is to see these animations, and hundreds of others, displayed in the correct learning environment. They are ideal as an engaging visual support for classroom learning in the digital age. What better, than to see in motion, the very people or places you are reading about.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171787 https://historynewsnetwork.org/article/171787 0
A History of Huntington Disease and Beyond

 

In 1974, young neurologist Dr. Thomas Bird founded the first clinic for adults with neurogenetic diseases in the United States. For more than 40 years, he directed this clinic at the University of Washington where he saw thousands of patients and conducted pioneering research on conditions such as cerebellar ataxia, movement disorders, hereditary neuropathy, muscular dystrophies, and familial dementias. Over his career, he has been honored with numerous national awards and lauded for his discoveries about the genetics of hereditary neurological disorders including Alzheimer and Huntington diseases. 

 

To his initial surprise, patients with the cruel and incurable Huntington disease became a prominent part of Dr. Bird’s practice in the early years of his clinic. Huntington’s is a progressive, inherited disease that perniciously and ruthlessly devastates the brain. It can cause incoordination, jerkiness, confusion, impaired judgment, emotional instability, depression, anxiety, social disinhibition, hallucinations, and other problems. And no two patients are alike in terms of their signs and symptoms of the disorder.

 

Dr. Bird addresses this perplexing disease and its many permutations in his groundbreaking new book for general readers and professionals alike, Can You Help Me?: Inside the Turbulent World of Huntington Disease (Oxford University Press). The title comes from a Huntington sufferer’s plea for help from his prison cell, and this desperate call reflects the desire of so many of Dr. Bird’s patients who, through the years, sought his unique understanding and care.

 

In his book, Dr. Bird vividly describes Huntington disease, traces its history and, at the heart of his book, shares dozens of accounts of his own patients in lively prose that evokes the engaging writing of renowned doctor-authors such as Oliver Sacks, Richard Selzer and Atul Gawande. He recounts the physical, cognitive and emotional challenges of his patients and the complex situations that patients and their families face every day. There are wrenching stories of neglect and abuse of vulnerable Huntington sufferers as well as stories of hope and courage and the unselfish—and vital—support of families and friends. These very human accounts come from Dr. Bird’s decades of meeting and treating Huntington patients of all ages, from early childhood to the nineties, and from all walks of life.

 

Physicians are still struggling to understand the clinical manifestations of this condition. No Huntington’s patient is “typical,” as Dr. Bird’s case studies demonstrate. One patient may exhibit jerky movements only, while another may be emotionally explosive with poor judgment but without an obvious movement disorder. Some may experience both severe physical and behavioral problems, especially as brain degeneration progresses. Some patients may alienate their caregivers and some may refuse care and some may lack the financial and other resources to receive care and to survive in today’s complex world.

 

Can You Help Me? reflects Dr. Bird’s compassion and care for patients of this dreaded disease as he offers support and treatment grounded on his trailblazing research into the genetics of neurological diseases. In offering understanding and empathy to each patient, he emulates the admonition of the legendary physician Sir William Osler: “Care more for the individual patient than the special features of the disease.”

 

Dr. Bird is a University of Washington Professor (Emeritus) of Neurology and Medicine (Medical Genetics). In addition to directing the UW Neurogenetics Clinic for more than 40 years, he was also chief of neurology at the Seattle VA Medical Center for 12 years and is presently a retired Research Neurologist in Geriatrics at the VA. 

 

Although retired from clinical practice, Dr. Bird still actively researches genetic diseases of the brain and neuromuscular system; collaborates with molecular biologists and others on genetics projects; and mentors physicians in training and research fellows. He earned his M.D. from Cornell Medical College and is board certified by the American Board of Psychiatry and Neurology. He lives in Lake Forest Park, WA, just outside Seattle, with his wife Ros.

 

Dr. Bird sat down at his University of Washington Medical Center office and generously responded to questions about his career and the history and human stories of Huntington disease.

 

Robin Lindley: Thank you Dr. Bird for talking with me about your distinguished career and your new book on Huntington disease. I’d like to first ask you about your own story. When you were a child, did you dream of becoming a doctor?

 

Dr. Thomas Bird: I grew up in a small town in upstate western New York, in the Finger Lakes area. My maternal grandfather was a country doctor in that small town. I didn't know him really. He died when I was five or six years old, so I only have a few vague memories of him. But our house where I grew up was just down the street from where he lived, and that was the family home on my mother's side, and my mother's brother, my uncle lived in that house, so I was very familiar with the house.

 

My aunt and uncle kept my grandfather’s old office intact. I remember wandering through it as a kid and seeing the examination chair and a little side laboratory with a microscope and shelves loaded with pill bottles.  I was very impressed.

 

So, I had this knowledge of my grandfather, even though I didn't know him. My mother clearly adored him so, when I was growing up as a kid, it was very clear to me that if you wanted to be the best you could be, you would become a doctor. That was never said explicitly, but it was the aura that I grew up with.  Later I got really interested in chemistry and thought I wanted to be a chemist. So it wasn't like I directly wanted to be a doctor. But when I went to college, it was certainly in the back of my mind because I became a premed major. 

 

Robin Lindley: What inspired you to specialize in neurology?

 

Dr. Thomas Bird: There were a lot of lines that led to that. I'm sure that having a brother with mental retardation made a difference in how I viewed people and how I viewed medicine and the things I was interested in. Having a brother that I lived with 24/7 my whole childhood life who had something not right with his brain impacted me a lot. And, when I went to college at Dartmouth, I actually majored in psychology and who knows exactly all the influences for why I did that, but my brother was probably part of it.

 

I also was just fascinated by human behavior. Fortunately for my future, the psychology department specialized in biological psychology so the faculty were very interested in the neuroscience brain piece of psychology. We had Skinner boxes where we did mouse and rat and pigeon experiments on behavior, and I took one course where we dissected a sheep brain and then a human brain. I think that made a difference. My mentor was a clinical psychologist who was actually in the department of psychiatry at the medical school as well as in the department of psychology. He taught me a lot about human behavior and our interests in that topic matched nicely.

 

So, when I went to medical school, I thought my trajectory would be to either be a family doctor, like my grandfather, or a psychiatrist because I was really interested in human behavior. When I got there, I didn't particularly like psychiatry. It wasn't a neuroscience or brain-oriented department of psychiatry so I lost interest in it.

 

Then, mostly by chance, I ended up having my summer project with the new chairman of the department of neurology at Cornell, Fred Plum. I knew nothing about neurology, but I wanted to stay on campus and work with somebody doing research in medicine. So I got hooked up with Dr Plum. He turned out to be a very dynamic, aggressive, energetic person who eventually became world famous. He wrote a bestselling textbook in the 1960s called Stupor and Coma, and he became one of America's leading neurologists. In the beginning I had no idea who he was or what I was getting into, but it turned out to be terrific. 

 

I started with a clinical project in coma. I learned how to do EEGs [electroencephalograms], and I was going around the hospital with a mobile EEG machine and doing EEGs on people in comas. So I got to see all kinds of neurology and I got to see it up close and personal. Then I started going to neurology grand rounds on a regular basis. I just became fascinated with the brain and with neurology. 

 

Back in those days, you didn't have to decide what you wanted to specialize in until you got into your internship. My internship was here at the University of Washington, and I already knew I was interested in neurology. In my internship, I had two neurology rotations and I just loved it and had a great time. The head of neurology department asked me if I’d like to be in the neurology program? And I said, "Sure." And that was it. That was before the days of matching for residency programs and before the days of signing a contract. It was just on a handshake.

 

Robin Lindley: And you then further specialized in genetics and neurology.

 

Dr. Thomas Bird: So I was in neurology, which I really loved. Then, in my last year of the neurology residency, I learned of the Medical Genetics Clinic here that had been started by Arno Motulsky who was one of the very earliest and most prominent medical geneticists in the country. He started medical genetics here at the University of Washington in 1957 and, along with Johns Hopkins, that was the first program in the country looking at the genetics of human disease with a medical orientation. He was very farsighted in doing that initiated a clinic that saw adults with genetic diseases.

 

So, as a senior neurology resident, I started going to that Medical Genetics Clinic and seeing the kinds of patients that I'd never seen before. They were all considered rare diseases--back of the textbook experiences. I'd go to my general neurology clinic and I'd see migraines and back pain and stroke and things like that, which were kind of interesting but I didn't consider them fascinating. And I'd go to this genetics clinic and I'd see Huntington's disease and cerebellar ataxia, muscular dystrophy, and Charcot-Marie-Tooth neuropathy. Things I'd never seen before. And I was fascinated by it. I discovered they had a fellowship training program associated with the clinic. I applied for the fellowship and was able to get some funding.

 

After I finished my residency, I did two years of fellowship in medical genetics and that became my career. I spent the rest of my career specializing in genetic diseases of the nervous system. Nobody did that back then for adults. It was a new area and it turned out to be very timely. I didn't realize it, but I was right on the cutting edge of a revolution in human genetics.

 

Robin Lindley: I'm very impressed by your extensive background. You're a pioneer in the medical genetics of neurological diseases, including Huntington disease, the focus of your new book, Can You Help Me?

 

Dr. Thomas Bird: I started a clinic in adult neurogenetics as a fellow in 1974, 45 years ago. At that time, in the 1970s in 1980s, the most common neurological disease seen in the medical genetics clinic was Huntington's disease. I had no idea that was the case until I got to that clinic. It was considered a rare disease, and yet there were people coming in with it every week. Because it was so common in that clinic, it became something that I couldn't avoid. And I found it extremely fascinating. So I saw what eventually developed to be hundreds of families with Huntington's disease over the next several decades.

 

Robin Lindley: I realize it's complicated, but what is Huntington disease or Huntington's chorea, as it was once called?

 

Dr. Thomas Bird: It was called Huntington's chorea for a long time. In a nutshell, it's a degenerative disease of the brain that's genetic. Those are the key things to know. So it's a brain disease. It's degenerative, so it's progressive and causes a deterioration in the brain. And it's inherited in what's called a dominant fashion. So, if someone has it, each of their children has a 50/50 chance of inheriting it whether they're a boy or a girl, and that's each time they have a child. And it's so progressive that it eventually is fatal. But it's slow, so the typical duration of the disease is about 15 years.

 

The manifestations of the disease primarily fall into three categories. One is trouble with their coordination and [patients] develop movements that they can't control. They have these jumpy, jerky, uncoordinated movements, and it can affect the hands, or the arms, or the legs, or the face. It can affect their whole body. So they develop these jumpy, jerky movements, and when they're walking, they almost look like they're dancing in an uncoordinated way. And that's why it got called chorea. Chorea is the Greek root word for dance, as in choreography, and chorea means dancing. And these people sometimes look like they're doing a dance.

 

Second, they also can develop a kind of dementia. So their judgment and their ability to solve problems can become mildly to moderately to severely impaired.

 

And third, they can have problems with behavior with disinhibition where they're unable to inhibit socially inappropriate activity. And their thinking can become disoriented or disarranged. They can become manic or they can become depressed or they can have delusions or even hallucinations. Their behavior can become quite abnormal.

 

Any of those things can happen to a person with the disease. And someone can just have mostly one symptom or a combination of two or all three. The dramatic piece that people notice is the chorea, and that's why it was called Huntington's chorea. But it became clear to doctors and investigators that there were people who had the disease with little or no chorea and, to be more comprehensive in terms of the name, it was called Huntington's disease rather than Huntington's chorea because Huntington's chorea implied that everybody had chorea and not everybody with it has chorea.

 

Robin Lindley: How common is Huntington's? It's thought of as very rare.

 

Dr. Thomas Bird: Everything is relative. It is rare. If you relate it to Alzheimer's or Parkinson's or cancer or diabetes, it's much more rare than those diseases. It's on the order of 10 cases per hundred thousand population. So that's not a lot, but it's actually more common than ALS (amyotrophic later sclerosis) or Lou Gehrig's disease, which people have heard a lot about. And it's more common than some other genetic diseases. So, it's not common, but it's not as rare as you might think.

 

We've seen hundreds and hundreds of families in Seattle. In my career, I know I've seen more than a thousand people with the disease. I don't think of it as rare. I think of it is uncommon, which is somewhere between rare and common if that makes sense.

 

Robin Lindley: There is no cure for Huntington disease and it's fatal. 

 

Dr. Thomas Bird: Right. So a couple of things to put it in context. First of all, it's called a fatal disease because people with it have a shortened lifespan and they get worse, then they die with the disease, usually of the things that happen to people who can't take care of themselves. It's the same as what happens when you get end stage Alzheimer's or end stage Parkinson's or end stage ALS. It's not really the disease that kills you, but you can't walk, you can't talk, you can't swallow, and you develop malnutrition and pneumonia and that's what you die of. But there's no question that lifespan is shortened, and that's why it's called a fatal disease.

 

But I always remember a woman who is very famous in the world of Huntington's disease. Her name was Marjorie Guthrie. She was Woody Guthrie's wife. When people said, this is a fatal disease, she would get a little upset about that word fatal. She would look you in the eye and say “life is a fatal disease” because she didn't like HD being labeled that way. She said everybody dies of something sooner or later so let's not get too down about this disease. Let’s be more optimistic and move forward.

 

And there is not a cure for this disease. What does that mean? That means that once it starts, there's nothing that stops its progression and there's nothing that prevents it from developing, and people continually go downhill with it. So, in that respect, there is no cure.

 

Robin Lindley: As you detail in your book, genetic testing is available for Huntington’s. Understandably, people who are aware of ancestors with the disease are often reluctant undergo testing.

 

Dr. Thomas Bird:  In 1993 medical science developed the ability to have a simple blood test to identify the mutation causing HD. That dramatically changed the field.  Now people at risk for the disease could actually find out if they had or had not inherited the HD mutation. Obviously that testing decision is fraught with all sorts of complications.  Because there is no effective treatment, most people decide not to be tested.  Those that do get tested may experience the gamut of emotional reactions from elation to serious depression.  In my book I have an entire chapter devoted to this amazing and often unpredictable range of responses. 

 

Robin Lindley: In looking at the history of the disease, Woody Guthrie is perhaps the most well-known Huntington disease sufferer. He’s an example of a patient. What do we know about his disease course?

 

Dr. Thomas Bird: We know a lot about him for many reasons. One, he was famous so a lot was known about him and he wrote his own autobiography. I've tried to read that. I haven't read it cover to cover. It's a strange story. In the first hundred pages, he goes into great detail about growing up as a kid in Oklahoma. And he talks about his friends and he talks about the games they played, and about the town he grew up in, and about the tricks that they played. He talks about the trouble they got into and he goes on and on about those things.

 

Of course, he didn't know that his mother had Huntington's disease. But once you get into his book, he talks about his mother and how he loved her. And he didn't understand why she would behave the way she behaved. She would lose her temper and she would scream and yell and she would throw dishes and she would run out of the house. And she was clumsy and she was always breaking things. He had no idea why, and he loved her dearly, but he recognized that there was something wrong with her.

 

And so, you learn that as his background, and then he became this famous folksinger who was highly popular. Then he began to develop the disease and his behavior changed and his thinking changed and he developed chorea. It became obvious that he had the same disease that his mother had. He got the diagnosis of Huntington's chorea and he eventually was institutionalized. He died in a state institution.

 

Robin Lindley: You detail the history of Huntington disease, which was not identified until the late 19th century, although it certainly had affected humans for thousands of years.

 

Dr. Thomas Bird: Yes. It's called Huntington's because that's the name of the young doctor that first described it best. In terms of history, it's a fascinating story. George Huntington, who the disease is named for, grew up in a little country village, East Hampton, Long Island, in the middle of the 19th century. His father and his grandfather were family doctors in that town. He went around with his father on rounds to see patients. He described going with his father in a buggy, and riding to the homes of people who had this disease. There was a family of people that had chorea and it ran in their family, so he knew about them as a kid because he had seen them with his father. He had grown up knowing of the characteristics of this family.

 

Then Huntington went to medical school, just like his father and grandfather, and became a family doctor. After medical school in New York City, he briefly moved to Ohio to try out a practice. While he was there, the local medical society asked him to present a paper. That was a professional organization and probably every week or every month they had one of their members present a paper. They asked him to do it as a new member. He presented a paper on chorea and I think it's partly because he knew this family, so it was something he felt comfortable writing about.

 

So, he presented this paper on chorea to the local county medical society and it was so good, he wrote it up and published it. He talked about all different kinds of brain diseases that can produce chorea.  At the end, almost as an afterthought, he said he'd like to mention this family he’d known for decades in his hometown because they're so interesting. Then he goes through a very accurate, careful description of the family in terms of their movements, their behavior, their loss of judgment and dementia. He described the fact that the disease was progressive and fatal. The fact that they had an increased frequency of suicide. The fact that males and females both got it. The fact that it was passed down from generation to generation. And if somebody had a parent with it, but that child lived into late adulthood and never showed signs of it, then it didn't show up in their branch of the family. So, Huntington really had cued into the genetic piece of it before medical genetics, and [before geneticists] knew about dominance. He had described dominant inheritance and he didn't even know what he was describing.

 

 He published this paper about chorea in 1872 when he was just 22 years old.  Over the next 20 to 30 years, other people in the US and in Europe realized they were seeing similar families. And they would write them up. And when they referred to them, they would always say we've seen a family with chorea and it's like this family that Huntington described and they would cite his paper. And so very quickly it got to be called Huntington's chorea because he was the one that first described it. And that was an 1872.

 

Robin Lindley: In discussing medicine and genetics, you mention a resurgence of interest in Gregor Mendel’s work in genetics a couple of decades later.

 

Dr. Thomas Bird: Yes. In 1900, biomedicine rediscovered Gregor Mendel's laws of inheritance. Mendel, the monk with his pea plants in what's now the Czech Republic, figured out inheritance of genes. Genes weren't actually known at the time, but that's what Mendel was studying.  He found out that things could be inherited in a dominant or recessive manner. And he very accurately and carefully described that and it was pushed aside and unrecognized and nobody thought anything about it for 30 years. And then in 1900, his papers were rediscovered, and people not only realized that it was relevant to the plant world, but they said, "Oh my goodness. Human diseases are inherited in the same way."

A scientist named Bateson in England was looking around for human diseases that he could say were dominant or recessive, like Mendel's pea plants. He came across the publications on Huntington's chorea and he looked at the pedigrees of families with Huntington's chorea and said, "Oh my goodness. This is autosomal dominant inheritance." This is what Mendel was describing in his pea plants as dominant inheritance occurring in the same way in a human disease.  Bateson started promoting that idea and Huntington's chorea suddenly moved to the front of the book in human genetic studies because it was considered a classic example of dominant inheritance. Even though it was rare, it became very well known in the human genetics field because it was very clear that it was an autosomal dominant disease. 

 

Robin Lindley: Was Parkinson's disease described by then too?

Dr. Thomas Bird: Parkinson's definitely was already described, but nobody thought it was genetic, so it wasn't part of the human genetics literature at all. Same with Alzheimer's. Alzheimer's was described about 1906 or 1907, and was a well-recognized disease, but no one really thought it was genetic. But Huntington's was special because it was clearly a genetic.

 

Robin Lindley: How this whole field has developed is fascinating. And the gene for Huntington’s wasn't identified until about 1993?

Dr. Thomas Bird: Yes, the gene was found in 1993.

 

Robin Lindley: It's incredible that Huntington was so far sighted.

Dr. Thomas Bird: The key advantage he had was realizing [the disease] was genetic, realizing that it was inherited. And he knew that because he had lived in the context of his family and his community for his whole life. He had seen several generations of this family and had no doubt it was inherited. As I mentioned he described the disease when he was 22 years old and never wrote another paper. He wasn't an academician at all. He never did research.

 

Robin Lindley: Huntington was more of a country doctor then?

 

Dr. Thomas Bird: He wanted to be a country doctor and he was a country doctor.

 

Then a couple of things happened with Huntington's that are of historical interest. One, a psychiatrist, I believe in Connecticut, saw families with this disease. He thought that the families in New England that had this disease were all related to each other, and that they had come over as migrants from England in the 1600s. And he thought he had evidence that they were persecuted as witches in New England in the 1600s and 1700s. He wrote about that, and that became a very popular theme about Huntington's disease that made people very uneasy.

 

Robin Lindley: Did this psychiatrist connect his view of Huntington’s with the Salem witch trials?

 

Dr. Thomas Bird: He thought so, but he wasn't able to quite make that connection. But it turns out he was wrong. He was wrong that they were all related to each other. He was wrong that they came over at the same time. And, as far as anybody can tell, he was wrong about them being persecuted as witches. But for years, Huntington’s had this context of being associated with witchery, for whatever that's worth. It was unfortunate and it also was not true.

 

Robin Lindley: And the eugenics movement wanted to rid the US of Huntington’s disease—by sterilizing patients. 

 

Dr. Thomas Bird: Yes. Because of the behavior of these people have and because of how abnormal they look and how deteriorated they get, Huntington’s got tossed into this pot of diseases we want to get rid of, especially because it was hereditary. So it got thrown into the eugenics movement in the first half of the 20th century. The eugenics movement in this country was focused at Cold Spring Harbor on Long Island. It's still a very prominent, biomedical research center even to this day. But back then, it was run by Charles Davenport, America's most prominent eugenicist. It's ironic that Cold Spring Harbor, Long Island, is only a few miles from East Hampton, Long Island, where Huntington lived. 

 

Davenport's idea was that "bad diseases" are genetic. He said alcoholism is genetic. Mental illness is genetic. Mental retardation is genetic. Criminality is genetic. Prostitution is genetic. Dementia is genetic. And, Davenport said, we need to eliminate these from our society and we need to do it by sterilization. And he and the eugenics movement included Huntington's chorea as one of those diseases along with alcoholism and mental illness and criminality. So, Huntington's got a bad rap when it was thrown in with these diseases that were bad for the society, and they wanted to get rid of it by sterilizing people.

 

Robin Lindley: So eugenics was supposedly aimed at improving the health of the society, and there was also an element of class and racial discrimination.

 

Dr. Thomas Bird: Yes. And Huntington's became part of that. Being part of the eugenics theme and being part of the witch theme really gave Huntington's disease a very bad aura. It became a stigma for the communities. It became a stigma for the patients. It became a stigma for their families. So it was something they didn't want to face up to. Patients didn't want to talk about the kind of disease you'd hide in the closet. And it was really difficult to get a handle on this disease in the first half of the 20th century. Plus, because of their behavior and because of the fact that it hits you in your early and midlife, they'd lose employment and often become poverty stricken. They frequently ended up in mental institutions. So, it wasn't surprising that they got thrown into this pot of mental disease that we wanted to get rid of.

 

Robin Lindley: Was there evidence that the Nazis in Germany euthanized people with Huntington's as part of their T4 eugenics program, Hitler’s program to "eliminate" the disabled, those labeled as "life unworthy of life"?

 

Dr. Thomas Bird: I don't know if any of them were actually euthanized. I don't know if that's documented or not, but when you look at the lists of diseases that the Nazis wanted to get rid of, Huntington's clearly appears on those lists.

 

Robin Lindley: You mentioned too, and this touches on our regional history, that there tend to be more West Coast cases of Huntington's disease than in other parts of the country. Do you think that has something to do with migration patterns?

 

Dr. Thomas Bird: I think so. I can't argue too strongly for that because the statistics just aren't there. But when you go back and look at the population numbers for states (and this was mostly done by death certificates which aren't terribly accurate), there are certain states that were noted to have more families with Huntington's than others.  If you look at those statistics HD seemed to be more prominent in Washington, Oregon and California.

 

When we started seeing families with Huntington's in the state of Washington, we wondered what was going on because we were seeing quite a few and we thought maybe they were all related. Maybe one family had moved to this area a hundred years ago, 120 years ago, and we were seeing the descendants of this one family. But we could look at the family trees of the families we were seeing and that clearly was not true. We were seeing very different families and hundreds of different families that were not related to each other. None of them were native western Washington people because there aren't very many native western Washington people. We're an area of migration, so it was clear that these families had come from the East and from Midwest.

 

We could see that the people with Huntington's that we saw were from families that had moved here from the East and the Midwest. When you looked at these death certificate reports, the Western states had a lot more Huntington’s than the Midwest states per population, so we thought that there was a migration factor and they were getting to the West Coast and couldn't go any farther, so they settled down.

 

Why would they migrate? In my mind, one of the reasons would be that people with Huntington's disease tend to be loners. They tend to want to go off by themselves. And they tend to be shunned by their communities because they look different, they act different, and they have social behavioral problems. They were not getting along in their local communities. So they moved. In our country when you move, more often than not, you move west. I think that's what happened to a certain degree.

 

Robin Lindley: That makes sense to me. I think Washington State has a reputation for attracting outcasts, misfits and loners. You also address the history of different approaches to treating Huntington's, including the use of lobotomy in the 20th century.

 

Dr. Thomas Bird: I didn't want to emphasize that very much, but it intrigued me because it became apparent to me that, particularly in the first half of the 20th century, it was common for people with Huntington's disease to get committed to state mental institutions. It's not so much now because there are fewer institutions and their populations have gone down but, up until the 1970s, it was very common for patients with Huntington's to be admitted to state mental institutions. And because they had a progressive disease that didn't get better, they tended to stay there for a long time, sometimes for the rest of their lives.

 

I ran into quite a few people with Huntington's disease in our state institutions and there was no good treatment for it then and there still isn't a good treatment for it. And, in the 1940s and 50s, Walter Freeman developed frontal lobotomy as a treatment for mental illness and it became extraordinarily popular. It was done mostly in mental institutions. Walter Freeman actually traveled around the country and showed psychiatrists and neurologists how to do frontal lobotomies. He went from state hospital to state hospital to state hospital doing that. And then [his trainees] would do it.

 

From 1938 to about the late 1960s, it was done on thousands of Americans in state institutions. I wondered, was it ever done on somebody with Huntington's disease? I actually had never seen anybody with Huntington's who had had that procedure, but it seemed to me, knowing they were in state institutions, knowing there was no good treatment for it, and knowing that this frontal lobotomy had become very popular in the fifties and sixties, the chances were that some people with Huntington's were getting lobotomies. I wondered if I could actually document such a case.

 

I went back to Walter Freeman's original textbooks on his procedure. He wrote two editions of his textbook and they [describe] the procedure. He [included] long lists of patients that he did the procedure on by number. He would give them a case number and then he'd just talk about them a little bit. I quickly realized that he didn't particularly use this procedure for certain kinds of mental illness. He thought lobotomy might be good for almost any mental illness, so he did it on all kinds of patients. He was doing it for schizophrenia and for depression. He was doing it on mania, he was doing it on dementia.  In essence anybody that misbehaved he thought was a prime candidate for a frontal lobotomy.

 

I looked in his index of one of his volumes of his books to see if he listed Huntington's disease. And sure enough, he did. So, I found a case in his records of a patient with Huntington's disease that he had done a frontal lobotomy on. In my mind, that confirmed in fact that this was being done on people with Huntington's disease. I have no idea how many, but knowing that he saw nothing wrong with doing it on Huntington's and knowing that he showed hundreds of doctors how to do it, and knowing that it was fairly commonly done all over the country, my guess is that probably at least a hundred people with Huntington's had that procedure done and maybe even more.

 

Robin Lindley: And Freeman’s lobotomy was such a crude procedure that was done with an icepick and a hammer.

 

Dr. Thomas Bird: Yes. It was very crude and it was not controlled and it was not done in any careful scientific manner. And Freeman was an evangelist for it and he was self-promoting both himself and his procedure. It definitely was out of control. I don't want to emphasize it, but I think it is part of the story of what happened to people who had this disease.

 

Robin Lindley: And you discuss the role of the asylum movement in the history of Huntington’s.

 

Dr. Thomas Bird. Yes. Asylums were built for that kind of person. Actually, the asylum movement was a positive, compassionate approach to help the community and to help the patients with severe mental illness. If you look at the people who were promoting asylums in the 19th century, they were trying to help by doing two things. Number one, they were trying to treat these very sad people who were very difficult to help.  They also were clearly trying to remove these people from society so that they would be separated out.  They thought that made society safer and it made the patients safer. The problem was, once you put them away, nobody paid much attention to them.  They could be abused and no one would know it. There was no mechanism for getting them back into society once they disappeared. And nobody wanted to pay a lot of money to take care of them.

 

And that of course is still a problem today. It's very expensive to take care of people in institutions. And by and large, the states and the communities don't want to put a lot of money into it. They complain bitterly about both the patients with these diseases and the institutions, but they don't want to fund them to a level that that will actually be effective.

 

Robin Lindley: Terrible problems developed with deinstitutionalization, by the 1970s, I believe, and many patients who were discharged from institutions wound up on the streets without support.

 

Dr. Thomas Bird: Yes. And Huntington's is one of many diseases where they commonly put patients into state institutions. And when the deinstitutionalization happened, they were put back out in the community, but nobody was paying much attention and they didn't really get the care that they ought to have. And today even, there are people with Huntington's who are homeless and are not getting good care because they seem to misbehave and they have no financial resources. They have what's called denial. They don't think there's anything wrong with themselves so they often refuse treatment and they refuse to take medications and they flounder in society.

 

Robin Lindley: The issue of lack of insight in the disease seems prominent.

 

Dr. Thomas Bird: It's very common. It was called denial, but they're not really consciously denying it. They really are not aware of their disability and lack of insight is a good way to put it. It’s lack of awareness. They're not aware of their behavioral abnormalities and they're not aware of their incoordination and involuntary movements, so they don't think they need help.

 

Robin Lindley: As your book demonstrates, you're a master storyteller. Can you tell me about your interest in writing and telling stories? Is it a matter of course for you as a physician to write about your cases?

 

Dr. Thomas Bird: I see patients in my mind. I see patients as human stories. I always talk to my patients. I always find out from my patients, who are they? What kind of work did they do? What was their occupation? Where did they live? Where were they born? Where did they go to school? Did they play sports? Did they have hobbies? Did they have any kind of talents? Who were their parents? Who were their brothers and sisters? What has their life been like? What got them to the office today? I always think of my patients that way. They're all human stories.

 

Robin Lindley: Did you keep case notes with those kinds of detailed descriptions?

 

Dr. Thomas Bird:   In my clinic notes I always dictated the background of the patient. Where they were born, where they had lived, where they went to school, what their occupation was. I always thought that was part of their story and I thought of people that way. 

 

And of course, the Huntington’s people would often have these problems with their occupations and with their marriages and with their families and with their behavior. They often were doing surprising things that you didn't expect and that would become part of their story. And sometimes they were recurrent and, month after month, I would see them and they were always having one problem after another. And some of them you couldn't forget because their problems were so complicated and sometimes so strange and sometimes so unusual and sometimes so difficult to deal with that you just couldn't forget them. 

 

In fact, when I retired, I couldn't get these people out of my mind. I had seen dozens and dozens of them that I couldn't forget and I kept thinking about them. I thought one way to help me deal with that would be to get it down on paper. That's when I started writing their stories.

 

Robin Lindley: You vividly and compassionately describe your patients and share dozens of fascinating stories about them. You describe a man who had a compulsion to steal, and that seemed a part of his disease. He was in prison when he wrote you for help. Was his compulsion related to Huntington’s?

 

Dr. Thomas Bird: That individual’s story generated the name of my book. I got a totally unexpected letter in the mail. It was a handwritten letter and it began "Dr. Bird, Can you help me?" It turned out, as I read the letter, that it was from a prisoner in the state penitentiary in Walla Walla. He knew he had Huntington's chorea because his mother had died with it. He didn't say why he was in prison, but he asked if we had a clinic that followed people with Huntington's chorea. I wrote back to him and said that we did, and I just put it aside. I didn't think I'd ever hear from him again because I knew he was in prison and you don't go to the Walla Walla state penitentiary for minor crimes. I figured he would probably would be there for decades.

 

And then a few months later, he turned up in my clinic. I was actually a bit surprised. It turns out that the prison then was badly overcrowded and he had complained about having this disease, which was obvious because of his movements. The warden had told him that, if he could document that somebody would follow him for his disease on the outside, they would release him. So as soon as he got my letter, he showed it to the warden and they released him from prison on parole. 

 

When he was out on parole, I saw him and followed him. I got a call one day from his parole officer who said he'd stolen a sweater from Nordstrom's. And the parole officer said,  “He's on parole and, if I report him for that, he'll go right back to prison.” And he said, “I like the guy and I don't want to send him back, and I know the pen is overcrowded anyway, so I'm going to let this go, but would you please tell him to stop stealing things?” And so, the next time I saw him, I did. I said, you know, don't do that or you'll end up back in jail. And he said he'd take that under consideration, but added that he couldn't help it. That was the way he put it. I think that's why he was in prison originally, because he had burglarized places over and over again.

 

A few months later, I got a call from the parole officer who said, “Sorry to tell you this, but he's back in the Walla Walla pen.” I asked, “Why?” And he said, “He burglarized a home, and the mistake he made was that he burglarized the home of a very wealthy, well known person.” It turned out to be the home of the owner of the famous Seattle restaurant, Rossellini's 410. And he was the brother of a former governor of the state. So it was a very prominent family. When they found out that the guy who had burglarized their house was on parole, they said he's got to go back. And so he did.

 

I talk in the book about whether his repetitive stealing had anything to do with his disease. That's sort of a leap. What's the proof that the two are related? Maybe he was just a burglar who happened to have Huntington's disease. But, if you look at the literature on Huntington's, it's very clear that one of the themes of the behavior problem can be obsessive compulsive illness. They can do things over and over again that they don't have any control over. They can become cigarette smokers. They can become alcoholics. They can become gamblers. They tend to be obsessive about a lot of things. Not always, but frequently. I think that's part of the disease. I suspect that this guy was a compulsive stealer because of his brain disease. I can't prove it, but I think it's very likely.

 

Robin Lindley: Doesn't this get into the neuroscience of addiction?

 

Dr. Thomas Bird: It gets into the organic brain foundation of mental illness. There’s this tendency to classify diseases as organic biologic diseases or mental diseases, and say they're not the same. If somebody has a mental illness, that's not like having cancer or diabetes. That's somehow different. But when you try and think that through, where does behavior come from? And that comes from their brain, right? It's not magic. Your language comes from your brain. Your vision comes from your brain. Your speech comes from your brain. Your walking, your talking, and your thinking come from your brain. So, doesn't mental illness, if we assume that there is mental illness, and I do, doesn't that come from your brain? And if you say that schizophrenia is a mental illness or manic-depressive or bipolar disease is a mental illness, doesn't that imply that it's a brain disease? So, if somebody is in a state institution or in a prison for abnormal behavior, how much of that is because their brains are not functioning properly, and is the right approach to that to just throw him in a cell and ignore him?

 

I think as a community we dropped the ball when we tried to deal with people with severe, difficult to control behavior. And I think we need to recognize at least some of that is being driven by abnormalities of the brain. Of course, environmental things are playing a role too. Your diet plays a role. Your parents play a role. Your peers play a role. Your occupation plays a role. Head trauma plays a role. All of those things are involved.

 

Robin Lindley: Are there a couple of other striking cases you'd like to mention?

 

Dr. Thomas Bird: I think people often don't realize that Huntington's has a juvenile piece. I have a case in the book that I call the "Princess in Pink" about this little girl who was in elementary school. She was a good student. She played soccer and kickball and she got along fine. She was just a really cute, nice kid. Then she started to have trouble. She couldn't run around as well anymore. And then she couldn't keep up with her peers in classwork, in reading and writing and arithmetic. She fell behind. And she was living with her grandmother because her mother died with Huntington's disease. Her family was aware of juvenile Huntington's, and they wondered, is that possible? Is our little girl developing juvenile Huntington's disease?

 

And it turned out to be exactly the case. When she was seven or eight years old, she actually began to deteriorate because she had developed this progressive disease. The nice thing about the story is that her teacher realized what was going on and she was particularly outgoing and kind to her and brought her classmates into her social welfare. And so, her classmates realized that she had a disease, that she was getting worse, but she was still this really nice girl that they'd known for several years already. And so, the teacher and the classmates formed this very effective safety net and support group for this little girl. And they wrote a class book about her called "Princess in Pink.”  It's really a very compelling story.

 

Robin Lindley: You included much of the text of their lovely book for this girl, and it's very moving.

 

Dr. Thomas Bird: Yes, it was a lovely class project. . An awful lot of it had to do with the teacher who I have tremendous regard for, Ms. Perry. She stayed the girl's teacher for three years in a row. They developed a really good relationship and the students were really kind to the girl.  She did very well for several years and stayed in school, but eventually became quite disabled and died with the disease, I think when she was 14. Her teacher and some of her students went to her funeral. So it's a sad disease, but it's a very nice story about what loving care she got from her social community.

 

Robin Lindley: That’s a touching chapter of your book. You also have the case of a man with Huntington’s who shot his roommate and he didn't know why.

 

Dr. Thomas Bird: I think that's another one of those lack of awareness kinds of things. He was just watching TV and he had a handgun and he pulled out the handgun and shot and killed his roommate and he didn't know why. He had Huntington's disease and he eventually went to prison.

 

And then I had the opposite case, where a man with Huntington's disease was killed by his roommate.  That's a striking example of how vulnerable people with Huntington's are. I think he got in with the wrong crowd. He had no idea what a miserable guy he got attached to as a roommate.  The roommate just decided to kill his friend who was obviously disabled with this disease. My guess is [the roommate] probably robbed him. So, on the one hand, a guy with Huntington's committed homicide, but on the other hand, a guy with Huntington's was very vulnerable and he was a victim of homicide. So, it can go either way.

 

So people with HD become very vulnerable. They can't take care of themselves and it's obvious to the community that there's something wrong with them either because of their behavior or their movements. And their judgment is very poor, so they can't figure out who's a good colleague and who's not a good colleague. And so, they often are abused by other people in the community because they're seen as disabled and vulnerable.

 

One of my favorite stories is about the young man who kept all his money in his shoe. He was homeless and he kept getting arrested. When he was in jail, he took the money out of his shoe and his cellmate noticed the money. His cellmate told him that he was a financial advisor and that, when they got out, if he turned his money over, he would invest it for him and make him a pile of money. And this guy with Huntington's was getting a VA pension, so he was getting monthly money. He met this "financial advisor" and started giving him all his money. He got the VA pension and of course his former cellmate was just stealing the money from it, and he ended up with nothing. Again, that shows how vulnerable he was.

 

Robin Lindley: You mention a tendency of Huntington patients to suffer head injuries. 

 

Dr. Thomas Bird: Yes. I also have a picture in the book of an MRI of a person with a subdural hematoma. When they fall or hit their head, they bleed into the area between their brain and their skull. People with Huntington's are also vulnerable to mild head trauma and tend to get these subdural hematomas. There's a story in the book about a woman who fell down the stairs and she eventually died of subdural hematomas.  We just recently had a man with HD who last month who died with subdural hematomas. Falls are bad news with this disease. It's a real problem.

 

We think what happens is that, with Huntington's, the brain tends to shrink from the degeneration so this space between the brain and the skull gets enlarged. There's more space there and that puts stress on the veins. And when a patient just hits his or her head against the wall or has what we would think of as mild trauma, the brain bleeds and there's a hemorrhage into the space between their skull and the brain. They're more vulnerable to that because their brain is shrinking.

 

Robin Lindley: Thanks for explaining that brain anomaly. I thought the head injuries were from movement problems and falling.

 

Dr. Thomas Bird: Yes, there is that. They hit the head more often because they are falling and bumping into things. But the kind of bump that wouldn't bother us can be very serious for them.

 

Robin Lindley: And there’s the vexing problem of suicide.

 

Dr. Thomas Bird:Yes. It's not so surprising, particularly for those who have awareness of their disease and especially if they've seen some other family member go through the full brunt of the disease. They don't want that to happen to them. So they can become very depressed and suicidal.

 

Robin Lindley: How can you treat or otherwise help these Huntington’s patients?

 

Dr. Thomas Bird: There are things you can do to help people and improve their lives. If you look at that context, the thing that helps the most is the helping community that you put around those people. If their families help them, their friends help them, their doctors help them, their nurses help them, their social workers help them, they do better. So what you need is a team that's focused on helping these people live their lives as best they can. And that's what helps them the most.

 

If they have certain symptoms, sometimes there are treatments for those symptoms. So if somebody is depressed, you can treat them with an antidepressant or you can do talk therapy and try to help them that way. There are some things that improve the movements. Some drugs slow down the movements. Of course, they have side effects, and sometimes it's a tradeoff. You slow down the movements, but you develop side effects. The same if they have delusions or, or psychotic behavior. There are drugs that are antipsychotics, and that sometimes improves that behavior. If they have severe anxiety, there are things that can improve anxiety.

 

So, there are things you can do to help people with Huntington's, but you don't cure the disease.

 

Getting back to it being a fatal disease, that brings up the issue of when it develops and the fact that it has this huge range of onset. It usually develops in the thirties or forties--those are the typical ages when you get it. And you may live for decades--three, four, five decades with no symptoms at all, and then develop the disease, and then live for 15 or 18 years. But there is a juvenile variety where children develop the disease. And there's a late onset variety where people don't develop it until they're in their seventies. So, if you develop it when you're 70 and 10 years later, you die of cancer, it wasn't really a fatal disease, right? So sometimes it is not as severe as it seems. As I say in my book, I've seen children who've had this disease when they were in an elementary school and I've also seen people who were in even their early nineties with the disease.  The age range is surprisingly large.

 

Robin Lindley: You describe how the health care system often falls through for Huntington’s people and they don't get required care. What changes would you like to see with our health care system?

 

Dr. Thomas Bird: So, the people without financial and social resources need more help that they can't provide them themselves. If we really want to care for them compassionately, we've got to provide some resources for them. And that depends on what their problems are. If it's a medical problem, they need doctors and nurses and medication. If it's a social problem, they need housing, they need an occupation, they need social workers, they need appropriate diet. So, I think we need to put more resources into caring for people who, for no fault of their own, don't have those resources.

 

For people with progressive mental illness, particularly those that have lack of awareness, it's hard to care for them because they don't think there's anything wrong with them. But we have to be careful not to compromise their autonomy. People should be able to know in our country that they're free to behave in a wide range of ways as long as you don't hurt other people. But if you're injuring yourself or you're injuring people around you in some way, we have to try and get better care for you. 

 

And if we can't put them in a state institution, then we need facilities in the community that are locally easily available that can care for them. And that's not easy if they don't want much care. But I think we need to provide as much as we can locally. For some people that means they need a controlled environment, at least for some period of time. It might be a week or a month or a couple of months, but there are some people that, for their own safety and health and for the safety and health of the community, need to be in a controlled environment for a while.

 

Robin Lindley: Wasn't the hope of those who advocated for deinstitutionalization that the mentally ill would have alternatives to institutions in their communities? 

 

Dr. Thomas Bird: Unfortunately, too much emphasis was placed on saving money. What everybody saw with it was that deinstitutionalization would save money. They could close state hospitals or they could reduce their size and that would save millions of dollars. There was also this idea that new medications had been discovered that would so effectively treat these diseases that patients would not need close monitoring. You would just give them a pill and they would be fine.  That idea turned out to be terribly naive.

 

Not enough resources were put into local facilities. There were some, but it wasn't anywhere near what was needed to really care for these people. And not only do you need a physical facility and not only do you need medications, but you need professionals who can care for patients and follow them and monitor them. That means doctors and nurses and medical staff and social workers who are trained and dedicated to care for these people who are difficult to care for. They're not simple. And so those professionals are expensive. The fact that you close down a state mental hospital doesn't mean that the adequate care of these people is going to be cheaper.

 

Robin Lindley: What are we learning from recent genetics research? Will it be possible in the future that some of the degenerative diseases like Huntington's disease can be prevented or somehow addressed with gene editing?

 

Dr. Thomas Bird: Yes. That's the hope right now. There's a very strong hope that the disease can be attacked from the genetic therapeutic standpoint and that there are ways to shut down the effects of the abnormal gene and basically turn it off. Whatever it's producing that's abnormal, you would stop and shut down and that would prevent the disease from progressing or from even developing. There's a lot of enthusiasm about doing that for Huntington's disease, and also all genetic diseases.

 

There have been a couple of successes in other diseases using that kind of approach. There's a disease of children called spinal muscular atrophy, a very severe condition. They've used a genetic approach to turn off that abnormal genetic mechanism and have the correct one work properly. It has been a huge benefit to these kids who otherwise would have died. There's a lot of enthusiasm for that.

 

And there is a study of Huntington's going on right now where they're using that kind of therapeutic approach and hoping that it will work. Probably in a year or two we'll know the results of that study. They're doing the same thing for genetic forms of ALS.

 

Robin Lindley: Is Alzheimer's in that category too?

 

Dr. Thomas Bird: Alzheimer's is more complicated because there are so many different causes of Alzheimer's and most Alzheimer's is not purely genetic. Huntington's is always purely genetic. Alzheimer's is usually not purely genetic, but there are some rare forms of Alzheimer's that are genetic diseases caused by a single mutation in a single gene. So in those rare kinds of Alzheimer's, that sort of approach is being considered.

 

Robin Lindley: Do you have any other comments for readers or anything to add about Huntington's or your research?

 

Dr. Thomas Bird: When I think back about Huntington's, the things I like to emphasize are that, for a medical science kind of person, it's a fascinating disease. It's also a tragic disease. And it's an important disease. Even though it's uncommon, I think it has important implications for all degenerative brain diseases and for mental illness. So I think its importance is way out of proportion to its uncommon frequency in the population. So it's fascinating. It's tragic. And it's important.

 

Robin Lindley: Thank you for your words Dr. Bird, and congratulations on your vivid and informative new book on Huntington's disease. Your compassion and devotion to your many patients with this perplexing and cruel condition is inspiring.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171788 https://historynewsnetwork.org/article/171788 0
Presidential Personality and Politics

 

 

America’s founders researched and valued the lessons of history. This is evident in The Federalist Papers, that series of newspaper columns written by Alexander Hamilton, James Madison, and John Jay for New York City newspapers that justified the new American Constitution and appealed for its ratification.  Since it was common for copies of newspapers to be kept in taverns for reading by patrons, it should not be assumed that they were read only in elite circles.  

 

Are we as thoughtful as that generation?  In his latest book, Joseph J. Ellis, formerly the Ford Foundation Professor of History at Mount Holyoke College, author of many books about that early period of American history, argues that founding period offers many lessons for present-day America.  Yet, with all its many virtues, no one will mistake it for the equivalent in our day of The Federalist Papers.  

 

Like so many modern books on history, this book is driven by a concern for personality and politics.  Ellis argues that the approaches to political issues of Thomas Jefferson, John Adams, James Madison, and George Washington can have lessons for modern politics. Though this book focuses on these people’s lives and gives little attention to the societies that surrounded them, Ellis offers little to evaluate what each man could have done differently politically or what mistakes they made that modern Americans should try to avoid, except in the most general sense of learning from their flaws in character. While personality and judgment are emphasized, Ellis does not really analyze how leadership functions in our society to foster discussions of policy.  

 

For example, Ellis argues that Thomas Jefferson played an active role in trying to limit slavery, especially in the new territories but also in his native Virginia. Jefferson was rebuffed and although he still opposed slavery, he was temperamentally uncomfortable with being a center of controversy if he could avoid it.  So he just gave up.  In his old age he opposed abolitionist schemes in the North because he felt that abolitionists would seek to produce mixed race communities that he felt were not feasible. In his view, abolitionism would just exacerbate sectionalism.  The author claims the racism of Jefferson illuminates present-day racism. But the author does not explain if he thinks racism is diminishing or increasing nor adequately explain terms like “structural racism.”  

 

His discussions of John Adams, James Madison, and George Washington follow the same pattern, discussing their personalities more than the choices the nation faced, then as now.   

For example, in retirement, Thomas Jefferson and John Adams exchanged friendly letters with each other that rebuilt their friendship that had become frayed during their earlier period as political rivals. In analyzing these exchanges, Ellis emphasized the temperamental differences between the two, the naïve optimism of Jefferson (except regarding slavery of course) and the fearful pessimism of Adams.   

 

Regarding “Our Gilded Age” he provides a quite good summary on the circumstances that led to quite severe maldistribution of income, not only in the United States but in places with similar economies like Europe.   But by limiting intellectual discussion of how to deal with this state of affairs to a recapitulation of the debate between John Adams and Thomas Jefferson, or in broader terms between the Federalists who favored a strong federal government, and the anti-Federalists who favored a weak federal government, what is left out is much detail about what particular actions would be useful at both the local and nationwide levels.  Instead the general mythology that all of American history is nothing more than a replaying of the debates between the Federalists and the anti-Federalists is reinforced. 

 

The author portrays James Madison as a skillful politician and highlights his role in organizing the Constitutional Convention and the ratifying conventions that followed. Madison changed his political alliances over time, moving away from the Federalist party and joining Thomas Jefferson when he felt that the Federalist administration became arrogant and even abusive.  More than John Adams, and even more than Thomas Jefferson, James Madison reacted to circumstances and changed his opinions. Ellis argues that Madison’s political skills clarifies what the founding fathers considered the original intent of the Constitution. Ellis really shows that Madison had good political instincts but his philosophy of government remains unclear. 

 

Ellis seems to admire George Washington most of all.   Ellis details how he sought to conduct foreign affairs in an honorable and reasonable fashion. For example, Washington  negotiated the Treaty of New York (1790) with the Creek Nation that would transfer western Georgia, northern Florida, southern Tennessee, and most of Alabama to the Creek Nation.  Instead, a flood of settlers on the Georgian frontier refused to be bound by this treaty and the legislature of Georgia rejected the treaty.  Nonetheless, Ellis uses this example to argue George Washington had the temperament to be honorable in foreign negotiations that many politicians throughout American history lacked.   He illustrates this by arguing American foreign policy after the fall of the Soviet Union has been characterized by half-baked moral crusades against greatly exaggerated threats. This claim seemingly undercuts his argument of the worthiness of our foreign policy establishment.

 

These are all interesting stories, but with few revelations and only very general lessons for the present, Ellis hopes that from Thomas Jefferson we can learn lessons on American racism, from John Adams lessons on economic inequality, from James Madison lessons on understanding constitutional laws, and from George Washington lessons on foreign policy.  Yet mostly what we have learned is that a leader should be thoughtful and gracious, and should understand the many ramifications of the problems under discussion. Also, leaders tend to be limited by their own prejudices, and the prejudices of their times.   Even when we share the values of these Founding Fathers, and when we don’t, their policy options are not necessarily the same as our policy options.  World trade, the threat of international war, the ecological crisis and automation in the workplace are all issues that do not have their 18th century equivalents. Yes, this book is a start, an enjoyable start, to help guide citizens in their future political decisions.  But these decisions require policy choices that cannot be handled merely by stories about the  Founding Fathers.  Nevertheless, we can take away good lessons especially on personal character from this start that Ellis provides.   

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171789 https://historynewsnetwork.org/article/171789 0
Do Those Closest to Trump Think He's Fit for Office?

 

Several times since 2016, I have criticized the performance of our current President Donald J. Trump.  This little essay is, one might say, “a horse of a somewhat similar color.”  Like many observers in our concerned society, I have speculated on the “mental capacity” and the job performance of the man who will apparently be in the Oval Office until 2020—and, who knows, maybe longer.

So far, I have reviewed five books about Trump in the White House by individuals who have really studied “the nature of our President’s  mentality.” I have gone back to see how those authors felt about the mental condition of the one who has appointed such weird and unsuitable individuals to high office. The authors are both male and female, politically inexperienced and veterans, and all have been given major media attention. What can we learn from these books? Let’s start.

The jacket of Omarosa’s 334 page “insider’s account” UNHINGED  boasts, “Few have been a member of Donald Trump’s inner orbit longer than Manigault Newman.” An Assistant near the Oval Office,  she ended her friendship Donald  with distaste. In a chapter entitled “I Think Our President Is Losing It,” she wrote openly about our President’s “level of paranoia” and  found “… something real and serious was going on in Donald’s brain.  His mental decline could not be denied….  I knew something wasn’t right.” Even more, she revealed the make of three guns he proudly owned and said at least once  in the primaries he carried a gun. 

Something of a pioneer in judging Trump is Michael Wolff, whose book FIRE AND FURY created a stir.  Boasting “deep access to the West Wing,” Wolff casually writes toward the end of the book that “staffers” were concerned that Trump’s rambling and alarming repetition of the same sentences had significantly increased. Further, his ability to stay focused, never great, had noticeably declined and the staffers worried this would be noticed by the general public. Wolff also makes frequent references to “Trump’s stupidity.”

At the time of reviewing, I found this short paragraph mid-book upsetting:  

“Trump’s extemporaneous moments were always existential, but more so for his aides than for him.  He spoke obliviously and happily, believing himself to be a perfect pitch raconteur and public performer, while everyone with him held their breath. If a wackadoo moment occurred on the occasions—the frequent occasions—when his remarks careened in no clear direction, his staff had to go into intimate  method-acting response.  It took absolute discipline not to acknowledge what everyone could see.” p. 137  

Even more, Trump had “contempt for other people’s expertise that was understood by everybody in his billionaire circle.” He was openly contemptuous of both the Bush and Obama families, and he didn’t back down from his disgusting critique of prisoner then national hero John McCain. (That gross misconduct has been quite amazing!)

At more than 400 pages, Bob Woodward’s FEAR is “drawn from hundreds of hours of interviews with participants and witnesses to these events.”  The President refused to be interviewed by the famous reporter and author.  Maybe it was just as well. Woodward argues President Trump has “anger issues,” problems making apologies, is erratic, is impulsive, and is “an emotionally overwrought, mercurial and unpredictable leader.”  The executive power of the United States has come to experience a “nervous breakdown.” p. xxii   

It can be downright frightening to read over 400 pages about life in the White House and the views of aides about their powerful leader. Rational lifelong leaders who had seldom shown fear now did. “The senior White House staff and national security team were appalled,” Woodward wrote, “They didn’t know what the president might say or do.” Staff Secretary Rob Porter said “A third of my job was trying to react to some of the really dangerous ideas that he had and try to give him reasons to believe that maybe they weren’t such good ideas.” Lawyer Dowd’s version of Trump’s characteristics (rooted in a 47 year legal career) is virtually unprintable: "he’s going to say ‘I don’t remember’ 20 times.  And I’m telling you, Bob, he doesn’t remember.” “[T]hese facts and these events are of little moment in his life." "I told you he was a goddamn dumbbell.” 

Trump asked  if if Kim Jong Un had a nuclear button on his desk “at all times,” why, “Will someone from his depleted and food starved regime please inform him that I too have a Nuclear Button, but it is a much bigger & more powerful one than his and my Button works!”  

What has happened to “patient diplomacy”?  Those of us who fearfully speculate that a North Korean missile just might get steered toward San Francisco or Honolulu, are jumping up and down.  The prospect of an oddball individual in a redecorated Oval Office thinking he can, casually,  consider happily arousing one like KIM to maybe fire off a missile if in the mood, free of any consequences, is much too much. Actions DO have consequences, no?  

Which brings us to James Comey: tarred by a difficult political decision he made late in 2016, but a decent long-time leader of our FBI. To me, he has every right to place on his book’s cover words like its provocative title,  “A HIGHER LOYALTY:  TRUTH, LIES, AND LEADERSHIP.  I believe him when he says in summary that D.T. insisted on his loyalty.  When he didn’t get his promise of loyalty in advance, he barged ahead to fire the FBI leader!  Earlier, when Comey was invited to dinner by Trump, he hardly got a word in, as the realtor/builder/golf course creator from NYC, dominated the conversation for an entire meal.  Writes Comey, “None of this behavior, incidentally, was the way a leader could or should build rapport with a subordinate.” Agreement is easy.

 Elsewhere, Comey resented the bizarre conditions placed on him at the time of his “release” from a lifetime with the FBI. He was 3,000 miles from his office and colleagues when Fired.  There was given no ability to say “goodbye” to subordinates.  Reading the Comey book arouses real sympathy for the stalwart FBI leader on the way out.

This writer finds that it is beyond his capabilities to mold into this short essay any account of the renderings—little more than educated guesses—about the mind of President Trump offered so far by physicians who lack firsthand interviews/examinations of him. In his book, the psychiatrist-author Justin A. Frank at a major university expresses apprehension over the nature of our President’s mind.  (Of course, a book like that can be written about anybody.)  I do have to say here, however, that the number of times since Inauguration I have heard—or said myself—“He must be crazy” is out of the ballpark.  Donald J. Trump is “different” in so many ways:  so often, unapologetic, in public, well, yes, disgusting.

On the other hand, these profiles of Trump don’t include what many perceive as his successes. His daughter contends “He has the heart and mind of a leader.” His TV show The Apprentice was highly successful.  He has magnified the wealth given him by his father.  He made his name in the tough NYC.  His friends and acquaintances seem to include movers and shakers.  He has not been thwarted by overseas ventures.  Daunted by one, then a second, marriage, he sought and found another at “his level.” 

Medically impaired or not, Donald J. Trump is engaged in splitting and impairing the United States of America.  Something must be done about it. From his daily expression as he comes and goes in the 2019 year, he may indeed be ailing.  Or maybe he just doesn’t enjoy the life surprisingly granted him in our White House and prefers the tropics.   And can’t help showing it?   May one hope that some remarkable change is in the offing, so that before too long “everything will somehow work its way out all right?”

Surely, we all deserve a happy ending: a rational federal government in all three branches, and a respected place as an organized people, living contentedly in this ever-changing world of ours.  If Donald J. Trump, our President, can’t or won’t be part of that somewhat idyllic system and help meet our needs, something will have to be done about it.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171768 https://historynewsnetwork.org/article/171768 0
Roundup Top 10!  

When Slaveowners Got Reparations

by Tera W. Hunter

Lincoln signed a bill in 1862 that paid up to $300 for every enslaved person freed.

 

Why you don’t need to be French or Catholic to mourn the Notre Dame fire

by Kisha G. Tracy

The cathedral is an important part of our shared cultural heritage.

 

 

How historians got Nike to pull an ad campaign — in under six hours

by Megan Kate Nelson

The multinational corporation dropped its “Lost Cause” ads after historians pushed back.

 

 

Immigration, Race, and Women’s Rights, 1919 and Today

by Arnold R. Isaacs

The comparison couldn’t, in many ways, be grimmer or more telling.

 

 

How California is dumbing down our democracy

by Max Boot

It is a matter of national concern that the California State University (CSU) system is on the verge of further diluting its already inadequate history and government requirements.

 

 

Elizabeth Warren’s historically sound case against the filibuster

by Julian Zelizer

The Senate rule has long been used as a weapon against civil rights and other progressive legislation.

 

 

The return of ‘reefer madness’

by Emily Dufton and Lucas Richert

Both supporters and opponents of legalization are quick to use sensationalism to prove their points, stunting the pursuit of real research needed to determine cannabis’ social effects.

 

 

Why Democratic Presidential Candidates Should Make Climate Change Their #1 Issue

by Walter G. Moss

Nothing else—including medical care, the economy, income inequality, immigration, racism, or the gender or race of a candidate—is more important.

 

 

Why Trump Won’t Stop Talking About Ilhan Omar

by Jamelle Bouie

The president is following a Republican playbook that is now nearly two decades old.

 

 

Join my Nato or watch critical thinking die

by Niall Ferguson

A new red army is out to silence debate. We must rise up and resist it.

 

 

Niall Ferguson isn’t upset about free speech. He’s upset about being challenged

by Dawn Foster

Powerful people used to express their views on others unopposed. Now their targets fight back, they find it intolerable.

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171779 https://historynewsnetwork.org/article/171779 0
American Jews Versus Israeli Politics Steve Hochstadt teaches at Illinois College and blogs for HNN.

 

Knesset chamber

 

 

Benjamin Netanyahu just won a record fifth term as Prime Minister of Israel. He has dominated Israeli politics for ten years. His reelection shows the widening gap between the ideas and politics of American and Israeli Jews.

 

The Israeli Attorney General announced at the end of February that Netanyahu will be indicted for bribery and fraud. Just days before the election, Netanyahu said that Israel would annex Jewish settlements on land in the West Bank taken in the Arab-Israeli War of 1967. About 400,000 Israelis live in West Bank settlements. He said, “I will impose sovereignty, but I will not distinguish between settlement blocs and isolated settlements. From my perspective, any point of settlement is Israeli, and we have responsibility, as the Israeli government. I will not uproot anyone, and I will not transfer sovereignty to the Palestinians.”

 

Netanyahu’s electoral opponents were a new coalition of centrist and conservative Israeli politicians. Thus the choice for voters was between a continued hard line against Palestinians and Netanyahu’s even harder line. His victory demonstrates the preference of Israeli voters for an ethically dubious politician, who offers no path toward peace with Palestinians, but continued seizure of formerly Arab land.

 

In 2009, Netanyahu made the following programmatic statement about the most pressing issue in the Middle East: “I told President Obama in Washington, if we get a guarantee of demilitarization, and if the Palestinians recognize Israel as the Jewish state, we are ready to agree to a real peace agreement, a demilitarized Palestinian state side by side with the Jewish state.” Since then he has gradually been moving away from this so-called two-state solution. In 2015, he employed harsh anti-Arab rhetoric during the last days of the election campaign, for which he apologized after winning. He seemed to move away from support of the two-state idea, but said after the election that this idea was still viable.

 

The election of Donald Trump pushed Israeli politics further right. Although Trump repeatedly claimed to have a bold plan to create a peace settlement between Israelis and Palestinians, in fact, he has openly supported Netanyahu’s movement away from any possible settlement. A year ago, Trump announced that the US officially recognized Jerusalem as the capital of Israel. Trump announced last month that the US recognizes Israeli sovereignty over the Golan Heights, seized from Syria during the 1967 war. Netanyahu used giant billboards showing him shaking hands with Trump.

 

To support his election bid this time, Netanyahu offered a deal to the most radical anti-Arab Israeli parties, which had thus far failed to win enough votes to be represented in the parliament, the Knesset. He orchestrated the merger of three far right parties into one bloc, the “Union of Right-Wing Parties”, and promised them two cabinet posts if he wins. One of those parties, Jewish Power, advocates the segregation of Jews and Arabs, who make up 20% of Israelis, and economic incentives to rid Israel of its Arab citizens. Jewish Power holds annual memorials for Baruch Goldstein, who murdered 29 Muslims at prayer in 1994. Imagine an American politician allying with a party which celebrates the murderous accomplishments of Dylann Roof.

 

Netanyahu recently said, “Israel is not a state of all its citizens,” but rather “the nation-state of the Jewish people alone.” That makes a “one-state solution” impossible, because non-Jews would automatically be second-class citizens. Netanyahu’s victory shows that the creation of a Palestinian state is less and less likely, as the land for such a state is increasingly seized by Israel.

 

While most Israelis also say they support a two-state solution, their real politics makes this support meaningless. A poll of Israelis in 2017 showed Jews leaning heavily to the right and extreme right. A more recent poll showed greatly increasing support for annexation: 16% support full annexation of the West Bank with no rights for Palestinians; 11% support annexation with rights for Palestinians; 15% support annexation of only the part of the West Bank that Israel currently fully controls, about 60% of it. About 30% don’t know and 28% oppose annexation.

 

Meanwhile, the uprooting of Arabs and confiscation of their land continue as Jewish settlements expand. While the West Bank is highlighted in the news, the Israeli policy of expelling native Arabs from their homes has also been taking place for decades in the Negev desert in southern Israel. Bedouin communities, many of which predate the founding of the Israeli state, have been systematically uprooted as part of an Israeli plan of concentrating all Bedouins into a few towns, in order to use their land for Jewish settlements and planned forests. The Bedouin communities are “unrecognized”, meaning that the Israeli government considers them illegal. Illegal Jewish settlements in that region have been recognized and supported, while much older Bedouin communities have been labeled illegal and demolished or slated for demolition. Essential services, like water and electricity, have been denied to the agricultural Bedouin villages in order to force their citizens to move to the new urban townships.

 

American Jews are overwhelmingly liberal. Polls since 2010 show over two-thirds supporting Democrats for Congress, rising to 76% in 2018. This long-standing liberalism meant broad support among American Jews for the civil rights struggle during the 20th century. Now the open discrimination against Arabs by the Israeli state, which in some ways resembles the former South African apartheid system, reduces sympathy for Israel.

 

Surveys of American Jews have demonstrated a consistent support for a two-state solution. Since 2008, about 80% of American Jews support the creation of a Palestinian state in Gaza and the West Bank. 80% also agree that a “two-state solution is an important national security interest for the United States.” Many factors have been moving American Jews away from support of Israel. The close family connections between Jews in America and Israel after World War II have diminished over the past half-century. The continued dominance of Israeli politics by ultra-Orthodox religious policies has worn out the patience of more secular American Jews in Conservative and Reform congregations.

 

In fact, the greatest support for hard-line Israeli policies has not been from American Jews, as Ilhan Omar recently implied, but from evangelical Christians who support Trump. After Netanyahu talked about annexing West Bank land, nine major mainstream American Jewish groups wrote to Trump asking him to restrain the Israeli government from annexation, saying that “it will lead to greater conflict between Israelis and Palestinians.”

 

The drifting apart of American Jews and Israelis is a tragic development, but perhaps an inevitable one. As Jews gradually assimilated into American democracy, they congregated at the liberal end of the political spectrum, feeling kinship with other minorities which experienced discrimination. American Jewish religious politics affirmed the traditional Jewish ethical ideas of justice, truth, peace, and compassion. Israeli Jews have faced a radically different environment. Although many of the early Israeli settlers and leaders came from the leftist European labor tradition, decades of conflict with Arab neighbors, in which both sides perpetrated countless atrocities, have led to hardening attitudes of self-defense and hatred for the other.

 

Jews in Israel support politicians and policies that I reject as abhorrent. That is a personal tragedy for me. The larger tragedy is that there appears to be no solution at all to the Israeli-Palestinian conflict.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/blog/154204 https://historynewsnetwork.org/blog/154204 0
I Stuck with Nixon. Here’s Why Science Says I Did It.

Richard Nixon surrenders to reality and resigns, August 9, 1974

Rick Shenkman is the former publisher of the History News Network and the author of Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books, January 2016). You can follow him on Twitter. He blogs at stoneagebrain. This article was first published by the  Daily Beast.

Will Donald Trump’s supporters ever turn on him? I think I know the answer. It’s partly because I’ve been in their place.

During Watergate I was a die-hard Nixon dead-ender. I stuck with him after the Saturday Night Massacre in the fall of 1973 and the indictments of Nixon aides H.R. Haldeman and John Ehrlichman in 1974. Not until two months before Nixon resigned did I finally decide enough’s enough.

What was wrong with me? I’ve been haunted by that question for decades. 

I can clear up one thing immediately. I didn’t support Nixon out of ignorance. I was a history major at Vassar during Watergate and eagerly followed the news. I knew exactly what he’d been accused of.

The fact is the facts alone didn’t matter because I’d already made up my mind about him. My fellow Vassar students—all liberals, of course—pressed me to recant. But the more they did, the more feverish I became in my defense. I didn’t want to admit I was wrong (who does?) so I dreamed up reasons to show I wasn’t—a classic example of cognitive dissonance in action. 

A pioneering study by social psychologist Elliot Aronson conducted in the 1950s helps explain my mental gymnastics. Young college women invited to attend a risqué discussion of sexuality were divided into two groups. One group was put through a preliminary ritual in which they had to read aloud a list of words like “prostitute,” “virgin,” and “petting.” The other group had to say out loud a dozen obscenities including the word “fuck.” Afterwards, the members of both groups were required to attend a discussion on sex, which is what had been the draw. But it turned out they had all been duped. The discussion wasn’t risqué. The subject turned out to be lower-order animal sexuality. Worse, the people leading the discussion spoke in a monotone voice so low it was hard to follow what they were saying. 

Following the exercise the students were asked to comment on what they had been through. You might expect the students who went through the embarrassing rite of speaking obscenities to complain the loudest about the ordeal. But that isn’t what happened. Rather, they were more likely to speak positively about the experience.

The theory of cognitive dissonance explains why. While all of the subjects in the experiment felt unease at being duped, those for whom the experience was truly onerous felt a more compelling need to explain away their decision to take part. The solution was to reimagine what had happened. By rewriting history they could tell themselves that what had appeared to be a bad experience was actually a good one. Dissonance begone.

This is what I did each time one of my Vassar friends pointed to facts that showed Nixon was lying. 

Neuroscience experiments in the 21st century by Drew Westen show what happens in our brain when we confront information at odds with our commitments. In one study, supporters of President George W. Bush were given information that suggested he had been guilty of hypocrisy. Instead of grappling with the contradiction they ignored it. Most disturbing of all, this happened out of conscious awareness. MRI pictures showed that when they learned of Bush’s hypocrisy, their brains automatically shut off the “spigot of unpleasant emotion.” (It’s not a uniquely Republican trait; the same thing happened with supporters of John Kerry.) 

In short, human beings want to be right and we want our team to win. But we knew all that, right? Anybody who’s taken a Psych 101 class knows about confirmation bias: that humans seek out information that substantiates what they already believe; and bounded rationality: that human reason is limited to the information sources to which we are exposed; and motivated reasoning: that humans have a hard time being objective. 

But knowing all this isn’t enough to understand why Trump voters are sticking with Trump.

What’s required instead is a comprehensive way to think about the stubbornness of public opinion and when it changes. Until a few decades ago no one had much of a clue what a comprehensive approach might look like. All people had to go on was speculation. Then scientists operating in three different realms — social psychology, neuroscience, and political science — began to delve into the working of the human brain. What they wanted to know was how we learn. The answer, most agreed, was that the brain works on a dual-process system, a finding popularized by Daniel Kahneman, the Nobel prize-winning Princeton psychologist, in the book, Thinking Fast and Slow.

One track, which came to be known as System 1, is super-fast and happens out of conscious awareness, the thinking you do without thinking.

There are two components to System 1 thinking. One involves what popularly is thought of as our animal instincts, or what social scientists refer to, with more precision, as evolved psychological mechanisms. Example: the universal human fear of snakes. The other involves ways of thinking shaped by habit. The more you perform a certain task, the more familiar it becomes and the better you get at it without having to think about it.

Donald Trump likes to say that he goes with his gut. What he’s saying, likely without knowing it, is that he has confidence in his System 1. This is not exceptional. Most of us trust our instincts most of the time. What distinguishes Trump is that he seems to privilege instinct over reason nearly all of the time.

The second track, System 2, is slower and allows for reflection. This mode, which involves higher-order cognitive thinking, kicks in automatically when our brain’s surveillance system detects a novel situation for which we aren’t prepared by experience. At that moment we shift from unconscious reaction to conscious thinking. It is System 2 that we rely on when mulling over a difficult question involving multiple variables. Because our brain is in a sense lazy, as Kahneman notes, and System 2 thinking is hard, our default is System 1 thinking.

One thing that’s worth noting about System 1 thinking is that our brains are essentially conservative. While humans are naturally curious about the world and we are constantly growing our knowledge by, in effect, adding books to the shelves that exist in our mind’s library, only reluctantly do we decide to expand the library by adding a new shelf. And only very rarely do we think to change the system by which we organize the books on those shelves. Once we settle on the equivalent of the Dewey Decimal System in our mind, it’s very hard to switch to another system. This is one of the main reasons why people are almost always reluctant to embrace change. It’s why inertia wins out time and time again.

But change we do, thanks to System 2. But what exactly triggers System 2 when it’s our politics that are on the line? Social scientists finally came up with a convincing explanation when they began studying the effect of emotion on political decision-making in the 1980s.

One of the pioneers in this research is George Marcus. When Marcus was starting out as a political scientist at Williams College he began to argue that the profession should be focusing more on emotion, something they’d never done, mainly because emotion is hard to quantify and count and political scientists like to count things. When Marcus began writing papers about emotion he found he couldn’t find editors who would publish them. 

But it turned out his timing was perfect. Just as he was beginning to focus on emotion so were neuroscientists like Antonio Damasio. What the neuroscientists were learning was that the ancient belief that emotion is the enemy of reason is all wrong. Rather, emotion is the handmaiden of reason. What Damasio discovered was that patients with a damaged amygdala, the seat of many emotions, could not make decisions. He concluded: The “absence of emotion appears to be at least as pernicious for rationality as excessive emotion.” 

If emotion is critical to reason, the obvious question became: which emotion triggers fresh thinking? Eventually Marcus and a handful of other political scientists who shared his assumption that emotion is important to decision making became convinced that the one that triggers reappraisals is anxiety. Why anxiety? Because it turned out that when people realize that the picture of the world in their brain doesn’t match the world as it actually exists, their amygdala registers a strong reaction. This is felt in the body as anxiety.

Eventually, Marcus and his colleagues came up with a theory that helps us understand when people change their minds. It became known as the Theory of Affective Intelligence (later: the Theory of Affective Agency). The theory is straightforward: The more anxiety we feel the more likely we are to reconsider our beliefs. We actually change our beliefs when, as Marcus phrases it, the burden of hanging onto an opinion becomes greater than the cost of changing it. Experiments show that when people grow anxious they suddenly become open to new information. They follow hyperlinks promising fresh takes and they think about the new facts they encounter.

How does this help us understand Trump supporters? It doesn’t, if you accept the endless assertions that Trump voters are gripped by fear and economic anxiety. In that case, they should be particularly open to change. And yet they’re as stuck on Trump as I was on Nixon.

The problem isn’t with the theory. It’s with the fear and anxiety diagnosis. 

Humans can multiple feelings at odds with one another simultaneously, but research shows that only one emotion is likely to affect their politics. The dominant emotion characterizing so-called populist voters like those attracted to Trump is anger, not fear. This has been found in studies of populists in FranceSpainGermany and Britain as well as the United States

If the researchers are right that populists are mostly angry, not anxious, their remarkable stubbornness immediately becomes explicable. One of the findings of social scientists who study anger is that it makes people close-minded. After reading an article that expresses a view contrary to their own, people decline to follow links to find out more information. The angrier you become, the less likely you are to welcome alternative points of view. 

That’s a powerful motive for ignoring Trump’s thousands of naked lies.

Why did I finally abandon Nixon? For months and months I had been angry over Watergate. Not angry at Nixon, as you might imagine, but angry at the liberals for beating up on him. Nixon fed this anger with repeated attacks on the people he perceived as his enemies. As long as I shared his anger I wasn’t prepared to reconsider my commitment to his cause. 

But eventually there came a point when I stopped being angry and became anxious. 

I would guess that what happened is that over time Nixon’s attacks came to seem shopworn and thin. Defending him became more of a burden than the cost of abandoning him.

If I am right about the circuitous path I took from Nixon supporter to Nixon-basher, there’s hope that Trump supporters will have their own Road to Damascus epiphany. Like me, they may finally tire of anger, though who knows. Right-wing talk radio and Fox News have been peddling anger for years and the audience still loves it.

It took me 711 days from the time of the Watergate burglary to my break with Nixon, when I resigned from a committee defending him, to come to my senses. As this is published, it has been 812 days since Trump became president. And there’s little indication that Trump voters have reached an inflection point.

Any of a number of disclosures could disillusion a substantial number of them. We have yet to read the full Mueller report. Nor have we yet seen Trump’s tax returns, which might prove politically fatal if they show he isn’t really a billionaire or if they prove his companies depended on Russian money. (As Mitt Romney suggested, the returns likely contain a bombshell.) 

If Trump’s disclosures suggest to his supporters that they were chumps to believe in him his popularity no doubt would begin eroding. And already there’s evidence his support has weakened. In January 51 percent of GOP or GOP-leaning voters said they considered themselves more a supporter of Donald Trump than the Republican Party.  Two months later the number had declined to 43 percent. If this slippage is because more supporters feel they are embarrassed to come out as full-blown Trumpies he may be in trouble come election day.

In the end, politics is always about the voters. Until now, Trump has made his voters by and large feel good about themselves by validating their anger. But there remains the possibility that in the coming months disclosures may make them feel that they have been conned, severely testing their loyalty. If the anger they feel either wears off or is redirected at Trump himself their amygdala should send them a signal indicating discomfort with the mismatch between the known facts and their own commitments.

This presupposes that they can get outside the Fox News and conservative talk bubble so many have been living inside. Who knows if they will. It is worth remembering that even in Nixon’s day, millions remained wedded to his lost cause even after the release of the smoking-gun tape. On the day he resigned, August 9, 1974, 50 percent of Republicans still supported him even as his general approval dropped to 24 percent.

To sum up: Facts finally count if enough loyalists can get past their anger to see the facts for what they are. But people have to be exposed to the facts for this to occur. And we can’t be sure that this time they will be.

 

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/blog/154203 https://historynewsnetwork.org/blog/154203 0
The Sorrow of Watching Notre Dame Burn

 

On the 20th day of Brumaire during the Second Year of the revolutionary order, a spectacle was held inside the newly consecrated Temple of Reason. Upon the altar of what had once been the magnificent cathedral Notre-Dame de Paris at the very heart of the greatest city in Christendom, the religious statues were stripped away (some decapitated like the heads of the overthrown order) and the whole building was turned over to a festival for what the most-radical of Jacobins called the “Cult of Reason.” In the hopes that this atheistic faith-of-no-faith would become the state-sponsored religion of the new regime, the revolutionaries staged their own observance, with young girls in tri-colored sashes performing a type of Morris dance about a statue of the Goddess Reason.  Such was just another occurrence among the competing factions of the Revolution, and which saw the dechristianization of France, including not just the iconoclasm of smashed stain-glass and white-washed images, but the execution of perhaps 30,000 priests. Less than a decade laterMass would once again be celebrated upon Notre-Dame’s altar. 

Within the shadow of its spire – which as of today no longer stands – the great Renaissance essayist Montaigne would have walked. By the massive rose window which filtered natural light into the ring of cobalt blue and emerald green, solar yellow and fire red, Rene Descartes may have contemplated his Cogito. By its gothic flying buttresses and underneath its simultaneously playful and disquieting gargoyles the novelist Victor Hugo both celebrated her stone walls and arches while advocating for her 19th century restoration. In 1323 the scholastic theologian John of Jandun would write of the cathedral that she “deservedly shines out, like the sun among stars.” And through it all, over a millennium of Parisian history, the cathedral stood guard from its island in the Seine.  Which is not to say that the cathedral hadn’t been destroyed before, and that it wouldn’t be destroyed again. Notre-Dame withstood the Wars of Religion which burnt across France during the sixteenth-century and Hitler’s orders to leave not a stone of Paris standing when the Nazis retreated at the end of the Second World War, and yet the cathedral endured. Since the twelfth-century Notre-Dame has survived, and while we watch with broken hearts as her spire collapses into the burning vaulted roof during this mournful Holy Week, we must remember that Notre-Dame will still be standing tomorrow. 

Sorrow for the destruction of something so beautiful, so perfect, must not obscure from us what a cathedral is. A cathedral is more than the granite which composes her edifice, more than the marble which lines the nave. More than the Stations of the Cross and the statues; more than the Crucifix which punctuates the altar. A cathedral is all of that, but it is also an idea; an idea of that which is more perfect than this fallen world of ours. More mysterious, and more powerful, and more beautiful. When we see push notificationsalerting us to the fire of this April 15th, when we see that tower which points to the very concept of God collapsing above her nave, it can feel as if civilization itself is burning. As if watching the Library of Alexandria be immolated on Facebook live, or reading the live tweeting of the dissolution of the monasteries. In this age of uncertainty, of rage, of horror, and of violence; of the decline of democracy and the heating of the planet; it can feel as if Notre-Dame’s fire is as if watching the very world itself be engulfed. Which is why it’s so important to remember what a cathedral is, what Notre-Dame is. 

Skeptics can reduce that which is associated with the phrase “High Church” to an issue of mere aesthetics, as if in our post-Reformation, post-secular world the repose of a cathedral is simply a mood or a temper and not a profound comment in its own right. An allegiance to the sacredness of silence, of the holiness of light refracted onto a cold stone floor. Minimalism makes its own offers and promises, and requires its own supplication, and the power of simplicity and thrift should not be dismissed. But a cathedral makes its own demands – a cathedral is beautiful. The intricacy of a medieval cathedral is not simply an occasion for art historians to chart the manner in which the romanesque evolved into the gothic, or for engineers to explicate the ingenuity of the flying buttress. Notre-Dame isn’t simply a symbol of Paris, nor a landmark by which a tourist can situate themselves. A cathedral is larger than the crowds which line up to take selfies in front of it; a cathedral more significant than the gift shops and food trucks which line the winding cobble-stoned streets that lead up to it. A cathedral is an argument about both God, but also humanity and the beauty which we’re sometimes capable of. 

Tomorrow the world will be less beautiful than it was this morning, and this is in a world which has precious little beauty that it should be able to give up. That Notre-Dame should be burning this April evening is a calamity, a horror. It is the loss of something that is the common treasury of humanity, which belongs not entirely to the people of France, nor only to those whom are Roman Catholics, but which rather sings of those yearnings of all women and men, living in a world not of our own creation but trying to console each other with a bit of beauty, a bit of the sacred. To find that meaning in the cathedral’s silence, in that movement of light and shadow upon the weathered wooden pews and the softness of the grey walls. The 17th century English poet George Herbert wrote of “A broken ALTAR… Made of a heart and cemented with tears,” as indeed may describe the crowds gathering along the Seine and singing hymns to our burning cathedral this spring night. Herbert’s poem is an apt explanation of what a cathedral is. A cathedral is a person. Her spine is the nave, and the transept her arms; the window her face, and the spire her head – the altar a heart. And though a cathedral is as physical as our finite bodies, threatened by incendiary and crowds, by entropy and fire, its soul is just as eternal. 

If there is something to remember, it’s that in the era before steel and reinforced concrete an anonymous mason would begin work with his brothers on a cathedral that his children would most likely never see completed. Perhaps his grandchildren would never live under its full height either. To work on a cathedral was a leap into a faith that we can scarcely imagine in our era, to work towards a future you’d never see, and yet to embrace that which is greater, more sublime, more perfect that you are. Our attitude of disposable consumerism and exploitive capitalism makes such an ideology a foreign country to us, yet if we’re to solve any of those problems that face us today – from climate change to the restoration of democracy – it must be with the faithful heart of a medieval mason who toils with the knowledge that a spire will rise above Paris – again. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171724 https://historynewsnetwork.org/article/171724 0
Benny and Joon and a Good Look at Schizophrenia

 

How do you stage a charming musical about schizophrenia? Was there ever a dimmer, sadder and troubling topic for a play?

Just ask the folks who run the Paper Mill Playhouse, in Millburn, New Jersey, where the new musical Benny and Joon opened on Sunday. It is delightful look at modern schizophrenia with three stars who are not only entertaining, but work hard to exam schizophrenia and talk about it on stage with candor, and with smiles, too.

Benny and Joon is the musical version of the 1993 movie of the same name that starred Johnny Depp. Schizophrenia was such a controversial topic in that year that the word schizophrenia was never mentioned in the script. Now, thankfully, it is.

Benny and Joon are brother and sister (he’s early twenties and she’s 20 or so). Joon suffers from schizophrenia and was a handful for her parents. They were killed in a cars crash when Benny was 18. Now, with them gone, Benny, who runs a car repair shop, has to raise her. All of a sudden, after a bad night playing poker, Benny has to provide room and board for a kooky young man, Sam, who comes to live with them. Sam, who wears and odd- looking hat, sees himself as the re-incarnation of Charlie Chaplin and Buster Keaton and mimics them. In one nice bit he uses dinner rolls as people and has them dance.

It starts with Sam’s arrival. Joon is relentless in her schizophrenic behavior and although Benny loves her to death, she drives him crazy. He faces the very real possibility of putting her into a group home with other mentally ill people. Joon, of course, wants to keep living with him and continue her amateurish career as a painter. He does not know what to do and consults Joon’s psychiatrist. 

The story of the play is Benny’s fear of putting Joon into a home or, later in the play, a mental institution. Throughout the story, Joon exhibits numerous schizophrenic tendencies. She is moody, very happy and then very sad, convinced people are trying to hurt her, fearful of what will happen to her. She’s impulsive. She doesn’t listen to people. She’ argumentative, possessive. There is no typical schizoid, but Joon exhibits the qualities of many people seen as such.

Yet, though all of this you love her.

 Kirsten Guenther wrote the book and the music and lyrics are by Nolan Gasser and Mindi Dickstein. They use their words and songs to suggest that while Joon might need help, she may not need all of the help that people suggest. They also get you to root for Joon. Isn’t she like so many quirky people we all know? Don’t put her away, people will say, just put up with her.

Sam, as he bops around the stage in a very goofy way, starts to admire, and then love, Joon. It’s an improbable relationship, to be sure, but so what? Where will they live, Benny asks his sister? The answer, as she frets, is well, who knows. We’ll get by.

Big brother Benny is scared to death. He is so, so worried about his sister and needs to protect her. What he’s going to do?

The success of the play is the work of the three stars, Claybourne Elder as Benny, Hannah Elless as Joon and Bryce Pinkham as the slightly nutty but thoroughly adorable Sam. They play their characters as lovable people trying to ward off schizophrenia.

The story is not about schizophrenia, but how it affects the families of its victims. It is the story, too, about how all mental illnesses affect families. We need more of these stories. There are tens of thousands of moms and dads, brothers and sisters, and have been throughout history, who have to live with and cope with mentally ill people. It is a struggle and Benny and Joon shows that in a majestic way. You need to love and support the victims of mental illness, not just toss them into a group home.

The music in Benny and Joon is OK, but none of the songs are memorable. Together, though, they create a nice atmosphere for the story. Some of the songs are painful, as they help to tell the story of the brother and sister and their wacky friend Sam.

The show’s director, Jack Cummings III, gets fine work from his stars, Elder, Elless and Pinkham, but also gets fine performances from the other actors in the play - Colin Hanlon, Paolo Montalban, Conor Ryan, Natalie Toro, Jacob Keith Watson, and Tatiana Wechsler.

Schizophrenia is a relatively new illness, not named by Doctors until 1908. Medical specialists today see schizophrenics as people with split personalities. They are, in general, wildly eccentric, believe other people are trying to get them to do things, feel slightly paranoid and see themselves embattled against just about everybody.

Benny and Joon, in the end, is both a sobering look at schizophrenics and a wonderful look at a pair of siblings who fight and feud, with the troubles of schizophrenia added, but, through it all, love each other.

We need more plays like this one. And more Bennys and Joons in this world, too.

 

PRODUCTION: The play is produced by the Paper Mill Playhouse. Scenic and Costume Design: Dane Laffrey, Sound: Kai Harada, Lighting: R. Lee Kennedy., choreography: Scott Rink. The play is directed by Jack Cummings III. It runs through May 5.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171750 https://historynewsnetwork.org/article/171750 0
The Electoral College and the Myth of a Proslavery Ploy

 

As the New York Times at present lacks the proper format for a debate over an Op-Ed of mine that it published on the origins of the Electoral College, “The Electoral College Was Not a Pro-Slavery Ploy,” I am grateful to History News Network for giving me the opportunity to reply to Akhil Reed Amar and his Op-Ed, “Actually, the Electoral College Was A Pro-Slavery Ploy.” I look forward to continuing the debate on slavery and the Constitution, which certainly includes the inception of the Electoral College but also involves the larger and, I believe, more important historical issues raised in my recent book, No Property in Man: Slavery and Antislavery at the Nation’s Founding.

    

Earlier this month, I wrote on Op-Ed piece for the New York Times disputing claims that the Electoral College originated as a slaveholders’ ploy at the nation’s founding in 1787. The issue has become important recently as part of a larger debate about the Electoral College. In the wake of two presidential elections, in 2000 and 2016, where the electoral system overruled the popular will, many Americans, especially inside the Democratic Party, have declared that the system ought to be seriously amended, if not eradicated in favor of direct popular election of the president. I have long believed the Electoral College at least needed fixing. The point of my Op-Ed, though, was a different one, having to do with history.

Last September, I published a book that challenges the prevailing wisdom about the role of slavery at the Federal Convention in 1787. Almost as an aside, the book briefly discusses how the framers created the Electoral College and argues that, on this matter at least, the prevailing wisdom is correct: the Electoral College, I wrote, arose out of Southern delegates’ efforts to give as much extra power as they could to the Southern slaveholding states. Since then, in part because of the current public debate over the Electoral College, I closely re-examined the issue and concluded that, like most of my fellow American historians, I have been wrong about slavery and the Electoral College. 

In advance of the publication of a paperback edition of my book this fall, I duly prepared a new preface that explained my change of mind. As the general debate about the Electoral College heated up, though, and as the Electoral College’s opponents began decrying its origins in slavery, I thought I ought to write an essay on the subject in a more public venue, lest anything I wrote in my book be taken to support a claim I no longer believe.  I wrote that essay and the Times published it – and very quickly thereafter, the Times published another Op-Ed by Akhil Reed Amar disputing and dismissing my piece. 

Not surprisingly, I think that Amar’s response is mistaken. His account turns the origins of the Electoral College into a simple story that fits well with commonplace current views of slavery and the U.S. Constitution.  Disarmingly straight-forward, it seems almost self-evidently true. But in order for his claims to hold up, Amar’s story has to ignore a great deal that either does not fit or that flatly contradicts him: for example, that the Electoral College emerged as an alternative to another proposed system that also protected slavery; or the fact that the leading pro-slavery states actually voted against what would become the Electoral College, even though it promised to give more protection to slavery than the system they supported.  

On closer examination, though, Amar’s response is even more deeply flawed and in fundamental ways. But before I get into those defects, let me provide a basic chronological narrative of how the framers created the Election College.

 

*************************************************

 

On June 2, 1787, shortly after the convention to frame the new Constitution opened in Philadelphia, the delegates endorsed the creation of a national executive, the president, who would be elected by the national legislature and serve a single seven-year term. In mid-July, though, when the delegates began to favor making the executive eligible for re-election, the consensus for congressional election of the president faltered. 

From the early days of the convention, some delegates had argued in favor of having the executive elected directly by the people at large – a “people” then restricted to white men who met certain property qualifications, but still many orders of magnitude larger than the as-yet only vaguely envisaged Congress. Previously marginalized in the convention’s debates, the advocates for direct election returned to the fray, including such distinguished delegates as Gouverneur Morris who was representing Pennsylvania. American democracy, these men insisted, had reached a point where ordinary voters could and should choose the president. The Congress would be much more vulnerable to bribery and other forms of corruption than what Morris called “the people at large…the freeholders of the County.” 

The convention debated the matter sharply.  At one point, Hugh Williamson, a non-slaveholding delegate from North Carolina, mentioned that direct election would hurt Virginia, the state with the most slaves, because enslaved men “will have no suffrage.” The preponderance of the objections, though, from southerners as well as northerners, echoed the disdainful observation of George Mason – another distinguished delegate and an actual Virginian -- that having the people at large choose the president would be like referring “a trial of colours to a blind man.”  On July 17, the convention defeated a motion for direct election by 9 states to 1.       

A third group of delegates, however, supported an intermediate elector plan, as a middle ground that would provide a more attractive alternative to legislative selection than direct election.  Their basic idea was to give the authority for electing the president to independent electors, possibly chosen by the people, possibly by the state legislatures. The germ of such a plan had appeared in the convention debates much earlier, when James Wilson of Pennsylvania realized that a direct election proposal he was offering would fail. But after the dismal defeat of the direct election proposal on July 17, the supporters of an electoral system, including some like Wilson who had previously spoken in favor of direct election, found their collective voice. 

On July 19, William Paterson of New Jersey, who happened to be a critic of slavery, offered a proposal “that the Executive should be appointed by Electors to be chosen by the States”; Wilson, who was more keenly antislavery than Paterson, chimed in that it was now the “unanimous sense” of the convention that the executive be chosen by “an election mediately or immediately by the people.” James Madison of Virginia, the slaveholder later known as the father of the Constitution, gave a speech that, while it praised the defeated idea of direct election, instead backed an electoral system. A direct system, he observed, for all of its strengths would weaken the slaveholders' power as the electoral system under consideration would not. Momentum for independent electors grew. A motion to replace legislative election of the president with an electoral system passed easily, 6 states to 3, with one state divided.  

It is important to pause here and explain the place of slavery in the different proposals for choosing the president. The congressional selection plan gave the slaveholding states a singular advantage as the convention had already approved the notorious three-fifths clause. This clause stated that apportionment of the seats in the lower house of Congress would be calculated by including three-fifths of the total of each state’s enslaved population. As the three-fifths addition would apply if Congress was selected to choose the president, the congressional mode had a powerful appeal to the South. By contrast, as Madison pointed out in his speech, a direct election system would mean that the non-voting slave population would count for nothing in selecting the president, which Madison said would prevent any Southerner from winning the presidency. The electoral system under consideration, however – a detail that Madison did not mention -- would count all of each state’s inhabitants, slave and free, toward apportioning electors, which in principle would give the slaveholding states even more additional votes than the congressional system gave them with the three-fifths rule.

The really crucial point to remember here is that the electoral system began gaining support as an alternative not to direct election but to the congressional plan, which also protected slavery. Because, except to its strongest supporters, direct election seemed dead and buried, the convention did not face a choice between a system that favored slavery and one that did not but between two systems which favored slavery in different ways. This fact is essential to understanding the convention’s decision to adopt the electoral system and everything that followed. 

Although it was momentarily the convention majority’s choice, the electoral system still had some formidable foes, chiefly in the three most ardently proslavery states, North Carolina, South Carolina, and Georgia. That these states were so opposed to an electoral system is, to say the least, ironic given today’s conventional historical wisdom, but those states had their reasons. Lower South delegates instead favored what had been the approved system, selection of the president by Congress. Their calculations had nothing to do with protecting slavery; indeed, the electoral system promised to offer them additional votes for the president above and beyond what their favored congressional system did. Rather, they scorned the electoral system on elitist grounds, charging that the independent electors, unlike congressmen, as Hugh Williamson put it, “would not be the most respectable citizens” but men of inferior rank, open to bribery and other forms of corruption. And when the convention approved the electoral system, the only three states that voted against it were North Carolina, South Carolina, and Georgia. The lower South states, though, would not easily abide their loss, and they joined in mounting a counterattack. 

Five days after it approved the electoral system, the convention, with the full support of the lower South, reversed itself and rejected the electoral system -- seemingly for good – and restored the choice of the executive to Congress. The day after that, James Madison, who now believed an electoral system was forever doomed, dissented from the convention’s switch and strongly endorsed direct popular election, with hope that the system crushed a few days earlier might yet be revived. 

For more than a month thereafter, the convention continued to support congressional election of the president, but a major dispute broke out over procedure; and so, in the convention’s waning days, a special committee of eleven, appointed to settle the convention’s still unfinished business, offered a comprehensive plan of its own. It was this committee that revived the electoral system idea and effectively invented the Electoral College we know by proposing, for the very first time, apportioning the electors according to each state’s combined representation in the House and Senate. 

Slavery would seem to have been irrelevant to the special committee’s concerns, as its proposal offered the South an inflated proportion of electors just as the southerners’ preferred congressional system did. As Gouverneur Morris, a member of the committee, explained to the convention, the group was motivated by alarm at “the danger of intrigue & faction” should the legislature be authorized choose the president. Yet Morris also remarked that, with nobody “satisfied with an appointment by the Legislature,” “many” committeemen, “were anxious even for an immediate choice by the people.” (These members almost certainly included Morris himself.)  

Morris’s speech raises intriguing questions and possibilities about the committee’s private discussions. With appointment by Congress on the ropes, did Morris seize the opportunity to resuscitate his arguments in favor of direct popular election? Might James Madison, who was also on the committee, have replied that, although direct election was the “fittest” system, its disadvantages to the Southern slave states recommended an electoral system instead? Might another committee member, the New Yorker Rufus King, have restated his reasons, shared in by William Paterson, for endorsing direct election? In the absence of more detailed evidence, it is impossible to know. What is clear is that, as a majority of the committee opposed direct election, it was no more in the cards now than it ever had been. 

At all events, the convention at last approved the committee’s recommendations, with modifications, eleven days before the convention completed its work. The only opposition came from North Carolina and South Carolina, fighting to the bitter end for election of the president by the Congress. 

 

*************************************************

 

Akhil Amar’s response to my Op-Ed evades virtually all of the substantive points I made about this history. Instead, inside a brief space, Amar offers three assertions to support the view that the Electoral College was a proslavery ploy. First, he says, James Madison explained to the convention that a direct popular vote for president was a “non-starter” for the South because, “as slaves couldn’t vote,” the South would lose every time. Second, Madison’s “political calculation” is why the convention rejected a direct vote system. Third, in lieu of a direct vote, the framers considered an indirect electoral system which counted slaves – a system, in Amar’s words, that “might sell in the South.” “Thus were planted,” Amar claims, “the early seeds of an Electoral College system.”

Practically everything in this account, however, is either illogical, false, invented, or factually incomplete. Let’s start with Amar’s first assertion that James Madison supposedly described a popular vote system as a “political nonstarter.”  Amar is referring to the speech that Madison gave on July 19, the day the convention approved – temporarily, as it turned out – an electoral system. In that speech, Madison indeed recommended an electoral system, noting that the defeated direct election system he still deemed the “fittest” would hurt the slave South “on the score of the Negroes.”  “The substitution of electors,” Madison said, would correct for these problems.  

But to pluck that speech out of context, as Amar does, is to distort not just Madison’s thinking but the purport of what he said. Recall that five days after Madison delivered that speech, the convention reversed itself and rejected the electoral system. Then recall that, a day later, Madison, now believing the convention would never approve an electoral system, dissented from the convention’s switch and strongly endorsed direct popular election, still hoping that it had a chance -- the system that Amar says Madison ruled out completely because it was unacceptable to the South. 

Slavery, as it happened, was not the only thing or even the main thing on Madison’s mind.  Having Congress elect the president might give the slaveholding South extra votes for president, but he believed it would also invite corruption, which overrode whatever benefits the system might afford the slaveholding states. By contrast, he believed a direct vote system was preferable, even if it diminished Southern power. If it came down, in his mind, to a choice between the health of the republic and power considerations for the slaveholding states, he would choose the former; or, as he put it, “local considerations must give way to the general interest.” As a southerner, he concluded, “he was willing to make the sacrifice.”

Amar tells a different tale. According to him, Madison dismissed the direct voting system as “nonstarter” because it hurt the slaveholding South. The evidence shows this is false. Although he would have preferred an electoral system for reasons having to do with slavery, Madison hardly rejected direct voting because it was a “nonstarter” for the slaveholders or for any other reason. As soon as an electoral system seemed to be off the table, he returned to supporting a direct system, despite the long odds against it, rather than support a congressional system that was favorable to slavery but also vulnerable to corruption.

Amar’s second assertion is more consequential but equally wrong; indeed, it is an invention. By his account, Madison’s speech of July 19 explained to the convention why direct election would have been a “dealbreaker” for the slaveholding South and that the convention subsequently rejected direct election of the president. The trouble is, as the basic narrative shows, by the time Madison delivered this speech, he could not have been warning the convention against direct election. The reason is simple: the delegates had already soundly defeated direct election two days earlier.   

Here’s what really happened. On two separate occasions, the convention crushed proposals for direct election: the first time, on July 17, by nine states to one; the second much later, on August 24, by nine states to two. On the first of these occasions, the North Carolinian Williamson made his stray remark about how a direct system would hurt the largest slaveholding state, Virginia, but this was the only time on either occasion that the issue of slavery arose. There is no evidence that the northern states which voted “nay” did so out of deference to any slaveholder’s “dealbreaker” objections, explicit or perceived. On the other hand, as we have seen, there is plenty of evidence that the northerners – and many if not most southerners as well -- regarded direct election, in the words of Elbridge Gerry of Massachusetts, as a “radically vicious” system, in which an uninformed people “would be misled by a few designing men.” 

Amar’s third assertion is illogical, incomplete, and invented, all at the same time. He asserts that, at Madison’s prompting, the delegates or some of them, began wondering: “if slaves could somehow be counted in an indirect system, maybe at a discount (say, three-fifths), well, that might sell in the South.” Here, Amar claims, begins the real story of the inception of the proslavery Electoral College. Yet the scheming that Amar imputes to unnamed delegates about an indirect system is pure fiction. Moreover, as we have seen, the proslavery lower South actually rejected the idea of an electoral system despite its relative advantages to slaveholders, preferring a system of congressional election based on a formula that would have provided the slaveholding states a smaller number of electors. 

In all, Amar wants us to believe that the delegates sowed the proslavery seeds of the Electoral College when Madison explained to them that the South would never agree to a direct election system – a system the convention had, in fact, already defeated. He would have us bypass the fact that the most vociferous proslavery states, instead of rallying to that plan, resisted it in favor of another. And he would have us overlook that the plan the proslavery states favored promised to give them a smaller proportion of the vote for president than the electoral plan they opposed.     

To be sure, there are some traces of truth in Amar’s argument. First, because the framers tolerated slavery where it existed from the very beginning of their deliberations, slavery touched and often distorted many aspects of the new federal government. In the case of electing the president, the framers’ toleration led to the Southern slave states getting extra power, derived from the three-fifths compromise struck early on in the proceedings. Second, during the debates over the mode of electing the executive, two Southern delegates did note that a system of direct election of the president would hurt the slave states. But all of this put together is still a far cry from demonstrating that the Electoral College originated as a proslavery ploy. 

A good way to summarize what actually happened inside the convention is to recall the place of slavery in the different plans that the convention considered about electing the president.  The delegates weighed three options: the president would be selected by direct popular vote, by Congress, or by electors who would be chosen either by the people or the state legislatures. Direct election failed, but not because it was intolerable to the slaveholders, as Amar maintains. It failed because it enjoyed little support in the convention, for reasons that had nothing to do with slavery. The real choice for the framers was between the congressional method and the electoral method. Both methods gave the slave states an extra measure of power in selecting the president; so once direct election was scrapped, the convention was bound to grant the slave states some sort of bonus. But this was because the great majority of the convention did not trust in the people at large to choose the president. There was no slaveholders’ ploy.

Another way of putting this is to concede that the full story of the framers, the Electoral College, and slavery shows that proslavery concerns did indeed arise at the crucial point when the convention’s decided to reject direct popular election of the president. But the role they played amounted to twelve words in an insignificant speech by Hugh Williamson, and, at a great stretch, some remarks by James Madison after the direct voting plan had been defeated. Beyond that, proslavery concerns had nothing to do with the convention’s debates over what became the Electoral College; two northern critics of slavery, William Paterson and James Wilson, opened the debate that led to the temporary adoption of an electoral plan; proslavery delegates resisted that plan and helped get the convention to abandon it; and when the convention finally settled the issue, it agreed at the last minute to scrap what had long been the lower South’s preferred arrangement, selection by the Congress, in favor of a system of electors. The claim that the Electoral College originated as a proslavery ploy is a myth that can be sustained only by misreading the evidence or by simplifying in order to manipulate it.

Amar offers some additional criticisms of my Op-Ed’s discussion of the effects of the Electoral College after 1787. These chiefly involve a paragraph on the election of 1800-01, which argues that the Federalists’ interference with the electoral vote in Pennsylvania offset the extra votes that Thomas Jefferson received as a result of the three-fifths compromise. My point was simply that it is badly mistaken to say that the compromise unfairly handed Jefferson the presidency. Amar interprets this as an attempt on my part “to erase the ugly fact that the South had extra seats in the Electoral College because of its slaves.” His imputation is offensive as well as mistaken. To describe how the evil of slavery prevented the outrageous theft of a presidential election is not to evade or apologize for slavery and the three-fifths clause. It is to describe a terrible irony.           

In the 21st century, the Electoral College has twice thwarted the popular will. The debate over its future cannot be helpfully advanced by distorting its complex origins, in which slavery’s role was not central but incidental.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171722 https://historynewsnetwork.org/article/171722 0
President Donald Trump, HIV/AIDS, and Black Lesbian and Gay Activism

ACT UP Protestors in New York

 

 

In his State of the Union address on February 5th, 2019, President Donald Trump surprisingly included a plan to eliminate HIV/AIDS in his budget: “My budget will ask Democrats and Republicans to make the needed commitment to eliminate the HIV epidemic in the United States within 10 years. Together, we will defeat AIDS in America.”  The inclusion of HIV/AIDS in his address came as a surprise to many because one of President Trump’s first actions upon arriving at the White House was firing all 16 members of the Presidential Advisory Council on HIV/AIDS.

Though President Trump reinstated this council 15 months later, his initial actions were indicative of his longer record on HIV/AIDS. The AIDS Coalition to Unleash Power (ACT UP)--New York held many direct-action protests, including one at Trump Tower in October 1989. Roughly 100 protestors gathered to protest the 6.2 million-dollars in tax abatements Trump received to build the mixed-use, high-rise property at a time when those stricken with AIDS were increasingly vulnerable to homelessness. Protestors saw Trump Tower as a symbol of corporate greed and argued that state monies could have been used to build more housing facilities for those impacted by AIDS.  

Creative writer, activist, and scholar Sarah Schulman has written that the rise in sudden deaths of gay men during the early era of AIDS hastened gentrification in New York City—their absences from rent-controlled apartments and their partners’ lack of access to inheritance claims accelerated the conversion of these apartments to market-rate rents. The early AIDS crisis facilitated changes in the constitution and character of New York City neighborhoods, linking it to larger trends in gentrification that have shifted the racial demographics of inner cities from ethnically and class diverse to more homogenous, middle-class, and increasingly white enclaves. 

Trump’s plan to end AIDS within this decade also came as a surprise given his abandonment of his mentor Roy Cohn after rumors spread publicly that Cohn was dying of AIDS.  It was Cohn’s ruthless business tactics and genius maneuverings around legal loopholes that helped Trump secure the tax abatements to build Trump Tower. Cohn had cut his teeth in politics as Senator John McCarthy’s chief counsel during the Army-McCarthy Hearings in 1954.  Cohn became a power broker in local New York City and federal politics, and in 1971 represented Trump when he was accused of violating the Fair Housing Act in 39 of his properties. Trump’s organization was accused of quoting different rental terms and conditions and asserting false claims of “no vacancy” to African Americans looking to rent apartments in his Brooklyn, Queens, and Staten Island properties. Under Cohn’s direction the Trumps countersued the government for $100 million dollars for defamation, and were able to settle the lawsuit against the Trump corporation by agreeing to stipulations that would prevent further discrimination, thereby not having to admit guilt.

 

 

Trump’s record on AIDS and racial and sexual discrimination make his 10-year plan even more surprising since the face of the U.S. AIDS epidemic is primarily black and Latina/o, especially gay, bisexual, and transgender blacks and Latina/os. In January 2019, the Black AIDS Institute (BAI), a Los Angeles-based, national HIV/AIDS think tank focused on black people, expressed their dismay when the Trump Administration proposed a change in “protected class status” under Medicare, which has allowed people living with HIV to access better medical care. In their response to his State of the Union address, BAI questioned President Trump’s intentions, since he has repeatedly sought to cut the President’s Emergency Plan for AIDS Relief, better known as PEPFAR, a multi-million-dollar initiative which has been credited with saving 17 million lives around the world. Moreover, they indicted the President for his racist and homophobic rhetoric, which has fueled an increase in violence against black and LGBTQ communities. One of the suggestions BAI made to move Trump’s plan from words to action was to center leadership from communities most impacted by HIV.

Some of the earliest leadership from communities impacted by HIV/AIDS emerged from black lesbian and gay artists and activists during the early era of AIDS. Beginning in the late 1970s, black lesbian and gay arts and activist movements—which political scientist Cathy Cohen has identified as the first stage of AIDS prevention efforts in black communities—centered collectivity, self-determination, creativity, and radical love as central to their political practice. They saw the elimination of racism, homophobia, and economic inequality as essential to the elimination of AIDS in black communities. In 1986, Philadelphia based, black gay journalist, creative writer and activist Joseph Beam published the editorial “Caring for Each Other” in Black/Out magazine, the official publication of the National Coalition of Black Lesbians and Gays. The essay is a meditation on placing community responsibility ahead of reliance on the state. Beam believed that the state had never been concerned about the lives of black people. State apathy, he argued, extended to black gay men and IV drug users dying of AIDS, stating that “it would be a fatal mistake if we were to relinquish our responsibility for AIDS in the black community to such an external mechanism.”  

Indeed, Trump’s proposal to end AIDS by targeting geographic and demographic “hot spots” in seven states, 48 counties, Washington, D.C., and San Juan, Puerto Rico, comes as part of a budget plan that would eliminate funding for global AIDS programs, slash expenditures on the Centers for Disease Control and Prevention, while transferring the management of Medicaid through block grants to states, comprising an overall cut to spending on health and human services. This plan proposes to end health inequalities at the local level while threatening to reproduce broader social inequalities at the state, national, and global levels. 

Though Trump’s plan of action challenges Beam’s narrative of state apathy by continuing the contradictory record of state action that began with President Ronald Reagan when AIDS first appeared, Beam’s caution suggests that our efforts to end HIVAIDS in poor communities and communities of color across the globe must not depend solely on federal or state bureaucracies. Instead, this history suggests that plans to eliminate HIV/AIDS must be centered on community care and responsibility, and political action aimed at transforming the conditions of structural inequality that President Trump has perpetuated throughout his career.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171714 https://historynewsnetwork.org/article/171714 0
Health Care for All – A Cautionary Tale from the 1970s

 

With the 2020 presidential election around the corner, both parties appear headed, once again, for a train wreck on health care.  While scores of Democrats in Congress and on the presidential campaign trail advocate a single-payer health care system for all Americans immediately, other Democrats embrace the idea of universal coverage as the ultimate goal, but believe it should be achieved incrementally.  To some this seems like a repeat ofthe late 1970s when Democrats allowed the perfect to become the enemy of the good, and nothing was done on health care---for another 30 years. Meanwhile the unrelenting opposition of Republicans to the Affordable Care Act suggests that the GOP has no serious interest in offering an affordable health care plan. The voters punished them for it last year.  “Those who cannot remember the past are condemned to repeat it,” George Santayana famously said, offering an immutable truth that should be imbedded in the mind of every member of Congress.  

 

Health care coverage in the United States has had a compelling but sometimes fraught history that is essential to understand before it is reconsidered. Theodore Roosevelt first proposed national health care in his 1912 platform but he lost that election. Subsequent Democratic presidents including Franklin Roosevelt, Harry Truman and John Kennedy supported the idea but it was Lyndon Johnson who achieved Medicare for seniors with the Medicare Act of 1965.  At last every American 65 and over became eligible for federal health insurance regardless of income or medical history; it also included coverage for low-income Americans in the form of Medicaid. It was a landmark achievement, made possible by a unique moment in history and the tenacity of Democratic presidents in keeping the Republican Roosevelt’s 1912 idea alive. 

 

The next Democratic president, Jimmy Carter, was in step with his predecessors as he wanted to extend health care to all Americans, but the economic conditions of that time werevery different from 1965.  While both houses of Congress were Democratic in 1977-78, inflation was out of control and the economy as a whole was weak, straining the resources of the federal budget.  Carter had been a progressive governor of Georgia but a fiscal realist; he believed the country couldn’t afford such an enormous cost at that time without serious economic consequences.            

 

While Carter embraced universal coverage as the ultimate goal, he believed it should be achieved incrementally, not only for affordability but also for feasibility. An incremental approach, Carter contended, would aid the federal government’s ability to digest and administer such a huge and complex new system. Additionally, proposing a stepped approach would make it more likely to attract bipartisan support, which he believed was important for its long-term sustainability. 

 

Not everyone agreed. Eight years after Johnson’s Great Society was enacted, there were still pent-up demands among congressional Democrats for new federal spending.  Senator Edward M. Kennedy (D-MA) was the most vocal spokesman, and he was also, many suspected, planning to challenge Carter for the Democratic presidential nomination in 1980, using national health care as a defining issue.   

 

In 1977 Carter’s White House reached out to Kennedy to find a middle ground.  It became clear early on that there was a significant difference between the two camps. Over many months, the two parties tried to compromise, but the talks eventually faltered over the specific phasing-in of Carter’s proposal. The unbridgeable gaps were fully revealed at the final meeting between Carter, Kennedy, and their staffs in the Oval Office on July 28, 1978,.  When they first appeared, Carter, according to one participant, told Kennedy, “It will doom health care if we split . . . I have no other place to turn if I can’t turn to you . . . I must emphasize fiscal responsibility if we are to have a chance.”  Kennedy left the White House and soon announced he couldn’t support whatever the Administration offered on health care and he would write his own comprehensive bill, which he unveiled on May 19, 1979.  

 

A month later, Carter delivered a message to Congress calling for catastrophic coverage for all Americans so that  families who incurred severe and costly injuries or illnesses would not be financially destroyed. He also called for “comprehensive” coverage of 16 million low income Americans (Medicaid). It was a thoughtful, generous and responsible proposal, and it won significant early support on Capitol Hill, not least because many Democrats saw it as an essential step toward universal coverage.

 

In the previous fall of 1978, Kennedy had addressed the Democrats’ mid-term convention in Kansas City and threw down the gauntlet to Carter: “There are some who say we cannot afford national health insurance . . .But the truth is, we cannot afford not to have national health insurance.”  Tensions between the two men, already high, came to a boil when Kennedy formally announced his candidacy for president on Nov. 7, 1979. With no major issues dividing the candidates -- save for the timing but not the goal of universal coverage -- Kennedy’s campaign got off to a faltering start.  It was apparent he needed strong support from the more liberal trade unionsand some unions did sign on with Kennedy, including the United auto Workers, which had been a long-time supporter of national health care. The UAW’s  leadership pledged it would use its clout to see the plan enacted. Even after Carter captured sufficient delegates to win the nomination following a brutal series of primaries, the UAW would notback down from its all-or-nothing position. Neither would Kennedy. 

 

The hard-fought contest took its toll on both candidates and, tragically, on the issue of health care. In short, the dynamics of the 1980 primary campaign inevitably precluded the kind of legislative process that might have enabled universal catastrophic coverage to become law.  An important opportunity was lost; the American people would have to wait another 30 yearsfor major health care reform. 

 

It finally arrived in 2009 when President Barack Obama unveiled the Affordable Care Act as his highest legislative priority. The ACA or, as it became known, Obamacare, bore a striking resemblance to Carter’s proposal three decades before. New to the presidency, Obama’s leadership was sometimes hesitant and he failed to articulate a strong and consistent public case for his proposal, an omission that made passage more difficult. At a joint session of Congress in September 2009, the president read an endorsement from Senator Kennedy, written before he had died the month before. Obama rallied the congressional Democrats and, with the indispensable help of Speaker Nancy Pelosi, ACA finally became law in 2010.  It was an historic achievement, representing the most significant regulatory overhaul and expansion of coverage since 1965. 

 

With few Republicans supporting Obamacare, GOP leaders made its repeal their rallying cry for nearly a decade. Yet, they failed even when Republicans controlled both houses of Congress and the White House.  With Democrats now in control of the House of Representatives, the ACA finally appears secure--except that President Trump’s Justice Department is trying to overturn the ACA altogether.  

 

Republican control of the Senate and White House makes it a prohibitive time to attempt any major expansion of health care.  There is nonetheless an opportunity for Democrats -- and hopefully Republicans -- to prepare for the future by working together during the next two years to fix and strengthen the ACA so that it actually delivers the care it is meant to deliver.  They should also come together to significantly reduce the cost of medications, for which there is an undeniable bipartisan public mandate. Who knows where this could lead?  If led by serious people on both sides, it could yield yet more success stories like criminal justice reform and conservation of public lands.  Whatever it is, it’s better than polarized stalemate.

 

Thus, if the ultimate goal is to expand affordable health care to every American, history offers important lessons. It tells Democrats that in the next two years they must be politically savvy, and in some instances, uncharacteristically restrained, if they want to be poised to offer a viable form of expanded health care in 2021. They must be honest that 2021 is the first time a plan realistically can be considered.  Before then, they must avoid the public perception of “over-reach,” a political deadly sin that costs politicians who appear to offer grand proposals that are hugely expensive, complex and unwieldy. “Medicare for All” comes to mind as something many people already see as over-reach. Voters have finely attuned antennae, and most can tell when they’re being played by a slogan.  

 

On the other hand, Americans will respond favorably to reasoned proposals even for aspirational goals,as they did in 2018. They will do so again if a plan is couched in language they can understand, such as supporting a proposal for 2020 that offers “affordable health care for every American regardless of income or existing conditions.” At the same time, liberal Democrats should resist the siren song of ideological purity and embrace insteada pragmatism that will assure ultimate success.  The run-up to 2020 will be better than the 1970s unless Democrats take their eye off the ultimate goal and again allow a deep division within the party to preclude the outcome most Americans seek.

 

As for Republicans, history tells them that if they want to help shape America’s health care of the future, they should 1) accept the legitimacy, if not every detail, of the ACA,which is, after all, a direct philosophical descendant of the thinking of the conservative Heritage Foundation, as well as the first cousin of Republican governor Mitt Romney’s plan for Massachusetts, and  2) abandon their blind opposition to any expansion of health care. They should engage in a constructive and serious conversation with Democrats so that by 2021 we will have something approaching a national consensus on how to care for our health. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171720 https://historynewsnetwork.org/article/171720 0
Ilhan Omar is a Reconstruction Reformer

 

I stand with Ilhan Omar. As a historian of Reconstruction, I must. 

 

Omar embodies the best of Reconstruction-era reformers. She articulates a robust and inclusive vision of civil rights. She is a vocal advocate for the dispossessed and an outspoken opponent of racism and bigotry. She opposes Donald Trump’s nativist and Islamophobic “Muslim ban” and supports paid family leave and raising the minimum wage. In fact, she even co-sponsored the “Never Forget the Heroes Bill” that would permanently authorize the September 11th Victims Compensation Fund.

 

I did not run for Congress to be silent. I did not run for Congress to sit on the sidelines. I ran because I believed it was time to restore moral clarity and courage to Congress. To fight and to defend our democracy.

— Ilhan Omar (@IlhanMN) April 13, 2019

 

That last part might come as a surprise to those who know Omar primarily from the wave of race-baiting unleashed by conservative politicians, press, and agitators. Indeed, the president himself has repeatedly Tweeted lies about Omar paired with images of the 9/11 attacks obviously designed to make Omar out to be a terrorist. 

 

WE WILL NEVER FORGET! pic.twitter.com/VxrGFRFeJM

— Donald J. Trump (@realDonaldTrump) April 12, 2019

 

But we should recall that this has been a Republican strategy for quite some time now. The Republican Party of West Virginia implied that Omar was a terrorist last month, suggesting that Americans, by electing a Muslim, had “forgotten” the 9/11 attack. Again, this wasn’t some far-Right website. It was the WV state Republican Party. 

 

Nor is Omar the first woman of color to be targeted by Trump. Last year, Trump launched similar attacks against California Congresswoman Maxine Waters. These and other racist and Islamophobic attacks on Omar and Waters have inspired death threats against both women.  

 

As a scholar of Reconstruction, this recent surge in racist propaganda has me worried. It is precisely the tactic that conservatives used to subvert Reconstruction-era reforms. They publicly targeted politicians in their newspapers and incited violence as a tool to regain political power after having been defeated during the Rebellion.

 

I wrote about an eerily similar campaign of terror against Victor Eugène Macarty, an Afro-Creole politician recently for the Journal of African American History. Like Omar, Macarty was an outspoken advocate for equality. He had attended the voting rights convention on July 30, 1866 at the Mechanics Institute in New Orleans when it was attacked by police. He escaped death by hiding under the porch while New Orleans police officers, at the head of an angry mob of whites drummed up by the local press, attacked members of the convention and mangled their corpses.

 

I became interested in Macarty while researching his time as a member of the Orleans Parish School Board as part of a project examining the impact of racial science on state institutions after slavery. But the more I read about Macarty—who was singled out by the white-supremacist New Orleans Bulletin as “extremely offensive to the white people of this city”—the more I became intrigued by his story. During an era when the white press was reluctant event to print the names of African Americans, the Anglo papers in New Orleans routinely targeted Macarty, almost begging readers to attack him. They did.

 

After he confronted a white woman fired from her teaching position for supporting the White League—a white supremacist terrorist organization—the Bulletin repeatedly called for Macarty’s head. When the woman’s brothers attacked and left him for dead on September 16, 1875, the paper cheered the outcome and warned that the other Black school board members should “rememb[er] the fate of Macarty.” His attackers pleaded guilty and were “sentenced to each pay a fine of Ten Cents or one minute in the Parish Prison.” The court system in New Orleans functioned as an institution of racial control, letting Macarty’s attackers off the hook while signaling to African Americans that they would find no justice before the law. The continued media campaign and threats against Macarty played an outsized role in his political life and eventually led him to leave the city.

 

Macarty was not alone as a victim of media-initiated racist attacks. The white press regularly named targets for white vigilantism. White elites pioneered this form of racist terrorism after emancipation as a means of controlling African Americans and subverting working-class politics.

 

The consequences of the media campaign against Macarty should give us pause as the president and large portions of our national media engage in blatant race-baiting against Ilhan Omar and Maxine Waters. Indeed, it is hardly a coincidence that following this highly public, racist coverage, both Omarand Waters received death threats. As an activist and citizen, it is terrifying to see the resurgence of this Reconstruction-era tactic of racial oppression today.

 

What frustrates me as a scholar is that we’ve created a historiographic landscape in which African American contributions to American history are overlooked. We too often take a teleological approach to Reconstruction and spend too little time allowing ourselves to be surprised by the profound commitment to equality made by many of the era’s reformers. This act of intentional mis-remembering strengthens the foundation of white supremacy in our country. As we’re seeing right now, that’s incredibly dangerous. 

 

Macarty was a revolutionary figure about whom little was known until my recent article, despite his having brought the first lawsuit against segregated seating in federal court in 1869. In fact, the same few lines had been written and rewritten about Macarty since James Trotter’s 1880 Music and Some Highly Musical People, published the year before Macarty’s death. 

 

We need to better remember the stories of African American reformers and visionaries to counterbalance a field that remains plagued by Lost Cause categories, periodization, and imagery. We need to know more about those who led prior movements for equality. We need to celebrate their martyrs and understand the cause we inherit from them. And perhaps most crucially at this moment, we must become intensely aware of the tactics that their white supremacist opponents used to subvert equality.

 

Biography helps us accomplish these ends and we should pursue it vigorously and unapologetically. My friends and family are consistently surprised when they learn about my research into Macarty and his contemporaries. This cannot be the case, at least not if we hope to live in a society that values justice and equality.  

 

Biography is a key pillar of historical instruction from grade school through high school. It helps students recognize themselves in historical figures large and small. Well-executed biographies allow them to better understand the debates of the past and relate them to those of the present. They also enable students to approach the past with humility and to see that our forebears grappled with many of the same issues we face today. This is one of the central “lessons of history” and among the most important that we can offer. 

 

Further, biographical approaches to historical actors not only show African American resistance to white supremacy, but also avoid flattening African Americans into vehicles of resistance. Indeed, the view that African American liberty implies a rejection of (white) authority is a core belief of white supremacists. By telling the stories of African American men and women as whole persons, we can combat this racist lie.

 

In researching Macarty, I realized the need for more African American biographies in Louisiana and, I suspect, throughout the 19th-century U.S. At least in south Louisiana, I came across many prominent African Americans about whom little or nothing is known. Take T.M.J. Clark, who after having been enslaved, taught himself to read and became the president of the State Insane Asylum. Or John Gair, who helped write the Louisiana Constitution of 1868 and survived numerous threats and an assassination attempt before being gunned down while in police custody in 1875. Our histories have either completely ignored these radicals or, in cases where they’ve been mentioned in passing, gotten them almost entirely wrong.

 

Moreover, like Macarty, Gair and Clark were subjected to race-baiting coverage in the media that effectively ended their careers. The white press slandered and vilified both men and each of them suffered brutal attacks by white supremacist vigilantes. Like Macarty, Gair and Clark demanded equality. It was the cause for which Gair was martyred and Clark forced to flee for his life, a permanent exile from his hometown.

 

This wave of media-inspired white supremacist violence effectively ended Reconstruction. No one was ever held accountable for the massacre of voting rights activists in New Orleans in 1866. Macarty’s attackers, after nearly beating him to death, faced no consequences. And though Gair was assassinated while in police custody in 1875, none of his attackers were ever charged. It was this failure to hold the race-baiting press, politicians, and vigilantes responsible that undermined any semblance of equality for more than 100 years. 

 

Politicians like Macarty, Gair, and Clark took incredible risks and made enormous sacrifices to fight for equality 150 years ago. Their contemporaries failed to hold their attackers responsible. We cannot make that same mistake.

 

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171721 https://historynewsnetwork.org/article/171721 0
Oh, What a Beautiful Piece of American History

 

Oklahoma!, one of the great musicals of show business history, often loses its own history amid all of those gorgeous Richard Rodgers and Oscar Hammerstein songs. The play is a straight forward, and yet very complex, story of ranch hands and their women on farms in the bustling Oklahoma territory in 1906, just before Oklahoma became the 46th state. The simplicity and beauty of that life is the basis for the marvelous, new and different version of the play, that opened last week in New York at the Circle in the Square Theater at 1633 Broadway.

The play starts with ranch hand Curly, played superbly by the multi-talented Damon Daunno, a cowboy star in the Oklahoma territory who is desperately infatuated with farm girl Laurey. He stands up and, with a gorgeous voice, sings one of the signature songs in the musical, Oh, What A Beautiful Morning. It kicks off a play that is full of new romances, busted romances, patched up romances, a lot of violence, dark conversations, threats and a wild and wooly battle for the middle of America in a very divided country (sound familiar?). It is the men vs. the women, the good vs. the bad and the cowboys vs. the ranchers, all scrambling for a piece of the Oklahoma territory just after the turn of the century, in 1906, and all of the promises and dreams within it.

This new version is pretty much the same as all the other plays and movies (the 1955 film version won three Oscars) and yet, at the same time, it is distinctly different. The others were grand sprawling sagas with lots of props, such as the time-honored surrey with the fringe on top, farmhouses and barns. There are none of them in this new play, majestically directed by Daniel Fish. All the director gives the audience here is an empty stage with chairs, some spectators on the periphery, a small orchestra (all happily wearing cowboy boots) placed carefully in a shallow pit and that luscious music that drifts through the air and soothes the hearts of everyone in the theater.

The story (Hammerstein also wrote the book) develops nicely. Curly wants to take Laurey to the local dance but she had already promised to go with Jud Fry, a menacing, malevolent cowboy whom nobody likes. She only did it, she tells friends, to spite Curly. This sets off a battle between Curly, Jud and Laurey, in addition to the fight between cowboy Will Parker and traveling salesman Ali Hakim for the hand of the boisterous cowgirl Ado Annie. There is a lot of back and forth and the plot is told with the wonderful songs as well as dialogue. Those tunes include Oh, What a Beautiful Morning, The Surry with the Fringe on the Top, People Will Say We’re in Love, Kansas City, I Can’t Say No, and the rousing, burn-down-the-barn title song, Oklahoma!

Even though this is a barebones show, it has some marvelous special effects. At one point, Curly and Jud are arguing over Laurey with some pretty dangerous and threatening dialogue. Curly even suggests that Jud Hang himself. The whole scene is presented in the dark, so that you only hear their voices of the two men. Part of that confrontation is a huge, haunting, slightly out of focus film of Jud talking. It fills the stage wall.

Many of the conversations in the story are done with dark lighting and stirring music to add a sense of foreboding to the drama. There is some gunplay, pretty authentic for the era. An anti-gun theme is evident around the walls of the theater, where over a hundred rifles and standing in wall racks, ready to be fired at any moment if there is trouble somewhere in the territory of Oklahoma.

The story of the land and the people battling over it, the tale of yet another new frontier in U.S. history, is absorbing and the same story that developed in every other U.S. territory, whether it was Arizona, Alaska or Oklahoma. The play tells the tale of an America that, out there in the cornfields, is bursting at the seams. And, at the same time, it tells the story of Oklahoma, ranchers, cowboys and city folk.

In the play you learn about all the hard work the cowmen and ranchers put into make their ranches successful, the social customs of Oklahoma, and the mid-west, in 1906, the dances, the dating, the generational battles, and marvel of country folks for city folks, told so well in the tune Kansas City.

Amid all of this history is the story of the young people, helped and guided by the older ones, as they try to find their place in Oklahoma, America, and the world. It is a nicely told saga told within all of those memorable tunes.

Stetsons off to director Fish for not just re-staging, but re-inventing this classic musical. He used all of his genius to create a sensational new play out of an equally sensational old one. He gets significant help from a gifted groups of actors, including Daunno as Curly, Mary Testa as Aunt Eller, who holds the chaotic life of the prairie together through all of its storms,  Rebecca Naomi Jones, a fine singer and whirling dervish of a dancer as Laurey, James Davis as the stoic, hunkered down Will Parker, Ali Stroker as his beloved girlfriend Ado Annie, Patrick Vail as the villain Jud Fry,  Anthony Cason as Cord Elam, and Will Brill as salesman Ali Hakim.

The play started its musical journey in 1931 as Lynn Riggs’s Green Grow the Lilacs. It wound up with Rodgers and Hammerstein, who in 1943 made it into their very first, of many, shows. In 1944 it won a Pulitzer Prize. The play was a huge commercial hit and ran on Broadway for nearly seven years. Revivals of it over the years have won numerous Tony Awards. The 1955 movie, starring Gordon Macrae, Shirley Jones and Rod Steiger, garnered three Oscars.

The folks connected to the original play really should have taken some time to give people in the audience a little history about sprawling, ever green and inviting Oklahoma that was so central to the show. The big push for statehood started in the 1889 Oklahoma Land Rush, in which 50,000 energetic settlers raced across the territory’s plains in wagons, carriages and on horseback to claim two million acres of free land, a race into history sanctioned by the U.S. government as a way to populate the huge piece of Midwestern landscape.  As the new settlers developed it, the need for statehood grew. Ironically, after the success of the play, the state of Oklahoma named the title song of the musical as it’s official state song.

I’m sure they voted for it on a beautiful morning at the start of a beautiful day.

PRODUCTION: The play is produced by Leve Forward, Eva Price, Abigail Disney, others. Scenic Design:  Lara Jellinek, Costumes: Terese Wadden, Lighting: Scott Zielinski, Sound: Drew Levy, Choreography: John Heginbotham. The play is directed by Daniel Fish. It has an open-ended run.

   

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171723 https://historynewsnetwork.org/article/171723 0
When Women Ran Hollywood

 

Hold on. When did women—who produced only 18 percent of the 100 top-grossing movies of 2018, whose screenplays constituted a mere 15 percent, and who directed a microscopic 4 percent—ever run Hollywood?

 

Here’s how I found out about this little-known history. While researching a novel set in 1919 about vaudeville, the live variety shows that were America’s favorite form of entertainment at the time, I learned its demise was caused in part by the growing success of silent movies. The obvious question was, what could make silent movies—with their melodrama, bad acting and, you know, silence—more desirable than a live performance of, for instance, a regurgitator who could swallow and then upchuck items in the order the audience determined? (Audiences loved regurgitation, by the way; also sword swallowing and fire breathing. In addition to a wide variety of acts, there was a lot of ingesting stuff that one just shouldn’t.)

 

What was so great about silent movies? I soon found myself wandering down a succession of internet rabbit holes. (When it comes to research, most writers can get so rabbit-y we practically sprout long floppy ears.) First, I sampled a few films and found that they were more complex, well-acted, and creatively-filmed than I’d expected. Mary Pickford’s movies or Gloria Swanson’s, for instance, are surprisingly subtle.

 

But far more fascinating was the fact that women were a driving force in early filmmaking. Up until about 1925, “flickers” weren’t considered terribly respectable, so if you could get a real job, you avoided a career based on these flights of fancy. Conversely, if you were shut out of most employment because of, say, your gender, Hollywood beckoned. 

 

Consider the following:

  • Women worked in almost every conceivable position in the industry, from “plasterer molder” (set construction) to producer.
  • There were a few popular actors, but actresses were the stars.
  • In 1916 the highest salaried director was female.
  • In 1922 approximately 40 production companies were headed by women.
  • An estimated half of all screenplays produced before 1925 were written by women. 
  • For over twenty years, the most sought after and highest paid screenwriter was female. 

 

Why had I never heard of any of this? I was familiar with directors D. W. Griffith and Cecil B. DeMille, producers Sam Goldwyn and Jack Warner, writer-director-actor Charlie Chaplin, but had heard of almost none of the following sample of brilliant and powerful women.

 

Studio Chiefs

Alice Guy-Blaché began as a secretary for a French motion picture camera company. Fascinated by the medium’s possibilities, in 1896, she asked her boss if she could make a short story film—the first ever!—to promote the camera. He agreed as long as she didn’t shirk her secretarial duties. By the time she moved to America in 1907, she had produced 400 such films. She founded a new studio, Solax, served as president, producer, and chief director, and produced 300 more films by the end of her career. Her feature-length films were quite sophisticated, focusing on subjects of social import such as marriage and gender identity.

 

Mary Pickford, best known as “America’s Sweetheart,” was the most successful and highest paid actor of her time. She was also a shrewd business woman. Along with D. W. Griffith, Douglas Fairbanks, and Charlie Chaplin, she co-founded United Artists in 1919, and was arguably the most financially astute among them. Chaplin recalls that at a meeting to form the studio, “She knew all the nomenclature: the amortizations and the deferred stocks, etc. She understood all the articles of incorporation, the legal discrepancy on Page 7, Paragraph A, Article 27, and coolly referred to the overlap and contradiction in Paragraph D, Article 24.”

 

Screenwriters

Known for her sharp wit and snappy dialogue, Anita Loos’ career as a screenwriter, playwright, and novelist spanned from 1912 to the late 1950s. Douglas Fairbanks, as much an athlete as an actor, relied upon her to accelerate his career and devise an ever-expanding list of “spots from which Doug could jump.” Most famously, she adapted her bestselling novel Gentlemen Prefer Blondes as a silent film in 1928, which was the basis for the 1953 version starring Marilyn Monroe.

 

Frances Marion is hard to top even by today’s standards of output and success. Until 1935, well after women’s influence in Hollywood had waned, she remained the most sought after and highest paid screenwriter in America, male or female. She acted, directed, produced, and is the only woman to have won two academy awards for Best Original Screenplay. She was mega-star Mary Pickford’s preferred writer (and best friend) and saved many careers in the tumultuous years when the industry was converting to sound. Above all, Frances was a generous collaborator, hosting famous “hen parties” at her house as a sort of support group for Hollywood’s female filmmakers.

 

Directors

Lois Weber was also a studio head, producer, screenwriter, and actress, but as a director she was as well-known as D. W. Griffith and Cecil B. DeMille. In 1916 she was the highest paid director, male or female, earning an unprecedented five thousand a week. She was the first woman member of the Motion Picture Directors Association, with 138 films to her name. They were often morality plays on social issues such as birth control, drug addiction, and urban poverty, particularly as these affected the plight of working class women.

 

Dorothy Arzner, the most prolific American female director of all time, started in the scenario department in 1919 typing up scripts. In the fluid Hollywood work environment, she quickly progressed to cutting, editing, and writing, and by 1927 had directed her first film. With the advent of sound, she invented the boom mic to allow actors to move about the set without bumping into sound equipment. Arzner was gay and fairly open about her personal life, wearing men’s clothing, and living with choreographer Marion Morgan for 40 years. Despite her gender and orientation, she was able to work steadily as a director until she retired in 1943.

 

Renowned early film historian Anthony Slide has said, “Women directors were considered equal to, if not better than, their male colleagues.”

 

Actresses

Florence Lawrence is credited with being the world’s first movie star. In 1908 she was making 100 flickers a year with D. W. Griffith at Biograph, the world’s top studio at the time. However, she was known only as “the Biograph Girl” because the studio didn’t want to increase her burgeoning fame, and thus ability to demand a higher salary, by naming her. She moved to upstart IMP studios, and was involved in perhaps the first wide-scale publicity stunt. The studio quietly fed the papers a story that she’d been killed by a street car, then took out ads declaring “WE NAIL A LIE” claiming that other studios were trying to ruin her career. Fans went crazy for the story, and a public appearance shortly thereafter resulted in mayhem as a huge throng rushed her, pulling buttons from her coat and the hat from her head.  

 

Mabel Normand was a brilliant comic actress, starring in approximately 200 films, most at Mac Sennett’s Keystone studio, and was the first actor to be named in a film’s title (e.g. Mabel’s Lovers in 1912). She also directed many of her own films, including those in which she was featured with a young Charlie Chaplin. Though he erroneously claimed directorship on several of them, Mac Sennett has said that Chaplin “learned [to direct] from Mabel Normand.”

 

Every one of these women was a multi-talented powerhouse, committed to the success of her films, the industry, and other female filmmakers. And for each of those named above there were many, many more.

 

Unfortunately, with a few notable exceptions their careers were generally over by the end of the 1920s. As Hollywood historian Cari Beachamp said, “Once talkies arrived, in the late 20s, budgets soon tripled, Wall Street invested heavily, and moviemaking became an industry. Men muscled into high-paying positions, and women were sidelined to the point where, by the 1950s, speakers at Directors Guild meetings began their comments with “Gentlemen and Miss Lupino,” as Ida Lupino was their only female member.”

 

Their names may no longer be widely recognizable, but these were among the many women who built and ran early Hollywood, shaped the industry in myriad ways, and influenced what we see on the silver screen even today.

 

Will women ever “run” Hollywood again—or even advance to relatively equal numbers as studio heads, producers, directors, and screen writers? This remains to be seen, of course. But powerful leaders, like executive producer, showrunner, and director Shonda Rhimes, Amazon Studios head Jennifer Salke, Disney TV Studios and ABC Entertainment chair Dana Walden, Producer-director Ava DuVernay, and director Patty Jenkins, among many others, offer hope. 

 

“Demanding what you deserve can feel like a radical act,” Rhimes has said. Radical, perhaps, but not new. All it would take is a return to the good old days of early Hollywood.

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171719 https://historynewsnetwork.org/article/171719 0
Trump’s War on Civil Rights and Beyond: A Conversation with Acclaimed Political Analyst and Civil Rights Historian Juan Williams

 

 

Republican presidential candidate Donald Trump urged black voters to ditch the Democratic Party and “try Trump” at a campaign rally on August 19, 2016, in the predominantly white suburb of Dimondale, Michigan. He said of black Americans: "You're living in poverty. Your schools are no good. You have no jobs. Fifty-eight percent of your youth is unemployed.” Trump then asked, “What the hell do you have to lose?"

            

As it turned out, African Americans—among others—are losing a great deal under President Trump, as acclaimed commentator, journalist and historian Juan Williams argues in his timely and illuminating new book, “What the Hell Do You Have to Lose?”: Trump’s War on Civil Rights (Public Affairs). 

 

Mr. Williams contends that Trump’s now infamous campaign speech and other statements on race have conveniently ignored African American history and progress in the decades since the passage of the 1964 Voting Rights Act and the 1965 Voting Rights Act. He denounces the president’s ingrained tendency to intentionally distorthistory to fuel racial tensions for his political advantage.

 

In “What the Hell Do You Have to Lose?” Mr. Williams deftly weaves the remarkable story of the struggle for civil rights into his account of how the Trump Administration has been bent on turning back the clock and undoing or threatening advances in voting rights, school integration, equal employment, and fair housing, and other areas. He describes the unprecedented threat to civil rights under Trump as he chronicles the president’s personal and family history ofdiscriminating against people based on race and his record of hostility to African Americans, including President Barack Obama.

 

In describing the losses for African Americans under Trump, Mr. Williams also provides glimpses from the struggles of heroic pioneers who fought for civil rights and for a better life for all Americans. He shares the stories of activists such as Bob Moses of the Student Nonviolent Coordinating Committee who braved the violent Jim Crow South to register African American voters; James Meredith, a US Air Force veteran, who became the first black student to enter the University of Mississippi in 1962 in the wake of bloody riots at “Ole Miss”; A. Philip Randolph, a union leader who made strides for equal employment rights in the Jim Crow era; and Robert Weaver who championed fair housing programs and became the first black cabinet secretary as the head of the Department of Housing and Urban Development. 

 

Mr. Williams takes pains to explore the past in the belief that knowledge of history is the key to understanding the present and to shaping the future as he explains how the principles of equality, tolerance, and justice today are at stake for all citizens.

 

Mr. Williams is an award-winning journalist, political analyst and historianwho has covered American politics for four decades. He has written several other books, including Eyes on the Prize: America’s Civil Rights Years 1954-1965; Thurgood Marshall: American Revolutionary; This Far by Faith: Stories from the African American Religious Experience; My Soul Looks Back in Wonder: Voices of the Civil Rights Experience; and Enough. His articles have appeared in the New York Times Sunday Magazine, Time, Newsweek, Fortune, The Atlantic Monthly, Ebony, Gentlemen’s Quarterly, and The New Republic. Mr. Williams is currently a columnist for The Hill, and was a longtime correspondent for The Washington Post and NPR. He also cohosts the Fox News Channel’s debate show The Five, and appears on other Fox shows where he regularly challenges the orthodoxy of the network’s right-wing stalwarts. 

Mr. Williams generously spoke by telephone about his new book, his work, and his commitment to sharing historical context when discussing current events. Following our conversation, he added this opening update for readers on his historical perspective and recent events.

 

Juan Williams:I want to thank Robin for the opportunity to talk to history lovers on the History News Network. When I wrote “What the Hell do you Have to Lose: Trump’s War on Civil Rights,” my goal was to answer the question that then presidential-candidate Donald Trump posed to Black America: ‘What do we have to lose from a president who doesn’t care about African Americans?’

 

My book dissects Trump’s unprecedented assault on everything America has achieved over the last half century to move forward on race relations--from voting rights to integrated schools to equal opportunity in employment and fair housing. These changes were achieved by people who made sacrifices, put themselves at risk of being expelled from school, losing jobs, losing their mortgages, constant threats of violence and some even faced death.

 

I tell stories of these courageous civil rights heroes so that we can better understand that progress came at great cost. Starting from that baseline helps the reader to understand how much the nation has gained, and how much we have to lose from Trump’s effort to return to the past or, in his infamous words, “Make America Great Again.”

 

Since I finished writing What the Hell do you have to Lose in 2018, very little has changed. The president continues to tell lies about blacks, Latinos, and immigrants. He makes racial minorities and immigrants out to be a threat to America; we become the enemy, all lumped together as barbarians who commit crimes, take advantage of social programs, and abuse affirmative action policies.

 

These lies are aimed at the ears of white America at a time when pollsters report that large numbers of older whites are anxious about the growing number black and brown people, and immigrants of all colors, in the USA.

 

Trump’s most frequent refrain is that life is better for minorities with him as president. He dismisses talk about increasing racism and anti-Semitism as overwrought. Even FBI reports on the increase in hate crimes since he has been president are waved away as liberal nonsense. Instead, he frequently tells interviewers, for example, that the black unemployment rate is currently “the lowest in the history of the country.”

 

This is a distortion.

 

First, black unemployment under Trump has never reached its lowest point in history. Though it did hit 5.9 percent last May, Labor Department data indicates that black unemployment dropped down to 4.5 percent in 1953. According to the Washington Post Fact-Checker, this distortion was worth giving Trump three out of four Pinocchio’s for his unfounded claim.

 

In addition, the president fails to mention that black unemployment has been increasing. As recently as February 2019 it reached 7 percent. And throughout, black unemployment has remained more than double white unemployment.

 

Unfortunately, these are the kinds of distractions from the truth about race relations that Americans--black and brown Americans in particular--have come to expect from our president.

 

He’s a man who couldn’t condemn the unique horrors of white supremacy that resulted in the death of a woman in Charlottesville last summer

 

Trump won’t talk about the white supremacy that led to the death of Heather Heyer in Charlottesville, eleven Jews in Pittsburgh, and fifty Muslims in New Zealand. But he couldn’t be happier to talk about Congresswoman Ilhan Omar, whose recent treatment by Trump and the Republican Party has less to do with condemning anti-Semitism than it is a political ploy to silence an immigrant, black and Muslim woman who dares to wear a Hijab in Congress and speak her mind about controversial subjects.

 

He’s a man who, hours after it came out that a white supremacist in New Zealand slaughtered fifty Muslims during their Friday Prayers, said that white nationalism was “not really” a major threat, even as the killer’s manifesto described Trump’s 2016 victory as “a symbol of renewed white identity and common purpose.”

 

Indeed, even after Chicago Mayor Rahm Emanuel condemned the courts for dropping the charges against disgraced actor Jussie Smollett, he slammed Trump for speaking on the issue, ordering him to “stay out” because “the only reason Jussie Smollett thought he could get away with this hoax is because of the environment President Trump created.”

 

Previous Republican Administrations made good faith efforts to improve relationships with African Americans.

 

Presidents Reagan and Bush made a point of speaking at the NAACP, seeking out advice from prominent black intellectuals, and appointing African Americans to the highest positions in government. And under President Obama black and white members of both parties were willing to start having the messy, yet necessary conversations about issues that continue to prevent us from moving forward on race as a nation.

 

On the other hand, President Trump has just one African American in his Cabinet. Despite agreeing to some criminal justice reform measures, Trump has failed to deal with issues of police brutality that have led to persistent tensions with black America and the creation of the Black Lives Matter movement. Instead, he ran a campaign, and now a government, fueled largely by white American fears that the country is being stolen from them by ungrateful African Americans, undocumented immigrants, and radical Muslim terrorists.

 

According to Trump, the problem is not the harsh, unfair reality of high levels of segregation in neighborhoods, schools, and jobs. The problem in his eyes is a football player, Colin Kaepernick, kneeling in protest during the playing of the national anthem.

 

Trump also is easy to anger when prominent black people challenge his policies. He will also go out of his way to tongue-lash black critics, including insulting LeBron James, Steph Curry, Jay-Z and other black celebrities. He regularly disparages black women in the Congress who disagree with his policies. 

 

To get away from the day-to-day static around Trump’s mishandling of racial issues, the American people need to know about the civil rights heroes like Bob Moses, James Meredith, A. Philip Randolph, and so many others, because we need to understand how much blood, sweat, and tears it took to create the thriving Black America of today and protect us from those who, like President Trump, couldn’t care less.

 

 

Robin Lindley: Congratulations Mr. Williams on your powerful new book on Trump’s war on civil rights. You take pains to weave history into your reporting, and you are a historian in your own right with your acclaimed books such as Eyes on the Prize, a study of the Civil Rights Movement, and your renowned biography of Justice Thurgood Marshall. In your new book you share the story of civil rights advances that are now threatened under Trump. Your efforts as a journalist and historian are refreshing in this era of fake news. 

 

Juan Williams: I love history. I find it eye-opening because it tells me so much about not only the present but it allows me a structure for thinking about the future. For me, history always been a revelation. Even when I was a child when I learned about the past, I thought, Oh, my goodness. Who knew?

 

Robin Lindley: Did you have training in history when you went to school or did history just naturally come into your writing when you were a reporter?

 

Juan Williams: No, my love of history is an extension of my interest in the news, a fascination I had from my days as an immigrant child in a city with close to a dozen newspapers, New York. I found newspapers and daily journalism on radio and television to be a reason to look into history. The rest of the story, the back story if you will, was the history of the characters and events, and the ideas that animated the politics of the day. I would see something that happened in a prior period in American life and I would go to the library in New York City, where I grew up, and I’d read a book to investigate the story and to understand how we came to the point where we were then and how that article that I was reading in fact was representative of a larger and longer vein of history.

 

Robin Lindley: There's a new twist in the news every day concerning our history, and particularly about race. Attorney and Trump “fixer” Michael Cohen called Trump a racist, a con man, and a cheat at a public Congressional hearing. I don't think that was news to many of us, including the Republicans on the committee. You certainly delve into the history of Mr. Trump's racial insensitivity as well as his lack of historical knowledge as he attempts to erase the past.

 

Juan Williams: I write of the reality of the sacrifices, even people giving their lives, to accomplish racial justice in this country. I'm not suggesting this book is a complete telling of the civil rights movement; I structured the book to include the history as an introduction to the background for young people and a reminder for people who may have forgotten the past. My premise is that we have a traveled such a distance on race going back to our origins as a nation with legal slavery and then legal, government enforced legal segregation that extended well into the 20th century.

 

I had written some of that story in my first book, Eyes on the Prize. More of it is in my second book, a biography of former Supreme Court Justice Thurgood Marshall.

 

That brings me to this book and why I was offended by Trump telling white audiences that black people had nothing to lose by voting for him. The quote, "What the hell do you have to lose?" came from him during the 2016 election campaign. He argued to whites that these black people live in such bad neighborhoods in terms of the violence and crime, with bad schools and a lack of jobs. And, some white person driving through a troubled black neighborhood might say, "Well, it looks like he has a point."

 

There's so much missing context in terms of that distorted picture of black American life.  First, no fake news, just the facts: The nation’s black population is doing better than ever before by so many measures in terms of income, education, business ownership, occupying political office, and the like. I could go on. But Trump doesn't seem to plug into that part of the story. Instead, he takes a perverse delight in poverty and crime among blacks, Latinos and immigrants. Again, this why I think the history is so important. 

 

The history of progress for American minorities is needed to inform someone hearing Trump’s indictment so they are not fooled. With history in mind they will know what the hell striving minorities in this country have to overcome and a history lover knows how far minorities and immigrants have come despite those obstacles.

 

That indictment of black people by Trump is undermined by the history of all the struggle and sacrifices made to bring black people to this point. And also, it opens eyes to the idea that the African American community is not all poor and poorly educated. In fact, black America in 2019 is at historic heights in terms of income and education. Almost 40 percent of black people make between like $35,000 and $100,000 per year. Another 11 percent are earning between $100,000 and 200,000. So that's half of the black population living in the American middle class. And then you have the reality of black executives who have led very successful American companies like Time-Warner, Merrill Lynch, American Express, and Xerox. 

 

Those stories of black achievement are not part of Trump telling whites that blacks have nothing to lose. An informed listener will know they are being misled by Trump because they know the history of black trailblazers, beating the odds to make new paths in American society, a society that not only enslaved black people but legally segregated them and still discriminates against them.

 

And once voters – including Trump voters – are aware of that history I think their attitude might shift from contempt to admiration. They might say, ‘Oh gosh, look at these previously disenfranchised people who have made their way.’ Wow, 

 

We should celebrate these people who have remained loyal Americans and hardworking people who believe that they can make it in this society and that they can achieve the American dream. But to the contrary, they're vilified and made out to be a bunch of people with nothing to lose by the man who then becomes President Donald Trump.

    

Robin Lindley: It seems too that old stereotypes have re-emerged under Trump. By old, I mean before the Civil War, such as the recent controversy about the governor and attorney general of Virginia appearing in blackface as young men. How do you see this issue of political leaders who engaged in this racist mockery?

 

Juan Williams: Part of "Make America Great Again," Trump's campaign slogan, was to create nostalgia for some time before the civil rights movement, before the women’s movement, before America became more diverse, an earlier social hierarchy in which white people, especially white men, were at the top and people of color were below them. Black people fit into that picture as happy go lucky people, singing, dancing, and even white people feeling free to put on blackface and mock black Americans with minstrel shows. Apparently, we are to believe racial minorities were happy before all these northern agitators came down here. This is a modern version of segregationists telling each other that “Our black folks are happy folks.”

 

That was delusional thinking on the part of racists who didn't want to hear anything about equal rights or civil rights. So, when you look at this generation of white leadership in Virginia, which held the capitol of the Confederacy in Richmond, you see that they were in school [after the Civil Rights Movement]. You look at Governor Northam, and he was [wearing blackface] in the 1980s and then you look at the attorney general, and again, that was also the eighties. So even educated white men in the 1980s felt free to join in the mockery of their fellow Americans.

 

For young white men in Virginia and in fraternities, it was just acceptable behavior to replicate old Jim Crow dancing happily, making themselves fools for the entertainment of other whites. Blacks were portrayed as less intelligent, less hard working, less trustworthy liars and cheats-- people that you wouldn't want to be around, except to laugh at as fools. Certainly, not anybody you would trust as an employee or as a public official.

 

So here you have in modern America of 21st century a reminder of how even the best educated whites were also party to this longstanding dehumanization of black people by putting on blackface. It speaks again to the power of history to inform our understanding of who we are and who we elect today. Remember, both of those Virginia officials won the black vote in Virginia. What's curious about this is that Gov. Northam has continued to receive support from black Virginians who saythat's the just way it is, and let's look at his policies now,” and hope, in fact, that this might raise the race issue to the point where he and others feel as if they have some responsibility to make amends.

 

Robin Lindley: I can't help but think too that this ties also to the eugenics movement in early twentieth century America and white supremacy. Attorney Bryan Stevenson, who founded the Equal Justice Initiative and the Legacy Museum in Alabama, said that he believes that the worst legacy of slavery isn’t involuntary servitude, but it is the legacy of white supremacy that has echoed through time.

 

Juan Williams: That's my point. So, I say amen to Mr. Stevenson because this extends not to just to what you describe as the legacy of white supremacy in terms of our political institutions. It extends also into our assumptions about what we accept as normal in America. And I wrote this new book to open our eyes to see discrimination and inequality across the years so we can better understand it in the current context. 

 

You think about the contemporary American standards of success, standards of beauty, the exercise of power, the standards of intellectual achievements. President Trump, without any sense of history, acts as if black people are not contributing to the country, not up to these standards. 

 

Even if Trump wants to focus on higher level of poverty among blacks today, he can’t fool people who know their history. They can say to themselves, “Hey, wait a minute. Black people were kept out of our institutions beginning with schools. Black people were kept from buying real estate in neighborhoods that would have allowed them to amass wealth through the value of homes and property. Black people were kept out of American business and had no access to bank capital. Black people were kept out of the American military. Black people were kept out of our sports.”

 

For a young person today, you have to think about all these things when you hear someone like Trump putting down black people. He is appealing to what Mr. Stevenson said is the legacy of white supremacy.

 

Robin Lindley: I don't know if it matters whether we call Trump a racist, but he is a person who has a personal and family history of discriminating against people based on their race, which you document in your book.

 

Juan Williams: Yes, I think it's very important for people to know Trump’s personal history. You can inform people about the power of American history, but when you talk about it in terms of individual history you see the roots of his kind of leadership; the basis of so much of a person’s thinking as they grow to adulthood and then into power. That is a very revealing and illuminating backdrop for readers of American history. I think that's why biography, by the way, is such an important branch of American history. 

 

So, when I talk about Trump, I start with his father and his father was arrested at a Klan rally. We don't know if he was a Klan member, but we know he was arrested at a Klan rally in New York in the early part of the 20th century. 

 

 And then you come forward to the Trump family business agreeing to a deal with the US government that had charged them with housing discrimination in New York City in their housing units. In fact, in this book, I write about Woody Guthrie and others who were writing songs about the rank discrimination at Trump properties in New York City in mid-20th century. 

 

Once people have an understanding of Trump's upbringing and experience with race, they then might also come to understand why he's the guy who, when the five black and Latino boys are charged with beating and raping a white woman in Central Park, Trump leapt to assume their guilt. Subsequently, when they were found not guilty of this crime on the basis of DNA evidence, Trump did not recant or apologize for having run a full-page newspaper ad calling for the death penalty for these boys. He said nothing.

 

And Trump of course engaged in the whole birther argument against President Obama, trying to diminish the first black president by making him an illegitimate president. Some people might say, well, you know, what's the big deal, where is the birth certificate?—without understanding that it fits into this ongoing pattern in Trump’s life of appealing to racist sentiments that vilify people of color as strange, alien, foreign, dangerous, and coming to take your job or to disrupt your neighborhood. This is part of who Trump is and it's part of American history.

 

Robin Lindley: Trump's efforts to destroy any of the legacy of President Obama seems pathological. He and his party have tried to erase everything that President Obama accomplished. What's your sense of this obsession?

 

Juan Williams: A lot of the anger at President Obama again comes back to the idea that Obama was the first black president and did not rise to success through established white business, military or even political hierarchy. 

 

The argument, especially from a lot of the Trump people is, who is this guy? How did he get here? Who were his patrons? They derided him as a “community activist.” And even more, they expressed fear that Obama was going to take a special interest in caring for black Americans—that he was going to be the black president and neglect white America or even get revenge on whites.

 

That is a very interesting twist on reality. Historically it was black people who were excluded by white politicians from programs like the GI Bill, the programs to get people into schools and to help them buy housing. Even parts of the New Deal were not open to workers in unskilled, non-union jobs dominated by black people. This is the history of the country. Yet now we are told to focus on white, working class fear that the blacks are being given something for nothing. This is ridiculous if you know history.

 

Of course, President Obama's response to this psychological twisting by some people was interesting too because he was always on guard against the notion that he was only the president of the black people – not the president of all the people. Black activists on the far left often criticized him for not doing enough for black folks. It was kind of a Catch-22 in my mind.

 

But back to the Trump perspective. Again, it's the idea that if you undo Obama policies, what you are doing is making America great again by reorienting all the policies back to big business, and taking them away from trying to make amends for prior discrimination or high levels of inequality in American society. It's less about raising up those who have been left behind and more about rewarding those who are in power or who are at the top of the economic ladder. 

 

To me, it has as much to do with symbolism as it has to do the actual undoing of the policy. If you think about things like reversing environmental regulation or refusing to put in place consent decrees to deal with police accused of being brutal in their treatment of blacks, it is unbelievable. It is striking that Trump is able to convey to political base of support that, in order to make America great again, you don't want to lift up those who have been left behind, especially people of color. 

 

We haven't mentioned this, but I think it's very important to include immigrants, people of all colors but especially immigrants of color that Trump infamously described as coming from “Shithole” countries. Right from the start of his campaign, he demonized immigrants and specifically Mexicans as rapists and criminals. And again, the idea is these people are coming and taking advantage of the USA, when in reality these people are often times working in industries that can't find workers. And these are people who are trying and striving so hard to achieve their American dream.

 

Robin Lindley: The hate speech has been deafening. The events in Charlottesville had to be shocking to you. I never thought that I would see Nazi and KKK rallies in 21st century America, and you now have a president and a political party, the Republican Party, silent about this sort of racism and violence.

 

Juan Williams: Yes. Let me just say when Trump talks about fine people on both sides, the most forgiving interpretation you can give to him is to say that he sees these self-identified white supremacists as fine people who are simply standing up for Confederate statues and monuments to soldiers who fought to break up the United States. The phrase “fine people” assumes that these neo-Nazis were generally good Americans, just with different points of view about historical markers. In fact, they were celebrating racists and traitors to the American flag.

 

History can help you to stop and look through the fog of words from the president and you will see the history and tradition being celebrated is one of the Confederate Army attempting to secede from the United States and to break apart our country, and second, to defend slavery and impose that kind of oppression of human beings in the United States of America, a country based on the proposition that all men are created equal.

 

And again, only history can inform you of this distortion, which might fly by your ears while listening to the president of the United States. Fine people on both sides. Well, no. These are people who are celebrating monuments that are in fact intended as reminders of that legacy of white supremacy. Even if you were to say, and in many cases I can understand someone saying, the Confederacy is part of American history, like it or don't like it, and it's important that we know about it for better or worse.

 

Absolutely. It's also true that in Germany they do not celebrate with markers and monuments to the Third Reich.

 

Robin Lindley: I was thinking about the German example too. Trump and the Republican Party support the rollback of the Voting Rights Act and the undermining of democracy with voter suppression, trumped-up investigations of voter fraud, and gerrymandering. How do you see these efforts and why are Republicans virtually unwilling to contest Trump's racism and other faults?

 

Juan Williams: I expect it's a matter of pure politics in a contemporary context to understand the president and his support from self-identified Republican voters in the country. Any Republican who would challenge Trump's racist rhetoric and his other flaws would lose the Republican base. It has become the case that the Republican Party of 2019 is truly better described as the Party of Trump.

 

And, when it comes to issues like race, I think Republicans don't see that much benefit to stand up and speak honestly about Trump's racial attitudes despite the fact that 49 percent of Americans think the president of the United States is a racist, according to a Quinnipiac poll last year. It's incredible that nearly half of us would make that statement to a pollster. But that's repeatedly been the case. The same poll has 22 percent of Republicans saying this president has incited white supremacist behavior and actions in the country.

 

I'm very disappointed in where we are in terms of our nation’s civic morality and commitment to the historical premise of our country, equal opportunity for all. Somehow tribal political allegiances are failing to hold to those values. We seem to be going in the other direction, intent on exclusion instead of inclusion. You can't not see it if you open your eyes, but [Republicans] choose to ignore it, and in some cases to use it. 

 

You mentioned Michael Cohen’s testimony in which he called the president a racist in front of the Congress and the nation. And there was a congressman who then introduced a black person who was a friend of the Trump family and then promoted into a position in government by Trump. And he had this person stand up, like her presence was evidence that Trump is not racist. Well, again, this requires that you lose not only Trump's personal history, but our nation's history. Trump appeals to elements with a grievance against the minorities and uses that to elevate himself into power. So you'd have to ignore all of that. But again, using somebody as a prop to excuse it may be worse even than ignoring it.

 

Robin Lindley: That was a chilling moment, wasn't it? As that Republican member of Congress displayed this young black woman, it reminded me of antebellum images depicting slaves displayed on auction blocks.

 

Juan Williams: I hadn't thought of that one. Gosh.

 

Robin Lindley: If Trump has done anything positive, it seems he has sparked a conversation on slavery, cruel Jim Crows laws, racism, mass incarceration and more, and you have added to the dialogue. I may be in a bubble, but I've heard much more about this history in the past couple of years.

 

Juan Williams: I think so and we even see with the candidates campaigning for president on the Democratic side. They were in Selma, Alabama [observing the 54th anniversary of the Bloody Sunday, Selma March for voting rights], looking at America and the civil rights era and how we've come out of it. Again, the history informs our understandings about what's going on now and why there's such concern about white nationalism and negative racial attitudes flourishing under Trump.

 

Robin Lindley: You provide context in your book by weaving the civil rights history into your discussion of rollback of programs in the past two years. You focus on activists such as Bob Moses who risked his life to register black voters in the brutal Jim Crow South.

 

Juan Williams: I have respect for people who stand up for principle, but to sacrifice their lives is unbelievable. People don't understand the kind of courage it took to go up against segregationists who had guns and the authority to put you in jail or beat you without consequence if you said that all Americans should have the right to vote. It almost sounds ridiculous that you would have to fight for such a thing.

 

Bob Moses is still alive and living out a kind of second chapter of his life and his legacy as a civil rights pioneer. Now he's involved with something called the ‘Algebra Project.’ It teaches math skills to young people of color, giving them a chance to gain equality in terms of preparation for this high-tech economy. I wanted to focus on his work in Mississippi and in the South in general on voting rights because there were people, even civil rights heroes, who would not have been as courageous as Bob Moses in going down and directly challenging the Southern white segregationist power structure for failing to allow black people to vote. And it wasn't just challenging the white power structure. He had to challenge black people in a way that they had to put themselves on the line to stand up against that power structure. They faced the risk of violence, risk of jailing, and risk of loss of their jobs and mortgages, and all the rest, in order to take part in what Bob Moses was promoting. There's a lot of people who deserve similar credit for leading that effort and that movement.

 

But when we come to today and the Republican Party, you start to see that they view minority voting, and especially black voting, as a threat. Then you come to understand the historical roots and you start to look at what is identified often as voter suppression efforts by today's Republican Party, and you say we've lived through this as a country before in a different form in terms of outright denial of the vote. But now we come to efforts to push people off the voting rolls, limit the polling places available in minority neighborhoods, and limit the time available to vote at those polls, And you're reminded of things like literacy tests for the right to vote and limited time to register to vote for blacks. There were crazy tests [in Jim Crow states] about how many bubbles are in a bar of soap, or how many marbles are in a jar. These tests were all intended to deny black people the right to vote in earlier African American history.

 

Robin Lindley: Under President Trump, former Attorney General Jeff Sessions was curtailing enforcement of civil rights laws.

 

Juan Williams: This is such an interesting history. Now of course, that goes beyond Jeff Sessions, who is gone as Attorney General. 

 

Sessions didn’t want to enter into consent decrees with local police departments that encouraged making peace between local black communities, the Black Lives Matter movement, and police departments that were finding themselves charged with racism on the basis of either young men being shot and killed or police brutality and, of course, high rates of incarceration, and all that. On all of these counts, the idea was that the federal government under Obama tried to have a healing, salutary effect. Then, in comes the Trump administration with Jeff Sessions as Attorney General, and they say, “No, we're not interested in that.” What kind of signals does that send? To me, that's a pernicious signal. It's saying, "Trump supporters, Republicans, are not interested in that." 

 

In fact, they doubled down by giving police more military-style equipment. Again, this was a real signal of their antipathy towards racial peace in the country, in my mind.

 

Robin Lindley: You also comment on the Administration’s undermining of public education. You recount the history of civil rights and education with your story of James Meredith who enrolled as the first African American student at the University of Mississippi in 1962. You capture the extremely violent atmosphere at that time, and the story may surprise many younger readers in particular.

 

Juan Williams:  Instead of simply being a polemic aimed at how Trump has enabled racist sentiments to rise up, I think it's important to tell people exactly what we have to lose again in reference to Trump's statement: "What the hell do you have to lose?" Part of that speaks directly to the idea of James Meredith, the US Air Force veteran who enrolled at the segregated University of Mississippi, and wanting people to understand that it took the federal government to go down there and defend this one person who simply wanted to go to school--this one person again, an Air Force veteran who had served his country. And this one person was the target of such animus that there were riots and people were killed. 

 

It's hard to tell this story without explaining the level of violence that was mounted, and mounted with the encouragement of the governor at the time, Ross Barnett, with the intent of building political support for the white segregationists who would stop a black person from attending a state-funded university. 

 

And federal troops were sent into Oxford, Mississippi, so James Meredith could enroll. There was a pitched battle. Segregationists attacked US troops. Some people didn't make it through the night. Some people did not survive. Even a reporter was killed. And again, for so many people today, it would be hard to understand.

 

And recently, at the University of Mississippi, some of the basketball players knelt during the national anthem because there was a show of support for a rally by white supremacists coming to the campus.

 

Robin Lindley: I didn't know about that recent racial incident. That’s heartbreaking. You also detail cutbacks in funds for low income and fair housing and how these housing programs are now imperiled under Trump. And you share the story of Robert Weaver, a pioneering advocate for fair housing and the Secretary of Department of Housing and Urban Development. 

 

Juan Williams: The HUD building here is named for Robert Weaver and lots of times I point this out when I have guests in town and they say they had no idea. 

 

He's a figure who was obviously not lost to history, but I think he's not valued or celebrated in the way that he should be, now especially given Trump's background as a real estate developer. It's an interesting point of contact that there was a civil rights pioneer in just that area who insisted on equal housing rights in a country that had not only practiced housing segregation but also engaged in legal tactics to exclude blacks, Jews and other minorities from owning property in certain neighborhoods. 

 

Robin Lindley: I did want to ask about your role on Fox News. We have cheap cable, so we don't get Fox News or CNN or MSNBC. I wondered how you decided to appear on Fox News after your work on PBS and your work as a journalist. I'm heartened to learn that you present a counterpoint to the right-wing zealots who dominate Fox News.   

 

Juan Williams: Fox News is the number one cable network in the country. They don't tell me what to say and they don't censor me. 

 

For me, what's important is that I speak to an audience that otherwise wouldn't hear a different perspective. They wouldn't hear the historical background that I bring to discussions of contemporary events. The current politics in America is so partisan and so polarized and I offer people trapped in their own ideological bubble a breath of a different kind of air.

 

I think of myself as a foil to some of Fox’s leading hard right personalities, but it can be very difficult for me. And even with this book that we're discussing, What the Hell Do You Have to Lose: Trump's War on Civil Rights, you have people on the far right who immediately attacked the book, even before it was published. Their aim was to undercut its value and its attractiveness to readers. I find that just so alarming that you can't even have an honest discussion, an honest debate, or people will try to silence you. Anyway, I'm up against in many ways, but I think this is, for my time, the most important fight to be in.

 

Robin Lindley: That’s an act of courage now. I saw your Twitter feed and was stunned by the barrage of hateful and racist remarks you receive.                     

 

Juan Williams: Yes. That's what happens. You have these trolls and then the bots pick up and they never stop. They try to bury you alive in terms of American history. Can you believe that?

 

Robin Lindley: I’m sorry you’re the target of these vicious attacks.

 

Juan Williams: I just think that's an important point for you and for the History News Network to be aware of. In the current political climate, there are people who don't want to hear and will kill off any attempts to raise up American history, to help us better understand who we are and where we are today. I think that's what happened in terms of what you saw on that Twitter feed with this book.

 

Robin Lindley: Were there efforts kill the book?

 

Juan Williams: No. They couldn't stop me from writing the book and they couldn't stop the publisher from publishing the book, but what I'm saying is, when you see those remarks that you described on Twitter and the like, some of those came before the book was published. And then you see this onslaught of people who say they were reviewing it and they didn't even have the book. But they are so harshly critical because they don't want that message to be given any attention. They want to dismiss it out of hand or, as I was saying earlier, they want to kill it in the cradle. And that hasn't stopped. I just know that the bots or the trolls have been intent in trying to undercut this discussion of Trump and his policies on race.

 

Robin Lindley: I regret that I haven't seen your commentaries on Fox News. Are you able to talk about the war on civil rights and share your opinions on Trump and his administration?

 

Juan Williams: Typically, the racial issues come up time and again in American society. Recently, we talked about the blackface issue. We also talked about Jussie Smollett, the actor in Chicago who claimed that he was attacked by people wearing MAGA hats. And we talk about the spike in hate crimes since Trump has been in office, including the incident in Charlottesville.

 

My historical bent informs the comments that I make about these news events. But again, it's one voice and oftentimes people don't want to hear what I have to say and it becomes contentious. But, I'm in there and I'm trying to do my best.

 

Robin Lindley: I appreciate your unique perspective and your efforts to provide historical background in your work. Congratulations on your timely and informative book on the rollback of civil rights progress under Trump. 

 

Juan Williams: I am so grateful that you guys [at HNN] love history as much as I do. It doesn't have to be about race. On any subject, I think that the more people are aware of our history, the more they'll love this country and the more they’ll understand the true purpose of this country, which is opening doors, building bridges. We have a great country but we have to protect the ideals aggressively.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171716 https://historynewsnetwork.org/article/171716 0
Teacher Pay, Presidential Politics, and New York’s Modest Proposal of 1818

 

With her recent pledge to raise teacher salaries, Senator Kamala Harris has guaranteed the issue will be a key part of the 2020 Democratic Party primary debates. Since the slew of teacher strikes began last year, the question of teacher pay has shoved its way to the front in American politics. We shouldn’t be surprised; teacher pay has always been one of the thorniest issues in public-school administration. The first generation of amateur administrators considered one solution that will seem shocking only to those who do not understand the desperate politics of school budgets.

 

The fact that teacher pay is a difficult issue should not come as any surprise. Public schools, after all, generate no profit and teacher salaries make up the biggest portion of budgets. Even a small decrease—or a reduced increase—in teacher pay can create a lot of wiggle room in a struggling school-district budget. It is always a temptation for administrators in straitened circumstances to dream of cutting salaries.

 

The most experienced and cynical teachers, then, won’t be surprised to hear of one plan from the earliest days of public education in the United States. Two centuries ago, well-heeled philanthropists of the New York Free School Society (FSS) operated two schools for the city’s least affluent children. 

 

Money was always tight. As the trustees noted in 1820, the FSS cobbled together funds from “the donations and Legacies of charitable Individuals, the bounty of the Corporation [i.e. city government] and the munificence of the Legislature.” Their funding was never guaranteed and it was never enough; the organization lurched from one financial crisis to the next. 

 

The dilettantish leaders had not expected this. They thought that a new plan of organizing their teachers would prevent such financial shortfalls. According to this plan, called “monitorial” or “Lancasterian” education, charismatic English school-founder Joseph Lancaster promised that thousands of children from low-income families could learn to read and write without expensive teacher salaries. One teacher would supervise “monitors,” student helpers who would do all the hard work of teaching. 

 

Experienced school leaders might have seen the problem coming, but New York’s administrators had more enthusiasm than experience. They were surprised to find that their monitors did not want to work for nothing. When the monitors were asked to do so, many of them simply left to open entrepreneurial schools of their own or to take jobs in other fields. The ones who stayed demanded salaries like the other teachers.

 

The FSS leaders were in a bind. Their budget was based on the free labor of monitors. They could not afford to pay more teacher salaries. In 1818, they wondered if they might import a traditional English solution. In October, 1818, the FSS board of trustees considered a plan to turn their young teachers—the monitors—into formal apprentices. The financial benefits would be enormous. 

 

According to the long tradition of apprenticeship, young people bound by indentures could be forced to work without salary until they turned twenty-one. In exchange for learning the art and mysteries of the trade of teaching, these young apprentices would be legally bound to serve the FSS for free until they aged out. 

 

The trustees created a model indenture form, one that would bind teachers until they turned twenty-one,

 

during all which time the said apprentice shall faithfully serve said [Free School] Society, obey their lawful commands and the lawful commands of such teacher and teachers and their agent or agents . . . for that purpose.

 

Beyond saving money on salaries and guaranteeing a committed, if temporary, workforce, the traditions of apprenticeship would have allowed the board to exert control over all elements of young teachers’ lives. As was common in apprenticeship agreements, the FSS indenture agreement forbade apprentices from having sex, getting married, gambling, drinking, or attending plays. Most important, the agreement legally prohibited apprentices from leaving.

In exchange, the FSS offered food, housing, and clean clothes. When the apprentice teachers aged out, they would be granted a leaving bonus—a parting gift of cash, with the amount to be determined later by the FSS. 

 

Temporary enslavement like this might seem shocking these days, but it had been a common practice among teachers in England. In 1813, for instance, Joseph Lancaster took on young men—never women—to serve as his teaching apprentices in his thriving London school. 

 

As was common in apprenticeship systems, parents often heartily supported the “masters.” One father told his grumbling son and Joseph Lancaster in 1813, 

Your master may dispose of you in any way he pleases—he has my hearty concurrence beg leave again to repeat—Sir do with my Child as you see fit my liberty and blessing you have For ever.

 

The power to deal with instructional staff in any way they pleased was enormously tempting for New York’s early school leaders. In the end, however, New York’s FSS trustees decided against turning their employees into apprentices. They realized that their young teachers would not likely accept the plan and could veto it, in practice, by simply walking out. 

 

The lessons of the early 1800s resonate with our teacher-pay politics today. The pressures on any school budget have always been intense. There is never enough money for every program and every need. When underpaid teachers threaten to walk away, school leaders have always considered desperate plans to fix desperate financial problems.

 

For more by this author, check out his latest book:

 

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171717 https://historynewsnetwork.org/article/171717 0
How did President Reagan Deal with Violent Radicals In His Own Party?

 

President Trump’s reaction to the tragedy in New Zealand was very much in line with his take on Charlottesville—a wink and a nod accompanied by a few ambiguous platitudes. That Brendan Tarrant, the terrorist who perpetrated the massacre, would explicitly acknowledge Trump as a “symbol of White identity and common purpose” comes as little surprise. After all, Trump the candidate urged supporters to beat demonstrators in his rallies. The violent participation of explicitly neo-Nazi groups like the Rise Above Movement at pro-Trump rallies around the country has also been met with a conspicuous silence from the White House as has the rise of racist violence globally.

Scholars of terrorism have long understood that there is a connection between the perception that national leaders condone or tacitly support violence and the decision of perpetrators to move from dreams of slaughter to the reality of cold blooded murder on the grandest possible scale. 

The tragedy in New Zealand and the series of church and synagogue shootings here bring to mind an earlier day and a very different president. In the 1980s, President Ronald Reagan entered the White House as the first explicitly pro-life president. To the rescue movement, the radical fringe of the pro-life community, the news was electric, and to some, a sign of Divine grace. Until then, the deeply religious rescue movement had followed Operation Rescue—the first national rescue organization led by Randal Terry—in pledging non-violence at clinic level demonstrations. 

From the staid confines of the annual White Rose Banquet to the increasingly acrimonious confrontations at abortion clinics, the talk was of the new President and how he was with them, even if he could not say so openly.

The rescue movement soon became bolder, holding massive demonstrations in cities across the country. Violent confrontations with police soon followed and the movement’s faithful—white, middle and working class churchgoers all—became acquainted with the realities of urban jails and the less than gentle ways police maintained order in many of these lockups.

Newer and more militant groups soon emerged with names like the Lambs of Christ and Missionaries to the Preborn. At the same time, demonstrations gave way to attacks, first on property as clinics were firebombed or rendered inoperable with chemicals like butyric acid poured through locks and keyholes. 

The violence turned deadly with the murder of Dr. David Gunn in Pensacola, Florida in 1993. Other killings followed. In my interviews with members of the rescue movement and doing fieldwork at clinic level demonstrations, it became clear that the turn to lethal violence reflected a disillusionment with America, which many protesters had came to identify with Nazi Germany based on abortion statistics and the perceived support of the nation’s leadership for the rule of law over that of conscience.

Much of this disillusionment occurred when President Reagan strongly condemned attacks on abortion clinics. On January 4, 1985 for example the Washington Post reported that the president “condemned such violence ‘in the strongest terms,’ and ordered an all-out federal effort to find those responsible for the ‘violent, anarchist activities.’”

The rescue movement could understand the political necessity of issuing statements against abortion violence, but unleashing the federal government against the movement sent a clear signal that violations of the law would not be tolerated. The federal effort was eventually successful in crushing the rescue movement when the Justice Department under the Clinton Administration used RICO statutes [Racketeer influenced and Corrupt Organizations law that had been designed for the Mafia] to effectively drive rescuers from the streets.

By contrast, under President Trump the rule of law is useful only when he deems it personally convenient. Given his actions aimed at bringing the rescue movement to heel, it is safe to assume that President Reagan would never have tolerated either the extremists at Trump Rallies or the Alt-Right extremists who perpetrate violence. Nor would President Reagan have had the slightest hesitation to express both sympathy and empathy to for the Muslim victims of the murders in Christchurch.  

In the end, what is left is the methodical workings of the law and the election cycle. Many years ago, the great historian Arthur Schlesinger noted the cyclical nature of American politics. The country goes through phases of activism which unleashes populist and leftist extremes and retrenchment which produces a tentative unity and a period of economic acquisitiveness. Recent election results have brought a Democratic majority with an investigative fervor to Congress. Trump policies from the border wall to the withdrawal from Syria are being increasingly challenged by both parties. 

Moreover, the second Unite the Right in August 2018 rally fizzled amid a wave of disgust for what occurred in Charlottesville. Two dozen marchers turned up, only to be dwarfed by hundreds of counter-demonstrators and a strong police presence.

All signs point to the wisdom of Schlesinger’s observations. Americans may for a time be entranced by the bromides of the populists or the voices of racism and division, but it soon passes and a twenty-first century version of Reagan’s ‘Morning in America’ will invariably follow. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171715 https://historynewsnetwork.org/article/171715 0
Is the Western World Declining and Russia Rising?

 

Is the western world declining and Russia rising? Yes, according to Glenn Diesen’s 2018 book The Decay of Western Civilization and Resurgence of Russia (hereafter Decay). Such a conclusion might come as a surprise, but it was a common one among nineteenth-century Russian nationalists. But paraphrasing Mark Twain’s reputed comment about his falsely-reported death, we might say that the report of the Western world decaying—either in the nineteenth century or today—is “greatly exaggerated.” 

 

The main reason that Diesen thinks “the geoeconomic foundations for the West's primacy for the past 500 years is ending” is that it has overemphasized the individual, rational, impersonal, and contractual to the detriment of community, which places more value on the irrational, instinctive, and spiritual.  (“Western civilisation prospered by embracing liberalism and rationalism, yet condemned itself to decay by self-identifying solely by these values.”)  In reaction to this overemphasis, Western right-wing populism has emerged, spurred on by resentment toward globalization and immigration and encouraged by the likes of Donald Trump and France’s Maria Le Pen

 

The West’s modern-day ills, according to Decay, also reflect the influence of cultural and political postmodernism, which “produces devastating nihilism by discarding traditional identities, cultures, nations, religions, the family units, genders, and civilisation itself as arbitrary . . . . Tolerance is corrupted by being translated into an embrace of cultural and moral relativism.” Diesen also believes that identity politics has converted the melting-pot ideal into that of the “salad bowl.” Unlike historian Carl Degler, who viewed the latter term favorably because it did not reject diversity and pluralism, he links it with “divisive society,” the politics of “victimhood” and evaluating equality as equal outcome rather than equal opportunity. Emphasizing “responsibilities towards sub-groups” rather than state loyalty “undermines the rule of law, legitimacy of government, and democratic rule.” 

 

Diesen’s criticism of identity politics fails to acknowledge, as Jill Lepore has indicated, that such “politics, by other names, goes all the way back to the founding of the Republic.” And “the identity politics of slaveholders and, later, of the Klan, and of immigration restrictionists, had been the work of more than a century of struggle—for abolition, emancipation, suffrage, and civil rights.” Although some fault might be found with the identity politics of recent decades for diluting “Rooseveltian liberalism” (Lepore’s view),  Diesen’s simplistic critique lacks context and nuance and undervalues the virtues of pluralism.

 

He goes on to opine that Western decay has sped up since the end of the Cold war. As globalization and new technology “centred on communication, automation, and robotics” has quickened, so too has growing inequality, “loss of civility,” political polarization, immigration, and hostility toward immigrants.

 

He also believes that Russia is rising, partly because it has “repositioned to the heart of Eurasia,” which has allowed it to reduce its reliance on the West, “while concurrently increasing dependence by others on Russia.” But also because it has balanced economic development with “culture and traditions to address the innate and imperishable in human nature, and to maintain a distinctive identity in a globalising world.” 

 

Although he barely mentions nineteenth-century Russian nationalist thinkers, Diesen’s ideas are similar to theirs in many ways.  Many of them, including Dostoevsky, thought that the West (by which they generally meant Western Europe and the United States) overemphasized rationalism, individualism, and money-making. Conversely, Russia was more religious and appreciative of non-rational elements and a spirit of community. 

 

In the mid-nineteenth century, the historian M. P. Pogodin wrote that the USA was “no state, but a trading company” that “cares solely for profit . . . She will hardly ever bring forth anything great.” In his book Russia and Europe (1869) the botanist Nikolai Danilevsky claimed that European civilization was not the only type of civilization and that there were no universal values, but that different historical-cultural types existed and that Europe and Russia belonged to two very different types. Western Europe, he believed, was decaying and an emerging Slavic civilization led by Russia was the great future hope. 

 

In his Winter Notes on Summer Impressions (1863), Dostoevsky criticized the West for its individualism and materialism. Later, in an 1880 speech he gave on the Russian poet Pushkin, he suggested that Russia might aid the West in helping it to regain a more spiritualistic basis for society.

 

Diesen states that Russia offers the possibility of doing something similar today. He asks, “Will Western decadence result in completely eviscerating the West as a civilisation . . . or does Russia intend to assist in rejuvenating the traditional and spiritual in the West?” In a December 2013 address President Putin declared that “today, many nations are revising their moral values and ethical norms,” destroying “traditional values,” accepting “without question the equality of good and evil.” 

 

Shortly after this speech, Diesan notes that American conservative Pat Buchanan wrote “Is Putin One of Us?”  Buchanan suggested that in “his stance as a defender of traditional values” Putin is very much in tune with U. S. conservatives.  He added, “Peoples all over the world, claims Putin, are supporting Russia's ‘defense of traditional values’ against a ‘so-called tolerance’ that is ‘genderless and infertile.’ . . . Putin is not wrong in saying that he can speak for much of mankind.” Later, columnist William Pfaff wrote that “the resemblance of President Putin's ambitions for his Russia to those of the neoconservatives in the contemporary United States bear a striking formal resemblance.”

 

According to Diesen, Russia’s appeal to Western conservative “populists” is its “bold and unapologetic commitment to preserve traditions, national culture, Christian heritage, national identity, and the family structure.” Besides Trump and France’s LePen, the author mentions other leaders, as well as Western parties, who generally share “a remarkable degree of empathy towards Russia and the belief that they have a common cause.” The group includes 

the Brexit-advocating UK Independence Party, the Alternative for Germany (AfD) party, and right-wing populist leaders who have come to power in Hungary, Poland, the Czech Republic, and Austria. In a late 2018 interview, Diesan reaffirmed this view of Russia as a defender of traditional values and that “Russia has returned to its pre-communist role as the go-to country for Western classical conservatives.”

 

Despite some nineteenth-century Russian nationalists’ predictions of Western decline, few people in the western world before World War I (WWI) believed the West was declining. Prior to 1914, nine western European countries controlled over four-fifths of the earth’s lands, and the United States had also expanded the lands it controlled by annexing Puerto Rico, the Philippines, and Guam and making Cuba a protectorate. In his memoir, The World of Yesterday, the Austrian Stefan Zweig wrote of the late-nineteen century widespread belief in progress—e.g., electric lights, telephones, automobiles, improved sanitation and medical treatment, expanded voting, justice, and human rights, reduced poverty, and even the hope for more peace and security.  

 

WWI, however, changed such optimism. Over 15 million Europeans, soldier and civilians, lost their lives in to what many seemed a senseless war. France regained some territory it had lost to Germany in the Franco-Prussian War of 1870-71, but 3 of every 10 Frenchmen between the ages of 18 and 28 paid for the gains with their lives. In addition, the peace treaties that followed the war were unsatisfactory to many, especially in Germany, which helped give rise to Hitler. The mere title of the German Oswald Spengler’s Decline of the West (1918-1922) was just one of many signs that a fundamental change had occurred in Western confidence.

 

The rise of communism in Russia, with Stalin eventually succeeding Lenin (d. 1924) as the Soviet leader; the rise of Italian fascism in the 1920s; the worldwide Great Depression of the early 1930s; Japanese aggression in Manchuria; and Hitler’s assumption of power in 1933 further sapped confidence in Western progress. Membership in western communist parties increased during the Depression, and not a few Western intellectuals, often fooled by Soviet propaganda, believed that more hope lie in communist Russia than in the capitalist West.

 

But, as happened before and has happened since, Western decline was only temporary. From 1933 until 1945 President Franklin Roosevelt (FDR) brought the United States out of the Depression and led a coalition of powers that defeated Germany, Japan, and other nations. In 1955, one of the twentieth-centuries most astute political philosophers, Britain’s Isaiah Berlin, had this to say about the 1930s:  “The most insistent propaganda in those days declared that humanitarianism and liberalism and democratic forces were played out, and that the choice now lay between two bleak extremes, Communism and Fascism . . . .The only light that was left in the darkness was the administration of Roosevelt and the New Deal in the United States. At a time of weakness and mounting despair in the democratic world Roosevelt radiated confidence and strength.. . . Mr. Roosevelt’s example strengthened democracy everywhere.”  

Contrasting today with late 1932, when Depression gloom was at its height, it is difficult to believe the Western world is now in worst shape than back then. Is a Russia led by an opportunistic Vladimir Putin, allied with conservatives who have groveled to the likes of Donald Trump, really one of the world’s greatest future hopes? Can the West still not produce leaders of FDR’s caliber? Can it still not reinvigorate values that once helped make it great like freedom, democracy, equality, social justice, and tolerance? 

History has demonstrated enough zigs and zags, stops and starts, setbacks and advances for anyone assuredly to predict the future. The nineteenth-century Russian exile Alexander Herzen was closer to the truth than Danilevsky, Spengler, Diesen, and others who thought they could foresee the future. In an essay on the Utopian socialist Robert Owen, Herzen wrote that “nature and history are . . . ready to go anywhere they are directed,” and “having neither program, nor set theme, nor unavoidable denouement, the disheveled improvisation of history is ready to go along with anyone. . . . A multitude of possibilities, episodes, discoveries in history and nature, lies slumbering at every step.” 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171718 https://historynewsnetwork.org/article/171718 0
The Spies' Marathon before Patriots Day

The Boston Marathon

 

Long before there was the Boston Marathon, a couple of British spies in Massachusetts stumbled into one of their own. Captain John Brown and Ensign Henry De Berniere were sent by British General Thomas Gage on a long journey to survey the countryside outside Boston in February, 1775. Gage was planning to move troops against the rebellious colonists, but needed intelligence on roads and towns for the dangerous mission. Things were tense between the armed Massachusetts colonists and the occupying British forces in Boston.   Brown and De Berniere disguised themselves to look like the typical Massachusetts resident of the time, brown clothes and reddish handkerchiefs. Their mission was by foot through Eastern Massachusetts. These spies had their own colonial version of the Boston Marathon traveling through Suffolk and Worcester counties. They needed lots of carbohydrates.   So Brown and De Berniere, along with their servant John, stopped to eat at a tavern in Watertown early in their route. They of course wanted to go unnoticed, finish their meal, and rest for the night. But when the restaurant's waitress kept eyeing them "very attentively," this was not a good sign for the spies. The two men tried to make casual conversation with the waitress, remarking on the fine land of Massachusetts. The waitress replied, "So it is, and we have got brave fellows to defend it, and if you go up any higher you will find it so.”  Uh-oh. That is the word spies never want to say while on duty. Getting recognized a few minutes into your mission is a tough start for an undercover operation. De Berniere wrote in his account, "This disconcerted us a good deal, and we imagined she knew us..." They conferred with their servant John, who overheard that indeed the waitress had recognized the British officers. Their mission was compromised and they were in danger. They decided not to go back to Boston though because they would look foolish.  The British spies did cancel their plans to stay overnight at the tavern. It was not clear if they left a tip for the waitress.  Brown and De Berniere continued on to a safer inn. They eventually made it all the way to Worcester “very much fatigued” from their journey. But as they continued on their “secret” tour of Massachusetts more people were taking notice of them. They did keep collecting information from all the towns.  On the way back to Boston a classic Nor'easter arrived, bringing snow and rain to their obstacles. De Berniere wrote “it snowed and blew as much as ever I see it in my life.” Welcome to Massachusetts! They walked very fast, fearing pursuit. This was a race for their lives. De Berniere wrote, they were exhausted, "after walking thirty-two miles….through a road that every step we sunk up to the ankles, and it blowing and drifting snow all the way.” Gage should have given them a medal for their marathon spy mission when they returned to Boston. He received good intelligence from his officers. They showed courage and endurance under tough conditions. In March he even sent the pair on a sequel to gather intelligence on a town called Concord.  We know what happened next. On April 19, 1775, the first shots of the Revolutionary War were fired at Lexington, followed by the fight at Concord Bridge.  We celebrate Patriots’ Day to commemorate the battles of Lexington and Concord. The Boston Marathon is run on that day. Hopefully, not in the snow that Brown and De Berniere endured.  Patriots’ Day can also be a celebration on the peaceful relations between Britain and the U.S. after many years of war. The building of that peace was a marathon in itself spreading over many decades and treaties. That is something we should be proud and hope it stands a symbol of peace for all nations.  If you are running the Marathon, or exercising on your own, you can use the Charity Miles app to raise funds for the World Food Program or Save the Children. This will help bring food and comfort to children who are living in war zones. The Boston Marathon is a great race and can rally support for this and many other social causes.  Happy Patriots' Day!

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171713 https://historynewsnetwork.org/article/171713 0
The Israeli Elections and What We Can Learn from History

 

The results of the Israeli elections that took place on April 9 have shown that history could be a good guide to assess political processes, but not necessarily to predict how events will unfold. 

On two previous occasions, a center-left coalition was able to defeat the center-right in Israeli elections. 

In 1992, the Labour Party, headed by former Chief of Staff of the Israeli Defence Forces (IDF), Itzhak Rabin, managed to defeat the governing Likud Party, headed by Itzhak Shamir. 

In 1999, another former Chief of Staff of the IDF, Ehud Barak, heading a political alignment centered on the Labour Party, secured a comfortable election victory against the incumbent Prime Minister, Benjamin Netanyahu. 

Many in Israel thought that these two historical precedents might be repeated in the elections Tuesday. 

The logic of the argument ran like this: For any political challenge mounted by the center and center-left against the governing Likud Party to succeed, a former military leader would have to lead it. In a country beset from its inception by security-related threats, the aura of a distinguished military career could be a vote-winner. 

The above could be proved empirically. After all, both in 1992 and in 1999, the governing Likud Party lost to two former generals, heading a center-left coalition.

However, in 2019, a newly-created, centrist political grouping, Blue and White, led by a former Chief of Staff of the IDF, Benny Gantz, and a triumvirate of leading figures, two of them also former Chief of Staff of the IDF, Moshe “Bogy” Yaalon and Avi Ashkenazi, was defeated at the polls by Benjamin Netanyahu and his Likud Party.

Why?

To answer this question we would need to stress, first and foremost, the singular circumstances surrounding each electoral event. 

For instance, in 1992, Rabin was thought to be a very experienced politician, having already served as Prime Minister between 1974 and 1977, and as Defence Minister between 1984 and 1990. In 2019, Gantz had no political credentials at all. He had served in no ministerial post, nor had he been even a Member of the Knesset (the Israeli parliament). 

Further, whereas Rabin in 1992 faced a serving prime minister, Shamir, who many considered uncharismatic, Gantz in 2019  wanted to unseat Netanyahu, one of the most charismatic leaders Israel has ever known. Shamir might have been respected by his followers; Netanyahu was adored by them. 

In 1999, Barak, the most decorated military leader in Israel’s history, wanted to unseat a young prime minister, who, after having served for three years as prime minister, was seen as inexperienced and rather unsuccessful: Netanyahu. 

Certainly, what the examples of Rabin (1992) and Barak (1999) demonstrate is that a rather hawkish platform, coupled with a distinguished military career, can help a center-left candidate in securing political support among center and center-right voters in Israel. 

Beyond that, voting in Israel follows, more often-than-not, a deep-seated sociological trend. Most Israeli Jews of European descent tend to vote for center and center-left parties; most Israeli Jews of North-African and Asian descent are inclined to vote for center-right parties. The more affluent an area in Israel, the more its residents would tend to vote for a center, center-left candidate. Certain cities in Israel are identified with either the center-left or the center-right: thus in these recent elections, cities like Tel Aviv voted overwhelmingly for opposition parties, whereas Jerusalem voted mostly for center-right and religious parties. 

Therefore, apart from the personal and political characteristics of the leaders involved, and the concrete circumstances in which they have operated, to understand the Israeli electoral process one has to delve more deeply into Israeli society and the way it has evolved.

In Israel, one can learn from history by assessing these more profound sociological processes, while being careful not to reach unequivocal conclusions from similar events that have taken place in Israel’s history. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171692 https://historynewsnetwork.org/article/171692 0
Roundup Top 10!  

 

 

The Man Who Saw Trump Coming A Century Ago

by Ann Jones

A Reader’s Guide for the Distraught.

 

2 Minutes and Counting

by Oliver Stone and Peter Kuznick

Crises that seemed contained not long ago have now spiraled out of control—and the prospects for resolving them peacefully look depressingly bleak.

 

 

Harvard's Communist Uprising, 50 Years Later

by Daniel Pipes

That takeover and bust culminated my political education.

 

 

‘Not a racist bone in his body’: The origins of the default defense against racism

by Christopher Petrella and Justin Gomer

The rise of the colorblind ideology that prevents us from addressing racism.

 

 

What Donald Trump Doesn’t Get About George Washington

by Peter Canellos

“If he was smart, he would’ve put his name on it. You’ve got to put your name on stuff or no one remembers you.”

 

 

Anti-vaxxers are comparing themselves to Holocaust victims — who relied on vaccines to survive

by Helene Sinnreich

The comparison is offensive. It’s also historically wrong.

 

 

People Used to Hate the Electoral College for Very Different Reasons

by Justin Fox

A half-century ago, the House voted to replace the Electoral College with a direct vote and the Senate came close. The arguments made then are enlightening.

 

 

How Trump finally turned Republicans against McCarthyism

by Jonathan Zimmerman

After nearly 70 years, Republicans have stopped defending Joe McCarthy.

 

 

The End of the American Century

by George Packer

What the life of Richard Holbrooke tells us about the decay of Pax Americana.

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171712 https://historynewsnetwork.org/article/171712 0
Why the Second Summit Between Donald Trump and Kim Jong Un Failed

 

The second summit between U.S President Donald Trump and North Korean leader Kim Jong Un in Hanoi failed due to a relatively unknown quantity: North Korea’s domestic politics.  

 

Despite more than sixty years since the armistice declaring the end of the Korean War, the U.S and North Korea seemed closer to peace than ever before. As evidenced in the case of post-World War Two Germany and Japan, and communist Vietnam, U.S is no stranger to making former enemies into friends. However, few have asked how this thaw in hostilities affects North Korea’s messaging to its own people. It is too politically risky for Kim Jong Un’s regime to have friendly relations with Washington.

 

As an autocratic regime, North Korea remains politically stable by limiting the flow of information into and out of the country. This information blockade has allowed the Kim family regime to stay in power longer than the Soviet Union ever existed. North Korean propaganda has strategically positioned the “U.S imperialist bastards” as the forever enemy of the Korean people since 1950. Propaganda posters feature North Korean missiles hitting the U.S capitol and hook-noosed U.S soldiers brutally massacring Korean peasants. There is a museum in Sinchon, North Korea dedicated to U.S atrocities during the Korean War. Despite the fact that many of these atrocities are made up by the North Korean propaganda apparatus, schoolchildren take regular trips to this museum for political education purposes. At festivals, shooting games feature portraits of U.S soldiers as targets and children practice bayonetting U.S soldiers at recess.

 

The U.S boogeyman serves as a mobilizing force for the Kim family regime and inspires the North Korean people to endure harsh living conditions for the good of the nation. The North Korean people boast about its nuclear program and ability to defend themselves against the much stronger and larger U.S military. As North Korean propaganda explains, once the U.S leaves the Korean peninsula, the golden era of Korea will begin. Peaceful reunification with the South and a strong economy awaits those who now sacrifice for the fatherland.

 

However, can the Kim family regime continue its brutal ways without the U.S boogeyman? As 19th century military theorist Carl von Clausewitz explains, “primordial violence, hatred, and enmity” are a “blind natural force” in conflict and primarily motivates the people. Take this hatred away and what does a militaristic state such as North Korea have left to mobilize its people? The manufactured fear of a U.S invasion has left North Korea in a near-warlike state for decades and permitted the hereditary dictatorship of the Kim family. This siege mentality and anti-American sentiment will not vanish once Trump and Kim sign communiqués. It is too politically dangerous for the North Korean leadership.

 

With more peaceful relations between Washington and Pyongyang, a lifting of sanctions and an increase in living standards for the North Korean people would likely occur.  This would most likely boost domestic support for the Kim family regime. However, the information blockade and authoritarianism of North Korea’s political system would continue. In the future, who does Pyongyang then blame for its troubles?

 

Recently, there have been signs in Pyongyang that anti-American propaganda is fading away. Much of this seems to be a spectacle for foreign journalists and tourists visiting the North Korean capital city. The regime cannot wholly remove anti-Americanism from the national consciousness without the fear of domestic instability. Once the U.S boogeyman is gone, the people’s passions and grievances now turn towards the North Korean government that has kept its people impoverished and literally in the dark for the last several decades. The North Korean leadership cannot have that.

 

While friendlier U.S- North Korea relations are less dangerous for the whole world, the autocratic nature of the Kim family regime nor its many human rights abuses should be forgotten. As long as the Kim family regime remains in power, anti-Americanism and human rights violations will continue north of the DMZ. It is the only path North Korea’s leadership knows and can permit. Rapprochement with the United States is too politically dangerous for Kim Jong Un’s regime.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171660 https://historynewsnetwork.org/article/171660 0
The Long History of Anti-Immigration Legislation and "Crimes Involving Moral Turpitude"

 

In light of the Supreme Court’s recent ruling on the Nielsen v. Preap case, a return to the history behind “crimes involving moral turpitude” and the concept’s unique relationship with immigration exclusion and deportation is useful. As many have noted, the Court’s ruling that immigrants who committed crimes and served their sentences (in some cases years or decades ago) can still be detained for deportation without bond hearings raises questions about the constitutionality of indefinite detention. But, as Justice Ruth Bader Ginsburg noted, the ruling also concerns the lingering and vague definition of “crimes involving moral turpitude” and its role in immigration.

 

In her dissenting opinion, Ginsburg referenced the dangers of interpreting crimes involving moral turpitude (or CIMT) as a means to curtail immigration. Under the Illegal Immigration Reform and Immigrant Responsibility Act of 1996, CIMT is a wide-ranging category of activities including serious crimes such as rape, incest, murder, and larceny, but also more minor offenses like petty theft or, as Ginsburg stated, “illegally downloading music or possessing stolen bus transfers.” 

 

While Ginsburg used modern examples of small crimes in her dissent, “moral turpitude” has been a staple in immigration law for over a century and has rarely been questioned as a useful tool for exclusion and deportation. In fact, the concept of “moral turpitude” and the litany of crimes that are evidence of an individual’s lack of “good character” (a requirement for naturalization rooted in the Naturalization Act of 1790) has historically applied primarily to immigration law. When an immigrant commits and serves time for a crime, immigration officials and courts can interpret that crime as one of “moral turpitude,” which indicates depravity, immorality, recklessness, or maliciousness on behalf of the perpetrator. 

 

The idea of using morality to target specific migrant groups first appeared in formal immigration policy in 1875 when Congress used its plenary powers to create an immigration law to exclude Chinese women suspected of being “undesirable” or engaging in prostitution—a response to a growing wave of anti-Chinese sentiment. “Moral turpitude” as an explicit phrase, however, made its first appearance in the Immigration Act of 1891. By the late nineteenth century, “new” immigrants from southern, eastern, and central Europe (as well as Mexico and the Caribbean) were beginning to arrive in larger numbers,as they fled political, social, and economic upheaval in their homelands and soughtjob opportunities in the industrializing United States. Many Americans, however, were alarmed by the arrival of a “horde” of immigrants who were not white Anglo-Saxon Protestants. Politicians and nascent anti-immigrant associations argued that the new immigrants were prone to criminal behavior and liable to become public charges. In response, the 1891 Act listed classes of migrants who were unfit to naturalize and therefore unfit to enter or remain in the United States,. The list included “idiots,” “insane persons,” the diseased,  “paupers,” polygamists, and those who had been convicted of a felony, misdemeanor, “or other infamous crime or misdemeanor involving moral turpitude…” 

 

Congress continued to pass laws that solidified the idea that immigrants were held to a higher degree of morality than most American citizens. In response to more radical political and social movements including socialism, anarchism, and labor organizing during the early twentieth century, the Immigration Act of 1907 expanded the list of offenses that qualified for exclusion. The 1907 Act banned “persons who have been convicted of or admit having committed a felony or other crime or misdemeanor involving moral turpitude; polygamists, or persons who admit their belief in the practice of polygamy, anarchists, or persons who believe in or advocate the overthrow by force or violence of the Government of the United States, or of all government, or of all forms of law, or the assassination of public officials; persons coming for immoral purposes…” 

 

The Immigration Act of 1917 further consolidated the groups of “undesirables,” but added specific references to “constitutional psychopathic inferiority” and “abnormal sexual instincts” as reasons for exclusion as well as deportation. The 1917 Act allowed for immigration inspectors and other immigration officials to deny entrance to and call for deportation of confirmed and suspected homosexuals who fell under these categories at the time or committed sodomy, a “crime of moral turpitude.” Throughout the early twentieth century, “undesirability” and “crimes involving moral turpitude” reinforced one another and served as a means to expand immigration exclusion and deportation at a time of heightened nativism and xenophobia. 

 

On June 25, 1952, President Harry S. Truman issued his veto of House Bill 5678 (or the McCarran-Walter Act), a proposal to amend the United States’ immigration policies. Truman not only insisted that the bill did little to address the discriminatory quotas targeting migrants who did not hail from Western European nations, but he also argued that it made it easier to deport immigrants who might be an asset to the United States during the Cold War rather than a threat or liability. Under HB5678, refugees fleeing the Soviet Union or those already living within the United States could be excluded or deported if “convicted of a crime involving moral turpitude” beyond a “purely political offense.” 

 

The bill’s provision for excluding and deporting immigrants based upon their ties to subversive practices and organizations is not surprising considering Cold War tensions. However, the bill also made “crimes involving moral turpitude” usefully vague during a time when many politicians and government officials argued that the U.S. was under constant threat from Soviet influence. But Truman believed that America’s reputation as a beacon of freedom for the tired, poor huddled masses would be challenged by an unnecessarily harsh immigration policy. The bill was poised to criminalize immigrants whom the U.S. should be sheltering rather than excluding. 

 

“We have adequate and fair provisions in our present law to protect us against the entry of criminals,” Truman wrote in his veto memo. “The charges made by the bill in those provisions would result in empowering minor immigration and consular officials to act as prosecutor, judge and jury in determining whether acts constituting a crime have been committed.” With such leeway in interpreting crimes of moral turpitude, the power to discriminate against immigrants while denying their rights could fall into misguided hands.

 

Despite Truman’s objections, Congress overrode the President’s veto and HB5678 eventually became the Immigration and Nationality Act of 1952. Under the 1952 Act, an immigrant who committed a “crime involving moral turpitude” within five years of their admission to the U.S. was deportable (those who committed two or more CIMTs were deportable regardless of their admission date). The Immigration Act of 1996, however, added the ambiguous language on when immigrants can be detained after being released from custody and also allowed for local law enforcement agencies, state-level officials, and immigration judgesto interpret CIMTs broadly.

 

Truman’s warnings against criminalizing immigrants and allowing crimes involving moral turpitude to be interpreted widely ring true today under President Trump’s administration. Moral turpitude is a phrase that has a long legacy of being used for exclusionary and discriminatory practices in deciding who is and is not fit to be an American. Perhaps it is time for a more careful examination of this component of American immigration policy in light of the questions over detention and due process.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171665 https://historynewsnetwork.org/article/171665 0
A Response to Rebecca Spang's "MMT and Why Historians Need to Reclaim Studying Money"

 

 

Historian Rebecca Spang’s latest History News Network piece on MMT and history is both timely and thought-provoking. In addition to its biting critique of economic orthodoxy and other valuable insights, the essay sets into relief a productive ontological debate about money and its historical manifestations. Part of the present breakdown of the neoliberal consensus, the insurgent popularity of MMT in contemporary discourse has enlivened conservations about the nature of money and its role in shaping social life. As Spang rightly claims, this discourse requires historical context. As such, I welcome and applaud Spang’s intervention. However, I also wish to underscore some crucial differences between Spang's vital work (particularly on the French Revolution and the rhetoric of inflation) and the historical work being done within the MMT movement. 

 

In her concluding paragraph, Spang spells out how in her opinion, history proves the relevance of MMT to today’s politics. She writes: “MMT, along with the euro crisis and awareness of austerity’s social effects, has done much to open monetary and fiscal debates to wider audiences. Simply recognizing that money is political and historical (central, as Harvard Law Professor Christine Desan likes to say, to how a polity constitutes itself) is a difficult breakthrough for most people. On the other hand, seeing money in this way doesn’t—in a fractured polity characterized by demagoguery and high levels of inequality—make policy any easier to write or implement.” The opening of this paragraph is spot on, especially as Spang connects MMT with Desan’s constitutional history of money, a history that insists upon a legal foundation for monetary relations. 

 

(Shameless plug: we at the Modern Money Network (MMN) created an awesome episode for the Money on the Left podcast with Desan last year.)

 

Her concluding paragraph, however, also reveals a difference between her and many MMTers. More specifically, many following MMT’s insurgence in D.C. disagree with her conclusion that MMT doesn’t make policy “any easier to write or implement,” given the fractured, unequal and demagogic nature of this political moment. This is the case for a few reasons. One is that a central theme of MMT’s political and financial project is the introduction of a non-zero-sum rhetorical framework for legislative and social finance. As noted MMT economist Stephanie Kelton has repeatedly argued, MMT frees the Left from a relying on rich people’s money. Instead, she argues that the left should mount its case for confiscatory taxation on moral, rather than budgetary grounds. As well, MMT can change the perception that “taxpayer money” (often code for white people’s money) is what funds welfare and jobs programs for the disenfranchised, as MMT lawyer and legal scholar Raúl Carrillo has written. In insisting that fiscal allocations be labeled as “public money,” Carrillo and others challenge flawed neoliberal notions of money as not only private and scarce but also inherently white. All in all, this makes policy easier to imagine and implement because we can focus on what needs to be done rather than how it would be funded and who would oppose it. Instead of the zero-sum contests that presume “there’s no free lunch,” MMT says free lunches for everyone, as long as the food is producible! 

 

Perhaps more important, Spang’s argument about the current political fracture in America betrays tacit assumptions that MMT's understanding of money seeks to problematize. In her excellent book, she argues that money represents a sort of performance of our ongoing social bonds. À la Judith Butler, Spang writes that money is “not fixed or made once and for all but something that exists thanks only to its repeated enactment (not one interpellation but a whole series of them).” Furthermore, she claims that “monetary transactions are therefore characterized by what we might call ‘involuntary trust’—a trust itself resulting from involuntary, even unconscious, memory.” (6) Putting a Butlerian twist on Enlightenment social contract theory, Spang defines money as a process of ongoing consent between issuers and users, as well as buyers and sellers, one which is malleable and contestable. 

 

I take a different approach and think some other MMTers do too. From an MMT perspective, money is an asymmetrical and ongoing legal obligation between government and society and not “involuntary trust” among creditors and debtors. Take, for example, Scott Ferguson’s 2018 book Declarations of Dependence: Money, Aesthetics, and the Politics of Care (Ferguson, along with Carrillo, are on the board of the Modern Money Network). In the book, he argues for money’s inalienable public nature. “A political relationship between centralized governments and people, money, according to MMT,” Ferguson writes, “is an inalienable utility ever capable of expansion and reconstruction. Money obliges the public to a political center, socializing productive and distributive processes rather than organizing them locally and privately.” (184) Rather than being an ongoing form of trust in a credit relation, as Spang argues, Ferguson claims that money is always a centralized political mechanism for provisioning asymmetrical and reciprocal public obligations. In other words, money actualizes the polity’s indebtedness to its governing authorities as well as those authorities’ indebtedness to their polity. 

 

Instead of imagining a polity as always-already connected in its participation in the public money relationship, Spang’s conception of money as ongoing consent leaves politics attempting to unite, through consent, a polity imagined as unrelated or fractured. Throughout history, such projects sadly often take the form of imagining some other universal, like culture, race, nationality, or sometimes (if we are lucky) liberal consensus, under which those who are fractured can become one again. As Spang’s book argues about the rise of Napoleon, that sort of social wrangling often takes on an authoritarian color.

 

However, when one brings together these two seemingly opposed ontologies of money, one begins to see a new place for consent in money. Spang’s work traces the crises of political contestation and authority during the French Revolution among both a government and a polity sharing in economic and legal obligations. It is for precisely this reason that her documentation of the rise and fall of the Assignats (the revolutionary paper currency) demonstrates the problem with her own Liberal assertion of consent as the initial basis of money. Instead of the crises being caused by political fracture, they were caused by the assumption that consent would be enough to cement the revolution. Therefore, in not recognizing the tax obligation as a prime factor in the maintenance of money’s social role, revolutionary France created precisely the fracture Spang laments today. 

 

This insight allows us to bring together these two ontologies of money to color the contemporary rise of MMT in a particularly interesting light. MMT introduces democracy (public consent) into the budgetary process. Rather than rely on economists that used to tell us that we can’t afford to care of people, MMT gives us the precise tools to do so. Therefore, our initial relation through money allows us to mobilize our consent for progressive or leftist ends.

 

For these reasons, I laud Spang’s call to develop and complicate MMT’s approach to history. Still, I argue that it is equally important to problematize historians’ unquestioned ontological assumptions about money and legal mediation through the MMT framework. If historians take this up in earnest, we might finally be able to overcome our austere imagination of money’s role in history.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171667 https://historynewsnetwork.org/article/171667 0
The Golan Heights: Its History and Significance Today

A UN-controlled border crossing point between Syria and Israel at the Golan Heights, Wikipedia Commons

 

Donald Trump’s decision to recognise Israel’s sovereignty over the Golan Heights might have serious repercussions for the future but to understand exactly why, and how, it’s necessary to first understand the past.      

                     

The Golan Heights is an elevated plateau stretching across some 932 square miles that shares a border with Lebanon, Syria, and Jordan. Several Jewish communities unsuccessfully tried to settle in the southern parts of the region and during the late Nineteenth Century the pre-independence Zionist movement claimed this area in the post-World War I Peace Conference of 1919. Rather than emphasising a historical connection to the area, the movement’s leaders chose to stress its strategic importance as a barrier from invading forces from the east (citing Bedouin tribes as the main threat) and a site for reliable irrigation sourcesfor a potential Zionist polity. With the creation of a mandate system in the early 1920s the area was eventually placed under French custodianship and became part of Syria with its establishment in 1946.     

  

Following the 1948 War, Israel and Syria battled over border delineation, diversion schemes for the Jordan River, and control of the Sea of Galilee. Artillery fire, infantry raids, and aerial dogfights became routine along the border during the 1960s causing considerable casualties and widespread destruction of property. The Golan became heavily militarised and was riddled with bunkers, minefields, and outposts. At the same time, it was also a bustling province with a population that in 1967 reached almost 150,000 civilians who lived in 270 villages and towns. The populations’ majority (85%) were Arab Sunni Muslims and the rest belonged to a variety of ethnic groups such as Cherkassy, Turkmen, Maronite Christians, Bedouin, Alawites, Isma’ilis, and Druze. 

 

Despite the area’s civilian life, the Syrian Plateau came to represent a threat to Israeli Jews. Significant pressure came from residents of the Kibbutzim (small agricultural communities) in northern Israel which at the time held significant political power. They demanded that the “Syrian plateau”, as it was colloquially known then, be pacified.  Syria played a minor part during the June 1967 war, but Israeli leaders decided that Syria’s involvement in the run-up to the war provided a pretext to occupy the Golan. The Golan’s occupation thus became part of an expansion strategy propelled by the perception (real or constructed) that Syria would continue to use the Golan to launch artillery attackson Israeli settlements. This fear-fuelled geopolitical “Israelification” of the Golan was enacted through three main tactics.  

 

First, the Israeli military displaced the Syrian population who remained in the territory after the fighting and prevented the return of civilians who fled during the war.  Only 6,500 people from the Druze sect that was historically considered “loyal” to Israel were permitted to remain. The rest were either forced to leave or were not allowed to return to their homes. Second, the Israel military, with supervision by archaeologists, architects, and rural planners systematically demolished villages, farms and houses. All structures were to be demolished except for those with architectural, archaeological or aesthetic significance.   

 

The population’s displacement and remaking of the landscape through demolition enabled the third tactic:  recreating the Golan as a tourist haven with a cool climate, open spaces, fertile soil, wineries, and attractions such as Mount Hermon, Israel’s only ski resort.  Most Israelis do not live in the Golan– the total Jewish population ofthe region isabout 20,000 --but rather visit it: between 1.5 to 3 million Israelis visit every year. In the words of Israeli Journalist Chemi Shalev, during the years in which the Golan Heights has been in Israel’s possession, it has become “more Israeli than Israel itself. An ideal version as we would like it to be: without Palestinians but with marvellous views, delicious wines, sympathetic residents, horses, crocodiles and ski sites.” 

 

In 1981 Israel extended its authority of civilian law to the Golan Heights, a move regarded as de-facto annexation. The decision was part of the attempt to secure Israel’s territorial conquest of June 1967 while then prime minister, Menachem Begin, was about to give back the Sinai Peninsula to Egypt as part of the Peace agreement between the two countries.The extension of civilian law to the Golan was a political decision made to garner support for Begin from the Israeli right who fumed over the withdrawl from Sinai. 

 

Similarly, Trump’s decision to recognize Israeli sovereignty over the Golan is an attempt to bolster Benjamin Netanyahu who is facing an uphill electoral battle in the coming elections in April. Both in 1981 and now, the pledge to the Golan was intricately connected to popularity contests and not to overall confidence of Israel’s leaders about the unquestioned “Israeliness” of the Golan. Indeed, Israeli policymakers have refrained from admitting that the Golan was officially annexed and from the 1990s up to the breakout of civil war in Syria in 2011, probed the possibility of giving back the territory. Thus, when Prime Minister Netanyahu lauds President Trump for recognising Israel’s sovereignty over the Golan, it goes against Israel’s strategy of leaving the question of the Golan’s formal status shrouded in opacity. 

 

Of course, the general sentiment that the Golan is supposedly part of Israel predates Trump’s recognition.  Jon Stewart’s The Daily Show is a case in point to this ignorance. A segment covering Israel’s 2014 military campaign in Gaza presented a map which did not encompass the Palestinian West Bank and Gaza Strip but depicted the Golan Heights as an integral part of Israel. The fact that Stewart’s team chose this map is, however, hardly coincidental, since many existing maps have erased the line separating the Golan. Can we blame Donald Trump, if even Jon Stewart “recognises” Israel’s control of the Golan?

 

Nonetheless, Trump’s recognition should have generated more attention than it got, especially because of his earlier decision to recognise Jerusalem as Israel’s capital triggered violent protests and worldwide condemnations. Unlike Jerusalem, the connection of the Golan to Zionism’s historical and theological narratives is peripheral at best and never played a dominant part in the reasons for controlling it. Besides, the area was never part of Mandatory Palestine and was internationally recognised, including by Israel, as part of Syria. Much more than Jerusalem, the Golan represents a clear case of a territory belonging to one sovereign state that is illegally occupied by another in clear violation of international law.   

 

So, what does the relative quiet with which Trump’s recognition has been accepted demonstrate?  The silence can be attributed to the announcement’s timing as it coincided with (another) violent escalation between Israel and Gaza and, perhaps more critical, the headline grabbing Muller Report’s completion. The brazen flaunting of International law also utilised the tragedy of the Syrian civil war as cover. There is almost no one left in Syria to challenge Israel’s claim to the Golan (although we can be sure that Putin and Iran weren’t too pleased). At the same time, any Israeli policymaker who will openly support an attempt to cede the Golan risks being branded as a lunatic. Fear mongering has always been a potent political currency in Israel, and the notion of Isis or Hezbollah taking over in Israel’s absence has significant public gravity.  

 

But as much as Israelis would like to claim that the Golan is needed for its strategic value or for its bucolic scenery, it cannot change the fact that there is no legal case for annexing the territory. Security threats cannot serve as a legitimate reason for annexing territory of another nation. The position of some legal experts that the territory rightfully belongs to Israel since it was occupied as an act of self-defence stands on shaky ground. It was Israel who invaded Syria which, until the moment of the incursion, did not play a significant part in the actual fighting of the June 1967 war. Even more so, a new historiography of the June war suggests that Israel's decision to take over the Golan was part of a strategy developed years before the actual occupation. Trump decision is simply another validation of one of the most successful land grabs of the twentieth century that is based on displacement, demolition and paradoxical transformation of the Golan into a peaceful warzone. A piece of “Europeansque” tranquillity in the heart of the Middle East’s most volatile region.

 

And this brings us to the question of what we can learn from the past for the future of the Israeli-Palestinian conflict. Some have suggested that Trump’s move, is, in fact, a way to sweeten the bitter pills that Israel will have to swallow once his ostensible “deal of the century” will be published. Trump, so the rumor goes, will force Netanyahu, or whoever will be Israel’s Prime Minister, to accept a Palestinian state, to relinquish land and evacuate settlers. But so far Trump, like the Golan, has proved to be more Israeli than Israel itself. The re-imposing of sanctions on Iran and the recognition of Jerusalem points to his complete lack of reservations about realising Israel’s wildest dream which now might become even wilder. This is especially true since the Golan is already developing into a generic model for what the West Bank should “become.” Put differently, the Israelification of the Golan which entailed massive population displacement, spatial demolition and European rebranding, has now become a battle-tested template to how annexation could look like in the West Bank. 

 

Yet we should also note that Trump’s blatant flaunting of international law in Iran, Jerusalem, and now the Golan is working at the moment but might also indicate America’s weakening position as it no longer can claim to lead an international community who opposes the notion that military might should translate into political rights. When Britain and France tried to take over the Suez Canal in 1956, a scheme that also involved Israel’s active participation and occupation of the Sinai Peninsula, they were forced by a new world order to back down, ending their time as colonial empires. While it is hard to draw immediate comparisons, Trump’s move might put America on the same path. In other words, Trump’s Golan might eventually become America’s Suez.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171659 https://historynewsnetwork.org/article/171659 0
Senator Chuck Schumer says corporations used to care; here's how historians responded Allen Mikaelian is a DC-based editor and writer. He received his history PhD from American University and served as editor of the American Historical Association’s magazine, Perspectives on History. The Political Uses of the Past Project collects and checks statements by elected and appointed officials. This is the second installment of what will hopefully become a regular feature of the project. Read more about the project here.

 

Sen. Chuck Schumer: American corporations used to believe they "had a duty not just to their shareholders but to their workers, to their communities, and to their country"

When more than 80 percent of corporate profits are going to stock buybacks and dividends, something is really wrong in the state of corporate America and the state of our economy. It wasn't always this way. From the mid-20th century up until the seventies and even into the eighties, American corporations shared a belief that they had a duty not just to their shareholders but to their workers, to their communities, and to their country, which helped them grow and prosper, along with our schools, our roads, and everything else. That created an extremely prosperous America for corporate America but also for American workers in the broad middle of this country. But over the past several decades, workers' rights have been diminished, and corporate boardrooms have been obsessed, slavishly, to shareholder earnings. —Sen. Charles Schumer, Stock Buybacks, Senate Floor, February 4, 2019

Historians say...

Bottom Line: Most historians who responded agree that Senator Schumer is on solid ground, but their caveats and the statements of the historians who strongly disagree should not be ignored, especially if we want to use this history to help formulate policy. Scroll down for links to the historians' full responses.

Senator Chuck Schumer delivered the above statement while discussing the Republican tax cuts; he charged that corporations are not using their tax savings to create jobs or pay higher wages, but are instead buying up their own shares. This can drive up stock prices by creating scarcity, and shareholders naturally love it. But the GOP's tax cuts were granted, we were told, to create jobs, not merely to further enrich investors.

Schumer proposes legislation to force corporations to do good—investing “in workers and communities first”—before they can buy their own stock. And to set the stage for his proposal, he points to a past in which American corporations had a heart. Maybe that history makes his idea seem not so radical. Or it raises hopes that maybe we don’t have to be in constant battle with corporate America. That maybe our expectations for more socially responsible corporations aren’t so unreasonable. Or perhaps even that the CEOs want to do the right thing but have to be legislated into it.

Regardless of why Schumer decided this piece of business history was a “useful past,” most of the historians who answered our request for input thought Schumer was on solid ground. However, we should not overlook their caveats or the dissents of historians who disagreed with Schumer’s view of history; these are perhaps more deserving of policymakers’ attention, if they really want to learn from the past.

Summary

Several historians responded by mentioning the stakeholder model that captured at least the imaginations, if not the actions, of many mid-twentieth century executives: “Two competing models of corporate ownership through stocks were evident in the twentieth century: shareholder and stakeholder. The former model asserts that the leaders of corporations must make decisions based solely on the best interests of people who actually own stocks, while the latter maintains that other interested parties like workers and their communities have an interest in corporate actions equal to those of shareholders” (Jason Russell). “Earlier in the twentieth century, some management scholars such as Peter Drucker argued that corporations had different stakeholders, including the community, employees, and consumers” (Gavin Benke). “They've always cared about the bottom line, but back then felt compelled to consider the needs of multiple ‘stakeholders’” (David B. Sicilia).

This was, of course easier to do when the economy was booming. The strength of unions was also a factor—they were relatively harder to ignore (Jonathan Bean)—and higher taxes made large investments in infrastructure possible (Rosemary Feurer). All this started to change at least by the 1970s (Benjamin Waterhouse, and Jason Russell pegs it to the 1960s). And with this change came a large-scale shift in thinking.

Milton Friedman argued in the New York Times in 1970 that a corporation’s sole responsibility is to “increase its profits,” giving permission and intellectual heft to executives who, in the midst of globalization and declining profits, wished to focus on shareholders rather than stakeholders. Friedman was objecting “to a very real sense, both within and beyond business leadership circles, that corporations had a clear social responsibility” (Benjamin Waterhouse), but Friedman did not limit his thinking to profits and business culture: “He argued that ‘the cloak of social responsibility ... does clearly harm the foundations of a free society’” (David Hochfelder). And even further, he accused executives who took up social responsibility of “preaching pure and unadulterated socialism” and being “unwitting puppets” of the collectivist left.

Schumer may be on solid ground, but if we pay close attention to these historians’ caveats and to the historians who think he is dreaming of a “golden age” (Jonathan Bean), we might ask whether the CEOs who preached social responsibility were leading the charge or merely reflecting what the public expected and what legislation demanded. “Corporations thought in wider terms about stakeholders because regulations compelled them to do so” (Jason Russell). And insofar as some corporations “may have felt a sense of civic duty” and others contributed to the public good, “they did so to comply with the much more progressive tax code at the time” (David B. Sicilia). Schumer is right to offer legislation at the same time he speaks of a now-distant past when corporations did the right thing, but his case would be stronger if he made note of how corporate virtue had to be cajoled by legislation.

Schumer also leaves out a key aspect of the history of corporate responsibility, one that most of the historians here take up. Corporations had to contend with strong unions. This helped reinforce stakeholder responsibility when the morality of CEOs failed. The demise of unions was no accident, and it was not coincidental to the demise of corporate responsibility. While even the historians who agree with Schumer mention unions, the historians who disagree move unions to the center of the discussion.

Specifically, the rise of so-called right-to-work states lured corporations to the south and the sunbelt, where their stakeholder responsibilities were far less (Jonathan Bean). Rosemary Feurer has questions about this: “Ask textile communities in the North how much corporations cared about the devastating effect of relocating. Ask African-Americans in Detroit how many of their jobs, newly won, were lost to corporate decisions of automakers to relocate jobs to the South.” She also has questions for Schumer, questions that could be turned into policy, if we are serious about returning to a time when corporation responsibility received at least lip service: “And ask Schumer what the Democratic Party did to stop this in this time. Instead, the party at the time sought to grow the economy without any intervention in this dynamic, despite the attempt of unions to gain some control over these relocations.”

In the past, corporate responsibility had much to do with Congress building strong guardrails. But it did not involve Congress alone. It was not merely big government legislating big business. It was also citizens and groups like unions that forced the issue. Schumer’s job will likely be easier if his attempt to blunt the harder edges of capitalism and a return to stakeholder values also protects these groups of citizens, the actual stakeholders themselves.

Browse and download citations recommended by the historians below from our Zotero library, or try our in-browser library.

 

Jonathan Bean, Professor of Business History, Southern Illinois University

Rating: 1.5

This is the myth of a golden age of "corporate liberalism." While it is true that during that time period (circa 1945-1970s), CEOs were more likely to espouse a belief in "stakeholders" (beyond shareholders), it was mostly public relations. Read more...

Gavin Benke, Boston University, author of Risk and Ruin: Enron and the Culture of American Capitalism

Rating: 3.5

Earlier in the twentieth century, some management scholars such as Peter Drucker argued that corporations had different stakeholders, including the community, employees, and consumers. However, it would be wrong to look back on the mid-twentieth century as period without discord in American business.  Read more...

Rosemary Feurer, History Department, Northern Illinois University. Coauthor, with Chad Pearson, Against Labor: How US Employers Organized to Defeat Unions

Rating: 1.7

The main explanation Schumer gives for the postwar period is a myth. He seems to suggest that there was less concern for profits in this period, that CEOs cared more about their workers and community. He makes it seem a moral or personal decision, rather than acknowledging the key factor—unionization tamed some of the rapaciousness of capitalism in this period and created a middle class. Read more...

David Hochfelder, Associate Professor, University at Albany, SUNY, author of The Telegraph in America, 1832-1920

Rating: 3.9

Schumer’s statement is more or less true. Corporations often felt some obligation to their workforces and communities from the late 19th to late 20th centuries. Electric utilities had employee and customer stock ownership plans. Industrial firms provided health clinics, built parks and schools, financed mortgages for workers, etc. Read more...

Jason Russell, PhD, Empire State College—SUNY, suthor of Making Managers in Canada, 1945–1995: Companies, Community Colleges, and Universities (Routledge, 2018)

Rating: 4.8

One important point is that corporations thought in wider terms about stakeholders because regulations compelled them to do so. For example, laws like the Glass-Steagall Act, the Wagner Act, and the Fair Labor Standards Act established certain parameters for corporations. Read more...

David B. Sicilia, Henry Kaufman Chair of Financial History and Associate Professor, University of Maryland, College Park. Coauthor or coeditor of six books on business and economic history, including Constructing Corporate America: History Politics, Culture

Rating: 3.6

Sen. Schumer’s comment captures the spirit of an important transformation in the second half of the 20th century but should not be taken too literally. His statement centers on a claim about motive (“a shared belief that they had a duty”) that is difficult to prove. But corporate behavior, especially toward workers and communities, certainly changed when and how Sen. Schumer suggests. Read more...

Benjamin C. Waterhouse, Associate Professor of History, University of North Carolina at Chapel Hill, author of Lobbying America: The Politics of Business from Nixon to NAFTA (2014) and Land of Enterprise: A Business History of the United States (2017)

Rating: 4.6

Broadly speaking, Schumer’s claim reflects the way business historians summarize changes in attitude among corporate managers and leaders. Naturally, it is impossible to say precisely what “corporate leaders” believed at any point, because that group is large and reflects many different opinions. Read more...

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/blog/154198 https://historynewsnetwork.org/blog/154198 0
James Madison Responds to Sean Wilentz

 

Sean Wilentz, Sidney and Ruth Lapidus Professor of the American Revolutionary Era at Princeton University, just announced in a New York Times op-ed that he retracted his earlier opinion on the origin of the Electoral College.  In NO PROPERTY IN MAN: Slavery and Antislavery at the Nation’s Founding, published by Harvard University Press in September 2018, Wilentz concluded “the evidence clearly showed the Electoral College arose from a calculated power play by the slaveholders.” Now Professor Wilentz asserts he was mistaken. “There is a lot wrong with how we choose the president. But the framers did not put it into the Constitution to protect the South.”

 

If I understand Sean Wilentz's new position on the origin of the Electoral College, it, like slavery, was an undemocratic element of the new Constitution endorsed by writers from the North and South who feared slave insurrection, democratic insurgencies like Shay’s Rebellion,  and popular government, who represented slave states (there was still slavery in most of the North) or commercial interests tied into the slave trade, and probably got a slaveholder elected President in 1800, but historians shouldn't conclude that they considered that the Electoral College, like the 3/5 clause, the fugitive slave clause, and the ban on banning the slave trade for 20 years, might protect slavery. 

 

I don’t consider myself equipped to debate either the earlier or later positions taken by Professor Wilentz, but I thought James Madison might be, so I decided to consult his Notes of the Constitutional Convention.

 

Hugh Williamson representing North Carolina seems to have first introduced the idea of an Electoral College in discussion of an Executive on June 2, 1787. On Wednesday July 25, 1787, the Constitutional Convention debated a series of proposals for selecting a national “Executive.” 

 

Oliver Ellsworth of Connecticut moved “that the Executive be appointed by the Legislature." Elbridge Gerry of Massachusetts, who later refused to sign the Constitution, argued that “an election at all by the Natl. Legislature was radically and incurably wrong; and moved that the Executive be appointed by the Governours & Presidents of the States.” James Madison of Virginia noted that “There are objections agst. every mode that has been, or perhaps can be proposed. The election must be made either by some existing authority under the Natil. or State Constitutions — or by some special authority derived from the people — or by the people themselves. — The two Existing authorities under the Natl. Constitution wd be the Legislative & Judiciary.” Madison opposed the judiciary and legislative options as  “liable to insuperable objections.” According to Madison, “The Option before us then lay between an appointment by Electors chosen by the people — and an immediate appointment by the people. He thought the former mode free from many of the objections which had been urged agst. it, and greatly preferable to an appointment by the Natl. Legislature. As the electors would be chosen for the occasion, would meet at once, & proceed immediately to an appointment, there would be very little opportunity for cabal, or corruption.” Ellsworth’s motion that the Executive be chosen by the national legislature was then defeated by 4 to 7 with only New Hampshire, Connecticut, Pennsylvania, and Maryland voting in the affirmative.

 

Charles Pinckney (South Carolina), George Mason (Virginia) and Elbridge Gerry supported a motion to have the Executive selected by the Legislature as long as “ no person be eligible for more than 6 years in any twelve years.” Gouvernor Morris of Pennsylvania spoke in opposition and insisted that “election by the people as the best, by the Legislature as the worst.”

 

The idea of an Electoral College was reintroduced by Pierce Butler, a South Carolina rice planter, one of the largest slaveholders in the United States, and one of slavery’s strongest defenders. Butler also introduced the Fugitive Slave Clause into the Constitution, supported the Constitution provision prohibiting regulation of the trade for twenty year,and demanded that the entire slave population of a state be counted for Congressional apportionment.

 

According to Butler, “The two great evils to be avoided are cabal at home, & influence from abroad. It will be difficult to avoid either if the Election be made by the Natl Legislature. On the other hand, the Govt. should not be made so complex & unwieldy as to disgust the States. This would be the case, if the election shd. be referred to the people. He liked best an election by Electors chosen by the Legislatures of the States.”

 

The issue of selecting an Executive was then referred to a special Committee of Eleven, also known as the Brearly Committee. On September 4, the Brearly Committee reported its recommendation that “Each State shall appoint in such manner as its Legislature may direct, a number of electors equal to the whole number of Senators and members of the House of Representatives, to which the State may be entitled in the Legislature.” Gouvernor Morris explained that the committee’s reasoning. “No body had appeared to be satisfied with an appointment by the Legislature,” “Many were anxious even for an immediate choice by the people,” and “the indispensable necessity of making the Executive independent of the Legislature.” Pierce Butler defended the recommendation, although “the mode not free from objections, but much more so than an election by the Legislature, where as in elective monarchies, cabal faction & violence would be sure to prevail.” The motion was then put on hold while the committee considered objection, not to the selection of the Executive, but to the process for removal.

 

On September 6, the edited Brearly Committee report was brought to the convention again. Alexander Hamilton, who had a strong “dislike of the Scheme of Govt. in General,” announced, “he meant to support the plan to be recommended, as better than nothing.” After continued debate and some amendments, Hamilton’s recommendation that the Convention approve the Brearly Committee’s recommendations for the organization of the Executive branch and acceptance of the Electoral College because the final document was “better than nothing,” was finally accepted by the Constitutional Convention and submitted to the states for approval. 

 

Madison’s notes do not definitely prove either Wilentz’s earlier or later positions on the relationship between support for the Electoral College and defense of slavery. What I find most suggestive in the debate is the role played by Pierce Butler, one of the Convention’s greatest slavery champions. The Electoral College may not have been expressly designed only to protect African slavery, but based on Madison’s notes, it was the mode most preferred by pro-slavery forces.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171666 https://historynewsnetwork.org/article/171666 0
Venezuela and the Birth of the American Empire

 

John Bolton, President Donald Trump’s national security advisor, took to Twitter recently to disparage the regime of Nicolás Maduro. In particular, Mr. Bolton vowed that the detention of opposition leader Juan Guaido’s chief of staff would “not go unanswered.” Roberto Marrero’s arrest was deemed “illegitimate,” and Bolton echoed the president, who has repeatedly warned Venezuela that “all options are on the table” concerning the conflict over that nation’s recent elections. This, the president confirmed, included the possibility of a military intervention. 

 

Maduro’s two biggest backers, China and Russia, have invoked the Monroe Doctrine in order to disparage the US’s efforts to ouster Maduro and his United Socialist Party. Ted Galen Carpenter, writing in National Interest, also invoked the Monroe Doctrine but in a positive light, arguing that the US needs to invoke the old policy in order to curb Moscow’s foothold in Latin America. 

 

With so many politicians and analysts invoking the Monroe Doctrine, you would think at least one would correctly understand the history of this controversial piece of American foreign policy. The inaccuracies permeating the recent analysis suggest Americans need a refresher on the nearly 200-year-old document.

 

First announced in 1823 during the administration of President James Monroe, the so-called Monroe Doctrine declared that the United States intended to protect its “sister republics” in Latin America from further European imperialism, specifically Spanish imperialism. Monroe and Secretary of State John Quincy Adams (the policy’s true mastermind) let Madrid, Lisbon, and Paris know that any return of European rule in Latin America would be viewed by Washington as “the manifestation of an unfriendly disposition towards the United States.” 

 

The policy sounded tough, but was essentially toothless. The American Navy in 1823 had just sixteen vessels of war, five of which were deployed in the West Indies. This force could not deter any serious naval armada, therefore the British Royal Navy, which enjoyed trading relations with several Latin American nations, became the enforcer of the policy. The Monroe Doctrine was as much of a British policy as it was an American one, and for the majority of the nineteenth century, the Monroe Doctrine benefited the British Empire more than it benefitted the American Republic. 

 

That changed in 1895. The reason? A border conflict between Venezuela and British Guiana. By 1895, the US fleet included fifty-five warships in total, with three brand new battleships commissioned that same year. While nowhere near the strength of a first-class European fleet, the US Navy was by that point a force to be reckoned with. The British found that out when London refused to accept international arbitration on the long-simmering dispute over the Schomburgk Line. The Venezuelan government demanded territory as far east as the Essequibo River. The colonial officials in British Guiana, recognizing that this would strip them of about two-thirds of their territory, countered these claims by demanding 33,000 square miles of Venezuelan territory west of the Schomburgk Line (so named because of the German-born explorer Robert Schomburgk). 

 

For nineteen years, between 1876 and 1895, Caracas petitioned the United States to intervene on their behalf against the British. It took a new US Secretary of State, Richard Olney, to finally grant Venezuela’s wishes. Olney sent a letter to Thomas Bayard, the American ambassador to Britain, demanding that London settle the dispute by arbitration. Olney invoked the Monroe Doctrine to declare that the United States, which had “greatly increased in power and resources” since 1823, had an interest in protecting the status quo in the Western Hemisphere. British Prime Minster Lord Salisbury responded by telling Olney that international law did not recognize the Monroe Doctrine. 

 

Rather than turn tail and save America from the possibility of fighting the world’s premiere navy, President Grover Cleveland, a Democrat and a firm believer in limited government and even more limited US involvement abroad, sent the issue to Congress. Congress met specifically to talk about the formation of a boundary commission. However, behind the scenes, a few American wives of British statesmen (including Mary Chamberlain, wife of Joseph Chamberlain) told their husbands that Congress could declare war. Their urging, along with a growing crisis with the Boer republics in South Africa, convinced London to back down. By October 1899, the issue had been resolved, with the United States declaring that the border should concur with the Schomburgk Line. 

 

By that point the United States had a full-fledged empire in the Caribbean and Asia. Following the impressive victory against the Spanish in 1898, American troops occupied Cuba, Puerto Rico, Guam, and the Philippines. In the span of four years, America had gone from saber rattling on behalf of arbitration to overseeing what amounted to the British Empire in miniature. In 1904, the Roosevelt Corollary gave serious muscle to the Monroe Doctrine, adding that the US military now had a responsibility to police the Western Hemisphere. President Woodrow Wilson, a hated enemy of Theodore Roosevelt, would nevertheless echo his predecessor’s policy by adding to the Roosevelt Corollary an interest in promoting “good governance” in Latin America.  

 

Under this idea, small batches of US Marines and sailors would occupy Cuba (1906-1909, 1912), Haiti (1915-1934), the Dominican Republic (1916-1924), and Nicaragua (1909, 1912, 1927-1932). American military governments established order, balanced the books, and tried to depoliticize Latin America’s ever-restive militaries. Under President William Howard Taft, the US also promoted dollar diplomacy, whereby US loans were leveraged in order to promote the ascension of pro-American leaders to power in Latin America. 

 

None of these developments would have happened had not the American government managed to get the British deescalate matters in Venezuela in 1895. This legacy is controversial to say the least. American occupation tended to promote improved public hygiene and free and open elections. Veracruz, Haiti, and Nicaragua all benefited from US intervention. However, Richard Olney’s descendants also left behind dictators like Rafael Trujillo and Anastasio Somoza Garcia, and by the time of the Great Depression, the United States had become the “Colossus of the North”— an imperial power which gobbled up Latin American resources and raw materials with a voracious appetite. Detractors of Washington’s current approach to the crisis in Venezuela reference this imperial legacy when they invoke the Monroe Doctrine. 

 

It remains to be seen what President Trump’s administration will do in Venezuela. Could it all be bluster, or is there a possibility of another Banana War-like invasion in the same vein as Grenada in 1983 or Panama in 1989? Either way, it is clear that the Monroe Doctrine needs to be seriously studied rather than bandied about by both anti-American powers and interests and war hawks. Simply put, the Monroe Doctrine was a complicated piece of legislation that genuinely attempted to protect the Western Hemisphere from Spanish revanchism. The policy changed once the United States became an economic and military power. 

 

Today, it would appear sound to not only study the history of the Monroe Doctrine, but to study the traditions of realpolitikin general. For the first time in a long time, a non-American power (Russia) has sent troops and material into a Latin American country in order to prop up an anti-American government. This does directly challenge American hegemony in the region, and now the question to the US is this: do you follow the traditions set forth by the Roosevelt Corollary to the Monroe Doctrine, or do you follow a new approach not constrained by historic precedent? 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171662 https://historynewsnetwork.org/article/171662 0
The Propaganda Posters That Won The U.S. Home Front

 

In 1917, James Montgomery Flagg created his iconic Uncle Sam poster encouraging American men to join the war cause with the clear message, “I want you for the U.S. Army!” as the U.S. ramped up preparations to enter World War I. Even though this was not the first instance of propaganda posters being employed on behalf of a war cause, the visual medium proved to be effective in the military’s recruitment drives and posters were routinely used to boost morale, encourage camaraderie, and raise esprit de corps.  Posters were cheap, easily distributed, and fomented a sense of patriotism and duty. In World War II, the U.S. turned to artists once again in an attempt to influence the public on the home front. Today, these posters offer a glimpse into American society and the efforts to mold public opinion in the country. 

 

Rolled out on a massive scale in World War I, the popularity of posters as propaganda only further increased in World War II. With the surprise attack on Pearl Harbor in 1941, the U.S. began mobilizing once again but not just militarily. The U.S. government leveraged hundreds of artists across the country to deliver important messages through visual means. This included some relatively famous artists such as the creator of Aquaman, Paul Norris, whose sketches were noticed by his superiors during his time in the military.  The artists’ designs were not just focused on the rank and file of the military either. The Office of War Information (OWI) believed that the ‘home front,’ was just as sensitive to enemy misinformation, and went to work creating a series of posters specifically focused on the population back home as the engine of the war effort in Europe and the Pacific. 

 

The designs and posters had a wide range in terms of messaging and design. Even though there was quite a number of posters in the U.S. with xenophobic or down right racist messaging and visuals, the majority centered around themes of tradition, patriotism, duty, and honor.  This was further expanded on the home front with themes such as conservation, production, work ethic, buying war bonds, tending to “victory gardens,” encouraging women in the labor force, and cementing a common enemy in the eyes of the American public.  

 

A Common Enemy Emerges

 

 

Several U.S. propaganda posters employed a tactic known as demonization.  This involved portraying the enemy as barbarian, aggressive, conniving, or simply evil. Demonization included derogatory name calling including terms such as “Japs,” “Huns,” and “Nips,” among others. Several posters in the U.S tapped into demonization by showcasing the Japanese with overly exaggerated features and by recycling racist and xenophobic personifications.

 

 

This was often paired with messaging such as one anti-Japanese poster which portrayed Emperor Hirohito rubbing his hands saying, “Go ahead, please take day off!.” The tactic was clear, motivate the working population at home to avoid sick days through fear of the inhuman enemy who is planning an attack on the homeland at any moment. 

 

Fear was a popular theme employed by artists, even with differing messages. In one poster, a giant Nazi boot is depicted crushing a small church with the language, “We’re fighting to prevent this.” Often, fear was utilized as a way to encourage the purchase of war bonds. Numerous posters portray children wearing gas masks or under the shadow of giant swastikas with clear messaging, “Buy war bonds to prevent this possible future.”  

 

 

Conservation and Production 

 

 

Some posters employed comedy as a way to break through, while at the same time tapping into the overarching fear of the enemy.  For example, one poster, seemingly in an attempt to encourage carpooling, depicts an outline of Adolf Hitler riding shotgun with a commuter with the messaging, “When you ride alone, you ride with Hitler.” Others encouraged high production outputs by likening slacking off with aiding and abetting America’s foreign enemies. At the same time, others were more positivist in nature such as the famous Rosie the Riveter “We can do it!,” poster, encouraging women in the workforce. 

 

 

Interestingly, many posters encouraged conservation and “victory gardens.” In an attempt to counterbalance rationing, the Department of Agriculture encouraged personal home gardens and small farms as a way to raise the production of fresh vegetables during the course of the war.  Some scholars, such as Stuart Kallen believe that victory gardens contributed up to a third of all domestic vegetable production in the country during the course of the war.  Posters espoused popular sentiments such as “our food is fighting,” “food is ammunition,” and “dig for victory.”  Coinciding with this, posters also espoused the benefits of canning with messages such as, “of course I CAN,” and “can all you can.”  

 

 

Loose Lips Sink Ships

 

Perhaps one of the most fascinating themes which were propagated on the home front is misinformation and “loose talk.” Some scholars have speculated that this theme emerged out of fear of domestic spies and foreign intelligence operations within the U.S. Others, however, maintain that the U.S. intelligence services had shut down any foreign intelligence networks even prior to America’s involvement in WWII. Their claim is that these types of messages merely aimed to dispel rumors and prevent a loss of morale at home and abroad. Whatever the case may be, the government asked illustrators to discourage the population at home from casual chatter about troop deployment, movement, and any other sensitive information which could be “picked up,” by enemy receptors or propagated on a large scale. The phrase, “loose lips sink ships,” emerged thanks in part to the work of Seymour Goff.  Goff’s poster depicts a U.S. boat on fire, sinking with the words, “Loose lips might sink ships.”  Similar messaging was also prevalent in Great Britain and Germany as well. 

 

 

Just as World War II was fought with bullets, boats, tanks and planes, the war at home was fought with information stemming from sources such as movies, radio, leaflets, and posters. Artists suddenly became soldiers on the front to win the hearts and minds of the American public.  Propaganda posters offer us an interesting insight into the objectives of the U.S. government and the war time aims of the mission to create consensus at home.

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171661 https://historynewsnetwork.org/article/171661 0
The Secret Life of CIA Spymaster James Jesus Angleton

 

If you asked the average American to name a CIA agent, he or she would probably go blank. One might list one of the Watergate burglars: E. Howard Hunt, James McCord or Gordon Liddy. A few new junkies might be able to name the current CIA Director, Dan Coats or his predecessor, Mike Pompeo (now secretary of state). 

Almost no one, however, would be able to identify James Jesus Angleton, despite his key role as leader of the agency’s head of counter-intelligence (CI) operations during the height of the Cold War. Angleton had a profound impact on the agency’s operating procedures during its formative years. A fervent anti-Communist, he was obsessed by the KGB and (he falsely believed) its constant attempts to plant agents in the CIA. Operating without hard evidence, he wrongly accused many CIA colleagues of disloyalty and ruined dozens of careers. 

The Ghost, a new biography of Angleton by Jefferson Morley, a Washington journalist, provides an intriguing look at this powerful, enigmatic Cold Warrior. He earned the nickname “the Ghost” because he was rarely seen outside his high-security office, yet had a major impact on the agency’s strategy and tactics. 

James Jesus Angleton was born in Idaho in 1917. He grew up in a secure, middle-class family. His father was a successful businessman who owned the National Cash Register franchise in Italy. Angleton spent several years in Europe and became fluent in Italian and German. 

As a student at Yale, he founded a literary magazine, Furioso, which published a number of avant-garde poets including William Carlos Williams, E.E. Cummings, and Ezra Pound.  He joined the U.S. Army in 1943 and was quickly assigned to the CIA’s forerunner, the Office of Strategic Services (OSS). (His father, then living in Italy, was already in the secret organization). Angleton rose quickly, by the end of the war he was head of the OSS’ X-2 division (counter-intelligence) for Italy. 

During the war, Angleton organized a number of secret missions and helped to round-up hundreds of enemy agents. He built a reputation as a genius who could interpret enemy strategies and discern the true loyalties of individuals. 

After the war, Angleton stayed in the army until he joined the newly formed Central Intelligence Agency in 1948. He was soon appointed head of the organization’s CI division, a job he held until 1974. In this position he supervised hundreds of agents around the world. As a fellow CIA officer recalled, he enjoyed the role of “Delphic Oracle.” He was “seldom seen, but frequently consulted.”

Angleton came to believe that the KGB had mounted a massive disinformation campaign designed to mislead the Western allies. As Angleton saw it, the split between Khrushchev and Mao, which culminated in the Soviet Union suspending all aid to China in 1961, was a carefully orchestrated deception – a ploy to persuade the West to lower its guard. 

He relentlessly pursued nonexistent KGB “moles” whom he believed operated at high levels in the governments of the U.S. and its allies. At various times, he falsely labeled as KGB operatives important figures such as Averell Harriman, U.S. Ambassador to Russia and former New York governor and two prime ministers, Harold Wilson of Great Britain and Lester Pearson of Canada.  

He also ruined the careers of more than a dozen loyal CIA executives by accusing them, without any firm evidence, of working for the Soviets.  Because many of these men spoke Russian or had worked in the U.S. embassy in Moscow, their forced retirement critically weakened the CIA’s ability to collect intelligence on the Soviet Union.

Angleton’s obsession with uncovering spies in American led him to authorize one of the CIA’s first domestic spying operations, Operation Lingual. Beginning in 1955, all U.S. mail sent to and received from the Soviet Union was opened and copied at a secret facility just outside JFK Airport. This operation was never disclosed to Congress and the FBI only found about it by accident in 1957. FBI Director J. Edgar Hoover, angered at the invasion of his turf (domestic spying), said nothing publicly but demanded the agency share its findings.   

A decade later, as student protests against the Vietnam War grew larger and larger, Angleton worked with the FBI to mount Operation CHAOS, which infiltrated the peace movement and surveilled many of its leaders. Angleton was convinced the KGB was inciting the protests with men and money. Still, it was a violation of the CIA’s charter, which prevented the agency from engaging in domestic operations against U.S. citizens.

In the wake of the Watergate trials and Nixon’s resignation, Congress began investigating FBI and CIA operations against American dissidents. On December 22, 1974, the New York Times published a story by Seymour Hersh that revealed the CIA’s extensive, illegal domestic spying operations. Angleton was mentioned by name and labeled an “unrelenting cold warrior” who had directed the operations. The CIA was deeply embarrassed by the revelations, and ordered Angleton’s immediate retirement. 

Morley’s account depicts a deeply troubled man. A chain-smoking workaholic and heavy drinker, Angleton would often arrive at the office at 10 and then retreat for a three-martini lunch. He would often work until two or three a.m. His rarely saw his three young daughters; his wife came close to divorcing him several times.   

Unlike other senior CIA officers, Angleton never published a memoir. In retirement, he clung to his paranoid suspicions, defending them to former colleagues and to journalists in off-the-record interviews.

He died of lung cancer in May 1987, taking most of his secrets with him to the grave.

Angleton may have been ghostlike, but the agency he haunted was a highly structured, powerful, organization that reported to the president. Morley’s biography falls short in providing context to Angleton’s rapid rise and sudden fall. Why was he allowed to condemn fellow CIA officers without evidence? Did the presidents he served under know what he was doing?

Angleton worked under five different administrations, those of Truman, Eisenhower, Kennedy, Johnson and Ford. But Morley’s book barely mentions these leaders and their different uses of the CIA.  While spy agency aficionados will have read recent CIA histories (e.g. Tim Weiner’s Legacy of Ashes), those who are new to the subject may wonder how this eccentric, paranoid man could wield so much power.    

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171663 https://historynewsnetwork.org/article/171663 0
The Original Border Wall

Ramón Murillo, Soldado de Cuero, 1804. (Archivo General de Indias, Seville: Mapas y Planos, Uniformes, 81; Wikipedia Commons).

 

Spaniards responded to the unfolding story of the American Revolution with a mixture of trepidation and schadenfreude. Britain was Spain’s dangerous imperial rival. Britain had humiliated France and Spain in the French and Indian War. So Spaniards much enjoyed England’s crisis. But in 1775, the Count of Aranda, the Spanish Ambassador to Versailles, presciently warned the government in Madrid that whether the Thirteen Colonies secured their independence or not, “we must view them as a rising power born to subjugate us.” In due course, he believed, whatever the outcome of the war, this terrifyingly bellicose population of mostly English-speaking, largely Protestant, northern Europeans would march across America, their Anglo-Saxon sights firmly fixed on the silver mines of Mexico.

 

Aranda preempted Manifest Destiny by three-score years and ten. He even identified the Red River as the likely invasion route, which would have led his imaginary army of alien heretics to the relatively prosperous community of 5,000 souls at El Paso, then famous for its aguardiente, literally “firewater.” Somewhere in New Mexico or Texas this phantasmagoric force would have been confronted by the first line of Spanish imperial defense, the garrisons of soldados de cuero, tough mounted border guards named for their leather armor.

 

In 1772, Charles III of Spain had ordered a major reorganization of the forts known as presidios and their garrisons of “leather-jackets” which were supposed to control and protect the northern reaches of the viceroyalty of New Spain (modern Mexico). The plan required a chain of presidios, each with its own well-equipped garrison, set one hundred miles apart. This “rampart” was to run from California to Texas following more-or-less the same line as the modern international border, with the exceptions of the outlying settlements in northern New Mexico and at San Antonio de Béxar, Texas. This was to be the frontier of Spanish occupied territory, beyond which lay Indian country which was nonetheless claimed by Spain.

 

Thus, King Charles’s Regulation of 1772 sought to establish an eighteenth-century Spanish predecessor to the contentious modern border wall. That ancestry is not without obvious irony, for the express purpose of this “Spanish wall” was, as Charles III stated, “to defend those borders and the lives and livelihoods of my vassals” in the modern Mexican border states of Sonora, Chihuahua, Coahuila, and Tamaulipas “from the barbarian nations” invading from the north. These “barbarians” were not hoards of British-Americans from the East, however, but Native Americans whose descendants are now US citizens. Northern New Spain had been devastated by highly mobile Apache and to some extent Comanche raiding parties from southern New Mexico and Texas that had repeatedly assailed settlements in northern New Spain. Their violence had caused large numbers of Hispanic settlers, mostly indigenous Mexicans or mixed-race castas who had emigrated from elsewhere in New Spain, to abandon their isolated farms and communities, causing massive depopulation and the dislocation of people across the region.

 

Behind this immediate threat from Native Americans, the distant but ever-present menace posed by Britain and the Thirteen Colonies did indeed lurk. The expanding population of British America created westward pressure on the Indian population that expressed itself as Comanche and Apache aggression when it came up against the Spanish world. Moreover, these Indian raiders acquired weapons and ammunition from trading networks that originated in the Thirteen Colonies, although they bought some from traders in Spanish Louisiana, to the disgust of the viceroy in Mexico City. So, while the Spanish border “wall” was not conceived of as a defense against British or American invaders, itwas a response to the consequences of American demographic growth and fast developing commercial relations with Native Americans.

 

Moreover, as Aranda’s warning exemplifies, Spaniards believed that they would soon have to confront a direct British or American trespass on their territory. Before the Revolutionary War, that threat was most tangibly present in Louisiana (ceded to Spain by France in 1763), where British West Florida was just across the Mississippi. The government in Madrid sought to create a bulwark against British-American expansion by building a series of alliances with the different Native American tribes who controlled most of the territory on both sides of the Mississippi. In 1769, the acting military-governor of Louisiana, an Irish-born army officer called Alejandro O’Reilly, convened a great Indian Council at which he met with chiefs from almost all the tribes living within two-hundred miles of New Orleans. He smoked the peace pipe with them and listened to their professions of friendship. He extolled the benign but awesome power of Charles III and then hung gold medals bearing an image of the king around the necks of the nine most important chiefs.

 

 

Fears of a British or American invasion of Spanish North America convinced Spain to maintain an ambivalent approach throughout the Revolutionary War. Madrid gave the rebels just enough support to prolong the conflict in order to weaken both sides, until the Spaniards finally entered the war, not as allies of the rebels, but in order to secure control of the Mississippi and the Gulf Coast. That policy succeeded. In 1783, at the Peace of Paris, George III ceded both East and West Florida to Charles III, giving Spain control over the Mississippi and the Gulf of Mexico. 

 

The Peace of Paris also created a troubled and porous frontier with the newly independent United States along the Saint Marys River. In the Florida, the disembodied specter of American invasion that had been invoked by Aranda manifested itself as real humanity on the ground. These were not the grand armies he had envisaged descending the Red River. In 1790, the new Spanish governor at Saint Augustine, Manuel de Zéspedes y Velasco, railed against “a species of white renegade, known as Crackers,” in a report sent to Spain. Their “wish to escape the authority of the law is so strong that they prefer to live in Indian... or Spanish territory rather than under the yoke of civilization.” They “are nomadic like Arabs and can be distinguished from savage [Indians] only by their complexions, their language, and the depraved superiority of their cunning and bad faith. As skilled as Indians when hunting, they will risk crossing great rivers on flimsy rafts, and can track man or beast through the thickest woods.” These “Crackers” trespassed across Spanish territory and occupied Indian lands, yet “far from opposing these land grabs,” Zéspedes complained, “the southern states of America encourage them, motivated by the desire to expand their frontiers and gain control over foreign lands”.

 

“America,” he might have said, “was not sending Spanish Florida its best.”

 

In 1785, the American general Nathaniel Greene made a surprise visit to Saint Augustine, where he much enjoyed the liberal generosity of Manuel de Zéspedes’s table.  Greene wrote his wife that perhaps “two hundred dishes of different kinds [were] served up in seven courses,” all washed down with a “variety of Spanish and... French wines.” After “five hours,” he confessed, “I was not unlike a stuffed pig”.

 

Greene did not visit Florida to have lunch, however enjoyable. Ironically enough, the ostensible purpose of his mission was to ask Zéspedes to help prevent Loyalist refugees from squatting and logging on Cumberland Island GA, where Greene had recently acquired the property which gained renown as Dungeness. More ironic still, Zéspedes reported to Madrid that Greene had in reality come to Saint Augustine not to complain about Tory vagrants, but to tempt formerly British Floridians newly subject to the Spanish Crown to settle and work his new estate. Skilled tradesmen like the carpenter Thomas Stafford, later elected to the State Convention of Georgia, and his brother Robert Stafford did indeed answer Greene’s call.

            

Not only was America not sending Florida its best, it was stealing Florida’s best to boot. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171664 https://historynewsnetwork.org/article/171664 0
What I’m Reading: An Interview With Historian of Mexico Pablo Piccato

 

Pablo Piccato got his B.A. in History at the Universidad Nacional Autónoma de México, in 1989, and his Ph.D. from the University of Texas at Austin in 1997. He is professor at the Department of History, Columbia University where he teaches on Latin America, Mexico and the history of crime. His research focuses on modern Mexico, particularly on crime, politics, and culture. He has taught as visiting faculty in universities in Mexico, Argentina, Brazil, Italy and France, and has been director of Columbia’s Institute of Latin American Studies. His books include Congreso y Revolución, 1991; City of Suspects: Crime in Mexico City, 1900-1931 (2001), The Tyranny of Opinion: Honor in the Construction of the Mexican Public Sphere (2010), and most recently A History of Infamy: Crime, Truth, and Justice in Mexico (2017), which won the María Elena Martínez Prize for the best book in Mexican History from the Conference on Latin American History.

 

 

What books are you reading now?

 

Reading fiction is a basic necessity for me, even when I am in the middle of research or teaching seasons. Right now I am readingThe Star Diaries by Stanislaw Lem, in the Spanish version. Science fiction creates worlds that are possible. They can be removed in time and space but they have a connection with our present. In a way, all science fiction is about colonialism. Lem is a great critic of science and politics: he plays with the encyclopedic knowledge of those possible worlds, and makes fun of our ridiculous anthropocentrism. His scientists of the future try to intervene in history to bring the world closer to their idea of perfection, but fail miserably because of bureaucratic intrigues. I’m also going slowly through Robert A. Caro’s second volume of Lyndon B. Johnson’s biography, Means of Ascent. I admire Caro’s narrative drive, but also his lack of concern about how long it will take to get to fully understand his subject. 

 

What is your favorite history book?

 

I have to think about several books that were important for me at different times: at the beginning of college, Charles Gibson’s The Aztecs under Spanish Rule, showed me the power of old and neglected sources to reveal a social structure that survived conquest. I read it in Spanish translation, in a battered copy at my university’s library, and it made me want to become a historian of Mexico in the sixteenth century. Just before graduate school, William B. Taylor, Drinking, Homicide and Rebellion in Colonial Mexican Villages showed me the value of judicial sources and the way in which transgression could be woven into the fabric of an apparently stable system of domination. When I was in Austin I discovered the richness of modern urban spaces and sociabilities through Judith Walkowitz, City of Dreadful Delight, and Margareth Rago, Os prazeres da noite: two wonderful books about transgression and desire set in urban spaces where women challenged the privileges of male gaze. 

 

Why did you choose history as your career?

 

I’m not sure. I was finishing high school and I had to decide what career to follow at the Universidad Nacional Autónoma de México. I guess I applied to History instead of Philosophy because I felt I had to understand the reasons why I was there: my father had to leave Argentina because of the political repression of the mid-seventies and we all moved to Mexico. Both countries still represent the questions of history for me, one in short and recent episodes of crises, and the other as long-term processes of stability and transformation. In those years, Argentina went through a process of internal political fragmentation that lead to a bloody military dictatorship. I discovered that Mexico had a rich pre-Hispanic and colonial history, a massive social revolution, and a regime that still welcomed exiles. Studying history may have been my way of honoring that complex combination in space and time that was Mexico City in 1982. I’m pretty sure I did not go into English Literature because I was afraid of spoiling the pleasure of reading fiction.

 

What qualities do you need to be a historian?

 

You have to be able to tell a story, but you also have to explain the past. Both require to be attentive to the present. The reasons why people approach history are always changing, so the historian has to have one foot firmly planted on her circumstances, and to read other historians with that in mind. But she also has to be willing to go into the archive or the library or the interview and let the sources take her into unexpected places. You have to be meticulous in preparation but also be ready for surprises, and keep a good database with your notes because you never know how are you going to use the source you read today. Without a particular combination of this sense of wonder with a maniacal concern about detail, one cannot go very far as a historian. And you also need patience: to spend many hours reading sources that do not seem to be productive, to wait until your files are delivered by the archivist, to look for a book that seems to have disappeared from libraries.

 

Who was your favorite history teacher?

 

Again: different people come to mind at different times. A teacher of Mexican history in middle school whose name I’ve forgotten was the first to show me that the past can be explained with clarity, and told me to visit the National Library, at the time still in an old church building in downtown Mexico City. Arturo Sotomayor, in high school, spoke with passion about modern Mexican history, in a way that gave it a relevance I did not imagine until then, and that I am still trying to fully apprehend. In college, Eduardo Blanquel taught me how to read and discuss primary sources, and I’ve been following his model and trying to do the same in my classes ever since. My doctoral adviser in Austin, Jonathan C. Brown, showed me how to think and then write clearly while I was immersed in a project that could have easily gone out of control. He is my model of a graduate mentor because he knows when to be critical and when to let you be. 

 

What is your most memorable or rewarding teaching experience?

 

After a particularly difficult class trying to understand Descartes’s Discourse on the Method, in my Contemporary Civilization section at Columbia. We were tired and not sure where the discussion was going. A student smiled and laughed, I laughed, and the entire group started laughing out loud, so much so I had to end the class right there. I guess we all agreed that we would never finish dissecting the book into its smaller parts, but also that the enterprise was foolish anyway. The lesson, I guess, is that you have to stop reading at one point, and move on. 

 

What are your hopes for history as a discipline?

 

That historians claim a stronger voice in the public sphere to talk about the present under the light of the past. This does not mean we have to become antiquarians who claim there is nothing new under the sun, or determinists that try to establish the laws of evolution. We can help shaping historical discussions that will make sense of the present with a proper historical perspective. We can compare places and times, and remind the public that their horizon should not be an op-ed about the last five years, or a rigid school narrative about the last two hundred years. Today that operation is more important than ever: we need to understand the history of fascism and racism if we are going to appreciate both the radical threat and the unavoidable roots of Trump’s government. Yet nothing of this will be possible if we stop training historians who can do serious and deep research, organize large amounts of data, write coherently and have a lasting impact as teachers and mentors.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

Not really, unless you count a small collection of 1940s Mexican comic books about Chucho Cárdenas, the reporter-detective. I enjoy looking at old objects in museums and libraries. They help me imagine how they were used, circulated, touched and valued in times past. Yet I had never had the impulse to own them. They should be in a public place where others can come close to them and imagine those uses for themselves.

 

What have you found most rewarding and most frustrating about your career? 

 

The most rewarding aspect has been my work with students and colleagues. It is tempting to think about the work of the historian as solitary and individualistic, as if we were authors inspired by rare epiphanies that only occur after long years of painful research. The reality is that our conversations in seminars, workshops, conferences, the cafés close to archives and libraries, and at the occasional bar, play a decisive role to understand the possible contribution of our work. Often when I write I imagine the text as a conversation with other historians who can criticize my arguments, be skeptical about my sources, but also, eventually appreciate what I am trying to say. Co-writing has always been a good experience for me and sometimes I wonder why we historians are so reluctant to write in teams compared to other scholars. Another reward of the job is to come across readers who understand, sometimes more cogently than myself, what my books and articles do. 

 

I guess all institutions can be frustrating, even as they make possible the material conditions and the collaborations that are essential for our work. I have experienced the combination of mismanagement and petty authoritarianism of large institutions since college, but I have also seen the advantage of being patient and trying to change them from the inside. I guess I’ve been fortunate to have the option. But I am aware that biases permeate academic life, even if we refuse to recognize them. I am still learning how seemingly small interactions can have large consequences for people’s careers. Along with a wonderful group of colleagues from different disciplines at Columbia I participated last year in the production of a report on harassment and discrimination in the social sciences (https://www.fas.columbia.edu/home/diversity-arts-and-sciences/ppc-equity-reports). The experience helped me understand moments in my own career that I had tried to forget, perhaps because they undermined my confidence as a young scholar. It also helped me appreciate how effective serious research and collective work can be when we try to confront the problems derived from bias and inequality in academic life. The committee work that I have to do, as all of my colleagues, reminds me that institutions are not brands or buildings but people who come together with a purpose.

 

How has the study of history changed in the course of your career?

 

The fundamental change in recent decades has been a new ability of the discipline to synthesize methodologies and approaches that twenty years ago seemed to be isolated from each other. During graduate school, in the nineties, I saw the tension between cultural history and other subfields that defined themselves as “harder” in terms of their use of evidence and interpretive models. It was as if two fundamentally opposed paradigms of historical work were on a collision course. But if you look at the best programs in my field today, in Mexico and the United States, you can see that they have avoided the temptations of specialization and have encouraged historians to cross disciplinary divides, training students in a generous way. So, instead of a field divided between “postmodernists” and “positivists”, as many predicted twenty years ago, we have an explosion of work that engages social, cultural, economic, political, intellectual, environmental, migration and legal history, to name a few. We still have some colleagues who concern themselves with patrolling the boundaries of their area of expertise, but they do not have the influence they think they have.

 

What is your favorite history-related saying? Have you come up with your own?

 

I love “May you live in interesting times”: it might sound as an ironic curse but now I hear it as a blessing of sorts. We all live in interesting times, whether we like it or not. 

 

What are you doing next?

 

I am starting a project on poetry and politics in nineteenth-century Mexico. I am still far from producing anything of value but I am enjoying the process of learning how to read poetry. Mexico and other Latin American countries had a rich literary life in the nineteenth century. Some authors have survived, particularly from the second half of the century, like Sarmiento, Martí or Machado de Assis, but I think few historians appreciate yet the creativity and intensity of that world of fiction and poetry, a world that was shared by many people, across classes, in oral and written form. It was a realm of cultural production where Latin American authors and readers could be as productive and free from the legacies of colonialism as they wished to. Poetry in particular was a central medium for political speech during the era that we can roughly classify as romantic. 

 

I started this project with some trepidation, because I knew I would enjoy the research. I now understand Terry Eagleton when he writes that “Literary critics live in a permanent state of dread--a fear that one day some minor clerk, in a government office, idly turning over a document, will stumble upon the embarrassing truth that we are actually paid for reading poems and novels.” I already had a sense of dread when I was reading crime fiction for my previous book but I managed to overcome it when I confirmed that narrative was a privileged source to understand social ideas about crime and justice. Poetry is similarly promising if we read it as a medium that expands the communicative possibilities of words and images.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171183 https://historynewsnetwork.org/article/171183 0
Roundup Top 10!  

 

What if Churchill Had Been Prime Minister in 1919?

by Andrew Roberts

More than most, he understood the grave challenges facing the West at the end of World War I.

 

How Reconstruction Still Shapes American Racism

by Henry Lewis Gates, Jr.

Regardless of its brevity, Reconstruction remains one of the most pivotal eras in the history of race relations in American history —­ and probably the most misunderstood.

 

 

The Electoral College Was Not a Pro-Slavery Ploy

Mr. Wilentz is the author, most recently, of “No Property in Man: Slavery and Antislavery at the Nation’s Founding.”

 

 

Are the Humanities History?

by Michael Massing

In the brave new world that is emerging, the humanities will have a critical part to play—provided that they themselves can adapt to it.

 

 

The Story We've Been Told About America's National Parks Is Incomplete

by Dina Gilio-Whitaker

The national park system has long been lauded as “America’s greatest idea,” but only relatively recently has it begun to be more deeply questioned.

 

 

Want to unify the country? A community organizer and a Klan leader showed us how.

by Jonathan Wilson-Hartgrove

In the midst of the identity crisis we face as a nation, the organizing tradition that Ann Atwater embodied is the strong medicine we need.

 

 

Waking Up to History

by Margaret Renkl

At new museums, the past is finally becoming more than the story of men and wars.

 

 

Can Bernie Sanders Exemplify the American Dream?

by Walter G,. Moss

How can a socialist provide a unifying vision, one that will unite U. S. citizens?

</

 

The truth about the "campus free speech" crusade and its myths that won't die

by Jim Sleeper

While critics of college "snowflakes" prop up a fake crisis, even "good liberals" misunderstand student outbursts.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171669 https://historynewsnetwork.org/article/171669 0
The Red Scare: From the Palmer Raids to Joseph McCarthy to Donald Trump

 

In the immediate post World War I era, Attorney General A. Mitchell Palmer was considering a bid to become the Democratic Presidential nominee in 1920 to succeed President Woodrow Wilson.  To raise his profile, he claimed there was a massive wave of Socialists and Communists in America working to undermine the nation in the aftermath of the Russian Revolution of 1917.  Palmer arrested and detained thousands of suspected radicals as part of the Red Scare. Many people were detained for months without trial or protection of their basic constitutional rights, until some were deported and others were released without charges. 

 

Palmer had a very eager and zealous chief assistant, J. Edgar Hoover.  Hoover performed so well under Palmer, Palmer recommended Hoover become the head of the newly constituted Federal Bureau of Investigation. President Calvin Coolidge indeed selected Hoover to lead the FBI in 1924, beginning Hoover’s 48-year career as the Bureau’s head until his death in 1972.

 

Like his predecessor, Hoover’s tactics often violated Americans’ civil liberties. Under his leadership, the FBI engaged in unconstitutional behavior, particularly in the post World War II era as the U.S. fought Communism at home and abroad during the Cold War. Many in academia, in Hollywood, and in government were purged based on accusations that they were Socialists or Communists and were spying for the Soviet Union.

 

One beneficiary of this Second Red Scare in the late 1940s and early 1950s was a Republican United States Senator, Joseph Raymond McCarthy, who was rated the least influential and accomplished of all US Senators by a periodical.  Thinking ahead to his reelection in Wisconsin in 1952, McCarthy decided to raise his profile by accusing people in government and all walks of life of being Socialists and Communists.  McCarthy became extremely popular among a third of the American population, and with the exception of a few journalists and US Senators who spoke and wrote against McCarthy, he was able to run rampant in the last years of the Presidency of Harry Truman and the first two years of the Dwight D. Eisenhower Presidency. He finally went too far in his accusations, and was censured by the US Senate, a rare action in the history of the upper body, in 1954.

 

During his nearly five years of power from February 1950 to December 1954, McCarthy was aided by a zealous young man not all that different in character or motivation from J. Edgar Hoover three decades earlier.  McCarthy’s chief aide was attorney Roy Cohn, who zealously attacked innocent people who were accused of being Communists (Reds), or soft on Communism (Pinkos). Many believed he lacked any sense of ethics or honor and he was much feared.  Even after McCarthy fell from favor and then died in 1957, Cohn’s prominence continued and spent his remaining career as an attorney who often chose to represent reprehensible elements of society, including Organized Crime. He was also known for his wild social life.

 

Then, Roy Cohn met a young real estate entrepreneur named Donald Trump. The two men became close friends and Cohn impressed upon Trump how to exploit and play “hard ball” to gain ever more wealth and public influence.  As others have argued, Cohn was one of the most influential people in the development of Trump’s public persona and political views.  

 

Trump learned to exploit his critics, focusing on their weaknesses or shortcomings, to harm their reputations. Trump stoked fear in his goal to help his rise to power and march to the Presidency. 

 

Once in the White House, Trump worked to undermine civil liberties and civil rights, as A. Mitchell Palmer, J. Edgar Hoover, Joseph McCarthy, and Roy Cohn had done in earlier generations.  Utilizing racism, nativism, and Islamophobia, Trump also exploited the issue of gay and transgender rights, following the lead of his Vice President, Mike Pence, in promoting the elimination of gay and transgender people in the US military, despite their major contributions.  He undermined support for gay equality of treatment as a protected group, as had been pursued by Barack Obama during his Presidency.

 

Trump also has now labeled his Democratic opposition, especially Democratic Presidential candidates Elizabeth Warren and Bernie Sanders, as Socialists who threaten American capitalism. As he stirs up fear, he works to undermine Social Security, Medicare, Medicaid, the Affordable Care Act, and environmental protections.

 

Trump is clearly using similar tactics as Palmer, Hoover, McCarthy and Cohn did to promote his agenda and undermine civil liberties and civil rights. But he is more dangerous than his predecessors as only Donald Trump has reached the pinnacle of the Oval Office. Trump has learned very well from their examples, and it requires vigilance and activism to cope with the threat he represents every day.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171640 https://historynewsnetwork.org/article/171640 0
ROOAARRRR: 66 Million Years of Tales about the Big, Bad T-Rex Dinosaur

 

In a stunning scientific discovery, last week Canadian scientists unearthed a 66-million-year-old Tyrannosaurus Rex dinosaur that weighed 9.5 tons and was 65% intact. The startling discovery of the world’s largest T. Rex was hailed throughout the scientific world

And, speaking of good timing…

* * * *

As you walk through the fascinating new exhibit on the history of the legendary Tyrannosaurus Rex dinosaur (T. Rex to his loyal fans) at New York’s American Museum of Natural History in New York, you hear this endless THUMP, THUMP, THUMP sound. It is the sound of T. Rex charging through the jungle in pursuit of yet another animal as his prey. The big, broody T. Rex, in all of his 66 million years of glory, and such a star on the silver screen for decades, is the subject of a new, brash and bold exhibit at the museum.

The exhibit, T. Rex: The ultimate Predator, opened last week at the museum, at Central Park West and W. 81st Street. It tells the 66 million year history of the T. Rex and explains how the huge, vicious two-legged dinosaur with the short arms evolved from a smaller and far less dangerous jungle creature into the much feared King of the Jungle.

The new T. Rex exhibit on the third floor of the museum  is the crowning touch to the large dinosaurs exhibits at the museum and helps, in an exciting way, to show visitors how all those movies and songs about T. Rex and his fellow dinosaurs emanated from the T. Rex over so many years.

Other dinosaur skeletons, fossils and mummies are house in the Hall of Dinosaurs on the fourth floor of the museum. The dinosaur hall has a towering T. Rex, a Ceratosaurus, the Stegosaurus, the Triceratops, Stegosaurus, Mammoth and a collection of fellow dinosaurs whose history stretches back 128 million years. The centerpiece of the hall is the 122’ long Tyrannosaur, found in an Argentina farm a few years ago, a beast whose skeleton is so long that it couldn’t be contained in one hall; his neck and head sticks out into a second hall, like a cartoon character, and is a daily tourist attraction for wide-eyed adults and giggling children.

The museum has done a nice job of setting up the T. Rex exhibit. They time the exhibit so the crowds are not large and you select your time when you obtain your tickets. The first few display boards tell you how the T. Rex descended from 24 different T. Rex species over millions of years. He was so big and heavy (around 9 tons) because all he did was eat and grow (he gained 140 pounds a month, and that’s without cookies). From there, you follow the story of how the dinosaur changed over the years. In the early days, the T. Rex was a much smaller animal. The meanest animal that ever lived, a really bad dude, was not mean in the beginning. In fact, don’t tell anybody, he was completely covered in feathers. Yes, feathers.

T. Rex now stands at the heart of the dinosaur exhibits at the museum. The reason is because the museum’s curator, Mark Norvell, and his staff have done a painstaking job of making him easy to understand without taking the menace out if him. They have also turned this into a very family-friendly exhibit, one that adults and kids alike can enjoy. And yet, T. Rex still as scary as ever.

Museum President Ellen V. Futter was so excited about the T. Rex idea that she made it the first major exhibit in the museum’s 150th anniversary celebration.  “Dinosaurs, and tyrannosaurus rex in particular, are such an important and iconic part of the Museum and have been throughout our history,” she said. “So, it seems fitting to launch the…anniversary with a major new exhibition on the ever-intriguing King of Dinosaurs.”

T. Rex was one of the most intelligent of the dinosaurs and yet, at the same time. mean and nasty.

He was such a belligerent predator because his razor sharp teeth could not only tear an animal to pieces, but crunch so hard that it made the prey’s bones explode. The dinosaur even ate the bones of his prey in a few big gulps. He also had a heightened sense of smell and hearing to easily hunt down animals, even those in hiding. No one could escape him 

His rage that made him so famous stemmed from injuries throughout his lifetime (most T. Rexes lived until about 34). “He had to go into battle every time he wanted to eat something, and the wounds he suffered piled up and he was always in some kind of pain,” said a tour guide.

The best way to see the sprawling exhibit is with a tour guide that you can find somewhere on the floor. The guides gather together a dozen or so visitors and, at no charge, give comprehensive tours discussing the T. Rex. They also answer any and all questions. They are also funny. Mine told a little girl that if a T. Rex gobbled her up she was so small that she would serve only as his “potato chips.” She laughed. 

It is appropriate that the museum is presenting the T. Rex exhibit because the world’s first T. Rex find was made by the museum’s famous paleontologist and fossil hunter, Barnum Brown. He uncovered the first-ever T. Rex remains in 1902 in Montana.

The T. Rex is everywhere in American culture – dolls, stuffed animals, coffee mugs, posters and even in rock and roll songs. T. Rex and other dinosaurs have been a staple of Hollywood movies for years. There have been more than 150 movies centered around dinosaurs, with the T. Rex, the Brontosaurus and others galloping through thick jungles and across meadows. They started in the silent movie era and picked up steam with the first King Kong movie in 1933. They gained wide popularity with all the recent Jurassic Park films (I saw Jurassic World on television just last Tuesday). Along the way there were all the Godzilla dinosaurs, The Land Before Time movies, the Lost Continent and Lost World movies, the Tarzan films, Mysterious Island, and of course, The Flintones (yaba-daba-doo).

The exhibit also includes the skeleton of a four-year-old T. Rex, a not so cuddly tyke, dinosaur teeth (pretty big and sharp) and a dozen or so re-created T. Rexes at different ages.

There is plenty to do at the exhibit besides stare at old bones. There is a “Roar Machine” on which you can listen to how the T. Rex sounded when he was angry. There is an “investigation station” that shows you various dinosaur fossils. There is a five minute long virtual reality show viewed with a mask in which you see the world as the dinosaurs saw it. There is a really neat, wall sized animated movie of a T. Rex rumbling through the jungle. You walk or run in front of it and the dinosaur follows your motion and chases you, roaring like mad and thumping away with his huge paws.

If you love dinosaurs and/or being chased by one, this is the exhibit for you. Bring the kids.

RRRRRROOOOAAARRRR….

The American Museum of Natural History is at Central Park West and W. 81stStreet. The museum is open daily, 10. A.m. to 5:45 p.m. It is closed on Thanksgiving Day and Christmas Day. The exhibit runs through August 9. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171638 https://historynewsnetwork.org/article/171638 0
The Temptations: Born in the Turmoil of the 1960s and Still Conquering

 

There have been a number of so-called “jukebox musicals” in theater over the years, but few, such as The Jersey Boys and Beautiful: The Carole King Story, succeeded. That’s because the ones that failed had lots of memorable music but no story. Now, following in the footsteps of the ones that worked, comes Ain’t Too Proud: The Life and Times of the Temptations, an out and out hit, and a great window on entertainment history. The play opened last week at New York’s Imperial Theater on W. 45th Street.

Everybody remembers the Temptations, voted the greatest Rhyme and Blues group of all time. They had huge hits with My Girl, Get Ready, If You Don’t Know Me by Now, Ain’t Too Proud, Papa Was A Rolling Stone and other tunes in a career that has stretched more than 50 years. They charmed you not only with their fine tunes and silky voice, but with that smooth, gorgeous choreography which served as a show in itself.

But there was also a lot of drama in the lives of the Temptations, all of them, and that is the heart of this terrific new play and the reason that it works so well. The play, written with great style by Dominique Morisseau, starts in 1967, the time of the Detroit riots, as Detroit’s Otis Williams is trying to sign up singers for his new music group.  He gets singers from his high school and neighborhood, bass Melvin Franklin, Paul Williams, Eddie Kendricks and flamboyant lead singer David Ruffin. They went through several names before settling on The Temptations, grabbed their song sheets and headed for the stage.

The story of the Temptations mirrored the story of America in the 1960s and ‘70s. The black group could not stay in white hotels after appearances in the South, suffered through the assassination of Martin Luther King Jr. and debated, as all African American music groups did, about what should they do, as high profile people, to help the Civil Rights Movement. They put up with rickety old tour buses, argued with their early manager and thirsted for fame.

There are very funny scenes in the play. In one, their new, aggressive manager, madder than hell at them, flies off the stage in a gorgeous, gleaming convertible. In another, Otis follows Motown head Berry Gordy into a theater’s men’s room in order to meet him. A minute later, all four Temptations singers run into the men’s room, too.

Later in the show, during the Vietnam War era, producer Berry Gordy tells the Temptations that the Vietnam protest song, War, will not go anywhere and that they should not record it. Someone else did and the song became a monster hit (“So I’m not always right…” bemoaned Gordy on stage to laughter from the audience).

Gordy took them into his stable of talent that included Diana Ross and the Supremes. The Supremes, who performed a number of songs, were a delight, led by Candice Marie Woods as Diana Ross.

Writer Morisseau was wise to let Otis Williams be the narrator of the show. The musical (and a previous TV movie) were based on his autobiography. He was with the group all of its life and so he can tell the story as an eyewitness and actor Derrick Baskin, as Otis, does a fine job holding the different segments of the tale together. Director Des McAnuff does a fine job of letting his actors flesh out the characters. 

The performers in the show are all wonderful. Some of the standouts are James Harkness a the alcoholic Paul Williams, Jawan M. Jackson as the deep voiced and flip Melvin Franklin, Jeremy Pope as the charming Eddie Kendricks, Ephraim Sykes as the bouncy, devilish David Ruffin, who brings the house down with his leg splits, body whirls and microphone flips, Jahi Kearse as produce Berry Gordy, Christian Thompson as Smokey Robinson,  Shawn Bowers as Otis’ son Lamont and Nasia Thomas as singer Tammi Terrell, who died at 24.

All of the Temptations’ great hits are performed on stage, to the out of this world, fabulous choreography of Sergio Trujillo. Ain’t Too Proud… is the story of the changes a group has to make, even when it is successful. The best example is the arrival and departure of sensational singer David Ruffin, later a star in his own right. He was a brilliant singer but an emotional mess who dragged the whole group down with him. Another singer, Eddie Kendricks, had the same problem and he was dismissed. There were singers who left to bask in retirement and one, Paul Williams, forced out after he became a bad alcoholic, later killed himself.

There are several themes in the musical that were common to all music groups in the 1960s and ‘70s. One – how do you keep a group together when members are in turmoil.  A number of groups collapsed because of inner friction in that historical era, but not the Temptations. Leader Otis Williams worked hard to find talented people to replace the talented people who were let go or quit and the group survives today (24 different singers over the years).

Second, how does a group that is very successful in the ‘60s with one style of music (reflected by My Girl) shift gears and come up with new music in the ‘70s and ‘80s, such as Papa Was a Rollin’ Stone? Third, how much does a group and its members have to give in to the wishes of the producer, Berry Gordy? He was a genius and they knew it, but still fought for themselves.

Fourth, the Temptations were always competing against some other group. At Motown, they had the enviable job of competing against the Supremes not only because of the Supremes’ enormous popularity, but because Berry Gordy was in love with Diana Ross.

And at the same time, the Temptations, a black group, had to compete against white groups and constantly had to avoid the tag of a “black music” group and still perform crossover music for mostly white audiences.

They were a reasonable success despite all of that, led by the hard working Otis Williams, who, because of tours, did not see his wife and son very much and missed them.

The group’s story was at times glorious and at time difficult, as it was for all the ‘60s music groups who had to compete against each other and the changing times. The play is a great behind the-scenes look at the history of the Temptations and American music. The Temptations did battle with the Supremes, Beatles and other groups and with their own, personal demons. too.

If you see the show, remember, they’re gonna make you love them…..

PRODUCTION: The show is produced by Ira Pittelman, Tom Hulce, the Berkeley Repertory Theatre and others. Scenic Design: Robert Brill, Costumes:  Paul Tazewell, Lighting: Howell Binkley Sound: Steve Canyon Kennedy, Projection Design: Peter Nigrini, Fight Director: Steve Rankin. Choreography: Sergio Trujillo, The play is directed by Des McAnuff, It has an open ended run.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171639 https://historynewsnetwork.org/article/171639 0
MMT and Why Historians Need to Reclaim Studying Money

Pictured above: Campaign buttons from the 1896 election. Some monetary references are still obvious (“the money we want”); others, now more obscure (Bryan’s campaign advocated a “16 to 1” ratio between silver and gold). Supporters of a gold standard had been known as “goldbugs” since the 1870s; the silver beetle-shaped pins were made in response. 

 

 

MMT (Modern Monetary Theory—a form of post-Keynesian economics) is everywhere these days. Alexandria Ocasio-Cortez and Bernie Sanders embrace it; Paul Krugman and George Will write about it; the Financial TimesForbes, and The Economist have all run columns about it. Even the men’s parenting website Fatherly had an article on it. Do historians have anything to add? 

 

Historians know this is not the first time that American politicians, scholars, and ordinary people alike have asked fundamental questions about what money is, how it works, and who it benefits. The 1896 presidential election is famous for William Jennings Bryan’s “Cross of Gold” speech, but he was only one of many that decade to be talking and writing about the comparative merits of gold, silver, and paper. Americans got in barroom fights about it (at least one man died), sang songs about it, and composed poems on the subject. The economist President of Brown University, E. Benjamin Andrews, nearly lost his job because of his silverite views. Newspapers across the country reported when a Stanford professor asserted that faculty were forced to teach in favor of the gold standard; “Coercion in the colleges” ran the headline in the Morning World-Herald (Omaha).  

 

Today, as in the 1890s, the fundamental question is whether prosperity can be increased and inequality reduced by injecting more money into the economy. Orthodox economists—the vocabulary of “orthodoxy” has been part of economics since the first professorships were created—say it cannot: that growth (whether it be the manufacture of more stuff, or the greener production of better stuff) has to happen in the “real” economy and that money simply facilitates buying, selling, saving, and investing. As J. Laurence Laughlin (first chair of Economics at the University of Chicago) wrote in 1895: “Money… no matter how valuable, is not wanted for itself. It is only a means to an end, like a bridge over a river.” No one, Laughlin continued, could really believe that adding silver to the money in circulation would produce “bushels of wheat and bushels of corn and barrels of mess pork”—only mine owners and their investors would gain by its being minted. 

 

MMTers and silverites, in contrast, emphasize the work left undone—factories shut, children and the elderly not cared for, solar panels not made and installed, etc.—because there is too little money in circulation. MMT’s proposed mechanism for adding money to the economy is hardly that of the “Free Silver” movement, but the two fundamentally agree that money is a political phenomenon (a “creature of the state” in the words of Abba Lerner’s 1947 paper). Populists in the 1890s campaigned against the 1873 law that demonetized silver; MMTers today, against the rhetoric of “deficits” and mandates for pay-as-you-go budgeting that have been central to American politics since the Reagan Revolution. MMT crucially claims that a monetary sovereign cannot go broke in its own money—it can always issue more. We should therefore think of public deficits not as bills to be paid, but as indicators of how much we as a nation care about particular issues. Since money exists for wars and walls, they say, it can just as readily be found for high-speed trains and clean-power energy. 

 

If the sovereign uses its money-issuing power unwisely—if more exists in the system than there is work to be done or goods to be bought—then prices for everything could rise. Should there be high inflation, the government should spend less and tax the excess money back into its coffers. MMT, in other words, does recognize that deficit-spending could become problematic, but not for the reasons usually given. A country like the United States—a public entity that is sovereign, does not age or plan to retire, and is imagined as existing indefinitely into the future—is not a household that needs to balance its budget. Using examples from personal finance to explain public spending may give a homey touch to political campaigns, but they are fundamentally misleading. 

 

In the way it links monetary policy, fiscal policy, and social policy—the Jobs Guarantee and something like a Green New Deal are not things to be “paid for” via MMT, but are part of it—MMT rejoins the Enlightenment tradition of political/social economy. Adam Smith, remember, was not an economist (the word was barely used in the eighteenth century) but Professor of Moral Philosophy and an opponent of many of the developments—growth of corporations, laissez-faire capitalism, the exploitation of workers—for which he is now imagined to stand. As Gareth Stedman Jones and others have shown, the selective reading of Smith as “father of capitalism” was an interpretation formed in reaction to the social radicalism of the French Revolution. So, too, did political context play a significant role when economics became a distinct, and then increasingly model-based, social science some 120 years ago. With the strikes and labor unrest of the 1880s and the Populist Movement of the 1890s, economists who spoke in favor of unions or about the plight of workers under monopoly capitalism either found themselves out of a job or re-appointed to Social or Political Science departments. There is a long institutional history, then, to MMTers’ self-positioning as underdogs and voices in the wilderness. 

 

While MMT economists (Stephanie Kelton, Pavlina Tcherneva, Randall Wray, Warren Mosler, and Bill Mitchell are five big names to know) quarrel with their fellow post-Keynesians over models and implications, historians need to reclaim money as something to be studied in specific social and political contexts. Historians know what all financial advisors profess to recognize: “past returns are no guarantee of future results.” In fact, however, the entire field of economics—with its assumptions about trend lines, models, and transhistorical facts (such as Milton Friedman’s assertion that “inflation is always and everywhere a monetary phenomenon”)—has largely failed to internalize this salient and important truth. 

 

The historian Andrew Dickson White, first president both of Cornell University and of the American Historical Association, made himself part of the 1890s debate with his Fiat Money Inflation in France: How it came, what it brought, and how it went (1896). An earlier version, entitled Paper Money Inflation in France, had appeared in 1876 when the Greenback Party (named for the paper money issued by the United States to fund and win the Civil War) was at its peak. In both pamphlets, White used the example of the French Revolution’s paper money, the assignats, to argue against increases to the money supply and for “fighting a financial crisis in an honest and manly way.” By changing the first word in his title and adding material borrowed from Macauley’s History of England about seventeenth-century coinage debasements, White expanded his target to include all “fiat” currencies—all money created by government order. Re-issued in 1914, 1933, and 1945 by various publishers and in 1980 by the Cato Institute, White’s pamphlet remains widely available today. This is a record of impact and influence few historians can match, but I do not suggest it is a model we should follow. Couched in a vocabulary of natural laws—at one point, White describes issuing paper money as equivalent to opening dikes in the Netherlands; elsewhere, he compares it to corrosive poison and cheap alcohol—Fiat Money Inflation in France appealed to partisans of the Gold Standard because it  seemed to show fiat money’s inevitable outcomes. But nothing in history is inevitable (even if some things are far more likely than others) and the eventual failure of the assignats owed as much to the specific politics of the Revolution as to any timeless laws of economics.

 

MMT, along with the euro crisis and awareness of  austerity’s social effects, has done much to open monetary and fiscal debates to wider audiences. Simply recognizing that money is political and historical (central, as Harvard Law Professor Christine Desan likes to say, to how a polity constitutes itself) is a difficult breakthrough for most people. On the other hand, seeing money in this way doesn’t—in a fractured polity characterized by demagoguery and high levels of inequality—make policy any easier to write or implement. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171617 https://historynewsnetwork.org/article/171617 0
Elizabeth Blackwell, MD., Hero, Humanitarian, and Teacher

 

To celebrate Women’s History Month, the television quiz show Jeopardy, recently posted a category related to female historical figures. The contestants, sharp, enthusiastic, and knowledgeable, answered all the questions in that category, except for one. When host Alex Trebek asked, “Who was the first female doctor in the United States,” all three contestants failed to press their buzzers. Trebek looked at them skeptically and simply said, “Elizabeth Blackwell.  While the contestants surely would have immediately recognized the names of Sojourner Truth, Susan B. Anthony, Harriet Tubman, Harriet Beecher Stowe, Florence Nightingale, Jane Addams, Coretta Scott King, Amelia Earhart, and Marie Curie, for example, I wondered why none of the contestants even thought to take a guess. 

 

Thef act that Elizabeth Blackwell was the U.S.’s first female doctor is certainly worthy of recognition. What is far more important is what she did. She was a pioneer in fostering the role of women in medicine both in the United States and Great Britain. In the United States, she founded in New York City an infirmary for poor women and children, during the American Civil War she provided invaluable assistance combatting infectious diseases and treating the sick and wounded for the Union cause under the jurisdiction of the United States Sanitary Commission, and prior to returning to England she established a medical college for the training of female physicians. In Great Britain she duplicated these efforts where she led the way with the formation of the National Health Society as well as the London School of Medicine for Women where she served as professor of gynecology from 1875 to 1907. Why she did not receive the recognition she deserved during her lifetime and afterwards until the second half of the twentieth century is due in large measure to the profession she chose, which until recent times believed that women were better suited as nurses and not physicians. But perhaps more importantly, when rejected for hospital positions, she used her skills as a teacher to become not only the nation’s first female physician but also its first female professor of medicine. Very, very few know about the latter. 

 

Blackwell was born on February 3, 1821 in Bristol, England, the third of nine children. Her father, Samuel, was a devout Quaker and one of the founders of the Bristol Abolition Society. Her subsequent social activism, especially her anti-slavery views, as an adult was greatly influenced by her father’s beliefs.  

 

During the Bristol Riots of 1831, Samuel’s small sugar business was destroyed by fire. Disillusioned and nearly destitute he relocated his family to the United States where he established a new refinery in New York City. However, the Panic of 1837 hit his business hard,and in 1838 he moved his family to Cincinnati in order to re-establish his business. Three months after arriving in the QueenCity, Samuel died leaving his family destitute. Determined to survive Elizabeth, along with her mother and two older sisters, started a small private school. Later Elizabeth also taught in Kentucky and North Carolina.

 

While she was working as a school teacher, she was drawn to the field of medicine. In 1845, she began reading medical books under the direction of Dr. John Dickson of Asheville, North Carolina and his brother, Dr. Henry Dickson of Charleston, South Carolina. What drew her into the study of medicine was her friendship to another woman who was suffering from a terminal illness; her friend expressed to her how embarrassed she felt going to male doctors and it was her wish that someday there would be female physicians better able to relate to her personal feelings as a woman. Determined to learn more with regards to medical treatment for women, in particular, in 1846 she applied to medical schools in New York City and Philadelphia, only to be rejected because of her gender. Finally, in 1847, Geneva Medical School, a small medical school in upstate New York, gave her a chance. She did not disappoint, finishing at the top ofher otherwise all-male graduating class. While attending classes she was largely ostracized and made to feel unwanted. She received her medical degree in January 1849.

 

Perhaps because of her own family struggles, she chose to work briefly with patients at a Philadelphia alms house, an experience that provided her with a considerable amount of knowledge in the study of epidemiology. Curious to learn more about this field she moved back to England in April of that year, where she worked under Dr. James Paget in London. There, she developed a close relationship with Florence Nightingale and Elizabeth Anderson, pioneers in professional nursing and women’s health care in Great Britain. Paget became a leader in the study of women’s breast cancer (a form of the disease is named after him); Nightingale and Anderson were attracted to Blackwell because of her work with Paget and her interest in larger medical issues such as childbirth (she briefly went to Paris and studied at La Maternite)  and infectious or communicable diseases. 

 

Returning to America in the summer of 1851, she was denied positions in New York City’s hospitals. In part this was due to her contracting a disease during a procedure on an infant that led to blindness in one eye while studying Midwifery in Europe. Her career as a surgeon was over but why she was not hired to teach at one of these hospitals is troubling. Nonetheless, by this time her sister, Emily, also had a medical degree, and the Blackwell sisters together with Dr. Marie Zakrzewska, established the New York Infirmary forI ndigent Women and Children. This infirmary took the lead in presenting important lectures on hygiene and preventive medicine, including the training and placement of sanitary workers in the city’s poor areas. As a former schoolteacher, Blackwell was well suited for the job. 

 

Attempting to cast a wider net regarding health care for women she also published her own account on such matters, one aimed, specifically, at young ladies, The Laws of Life, with Special Reference to the Physical Education of Girls (1859). This book called attention to the importance of healthy living and proper exercise of girls, who were now confronted by the growing complexities of a developing industrialized society. It was important for women to be both strong and healthy as contributors to this new way of life. “In practical life, in the education of children, in the construction of cities, and this arrangements of society,” she wrote in her introduction to this book, “we neglect the body, we treat it as an inferior dependent, subject to our caprices and depraved appetites, and quite ignore the fact, that it is a complex living being, full of our humanity[.]” 

With the outbreak of the American Civil War in 1861, Blackwell rallied other female reformers to establish the Women’s Central Relief Association in New York City to train nurses for the Union Army. Her motivation and commitment to the Union cause grew out of her own anti-slavery beliefs. Providing medical aid and comfort was her way of upholding her Quaker beliefs while sustaining her support for the Union. The association quickly became part of the United States Sanitary Commission (USSC), a private relief agency to assist the sick and wounded. With Blackwell in the forefront many women were trained and began serving on hospital ships and as army nurses and sanitary relief workers. Working hand-in-hand with USCC, Blackwell orchestrated the building and running of hospitals and soldiers’ lodging houses and devised a communication system that delivered letters and telegrams to men in the field.

In 1868, she and her sister Emily established the Women’s Medical College of the New York Infirmary where she served as a professor of hygiene. The next year she decided to return to England where she would reside permanently. In large measure this was due to previous conversations she had with Nightingale, who had expressed to her the need to establish a medical college for women like she had done in the United States. Given that England was now a mature urban-industrialized society, whereas the United States was just beginning to experience the transition from agrarian to industrial, England offered Blackwell more opportunities to explore national health issues on a grander scale. Upon her return she helped form the National Health Society, designed to educate citizens on the importance of health and hygiene issues and founded the London School of Medicine for Women.

During her remaining years—she died at her home in Hasting sin Sussex on May 31, 1910--Blackwel lextended her outreach to promoting municipal reform co-op communities, prisoner rehabilitation, and the Garden City movement—a method of urban planning begun by Sir Ebenezer Howard designed as planned self-contained communities surrounded by lush “greenbelts that provided for areas of residences, industry workplaces, and agriculture." Her humanitarian reform efforts went beyond medical treatment and education, although it is fair to state that she considered these attempts part of her professional obligation.

Although Jeopardy may have acquainted millions of viewers with Blackwell’s occupation on this show it falls far short of calling attention to her many achievements as a leading female figure. Simply remembering her as the country’s first female doctor shortchanges her numerous contributions to women’s history in the United States and Britain. She remains a heroine for her pioneering research into female health issues, as a teacher, for establishing medical schools for women in the United States and Great Britain, and for risking her own health and welfare when volunteering her services to assist sick and wounded Union soldiers. As a humanitarian she also deployed her medical expertise to help indigent women and children by building infirmaries and developing local and national health agencies associated with the growing complexities confronting nineteenth century urban-industrialized societies. Her humanitarian contributions, moreover, led to her association with urban reform efforts in the twilight of her career.  

Yes there are reminders of her place in history. There is a statue of her on the lawn at Hobart and Smith College (formerly Geneva Medical College), an 18 cent postage stamp dedicated in her honor in 1974, a 2003 historical marker established by the Ohio Historical Society, scholarly works about her life as a physician and inclusion in the National Women’s Hall of Fame, and the Elizabeth Blackwell Medal presented by the American Medical Women’s Association —the award was officially established in 1958.  Although she became a naturalized U.S. citizen, furthermore, she is also the first woman admitted to the British Medical Register permitting her to practice medicine in the United Kingdom. Yet it is still very puzzling why she is not generally known among the populace at large. Just witness the Jeopardy contestants.  

What needs to be done is to give her greater exposure in our secondary social studies textbooks and teaching. Include a description of her many accomplishments along with a photo caption that reads:  “Elizabeth Blackwell, Physician, Heroine, Humanitarian, and Teacher.” What may very well capture the students’ attention is the “teacher” description. That is the one aspect of her life which has received the least attention. Yet it could be her most valuable contribution to the study of women’s history.  After all, she used that skill to inspire women against difficult odds to follow in her footsteps. Now it is up to textbook publishers and schoolteachers to give Blackwell her just due.

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171612 https://historynewsnetwork.org/article/171612 0
The Perils of Criminal Justice Reform

Living facilities in California State Prison (July 19, 2006)

 

 

I started working on Beyond These Walls: Rethinking Crime and Punishment in the United States during the Obama presidency. I wanted to understand and hopefully explain why no substantial reforms of the carceral state occurred in the second decade of the 21st century, despite militant street protests against police killings and widespread consensus among liberals and libertarians that something needed to be done about the country’s unprecedented rate of imprisonment. 

 

Reform is one of the most overused, misused, and Orwellian terms in the English language. “I am well convinced that it is kind, humane, and meant for reformation,” wrote Charles Dickens in 1842 after he witnessed a Pennsylvania prison’s system of silent solitary confinement. But its outcome, he observed, was to subject prisoners to “torturing anxieties and horrible despair” that left them “dead to everything.”

 

This combination of benevolent rhetoric and punitive measures is a persistent theme in American criminal justice history. During World War I, for example, the federal Comission on Training Camp Activities claimed to be acting in the interest of “delinquent women and girls” by rounding up and detaining without trial some 30,000 of them suspected of spreading venereal diseases and perversion, while the men received health care and wholesome entertainment.

 

When government officials and their allies call themselves reformers, it’s time to look out, and to look deeply and carefully at what is being proposed. Most government-sponsored reforms of criminal justice operations manage and rearrange existing institutions of power. 

 

Not all reforms are manipulative and repressive. There is also a tradition of progressive grassroots reforms that try to make a difference in and empower people’s everyday lives. But these efforts to accomplish structural reforms typically are undermined in practice. Why have there been more failures than successes, and what is needed to reverse this sorry record? 

 

 

Historically, the overwhelming majority of reforms are top-down, state-engineering initiatives that are never intended or designed to expand the rights or improve the well being of their recipients. One of the earliest examples was the Progressive Era’s child-saving movement that formally did away with due process for juvenile delinquents. It recruited social workers, public health personnel, police, and urban reformers to send thousands of European immigrant youth to punitive reformatories, and Native American youth to boarding schools where they were punished for “speaking Indian.” In the 1940s, the Preston School of Industry in California was “organized like the military,” a former prisoner recalled. “We marched everywhere, and were always on ‘Silence’.” 

 

The child-saving movement was a model for many other government reforms that, in the words of historian Lisa McGirr, came loaded with “strong doses of coercive moral absolutes,” such as forcing the children of Jehovah’s Witnesses to salute the flag during World War I in the name of spreading patriotism, and then criminalizing their parents when they refused. In the 1920s, the federal Prohibition Bureau, with five times more staff than the FBI, saved the drinking poor from the scourge of alcohol by arresting them, while the wealthy drank in private clubs or bribed their way out of arrest. Between the world wars, government agencies compelled the sterilization of some 60,000 working class women in the name of purifying motherhood. Similarly, in the 1950s, “protecting the family” supposedly justified purging gay men from government jobs and subjecting them to the kind of systematic harassment by police that young African American men routinely experience. 

 

We see the same kind of coercive benevolence at work today when local governments and professional functionaries invoke civility codes to tear down homeless encampments and in cities such as Irvine, California, run beggars out of town in order to “keep our streets safe.” 

 

The second type of reform has a democratic impetus and is intended to expand the rights of the disenfranchised and improve people’s everyday lives. Pursuing this kind of grassroots initiative requires the stamina of a marathon runner, for there is a long history of trying to substantially reform criminal injustice operations that typically does not end well. 

 

Take, for example, the 1963 U. S. Supreme Court decision in Gideon v. Wainright that required states to provide attorneys to defendants in criminal cases if they cannot afford counsel; and the bail reform movement that achieved passage of the Federal Bail Reform Act of 1966 that granted release on own recognizances (OR) to federal defendants in noncapital cases. 

 

The Gideon case represented a victory for activists who had struggled for decades to bring some balance to an adversary system of criminal justice that is heavily weighted in favor of the prosecution. “Thousands of innocent victims,” wrote W. E. B. Du Bois in 1951, “are in jail today because they had neither money, experience nor friends to help them.” The provision of government-funded defense lawyers was supposed to rectify this wrong.  

 

However, the underfunding and understaffing of public defenders, and pressures from criminal court bureaucracies to process cases expediently resulted not in more trials and more pleas of innocence, but in a decline of trials and increase in guilty pleas. How can clients get a “reasonably effective defense” in Louisiana, for example, if a single public defender is expected to carry a caseload of 194 felony cases? “No majesty here, no wood paneling, no carpeting or cushioned seats,” writes James Forman, Jr. about his experience as a public defender in Washington, D. C. It wasn’t unusual for him to want to cry in frustration at the railroading of his clients. “Sometimes the only thing that stopped the tears,” he says, “was another case or client who needed me right then.” 

 

The Federal Reform Bail Act met a similar fate. Much of the legislation’s provisions were destroyed by the Nixon and Reagan governments as new legislation eliminated OR for dangerous defendants, a proviso that ultimately included people arrested on drug-related and non-violent behavior, meaning just about everybody. Today, more than sixty percent of people confined in the misery of local jails are there because they are unable to make bail and do dead time, a travesty of “presumed innocent.” 

 

Too often when progressive reforms are passed, they stand alone as single issues and are generally ineffective because they lack sustained and wide support, or they are whittled away to the point of ineffectiveness. A similar process is at work with the recent First Step Act, Congress’ tame effort at federal prison reform. This legislation originated in the efforts of reformers during the Obama presidency to dramatically reduce mass incarceration nationwide. By the time of the Trump presidency, the libertarian Right dominated the politics of reform and put their stamp on the Act: no relief for people doing time for immigration or abortion or violence-related crimes; the privileging of religious over secular programs; and a boost for the electronic shackling industry. 

 

Too often, substantial reform proposals end up politically compromised and require us to make a Sophie’s choice: release some “non-violent offenders” and abandon the rest, including tens of thousands of men who used a gun during a robbery when they were in their 20s. Or give public welfare relief only tocarefully screened “worthy recipients,” while subjecting millions of women and children to malign neglect. Or, potentially, provide the immigrant Dreamers with a path to citizenship while making their parents and relatives fair game for ICE. 

 

 

It’s not for lack of trying that substantial reforms are so difficult to achieve. There are structural, multifaceted reasons that undermine our effectiveness.  “America is famously ahistorical,” a sardonic Barack Obama observed in 2015. “That’s one of our strengths – we forget things.” In the case of efforts to reform prisons and police, we remember the experiences of Malcolm X, George Jackson, Attica, and the Black Panther Party, but then amnesia sets in. 

 

We need to reconnect with the writers, poets, artists, activists, and visionaries who generations earlier took on the carceral state and forged deep connections between the free and un-free. Let’s remember Austin Reed, a young African-American incarcerated in ante-bellum New York, who told us what it was like to “pass through the iron gates of sorrow.” And the Socialist and labor leader Gene Debs, imprisoned many times for his activism, who made sure his comrades in the 1920s knew that his fellow nonpolitical prisoners were “not the irretrievably vicious and depraved element they are commonly believed to be, but upon average they are like ourselves.” And the young Native American women and men, forcibly removed to boarding schools, who reminded us of their resistance, as in the words of a Navaho boy: “Maybe you think I believe you/ But always my thoughts stay with me/ My own way.” 

 

Revisiting this long historical tradition is important, not out of nostalgia for what might have been or to search for a lost blueprint of radical change, but rather to learn from past reform efforts and help us to understand the immense challenges we face – to “bring this thing out into the light,” as the civil rights leader Fannie Lou Hamer used to say. 

 

In addition to a deep history, we also need a wide vision in order to see that state prisons and urban policing are components of a much larger and more complex private and public social control apparatus that plays a critical role in preserving and reproducing inequality, and in enforcing injustices. No wonder that structural reforms are so difficult to achieve and sustain when carceral institutions are sustained by private police, public housing and education, the political system, immigration enforcement, and a vast corporate security industry that stokes what Étienne Balibar calls the “insecurity syndrome.” 

 

Struggles for equality in the United States have usually been uneven and precarious, with improvements in rights and quality of life for one group often coming at the expense of others – not consciously, but in effect. Our challenge is to rebuild a social and political movement that bridges the divide between a panoply of activists in the same way that post-World War II civil rights and black liberation organizations incorporated prisoners and victims of brutal policing into the Movement. Important single-issue campaigns – to eliminate cash bail, to restore voting rights to millions of former prisoners, and to make American prisons comply with global human rights standards – will have a better chance of success if backed by a multi-issue, grassroots campaign. 

 

We should not give up on big ideas and structural reforms. We never know when a spark will light a fire and energize a movement. Let's remember that it was protests against a police killing in a place like Ferguson that led to the Black Lives Matter movement and compelled a meeting with the president; and it was a high school student protests for gun reform in Florida that prompted a former Supreme Court Justice to call for the abolition of the 2nd Amendment. 

 

The Right has been extraordinarily effective in promoting a dystopia that anchors and propels its law and order policies. We need a comparably progressive vision.  In this moment of resistance and defense, to articulate an ideal of social justice might seem like pie-in-the-sky and a waste of energy. But to get support for progressive policies will require widespread endorsement, and this will only happen if we speak to people’s deeply held anxieties and aspirations. Without a movement and long-term vision that engages people, good policies wither. 

 

It will take nothing short of a broad-based movement, a revitalized imagination, and reckoning with a historical legacy that bleeds into the present to make the criminalized human again and end the tragedy of the carceral state. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171611 https://historynewsnetwork.org/article/171611 0
E-Carceration: Are Digital Prisons The Future?

 

We’ve all heard the unsettling stats regarding the U.S. mass incarceration crisis:

 

The United States holds 5 percent of the world’s total population, yet 25 percent of the world’s prison population.

 

On any given day, there are more than 2.3 million people locked up in jails and prisons across America—an estimated more than 540,000 incarcerated without ever even being convicted or sentenced—and more than 4.5 million folks on parole or probation. 

 

Although millions are imprisoned behind the drab concrete walls and cold steel bars of your stereotypical detention facility, an ever-growing number—estimated as high as 200,000—are instead being clamped with the trendiest weapon of the U.S. prison industrial complex: electronic monitoring.

 

Commonly referred to as “ankle bracelets,” these GPS-equipped devices are praised as cheaper and more humane alternatives—a legitimate remedy, even—to locking up so many within the overpopulated local jails and monolithic fortress-like prisons that have come to visually define mass incarceration within the United States.

 

Though these digital “shackles,” as one detractor calls them, have been utilized by the U.S. criminal justice system for more than 30 years, their use has increased a whopping 140 percent throughout the past decade, with cautionary criminal justice reform advocates sounding the alarm about the tech’s deception, and ramifications. 

 

We investigated this issue in “E-Carceration: Are Digital Prisons The Future?,” the latest episode of our social justice podcast, News Beat. Criminal justice reform advocates view the ballooning prevalence of electronic monitoring as just another way for profit-hungry corporations and billionaires to camouflage the true agenda: maintaining mass incarceration’s sinister legacy of punishing the poor and devastating black and brown communities, while those pulling the strings get even richer. E-carceration’s rise, criticspoint out, comes at a time when officials nationwide are making conscientious efforts to decarcerate, and seeking legitimate reforms—the reassessment of money bail, legalization of marijuana, and proliferation of diversion programs, among these. 

 

“I felt like I was still under carceral control, which I was,” author, educator, social justice activist, and former fugitive and prisoner, James Kilgore, tells us. 

 

A research scholar at the University of Illinois at Urbana-Champaign, Kilgore spent six and a half years in prison for crimes during the 1970s, and more than two decades as a fugitive. He wore a monitor as a condition of his parole.

 

“And the image that always comes to me, is the fact that when I went to sleep at night, I felt as if my parole officer was laying across the bed looking up at me from under the covers,” he says, bristling at the suggestion that being harnessed with such tech is a more compassionate form of punishment.

 

“Most people just said, ‘Well, it’s better than jail,’” continues Kilgore. “And my response was always, ‘Well that’s true, but a minimum-security prison is better than a supermax, but it’s still a prison.’ And by the same token, at an individual level, I would never tell someone, ‘Well, you’re better off staying in prison or in jail than going out on an electronic monitor.

 

“Just like I wouldn’t tell somebody, ‘Well, stay in that supermax where you’re in solitary 24 hours a day, and don’t go to this camp where you can be out free, you know, [for] 16 hours a day, moving around the yard and so forth,’” he adds.

 

Myaisha Hayes, national organizer on criminal justice and technology at the nonprofit Center for Media Justice, and also a News Beat guest, explains that now—as mass incarceration is an increasingly prominent issue—is the time to question these new extensions of prison and assess such strategies before they’re universally accepted. 

 

“As we’re in this moment of bail reform, parole justice, all of these different issues within the criminal justice space, we have a real opportunity to take a pause and think about, okay, if we’re going to end monetary bail, how do we actually address harm and maintain public safety in our communities?” she asks. “Or do we want to just find a technological solution to this issue? 

 

“I think that’s the issue that we’re having to deal with, and I do think…we have an opportunity now to course-correct,” adds Hayes.

 

Among her and other prison reform advocates’ list of concerns is the fact that the four largest providers of electronic monitors are private prison companies, including the most profitable, GEO Group, which has secured lucrative government contracts to operate federal prisons and monitor people in immigration proceedings.

 

The Center for Media Justice reports thatBI Incorporated, a subsidiary of GEO Group, has government contracts with at least 11 state departments of correction, and earned nearly $84 million in revenue in 2017.

 

Boulder, Co.-based BI has also reportedly earned more than half a billion dollars in U.S. Immigration and Customs Enforcement (ICE) contracts since 2004, according to local newspaper the Daily Camera.

 

Privacy and the possibility of even ensnaring other individuals within close proximity to those being surveilled should also be concerns, contends Stephanie Lacambra, criminal defense staff attorney at nonprofit Electronic Frontier Foundation, another guest on our News Beat “E-Carceration” episode.

 

“Locational privacy has been a concern in the deployment of a number of different law enforcement surveillance technologies, from automated license plate readers to the use of facial recognition,” she explains.

 

“I think we should be concerned about location tracking,” continues Lacambra. “Not just in the context of electronic monitoring, but in all of these other contexts as well, because I think they give law enforcement the potential to really aggregate very detailed profiles about all of us, regardless of whether you’re on probation or parole, or awaiting trial.”

 

Kilgore, the former inmate-turned-prison reform advocate, is also the project director for the initiative Challenging E-Carceration, and collaborated with Hayes and others at the Center for Media Justice on an analysis released last year titled “No More Shackles.”

 

It categorically rejects electronic monitoring (EM) as a viable alternative to incarceration; rather, it deems such tech as an “extension” of the very system it’s purportedly helping to rectify.

 

“We view EM as an alternative form of incarceration, an example of what we call “e-carceration”—the deprivation of liberty by technological means,” it reads. “Hence, as part of the greater movement for transforming the criminal legal system, we call for the elimination of the use of monitoring for individuals on parole. When people have done their time, they should be cut loose, not made to jump through more hoops and be shackled with more devices, punitive rules and threats of reincarceration.”

 

News Beat podcast melds investigative journalism with independent music (primarily hip-hop) to shine a light on the most pressing social justice, civil liberties and human rights issues of our day. “E-Carceration: Are Digital Prisons The Future?” and all News Beat episodes can be listened to, downloaded, and subscribed to via Apple Podcasts, Stitcher, Spotify, and wherever else you get your favorite podcasts.   

 

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171615 https://historynewsnetwork.org/article/171615 0
Immigration Restriction by Remote Control

 

On March 12, the Trump administration announced it would close all international offices of US Citizenship and Immigration Services, an action that will choke off the largest channel for legal migration. While much of the coverage of immigration has focused on the Border Wall, we have forgotten that most immigration restriction happens beyond the borders of the United States through what political scientist Aristide Zolberg calls “remote control.”  

 

US Citizenship and Immigration Services and neighboring countries like Mexico now prevent far more prospective migrants from entering the US than border control. It seems like common sense that most immigrants would be stopped from entering the US at the border, but this has not been true for almost a century.

 

When immigration restrictions were first established in the late nineteenth century to keep out Chinese laborers, convicts, people with diseases and prostitutes, the system of passports and visas that we now take for granted did not exist.  Immigration inspectors determined immigrants’ eligibility at ports of entry as inspectors screened passengers for admissibility, including doing medical and psychological exams. Long lines at Ellis Island illustrated the screening process at ports of entry.

 

However, even in the early twentieth century, consular officials conducted medical inspections abroad. At some US consulates, such as the one in Hong Kong, the rejection rate for Chinese migrants was more than fifty percent. Historian Amy Fairchild shows that by the 1920s, consular officials conducted rigorous medical exams and rejected about 5 percent of all applicants, which was 4 times the rate of rejections at US ports. 

 

In 1921, the Quota Act established national origins quotas for each country, but initially the slots were filled on a first-come, first-served basis. Ship captains raced to reach port before quotas were filled, creating chaos. In response, the 1924 Johnson-Reed Act mandated quotas would be filled not by counting immigrants but by counting immigration certificates issued at consular offices abroad. From this point forward, the biggest barrier to entering the US was obtaining a visa, not getting past a border patrol agent, a fence, or a wall.

 

The professionalization of the Foreign Service, the establishment of a universal requirement for passports, and the institutionalization of visas meant that would-be immigrants had to pass through the gauntlet of US restrictions in their home countries, long before they arrived on US soil. This system allowed prospective immigrants to know whether they were eligible to enter before they got aboard a ship. 

 

After 1924, the job of the immigrant inspector was mainly to inspect documents to make sure papers were in order and not fraudulent. The long lines of medical inspectors ceased to exist at Ellis Island. This more efficient system meant that most exclusion of immigrants was not happening at ports of entry but in far-flung consulates around the globe. 

 

The importance of this extra-territorial inspection was not just that it was more efficient, but it denied would-be immigrants any protections from the US Constitution. While prospective immigrants have few constitutional protections before being admitted, over time the US courts recognized that even undocumented immigrants within the US have rights to due process. By keeping migrants far from US soil, or claiming they have not technically entered even though they are on US spoil, the government denied them the possibility of using the US court system to apply for asylum or challenge decisions by consular officials.  

 

Keeping potential asylum seekers off US soil is why Trump demands that Central American migrants remain in Mexico while they await their asylum hearings. It’s also why the US has been paying Mexico hundreds of millions of dollars to detain and deport Central Americans since the mid-2000s.  Since 2015, Mexico deported more Central Americans than the United States, reaching almost 100,000 in 2018.

 

The dramatic rise in visa denials in recent years prevents hundreds of thousands of immigrants and non-immigrant visitors from entering the US.  Consular officials have drastically cut the number of non-immigrant visas issued from almost 11 million in 2015 to just over 9 million in 2018. At the same time, new international students in the US dropped by almost 7 percent in the 2017-18 academic year, and the latest data show signs of a continuing decline. Shuttering overseas immigration services will make it even harder for immigrants to apply for legal entry to the US.

 

Comparing data from 2016 and 2018, analysis by the Cato Institute shows that denials of visas to potential immigrants have increased since Trump took office by more than 37 percent.  In 2018, 150,000 more immigrants were refused visas than in 2016. 

 

While the president wants to focus our attention on the dramatic rise in border apprehensions, reaching 467,000 people last year, more than 620,000 were denied immigrant visas. 

 

Today, the backlog in visa applications stands at 4.1 million worldwide, and for Mexicans it is 1.3 million.  The wait time for most Mexicans is thus well over 20 years. When people talk about immigrants getting in line and waiting their turn, they need to recognize that that the line has become absurdly long.

 

Consular offices around the world are ground zero for immigration restriction.  No matter what your perspective, it’s time we focused on where most immigration restriction really happens.  

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171613 https://historynewsnetwork.org/article/171613 0
The Psychotherapy of Marcus Aurelius

 

Did one of Rome’s wisest and most revered emperors benefit from an ancient precursor of cognitive psychotherapy?

 

The Roman emperor Marcus Aurelius mentions undertaking Stoic “therapy” (therapeia) at the start of The Meditations, his famous journal of personal reflections on philosophy.  He writes, “From Rusticus, I gained the idea that I was in need of correction and therapy for my character.” Junius Rusticus was one of Marcus’ closest and most beloved advisors, a mentor to him in Stoic philosophy, perhaps even serving as a sort of therapist or life coach to the emperor.

 

Marcus mentions that he struggled at first to manage his own feelings of anger with certain individuals, including Rusticus.  There are numerous references to psychological strategies for anger management scattered throughout The Meditations.  It’s a topic to which he keeps returning, at one point listing ten different techniques for overcoming anger.  He describes these Stoic therapy strategies as gifts from Apollo, the god of healing, and his Muses.  For instance, he advises himself to pause when becoming angry with another person and first investigate whether or not he’s guilty, or at least capable, of similar wrongdoing himself.  Of course, bearing in mind our own imperfections can prevent us flying into a rage with others and help us move closer toward empathy, understanding, or even forgiveness in some cases. 

 

Marcus also frequently recounts the use of psychological techniques for coping with pain and illness.  Research shows that’s another problem with which cognitive-behavioural therapy (CBT) can help.  Marcus had a reputation for physical frailty and poor health in adulthood.  He particularly suffered from chest and stomach pains, poor appetite, and problems sleeping.   His Stoic mentors taught him to cope, though, by using mental strategies such as contemplating the temporary nature of painful sensations or their limited intensity and location in the body.  Rather than allowing himself to think “I can’t bear it,”, he’d focus on his ability to endure pain that was more intense or lasted longer.  He learned to accept painful feelings and other unpleasant symptoms of illness, to adopt a philosophical attitude toward them, and find more constructive ways of coping. 

 

The concept of philosophy as a medicine for the soul, a talking cure, or a psychological therapy, goes back at least as far as Socrates.  However, the Stoics, who were greatly influenced by the practical nature of Socratic ethics, developed this therapeuticaspect of his philosophy even further.  For example, the Roman Stoic teacher Epictetus, whom Marcus greatly admired, taught that, “It is more necessary for the soul to be cured than the body, for it is better to die than to live badly.” He therefore states bluntly, “the philosopher’s school is a doctor’s clinic.” 

 

The Stoics wrote books specifically dedicated to the subject of psychological therapy, such as the Therapeuticsof Chrysippus. Although these are now sadly lost, we can perhaps infer something about them from a surviving text by Marcus Aurelius’ famous court physician, Galen, titled On the Diagnosis and Cure of the Soul’s Passions, which outlines an eclectic approach to philosophical psychotherapy but cites earlier Stoic texts as its inspiration.  What we learn is that an aspiring philosopher should seek out an older and wiser mentor, someone he trusts to examine his character and actions, exposing flaws in his thinking through observation and questioning. 

 

It’s no coincidence, therefore, that the pioneers of CBT originally drew on the Stoics for their philosophical inspiration.  Modern cognitive approaches to psychotherapy are based on the premise that our emotions are largely (if not exclusively) determined by our underlying beliefs.  This “cognitive model of emotion” was ultimately derived from the ancient Stoics. Albert Ellis, who created Rational-Emotive Behaviour Therapy (REBT), the first form of CBT, in the 1950s, wrote:

 

This principle, which I have inducted from many psychotherapeutic sessions with scores of patients during the last several years, was originally discovered and stated by the ancient Stoic philosophers, especially Zeno of Citium (the founder of the school), Chrysippus, Panaetius of Rhodes (who introduced Stoicism into Rome), Cicero [sic., actually an Academic philosopher albeit greatly influenced by Stoicism], Seneca, Epictetus, and Marcus Aurelius.  The truths of Stoicism were perhaps best set forth by Epictetus, who in the first century A.D. wrote in the Enchiridion: “Men are disturbed not by things, but by the views which they take of them.” (Ellis, 1962, p. 54)

 

Aaron T. Beck, the founder of cognitive therapy, another form of CBT, repeated this claim with regard to his own approach.  In his first book on the subject he also quoted Marcus Aurelius’ version of the saying above in a more antiquated translation: “If thou are pained by any external thing, it is not the thing that disturbs thee, but thine own judgement about it.” This simple concept has been so influential, both in ancient philosophy and modern psychotherapy, because people find it of practical value.  From the 1950s onward, psychologically research increasingly lent support to the techniques of cognitive therapy, and in the process of so doing we might say it indirectly validated the practices of ancient Stoicism.

 

There’s an important difference, though.  CBT is a therapy; Stoicism is a philosophy of life, albeit one containing many therapeutic concepts and techniques. CBT is normally remedial, outcome-oriented, and time-limited.  It treats problems that already exist.  The holy grail of mental health, nevertheless, is prevention because as we all know: prevention is better than cure. Stoicism not only provided a psychological therapy, a remedy for existing problems like anger and depression, though, but also a set of prophylactic or preventative psychological skills, designed to build what psychologists today refer to as long-term emotional resilience.  

 

The historian Cassius Dio, for instance, praises Marcus Aurelius for the remarkable physical and psychological endurance that he showed in the face of great adversity as the result of his lifelong training in Stoicism.

 

[Marcus Aurelius] did not meet with the good fortune that he deserved, for he was not strong in body and was involved in a multitude of troubles throughout practically his entire reign. But for my part, I admire him all the more for this very reason, that amid unusual and extraordinary difficulties he both survived himself and preserved the empire.  (Cassius Dio)

 

Stoic philosophy therefore holds promise today as a means of expanding the findings of CBT beyond the consulting room and the limited duration of a course of psychotherapy.  It can provide a model for applying evidence-based psychological strategies to our daily lives on a permanent and ongoing basis in order to build emotional resilience.  For its modern-day followers, Stoic philosophy has once again become a way of life.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171614 https://historynewsnetwork.org/article/171614 0
The Holocaust and the Christian World

 

Holocaust scholars were stunned last year by the results of the April 2018 survey of Americans and the Holocaust, according to which, 31% of all Americans believe that two million or fewer Jews were killed during the Holocaust, while 41% of Americans cannot say what Auschwitz was. Additionally, 22% of millennials (ages 18-34) “haven’t heard” or “are not sure if they have heard of the Holocaust.” Other survey questions concerning the names of countries where the Holocaust took place, the names of ghettos and concentration camps, and the persistence of antisemitism also yielded low awareness rates. 

 

Simultaneously, a recent FBI report shows that hate crimes in the U.S. spiked 17% in 2017 alone—the third straight rise in as many years. The worst anti-Semitic attack in U.S. history---the murder of eleven people at the Tree of Life Synagogue in Pittsburgh occurred in November 2018.

 

The arrival of the second edition of The Holocaust and the Christian World: Reflections on the Past, Challenge for the Future, couldn’t be more timely. Its contributors are among the leading Holocaust scholars of their generation. The editors, emblematic of the ecumenical nature of this ethical undertaking, include Carol Rittner, a Catholic nun and distinguished professor emerita of Holocaust and Genocide Studies at Richard Stockton University; Stephen D. Smith, the Protestant co-founder of Beth Shalom, Britain’s first Holocaust Memorial, and current executive director of the USC Shoah Foundation; and Irena Steinfeldt, a Jewish educator and former director of The Righteous Among the Nations Department at Yad Vashem in Jerusalem.

 

The Holocaust and the Christian World is divided into nine sections: Confronting the Holocaust; Chronology, 1932-1998; Anti-Semitism; The Churches and Nazi Persecution; The Reaction of the Churches in Nazi-Occupied Europe; The Vatican, the Pope, and the Persecution of the Jews; The Challenge of the Exception (the Rescuers of Jews); After the Holocaust: How Have Christians Responded? (Activities and Issues); and, finally, the Afterword. Each section is made up of severalshort, stimulating articles that go directly to the issue at hand and offer suggestions for further reading as well as questions for reflection.  For example: 

 

--“What would have happened if the Churches—Protestant and Catholic alike—had defied Hitler during the Third Reich and stood in solidarity with the Jews?”

--“When does silence become an active form of collaboration?”

--“What should it mean—and not mean—to be a post-Holocaust Christian?”

--“How can we help people to develop faith without prejudice?”

--“What obligation do we have to stand up for people whose beliefs we do not share?” 

--“Who is part of your universe of obligation today?”

 

The Afterword  contains documents related to church matters during the Holocaust from Norway, Greece, France, and Denmark; post-Holocaust statements from the churches in Switzerland, Rome, United States, Hungary, Germany, Poland, France, and Austria; the text of the March 1998 Vatican Statement, “We Remember: A Reflection on the Shoah;” an updated videography and detailed list of on-line sources; and, finally, a select bibliography not contained in the first edition with appropriate entries through 2017.

 

Two impulses drive the text from beginning to end: the frank admission of the role played by Christianity in the Holocaust and the current project of completely ridding Christianity of all anti-Judaism. Carol Rittner and John Roth elucidate the history and Christian roots of anti-Semitism (“the longest hatred of human history”) found in the New Testament and suchearly Church Fathers as Saint Augustine and Saint John Chrysostom, as well as in later Christian preachers and theologians, such as Bernard of Clairvaux and Martin Luther. The authors describe the institutional anti-Judaism of Christian churches, the negative depiction of the Jewish people in Christian preaching and liturgy, and the process by which the Jew became “the other”—“marginalized, persecuted, blamed for every woe, from unemployment and slums, to military defeats and unsolved murders.” In addition, they present a chilling chart that lists Nazi measures on the one hand and prior Canonical Laws on the other, for example, “Nazi Measure: Law for the Protection of German Blood and Honor, September 15, 1935 (Canonical Law: Prohibition of intermarriage and of sexual intercourse between Christians and Jews, Synod of Elvira, A.D.306); Nazi Measure: Book Burnings in Nazi Germany (Canonical Law: Burning of the Talmud and other books, Twelfth Synod of Toledo, 681);  Nazi Measure: Decree of September 1, 1941—the Yellow Star (Canonical Law: The marking of Jewish clothes with a badge, Fourth Lateran Council, 1215).

 

After centuries of being “cast outside the universe of moral obligation,” it is not surprising that most churches and most Christians were indifferent to the fate of Jews during the Nazi plague. In the words of Pope John Paul II, “their history had ‘lulled’ their consciences.” Although this volume clearly states that Christianity cannot be seen as the cause of the Holocaust, it does convince the reader that Christianity prepared the way and then allowed it to happen. As a result, the authors accept the Shoah as part of Christian history. The enormity of Christian responsibility means that the Holocaust can no longer be conceived of as solely what happened to the Jewish people but rather what also happened to Christians who claimed to be disciples of a Jew named Jesus. 

 

Having clearly established the anti-Jewish bias of traditional Christianity, the text then moves to the contemporary task of ridding Christianity of its anti-Judaism. It explains what has been done since 1945 and what still needs to be done. The book’s authors offer several strategies to strengthen the dialogue between Christians and Jews. For example, Stephen Smith wants Christians to take an active part in the remembrance of the Shoah, given that the perpetrators, collaborators, and bystanders were not Jewish. To foster, Tikkun (Healing), Marcia Sachs Littell stresses the importance of developing a Christian liturgy on the Holocaust and the faithful observance of Yom HaShoah (the Day of Holocaust Remembrance) by the joint Christian-Jewish community.  Michael Phayer, Carol Rittner, and Isabel Wollaston call for the development of Holocaust education and courses in Hebrew Scripture and post-World War II Jewish-Christian relations in Christian seminaries, colleges and universities. They also suggest a moratorium on terms such as “Old Testament” which implicitly connote Jewish displacement. 

 

Although the contributors to this book do not hesitate to analyze factors that led to specific historical events, even identifying particular individuals in positions of responsibility, The Holocaust and the Christian World is never excessively accusatory. To learn retrospectively what should have been done by churches and individual Christians during the Holocaust is not tantamount to knowing what we would have done had we been in their place. This wisdom permeates the text whose authors recognize that their responsibility and ours lies in the present, in the creation of a world where another Auschwitz would be unthinkable. 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171609 https://historynewsnetwork.org/article/171609 0
How Hollywood has been fooled by Robert F. Kennedy assassination conspiracy theorists

 

Renewed interest in the Robert Kennedy assassination flourished on the 50th anniversary of his assassination in 2018 and the following year with the publication of two books about the assassination, A Lie Too Big to Fail by Lisa Pease and The Assassination of Robert F. Kennedy by Tim Tate and Brad Johnson. Both books allege RFK’s assassin was a hypnotised assassin manipulated by the CIA and had no real motive thus innocent of the crime.(1)

 

The falsehoods promoted by Pease, Tate and Johnson – a ‘girl in a polka dot dress’ controlled Sirhan, misinterpretations of the ballistics evidence, the creation of suspicion around LAPD crime scene mistakes, multiple teams of CIA-controlled assassins skulking around the Los Angeles Ambassador Hotel, allegations that Sirhan was never close enough to RFK to fire the fatal shot, allegations of Sirhan firing ‘blanks’, accusations against  an innocent security guard Thane Eugene Cesar - have all been addressed and debunked over the years (most recently here: https://historynewsnetwork.org/article/169208  and here:  https://www.moldea.com/RFKcase.html)

 

The authors of the books were given moral support of late by a group of Hollywood celebrities including Oliver Stone, Alec Baldwin, Martin Sheen, Rob Reiner, David Crosby, Mort Sahl, and two Kennedy family members -  Robert F. Kennedy Jr.and Kathleen Kennedy Townsend. Following publicity about the books the group called for a new investigation of the assassination. (The group also alleged that other political assassinations of the 1960s, JFK, MLK and Malcolm X, involved government malfeasance and cover-up and wanted the government to also re-investigate those crimes). (2)

 

It is no mystery why some Hollywood celebrities support the notion of an innocent Palestinian refugee railroaded into a notorious murder case. Many in Hollywood have endorsed and embraced the Palestinian cause mimicking the American left’s decades-old support. Unable to gauge Sirhan’s true character by reading the recent conspiracy books they would naturally assume Sirhan mysteriously acted without a motive.

 

The recent conspiracy authors adopt the modus operandi of previous RFK conspiracy authors in the way they attempt to portray Sirhan as a young man who had no real political agenda or any fanaticism. Tate and Johnson inform their readers that, “Sirhan’s closest friend, Ivan Garcia, had explicitly told them that, ‘Sirhan did not appear to be particularly aware of any political party, was not interested in groups or being a leader and was not openly fanatical about politics.’” (3) Lisa Pease cites acquaintances of Sirhan who described him as polite and non-violent; “…. nearly everyone”, she writes, “described Sirhan as polite, respectful and friendly…Sirhan did not appear to be particularly aware of any political party, was not interested in groups or being a leader and was not openly fanatical about politics”. (4)

 

Although many falsehoods about the case have been debunked over the years this crucialexamination of Sirhan’s motives has been largely ignored or overlooked by the mainstream media – motives which convincingly and conclusively show that not only was Sirhan a political fanatic but he also embraced the concept of violent solutions to political problems. 

 

Although some acquaintances of Sirhan said he was ‘pleasant and well-mannered’ and ‘non-political’ it is not the lasting impressions of those who knew him best. His brother Munir said Sirhan was ‘stubborn’ and had ‘tantrums’(5)William A. Spaniard, a twenty-four-year-old Pasadena friend of Sirhan’s, said the young Palestinian was “a taciturn individual.”(6) Fellow students characterized Sirhan as not only ‘taciturn’ but also  ‘surly’, ‘hard to get to know’, ‘withdrawn and alone’. (7) One of his professors saw Sirhan and another student have an argument that, “almost became a fist fight”. He said Sirhan had, “an almost uncontrollable temper”. (8) 

 

Sirhan also revealed the violent side of his character when he was employed as an exercise boy/ trainee jockey. According to two exercise girls who worked at the Grande Vista Ranch Sirhan treated the horses ‘cruelly’. Del Mar Race Track foreman Larry Peters sawSirhan kick a horse in the belly and after he remonstrated with him he was taken aback at the vitriol which emanated from the young employee. Peters said Sirhan's temper had been unusually violent when he was told he would never become a jockey. (9)

 

Additionally, a horse trainer at the Grande Vista Ranchranch saw Sirhan mistreat a horse, “…kicking and hitting it with his fists”. Sirhan, he said, “…was in a rage of temper”. By way of explanation Sirhan told him the horse “provoked him”. (10) In fact, Sirhan had used this excuse at his trial when he testified that Robert Kennedy, by his support of Israel, had ‘provoked’ him which led to his decision to assassinate the senator. (11)

 

As a young adult, Sirhan sought meaning to his increasingly hopeless life by embracing anti-Semitism, anti-Americanism and Palestinian nationalism. Sirhan’s parents taught him the Jews were ‘evil’ and ‘stole their home’. They also taught him to hate, despise and fear Jews. As a part-time gardener Sirhan came to hate the Jews whose gardens he tended. (12)

 

Amongst the many descriptions of Sirhan by those who knew him well include his friends Walter Crowe, Lou Shelby and John and Patricia Strathmann, as well as his former boss John Weidner. They all agreed that Sirhan hated Jews and been intense and emotional whenever he discussed the Arab-Israeli conflict. They all agreed he was vehemently critical of American foreign policy regarding Israel. 

 

Walter Crowe had known Sirhan from the time they were young adults and also during a short period of time when Sirhan was a Pasadena College student. Crowe said Sirhan was virulently anti-Semitic and professed hatred for the Jews and the state of Israel. He believed Sirhan’s mother Mary propagated these views to Sirhan. (13)

 

Lou Shelby, the Lebanese-American owner of the Fez Supper Club in Hollywood, knew the Sirhan family intimately. He described Sirhan as, “intensely nationalistic with regard to his Arab identity”. According to Shelby, “We had a really big argument on Middle East politics...we switched back and forth between Arabic and English. Sirhan’s outlook was completely Arab nationalist - the Arabs were in the right and had made no mistakes”. (14)

 

John and Patricia Strathmann had been ‘good friends’ with Sirhan since High School. According to John, Sirhan was an admirer of Hitler, especially his treatment of the Jews, and was impressed with Hitler’s Mein Kampf. John also said Sirhan became ‘intense’ and ‘mad’ about the Arab/Israeli Six Day War. Patricia said Sirhan became, “burning mad . . . furious” about the war. (15) 

 

Sirhan discussed politics, religion, and philosophy with his boss, John Weidner, a committed Christian.Weidner was honoured by Israel for his heroism in saving more than 1,000 people from the Nazis.Sirhan worked for Weidner from September 1967 to March 1968. According to Weidner Sirhan, ‘hated Jews’. (16)

 

Sirhan was not only anti-Semitic in his political views but believed in violent action as a political tool. He admired the Black Panthers and even wanted to join their organisation. According to his brother Munir, Sirhan also became enamoured with the Black Muslims, who “were like him culturally”. Sirhan attended the Black Muslim Temple in Central Los Angeles until he was told he could not join the organization because he was not black. (17)

 

The notion that Sirhan never held any animus towards Robert Kennedy is also entirely without foundation as friends and Sirhan himself have revealed. Sirhan said he believed Robert Kennedy listened to the Jews, and he saw the senator as having sold out to them. (18) 

 

Sirhan also expressed hatred for Robert Kennedy to John Shear, an assistant to trainer Gordon Bowsher at the Santa Anita Racetrack. Shear recalled that the newly hired Sirhan heard a co-worker read aloud a newspaper account of Robert Kennedy recommending the allocation of arms to Israel. “Sol (Sirhan) just went crazy,” Shear said. “He was normally very quiet, but he just went into a rage when he heard the story.” (19)

 

Sirhan thought RFK would be, “like his brother,” the president, and help the Arabs but, “Hell, he f….. up. That’s all he did. . .. He asked for it. He should have been smarter than that. You know, the Arabs had emotions. He knew how they felt about it. But, hell, he didn’t have to come out right at the f…… time when the Arab-Israeli war erupted. Oh! I couldn’t take it! I couldn’t take it!” (20)

 

Despite protestations to the contrary Sirhan had clear and defined motives in wanting to murder Robert F. Kennedy and the hatred that spewed forth from his gun can ultimately be traced back to one cause - Palestinian nationalism. 

 

 

Notes

 

1. The Assassination of Robert F. Kennedy byTim Tate and Brad Johnson Thistle Publishing 2018

 

A Lie Too Big To Fail by Lisa Pease Feral House 2018

 

2. Kennedy, King, Malcolm X relatives and scholars seek new assassination probesBy Tom Jackman, 25 January 2019 https://www.washingtonpost.com/history/2019/01/25/kennedy-king-malcolm-x-relatives-scholars-seek-new-assassination-probes/?utm_term=.3ff4a5a06d2a

 

3.Tateand Johnson, Kindle edition.2018, location1678

 

4. A Lie Too Big To Fail by Lisa Pease, Feral House, 2018, 126-127

 

5.Houghton, Robert A. Special Unit Senator: The Investigation of the Assassination of Senator Robert F. Kennedy. Random House 1970 New York,181

 

6 .Francine Klagsbrun and David C. Whitney, eds., Assassination: Robert 
F. Kennedy, 1925–1968 ,New York: Cowles, 1968, 109

 

7. Godfrey H. Jansen, Why Robert Kennedy Was Killed: The Story of Two Victims, New York, Third Press, 1970,121–123

 

8.FBI Airtel To LA From San Francisco Kensalt, Interview with Assistant Professor Lowell J Bean, 21 June 1968

 

9.  Houghton, 191

 

10. FBI Kensalt Files, Interviews, 7 June 1968, Inglewood, Ca LA 56- 156 and 8 June 1968, Corona, Ca LA-56-156

 

11. John Seigenthaler, Search For Justice, Aurora, 1971 Seigenthaler, 256

 

12.  See: The Forgotten Terrorist, Chapter 3, Sirhan and Palestine)

Klagsbrun and Whitney, Assassination, 110

 

13. Houghton, 165

 

14 Jansen, 138/139

 

15.Houghton, 231- 232

 

16.Jansen, 135

 

17. Kaiser 214

 

18. Seigenthaler, 295

 

19. Larry Bortstein, “Guard Has a Leg Up on Opening Day,” OC Register, 
December 24, 2006,http://www.ocregister.com/ocregister/sports/other/article_1397207.php

 

20. Robert Kaiser, RFK Must Die, E P Dutton 1970, 270

 

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171610 https://historynewsnetwork.org/article/171610 0
The Weakness of Democracy

 

Donald Trump is the most dishonest and most ignorant president in living memory, perhaps in American history. With his disdain for fundamental elements of democratic practice, such as freedom of the press and separation of powers, he is a danger to our American democracy.

 

But his election and the continued support he receives from a significant minority of voters are themselves symptoms of weaknesses which seem to be inherent in modern democracy itself. When we extend our gaze beyond the US, we can more easily perceive that democracy often works badly. I am not talking about fake democracies, where there is voting but no choice, as in the Soviet Union and the states it controlled. Even in countries where there is real opposition and secret ballots, voting can produce terrible results.

 

Venezuela, currently suffering a constitutional and humanitarian crisis, appears to have a functioning democracy, but the system has been rigged in favor of Nicolás Maduro, the successor of Hugo Chavez. Physical attacks on and arrests of opposition leaders, banning of opposition parties, sudden changes in the date of the election, and vote buying helped produce a victory for Maduro in 2018.

 

Algeria is currently experiencing a popular revolt against the elected president Abdelaziz Bouteflika, who was first elected in 1999, when the five other candidates withdrew just before the vote. He has been re-elected in 2004, 2009, and 2014, and announced he would run again this year, until massive protests forced him to withdraw as a candidate. He is very ill and has not said a word in public since 2013. His power has been based on military control, corruption, voting manipulation, and extensive use of bribery to create supporters and discourage opposition. The rebels are calling for an overthrow of the whole system.

 

These two cases are exceptional: the illusion of democracy hid authoritarian reality where democracy had never achieved a foothold. Much more common over the past two decades has been a gradual decline of existing democracies across the world, a process which could be called autocratization. A recent study shows that gradual autocratization has weakened democracies, in places as diverse as Hungary, Turkey and India. By extending government control of media, restricting free association, and weakening official bodies which oversee elections, modern autocrats can undermine democracy without a sudden coup. The authors argue with extensive data that the world has been undergoing a third wave of autocratization covering 47 countries over the last 25 years, after the first two waves in the 1930s and in the 1960s and 1970s.

 

The efforts of would-be autocrats to maintain their power by restricting democracy discourage trust in democracy itself. Nearly three-quarters of voters in Latin America are dissatisfied with democracy, according to a survey in 18 countries by Latinobarómetro, the highest number since 1995.

 

This is the context for the current failures of democracy in the United States (Trump) and Great Britain (Brexit). What can explain these failures? Physical coercion of political opponents is nearly non-existent. Corruption and voter suppression certainly play a role, at least in the US, but probably not a decisive one. Voters were overwhelmingly free to choose. Why did so many make such bad choices? I believe that conservative politicians in both countries used carefully chosen political tactics to appeal to widespread voter dissatisfaction. Those tactics are fundamentally dishonest, in that they promised outcomes that were impossible (Brexit) or were not actually going to be pursued (better health care than Obamacare). White voters made uncomfortable by the increasingly equal treatment of women and minorities were persuaded that it was possible and desirable to return to white male supremacy.

 

Voters made poor choices, even by their own professed desires. There is a dangerous disconnect between the voting preferences of many Americans and their evaluations of American political realities. A survey by the Pew Research Center at the end of 2018 offers some insight into the fundamental weakness of American democracy. A wide bipartisan majority of 73% think the gap between rich and poor will grow over the next 30 years. Two-thirds think the partisan political divide will get wider and 59% believe the environment will be worse. Only 16% believe that Social Security will continue to provide benefits at current levels when they retire, and 42% think there will be no benefits at all. Nearly half say that the average family’s standard of living will decline, and only 20% believe it will improve. These are not just the views of liberals. 68% of Republicans say that no cuts should be made to Social Security in the future. 40% say that the government should be mostly responsible for paying for long-term health care for older Americans in the future.

 

Yet when asked about their top political priorities, Republicans offer ideas which don’t match their worries about the future. Their three top priorities for improving the quality of life for future generations are reducing the number of undocumented immigrants; reducing the national debt; and avoiding tax increases. The richer that a Republican voter is, the less likely they are to want to spend any money to deal with America’s problems. Republicans with family incomes under $30,000 have a top priority of more spending on affordable health care for all (62%) and on Social Security, Medicare and Medicaid (50%), while those with family incomes over $75,000 are give these a much lower priority. 39% of poorer Republicans say a top priority is reducing the income gap, but that is true for only 13% of richer Republicans. Republican politicians follow the preferences of the richest Republican voters, but that doesn’t seem to affect the voting patterns of the rest.

 

Nostalgia for the “whites only” society of the past also pushes Americans into the Republican Party. About three-quarters of those who think that having a non-white majority in 2050 will be “bad for the country” are Republicans.

 

A significant problem appears to be ignorance, not just of Trump, but also of his voters. Many are ignorant about the news which swirls around us every day. A poll taken last week by USA Today and Suffolk University shows that 8% of Americans don’t know who Robert Mueller is.

 

But much of the ignorance on the right is self-willed. Only 19% of self-identified Republicans say the news media will have a positive impact in solving America’s problems. Only 15% are “very worried” about climate change and 22% are not worried at all. Despite the multiple decisions that juries have made about the guilt of Trump’s closest advisors, one-third of Americans have little or no trust in Mueller’s investigation and half agree that the investigation is a “witch hunt”. Despite the avalanche of news about Trump’s lies, frauds, tax evasions, and more lies, 27% “strongly approve” of the job he is doing as President, and another 21% “approve”. 39% would vote for him again in 2020.

 

Peter Baker of the NY Times reports that “the sheer volume of allegations lodged against Mr. Trump and his circle defies historical parallel.” Yet the percentage of Americans who approve of Trump is nearly exactly the same as it was two years ago.

 

Ignorance and illogic afflict more than just conservatives. The patriotic halo around the military leads Americans of both parties to political illusions. 72% of adults think the military will have a positive impact on solving our biggest problems, and that rises to 80% of those over 50.

 

The British writer Sam Byers bemoans his fellow citizens’ retreat into national pride as their political system gives ample demonstration that pride is unwarranted. His wordsapply to our situation as well. He sees around him a “whitewash of poisonous nostalgia”, “a haunted dreamscape of collective dementia”. He believes that “nostalgia, exceptionalism and a xenophobic failure of the collective imagination have undone us”, leading to “a moment of deep and lasting national shame”.

 

One well-known definition of democracy involves a set of basic characteristics: universal suffrage, officials elected in free and fair elections, freedom of speech, access to sources of information outside of the government, and freedom of association.

 

We have seen some of these attributes be violated recently in the United States. Republican state governments have tried to reverse electoral losses by reducing the powers of newly elected Democratic governors. Trump, following the lead of many others, has urged Americans to ignore the free press and to substitute information that comes from him. Many states have tried to restrict the suffrage through a variety of tactics.

 

Across the world, democracy is under attack from within. Winston Churchill wrote, “it has been said that democracy is the worst form of Government except for all those other forms that have been tried”. Unless we want to try one of those other forms, we need to fight against autocratization, at home and abroad.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/blog/154196 https://historynewsnetwork.org/blog/154196 0
Roundup Top 10!  

How New York’s new monument whitewashes the women’s rights movement

by Martha S. Jones

It offers a narrow vision of the activists who fought for equality.

 

Why Trump’s recognition of the Golan Heights as Israeli territory is significant

by Dina Badie

Given the dimensions of America’s global influence, U.S. recognition could lend some legitimacy to Israel’s controversial annexation policy

 

 

What Mueller's probe has already revealed about Trump

by Julian Zelizer

The price of this entire process has already been high.

 

 

Why Historians Are Like Tax Collectors

by Matthew Gabriele

Studying the past - and then, importantly, talking about it with an audience - is about revealing the mess behind the myth, the story behind what we think we know.

 

 

An Economist with a Heart

by Hedrick Smith

Alan Krueger, the Princeton professor and economic adviser to two presidents who died last weekend, was one of those rare economists who break out of the ivory tower and plunge restlessly into the real world.

 

 

The New Zealand Shooting and the eternal fear of “race suicide”

by Jonathon Zimmerman

Put simply, the fear of being flooded by foreign hordes is baked into our national DNA. And it all starts with the question of fertility.

 

 

The danger of denying black Americans political rights

by Kellie Carter Jackson

Without access to political rights, violence becomes a crucial tool in the fight for freedom.

 

 

Turning Our Backs on Nuremberg

by Rebecca Gordon

John Bolton and Mike Pompeo Defy the International Criminal Court

 

 

How Experts and Their Facts Created Immigration Restriction

by Katherine Benton-Cohen

Facts have a history, and we ought to admit it.

 

 

Democrats’ Voting-Rights Push Could Begin a Third Reconstruction

by Ed Kilgore

As in the Reconstruction era after the Civil War, one party is committed to the use of federal power to vindicate voting rights, and the other is opposed.

 

 

Democrats are holding their convention in Milwaukee. The city’s socialist past is an asset.

by Tula Connell

With the announcement of Milwaukee as the site of the 2020 Democratic National Convention, political opponents wasted no time in raising the specter of the city’s socialist past.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171608 https://historynewsnetwork.org/article/171608 0
A Heartwarming Lost Chapter on Immigrants Emerges

 

From 1907 to 1911, the U.S. Government sponsored the Galveston Movement, a massive effort to direct arriving Jewish immigrants away from New York City and other East Coast ports, all overcrowded with immigrants and their families, and bring them to cities on the Gulf of Mexico, primarily Galveston, Texas. The government wanted to populate the Southern and Western states with immigrants as well as the Atlantic Seaboard. At first, it worked. Nearly 1,000 immigrants, mostly Jewish people fleeing from the pogroms of Russia at that time, moved to the Galveston area and were gradually assimilated into the city and its suburbs. The movement had problems, though. Jews refused to work on Saturday, annoying their employers. There was some anti-Semitism. Low paid Texas workers complained that the Jews took their jobs. There was not a big Jewish community in Galveston to embrace newly arrived Jews, as there was in cities like New York. The new arrivals encountered many of the same problems that confront Jewish, and other, immigrants today. The movement shut down in 1912.

Among those Jews who did move to Galveston was Russian Haskell Harelik, who spent his life there. His story is now being told by his grandson, playwright Mark Harelik, The Immigrant. It is a heartwarming, engaging and thoroughly lovable story not just about the Jews, but the Texans who befriended them and, like so many Americans, helped them to become Americans themselves. The play just opened at the George Street Playhouse, in New Brunswick, N.J.

The play starts with the impressive display of dozens of huge black and white photos of the Galveston immigrants and what their lives were like in those years. Black and white pictures appear from time to time in the play, just the right times, too, to help tell the story. As the play actually starts, we meet Haskell Harelik.  Harelik, a charming, affable young man, arrived in Galveston by ship in 1909, leaving his parents behind in Russia. He had nothing.  Milton and Ima Perry, a Galveston couple, take him in, renting him a room in their house and get him to start a banana sales business, that he runs out of an old wooden cart.

Young Harelik, a hard worker, soon evolves the banana trade into a produce store and then a larger dry goods store. His wife arrives from Russia to join him and they have three children. They become patriotic Americans and his three sons all fight for the U.S. in World War II.

Harelik has his struggles, though. Angry residents of a nearby town shoot at him when he visits there  with his banana cart. Others scorn him. Many ignore him. Eventually, though, he succeeds.

Playwright Harelik does not just tell his grandfather’s personal story in The Immigrant; he tells the story, in one way or another of all immigrants. They all faced the same difficulties upon arrival in America and, in some way, overcame their problems and were assimilated. This is a story triumph, not just for Harelik, but all the immigrants who came to America over all the years. It is a reminder, too, to those on both sides of immigration wars today, that the entry of foreigners int America, however they got here, was always controversial.

There are wonderful scenes in the play, such as those when a thrilled Harelik carries his newborn babies out of his house and lays them on the ground so that they become part of America. Then, years later, he names his baby after his friend Milt, and Milt happily carries him out of the house and lays him on the ground.

There is the story of the first Shabbat, a Jewish Holy Day, when Haskell and his wife invite Milt and his adorable wife Ima to their home.

It is the story of Milt and Ima Perry, too. One of their two children died quite young and the other ran away from home and was rarely heard from. They battle each other, and the Hareliks, and townspeople, from time to time, like we all do. Their story is the story of Texans, and Americans, embracing, with problems, these new immigrants.

The play succeeds, mostly, because of the mesmerizing acting of Benjamin Pelteson as Haskell. He is funny, he is sad, he is exuberant. You cheer for him and cry for him.  Director Jim Jack, who did superb work on the drama, also gets outstanding performances from R. Ward Duffy as Milt, Gretchen Hall as Milt’s wife Ima, and Lauriel Friedman as Haskell’s wife Leah.

There are some gaps on the play. We don’t know if Harelik spoke English when he arrived in Galveston or whether he learned it here. We know very little about the story of his wife Leah or troubles his kids might have had in school. All of that, of course, would require a 450 hour play. The drama in this one is good enough.

Haskell and his family were assimilated into Galveston life, his business did succeed and they made friends. It was an American dream for them.

PRODUCTION: The play is produced by the George Street Playhouse. Scenic Design: Jason Simms, Costumes: Asta Bennie Hostetter, Lighting: Christopher J. Bailey, Sound: Christopher Peifer, Projection Design: Kate Hevner. The play is directed by Jim Jack. It runs through April 7. 

    

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171594 https://historynewsnetwork.org/article/171594 0
A Scorching Look at a Word War II Jewish Ghetto in Poland

 

Ira Fuchs’ powerful new play about a Jewish Ghetto in World War II, Vilna, starts off with a small crime, the bribing of a Nazi official with a bottle of liquor by a Jewish woman doctor, and ends in one of the greatest crimes in human history, the extermination of 60,000 Jews in the Polish city of Vilna, part of the mass murder of six million Jews throughout Europe.

Fuchs’ stellar play opened last week at the Theatre at St. Clements on W. 46th Street, New York. It is a deep, rich and thoroughly frightening story of the Jews’ battle for survival in the city of Vilna, their faith in God, and each other, as truckload after truckload of them are taken onto the Ponary forest outside the city, where they are lined up and shot to death by firing squads. 

The story of the ghetto in this Polish city resembles many other stories in plays, films and novels. There is nothing terribly new about it. The heroes, resistance leaders, and the villains, the Nazis and collaborating local government officials, are the same. The difference, and what makes Vilna so outstanding, is the graphic violence, cruel and heart stopping, plus just tremendous acting by an all-star cast, whose work you will remember for quite a long time.

The play tells an entirely true story, although some of the scenes have been invented but resemble similar scenes at other ghettos and execution sites. Vilna was a city of more than 80,000 Jews in a larger population. The Jews were the backbone for a large cultural community that included theaters, symphonies and operas. It was an Emerald City for Jews in Europe. Starting in 1941, the Nazis, who occupied the city, began forcing the Jews into a ghetto, much like they did in Warsaw and other cities, denying them health care, food and sanitation. The Nazis established a Judenrat, or central committee, of Jewish community leaders, to run the ghetto. Then, systematically, they began to remove hundreds of Jews each day and murder them at a forest concentration camp (a total of 100,000 people, Jews and others, were executed there).

Vilna is the story of the Judenrat and its members, and the members of their families. It is also the story of Motko Zeidel and Yuri Farber from their high school days in 1926 to the end of the war. Through their story, playwright Fuchs tells the tale of all the Jews in Vilna and tells it well.

Motko and Yuri wind up working for Jacob Gens, a Jewish hospital director and father of a teenage girl. They do fine work of running the Ghetto – curbing diseases and preventing starvation and crime, until the Nazis tell them that it is their job to decide who shall be murdered and who shall survive (in the end, nobody survived). They made heart breaking decisions and suffered for them.

The play, deftly directed by Joseph Discher, is carried by its actors.

Sean Hudock, who plays Motko, and Seamus Mulcahy, who plays Yuri, are marvelous in their roles, especially towards the end of the play. There is a scene in which Moto tells his father about his job of deciding who lives in which his dad, really shaken, rubs the side of his face with his hand and Motko starts crying and his body stiffens. It is a striking moment.

There is another moment when the Nazis discover that Yuri changed architectural plans for the execution grounds to save Jews. He shakes with fear because he thinks they are going to hang him for his transgressions. It is one if the best moments in theater I have ever seen. Nathan Kaufman is impressive as Judenrat leader Jacob Gens. There is a scene where he is having dinner with the Nazi chief and time after time quivers, his eyes frozen wide and his jaw trembling, as the German tells him of the awful things he has to do to his own people.

The violence is disturbing but helps make the play as powerful as it is. At one point, a man on a platform is shot and there is a huge BANG in the air. Everybody in the theater was shocked. That, though, is the least violent act in the play. You sit there and shudder when you watch the violence, and know that it happened in Vilna, and in other Jewish ghettos, again and again and again.

One of the wonders of the play is the superlative acting by minor characters. Zeidel’s mom, the Jewish doctor, played wonderfully by Cary Van Driest, is brilliant as Jew, Mom and physician. His father, the elderly Josef Zeide, is play with both anger and tenderness by Mark Jacoby.  The Nazi leader, Bruno Kittel, is one of the most vile, vicious stage characters I have ever seen. You can almost feel the floor of the theater wobble when he struts across the stage and you can feel the air in the theater move when he waves his arms hysterically.

Their performances make the play. Others in this fine cast include James Michael Reilly as the director of an engineering firm, Brian Cade as Martin Weiss, the chief Nazi and Paul Cooper as Kittel.

A few summers ago, I visited a concentration camp outside of Hamburg, Germany. It was set, like so many were in a beautiful forest with nearby lakes, streams, clusters of swaying green trees and sweet-smelling flowers. You stand there and visualize the horror of what happened and ask yourself how human beings could do such things to other human beings. You think the same thing as you watch Vilna. How could this possibly have taken place?

…never again… 

 

Production: The play is produced by the Theatre at St. Clements. Scenic Design: Brittany Vasta, Lighting: Harry Feiner, Costumes: Devon Painter, Sound: Jane Shaw.

 The play is directed by Joseph Discher. It runs through April 11.

]]>
Wed, 24 Apr 2019 03:52:18 +0000 https://historynewsnetwork.org/article/171595 https://historynewsnetwork.org/article/171595 0