Historians/History Historians/History articles brought to you by History News Network. Wed, 16 Jan 2019 10:08:51 +0000 Wed, 16 Jan 2019 10:08:51 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://blog.hnn.us/article/category/2 Yesterday was the 100th Anniversary of Theodore Roosevelt's Death. Here's How His Legacy Still Shapes the United States Today. The beginning of the year 2019 marks the centennial of the death of the 26thPresident of the United States, Theodore Roosevelt, who passed away at age 60 on January 6, 1919. The impact of Roosevelt was massive, and continues to be so on America a century later. Here are five ways that Teddy Roosevelt’s legacy still shapes the United States today. 


The first and most significant contribution of Theodore Roosevelt to his country was his commitment to and advocacy of conservation of the environment, including promotion of national parks and national monuments, protection of our natural resources for the long term, and emphasizing the need for government and the people to show respect and awe for the great natural wonders of the North American continent.  Roosevelt is regarded as the premier figure who inspired the environmental movement, which fortunately was encouraged and accelerated by many of his White House successors including Woodrow Wilson, Franklin D. Roosevelt, John F. Kennedy, Lyndon B. Johnson, Richard Nixon, Jimmy Carter, Bill Clinton and Barack Obama.


Second, Roosevelt emphasized the need for social justice and encouraged “progressivism” from the White House. He was committed to the cause of workers and consumers both in and out of office.  The need for responsible government regulation of corporations was a driving force in his life.  He sincerely believed that many problems in American society could not be resolved just on the state and local level, but needed a national voice for all of the American people—not just the wealthy and privileged.


Roosevelt also shaped the modern presidency as he revived the Presidential office after its decline in power and influence after Abraham Lincoln’s assassination. In doing so, he became the model for many future presidents including Wilson, FDR, Harry Truman,  Kennedy. Johnson, Nixon, Carter, Clinton and Obama. Presidential scholars in History and Political Science would regularly rate Theodore Roosevelt as a “Near Great” President, ranked only behind Lincoln, George Washington, and FDR. This is quite a feat to hold such scholarly admiration and public renown for an entire century.


Fourthly, Roosevelt saw the absolute need to build the defenses of the United States against any future foreign threat. In particular, he loved and was fascinated with the US Navy. He believed war was at times necessary to protect the great experiment in democracy and the constitutional framework set up by the Founding Fathers.  As part of his perception of world affairs, Roosevelt saw the need for the building of the Panama Canal, and for assertion of American authority over the Western Hemisphere, going past the wording of the Monroe Doctrine of 1823 with his Roosevelt Corollary in 1904, and his assertion of the Big Stick policy toward Latin America.  Unfortunately, this created a long-term image of the United States as a imperialist power, not well regarded or appreciated by the independent nations of the hemisphere.


Finally, Roosevelt, while promoting military and naval buildup for protection of the nation, was also a great diplomat. His expansion of American diplomacy and relations with foreign nations helped expand American power in the early 20thcentury. He became very close to nations that would later become our allies—particularly Great Britain and France—and set a new standard for presidential engagement by negotiating the Treaty of Portsmouth which ended the Russo Japanese War of 1904-1905, winning him the honor of the Nobel Peace Prize in 1906.  He also took a moral stand toward any sign of aggression in the world as he came to warn of the danger of German aggression at the time of the Morocco Crisis of 1905-1906. He spoke out against the pogroms going on in Czarist Russia during his Presidency, and worked to promote a peaceful co-existence between Japan and the United States in the Far East, due to his concerns over our territories of Hawaii, Guam, and the Philippine Islands. 


These five positive contributions of Theodore Roosevelt have lasted and will continue to have an impact on the American Presidency and the future of the American nation.


Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170867 https://historynewsnetwork.org/article/170867 0
What Can Historians Teach The Media In The Era of Trump? 4 Historians Weigh In “Truth itself is under attack and expertise is suspect,” stated Kenneth Osgood as he opened a panel at the American Historical Association’s 2019 meeting on Friday afternoon. Featuring historians Nicole R. Hemmer (University of Virginia), Jeffrey Engel (Southern Methodist University), Jeremi Suri (University of Texas at Austin), and Julian Zelizer (Princeton University), “Unfaking the News: Historians in the Media in the Era of Trump” offered helpful advice on how historians can engage with the media and public. Each historian suggested a key insight that historians can offer journalists as they cover political developments under the current presidential administration.


1. Ideological “balance” has limits – Nicole Hemmer

This “crisis of journalism” that has accompanied Trump’s election is not the first time the media has critically reflected on its reporting practices. Many journalists reexamined what they considered “objective” after their defense to political figures and government statistics led to misreporting on the Vietnam war in the late 1960s and early 1970s.

Nicole Bremmer argued that today “balance” has become the new standard for how many media outlets attempt to be objective. If a program has one voice from the right and one voice from the left, many consider it as “objective.” As a result, journalists often prefer ideological diversity over other forms of diversity including racial, gender, and class diversity. Bremmer suggested that the media could benefit by understanding that good reporting requires more than ideological “balance.” Perspectives beyond just “right vs. left” can improve news coverage.


2. Bureaucracy matters – Jeffrey Engel

“How did each of us get to the AHA?” asked Jeremi Suri. Most arrived because the employees of the Federal Aviation Administration and the Transportation Security Agency were operating despite the government shutdown.  As consumers of news and scholars, historians are often focused on the flashy actors and don’t pay attention to what allows the nation to function. Why do our universities still function even though they often have bad leadership and are attacked? Why is society still functioning under Trump? Suri suggested that understanding how the accumulated procedures and knowledge of bureaucracy is an essential part of democracy can improve both historians and the media’s understanding of the current political state.


3. Highlight longterm historical developments – Julian Zelizer

Princeton scholar and CNN contributor Julian Zelizer believes that historians analysis is best when they can get beyond the moment-by-moment explanations for how the current political climate is so polarized. When speaking to the media, Zelizer always tries to make a connection with issues from the past (especially the 1970s) and how things were reconfigured and those events connect to today. That perspective is often difficult to put forward because Trump’s eccentricities are so prominent. Nevertheless, by adopting a long-term timeframe and understanding how deeply rooted the dynamics and dysfunction are, historians can avoid a “Alice in Wonderland kind of moment where everything is happening the same way once again.” Historians need some sense of evaluating what’s a little bizarre and what’s fundamentally dangerous. For Zelizer, accomplishing this sort of analysis requires him to leave partisanship behind and focus on his role as a historian.


4. History Adds Value, Not Just Context–  Jeffrey Engel

Jeffrey Engel considered approached his role differently than Julian Zelizer. Engel asked, How much do we need to be educators, how much do we need to be citizens, and how much do those two roles overlap? While Engel originally believed historians should just provide historical context to current events, after the past two years he believes as citizens historians should also give an opinion on current events and reveal how it effects them. Engel genuinely believes the republic is in danger so historians have a responsibility to go a step further when engaging the public. They should speak to what history means for today and the discuss the values behind that history. To Engel, this means he is always prepared to answer a question as to why something from history is important to the present.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170860 https://historynewsnetwork.org/article/170860 0
History in Crisis: 5 Challenges to Organizing Graduate Student Workers and 3 Ways to Still Succeed As the number of tenured positions at universities declines, the workload of teaching increasingly falls to adjunct professors and graduate students. In response, many academics have attempted to unionize to demand better pay, benefits, and treatment. At the American Historical Association’s 2019 Annual Meeting, Sarah Siegel (Washington University), Jody Noll (Georgia State University), Ruby Oram (Loyola University Chicago), and Jeff Schuhrke (University of Illinois Chicago) discussed the challenges to anticipate when organizing and methods to still be effective.


1. Anticipate that the university administration will claim graduate employees are students and not workers.

Sarah Siegel testified on behalf of graduate workers at the National Labor Relations Board’s hearing on Washington University in St. Louis’s organizing effort. “You teach your own course but that looks good on your CV,” asked the schools’ lawyers, “so why do you think you’re an employee?” Of course, Sarah responded, everything people do professionally looks good on their CV.

At Loyola University Chicago, a Catholic Jesuit university, the school initially claimed a religious exception for why students should not be able to unionize. According to Ruby Oram, Loyola claimed the graduate workers were “religious workers” and thus the university did not have to recognize their union. When this tactic failed, the university altered tactics and claimed they were students instead of workers.


2. Some fellow graduate students and department faculty will also not consider themselves “workers.”

Convincing many graduate workers and faculty to think of themselves as workers is a challenge, said Ruby Oram. Academics are trained to think of their labor not as work but as a lifestyle. This often hurts the ability to get faculty support because they don’t view graduate students as workers because they don’t view themselves as workers either. “There is an apathy that is the biggest obstacle among grad employees,” said Jeff Schuhrke. “We have to disabuse ourselves of the notion that grad school is a hazing ritual – it’s real work.”


3. It’s hard to organize across disciplines.

Coalition building is often best achieved by discussing grievances one-on-one. But grievances in history are often different than those in the chemistry department. How do we build bridges and come together as a work force?


4. The political climate can affect unionizing campaigns.

Students at the Loyola University Chicago successfully won their case with the NLRB. The university did not respond for eight months. Finally, eight days after President Trump appointed new officials to the NLRB, the university announced it would not bargain with the union.

The Trump appointments at the NLRB have made organizing at private organization harder. The recent Janus vs. AFSCME Supreme Court decision has made it harder for workers at public universities to organize.


5. Even once unionized, the challenges persist.

Even when graduate workers are able to win unionization, the university often continues to resist and resent the fact that there is a union on campus. In Jeff Schuhrke’s experience, his university tried to make the union contract as ineffective and unenforceable as possible. Often, the school seemingly only wanted to avoid liability over grievances rather than resolving them.

Maintaining a healthy union membership is also hard because the potential member pool changes so much with graduation and the arrival of incoming students. Union reps have to explain why the union matters because the new students often don’t know the earlier history that made the union necessary.


The organizers, however, still found ways to obtain gains for graduate workers on their respective campuses like higher stipends, improved health care, and dental insurance. The panelists offered a few key pieces of advice.


1. Find allies when you can.

Several panelists mentioned their efforts to combine forces with other campus workers, adjunct faculty, and even tenured faculty across their university to amplify their efforts. One popular campaign mentioned was the demand that university staff receive at least $15 an hour.

Jody Noll discussed a historical example of the power of alliances. Teachers in the 1968 strike in Florida were successfully obtained bargaining rights because many principals supported their efforts, sometimes even joining in the strike.


2. It’s OK to switch tactics.

After the effort for union recognition at Washington University in St. Louis stalled, the organization chose to switch to direct action campaigns. These campaigns were often effective at garnering specific benefits at the university.


3. Remind the university of their mission.

Universities are there to educate students. They want to appear as benevolent, diverse, welcoming, and beneficial places of learning. Demanding they live up to this promise – and publicly shaming them when they don’t – is often a successful tactic.


Be sure to visit historynewsnetwork.org for more coverage of the American Historical Association 2019 meeting!

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170859 https://historynewsnetwork.org/article/170859 0
Ok, We Returned the Stolen Bells to the Philippines

US soldiers in the Philippines, Manilla, during the Philippine-American war.


The church bells have returned to Balangiga after 117 years abroad. They last tolled on Philippine soil on September 28, 1901, when the smallest of the three signaled the start of a surprise attack against U.S. troops by the town’s residents, angry at their mistreatment during U.S. occupation. The 48 U.S. soldiers killed in the assault—many of them clubbed or hacked to death in their barracks—comprised the worst single attack upon U.S. soldiers in the decades since Custer’s last stand at the Battle of Little Bighorn.

U.S. retaliation—Brig. Gen. Jacob W. Smith told his men to turn the province of Samar into a “howling wilderness”—would lead to the deaths of thousands of Filipino civilians and become the most infamous campaign of the Philippine-American War.

By bringing attention to this long-forgotten conflict, Balangiga’s bells challenge us to listen again for lessons from the past. But with attention spans shrinking and history itself becoming a battleground in the public sphere, can we accurately transcribe their messages?

American soldiers took the bells as war booty during their revenge-fueled counterattack in Samar. Two bells wound up on a military base in Wyoming; the third found its way to a U.S. infantry museum in South Korea. Filipinos, who gained full independence in 1946 after three centuries as a Spanish colony and decades as a U.S. territory, have long sought their return. 

Observers even thought President Rodrigo Duterte, who mentioned the bells in patriotic speeches, might bring up the issue during his first meeting with President Trump in 2017 (he didn’t). But on December 14, years of public pressure and diplomatic negotiation finally brought the bells to St. Lawrence the Martyr Catholic Church in Eastern Samar.

Americans don’t much recall the sounds of Balangiga. More patriotic strains ring in our collective memory. Our stories of these islands start later, in World War II, when invading Japanese forced haggard U.S. troops on a deadly march in Bataan—or when General Douglas T. MacArthur promised and completed his dramatic return (complete with a staged, shore-wading photo-op) a few years later. Yet even here, the leveling of Manila during its recapture—it became the most destroyed allied city in World War II after Warsaw—largely escapes American recollection.

For Filipinos, by contrast, Balangiga’s bells seem to ring loud and clear—as a ballad to Filipino resistance and sovereignty. But despite Duterte’s use of Balangiga, Filipino popular memory still centers around Spanish colonization and mid-century, American-led liberation from Japanese occupation. The “tiny” war with America at the dawn of the 20th century, when more than 4,000 U.S. troops and up to 750,000 Filipinos lost their lives due to warfare, famine, and disease, is largely forgotten. And Filipino historians, like their American counterparts, still struggle to distinguish national myths and heroes from historical reality.

We Americans needn’t go far to discover reverberations from Uncle Sam’s late-nineteenth century encounter with Asia—we could start with words. “Boondocks” comes from the Tagalog bundok, or mountain. U.S. soldiers passed down in speech racist monikers, too—historians believe “gook,” used by soldiers decades later in Indochina to refer to the Vietnamese, may have begun as “gugu” or “goo-goo” in the Philippines, American slang for their wartime adversaries (white soldiers also called Filipinos the n-word). 

Other premonitions of modern U.S. wars in Asia or the Middle East haunt these early years of American overseas expansion. The black press, for example, was split over sending African American soldiers abroad. “A man who is not good enough to vote for a government,” theRichmond Planeteditorialized in 1898, referring to black disenfranchisement, “is not good enough to fight for it.” Muhammad Ali’s refusal during the Vietnam War to answer the draft echoed this longtime African American stance. Photos from the early 1900s show U.S. troops applying the “water cure” to captured Filipino fighters—a kind of waterboarding without the board. And many U.S. generals in the conflict fought previously in the Indian wars of the western United States, demonstrating an even older provenance to American empire.

Filipino and American historians have done much to uncover our two nations’ shared pasts. But in our efforts to counter jingoistic narratives of American nobility and Filipino haplessness, we U.S. historians, writing from an academy whose top history departments employ very few conservatives, may get trapped within different, uncontested viewpoints. We write critically about the white man’s burden, describing a turn-of-the-century push for colonies that was powered by racist fantasies, but often fail to notice white racism’s other, more isolationist impulses (the specter of non-white “others” coming to the mainland from U.S. colonies abroad may have partly stayed American expansion.) And our penchant for writing “history from below” can, at its worst, cast victims of U.S. aggression as pure and noble, when in fact Philippine society has long held its own brutal hierarchies.

As my research into late-nineteenth century minority journalists in America led me, surprisingly, to the Philippines, it took some time before I encountered Balangiga in U.S. scholarship. Gen. Smith’s murderous Samar trek, by contrast, was widely cited with little or no mention of its direct antecedent. The Balangiga attack can never justify the Samar rampage. But its elision helps posit race hatred as the driving force for American aggression abroad, a common theme in much “race and empire” scholarship. 

Racism is integral to any discussion of imperialism. But listening to Balangiga might mean emphasizing the brutalizing effects of guerilla warfare upon any occupying army. Westerners are often shocked at the viciousness of Japanese troops during World War II compared to the relative restraint of the “greatest generation.” But we need only look at the more prolonged war in Vietnam—or at the brief turn-of-the-century war in the Philippines—to gain a sense of what young men from any culture are capable of, when sent to occupy lands where they’re unwanted.

That’s my limited and, hopefully, humble attempt to hear some strains of wisdom sounding from three recently repatriated bells on the island of Samar. I hope more Americans and Filipinos, historians or not, will tune in, too.


Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170775 https://historynewsnetwork.org/article/170775 0
What I’m Reading: An Interview With Historian Sharlene Sinegal-DeCuir


Sharlene Sinegal-DeCuir is an Associate Professor of History at Xavier University of Louisiana. She received her PhD from Louisiana State University in 2009 and her areas of concentration are in American, African American, and Latin American history.

What books are you reading now?

I am reading several books at the moment, including: The New Jim Crow: Mass Incarceration in the Age of Colorblindness by Michelle Alexander; Witness to Change: From Jim Crow to Political Empowermentby Sybil Morial; and White Rage: The Unspoken Truth of Our Racial Divide by Carol Anderson.

What is your favorite history book?

My favorite history books are W. E. B. Dubois’s Black Reconstruction in America, 1860-1880and Carter G. Woodson’s The Mis-Education of the Negro.There are many more history books that I enjoy and consider my favorites but for me, those two are the classics. 

Why did you choose history as your career?

I don’t actually think I chose a career in history, it chose me. I began my college career as a pre-med major at Xavier University of Louisiana, a small private liberal arts college that is known nationally as being number one for placing African-Americans into medical schools. After the second semester of my sophomore year, I found myself questioning if I wanted to be an MD or if it was the path my parents chose for me. I decided to change my major to history because I remembered how much I liked it in high school and I had decided that I was going to be a lawyer. Fast forward to senior year, I took the LSAT and began applying to law schools, all the while questioning if that was the right choice for me because I realized I LOVED history.

At the last minute, I decided I would go to graduate school instead of law school. I told myself, I was only going to get a master’s degree than go to law school. I had not taken the GRE, had not applied to any programs and had no clue how I was going to get into any program. So being a Louisiana girl, I took myself over to LSU, walked into the graduate admissions department and told them I wanted to earn a master’s degree in history. I was allowed to take courses as a non-matriculating student for a semester while I prepared for the GRE and applied to the actual master’s program in history at LSU. I got accepted into the program and thought, okay, after this, law school. Well, literally right after defending my master’s thesis, the department chair asked me if I would like to stay at LSU and complete my PhD. Without even hesitating, I said yes, and the rest is history. Best decision I ever made!

What qualities do you need to be a historian?

You most definitely have to be open-minded and passionate. You must be willing to allow history to speak to you, no matter how difficult the subject. You also have to have a passion for history because I promise you, those long nights of research, preparing for conferences, writing articles, teaching, etc. can be daunting. If you have a passion for the subject, especially the subjects you teach/focus on, the little things don’t bother you and that passion will be felt by others. I have students every semester that enter my classroom telling me how much they hate history. I always tell them, “no, you don’t hate history; you hate the way it has been taught to you.” If you are passionate about what you teach, the students become passionate, active learners ready to soak up everything. Over the years of teaching, I have had several students change their majors to history after taking one of my classes. I think that’s the most fulfilling part of being a professor.

Who was your favorite history teacher?

I have had several amazing history teachers. Many of my undergraduate professors at Xavier University of Louisiana became my colleagues -- a few have since retired -- but Dr’s Jonathan Rotondo-McCord, Gary Donaldson, Shamsul Huda, and Sr. Barbara Hughes, have all contributed to my love for history.

My PhD advisor at Louisiana State University, Dr. Gaines Foster, is amazing. I remember thinking if I could be half the researcher and professor as him, I would be okay. Dr. Foster just had that quality about him. He was extremely helpful but very stern. With him, I couldn’t cut corners and get away with it. I must admit I was a bit intimidated. I have since had the pleasure of serving on a panel with him. After the panel, he complimented me on my research and my ability to engage the audience. He said I had a presence. I thought to myself…wow! That compliment meant so much to me because I truly value his opinion.

Another graduate school professor that had an impact on me was Dr. Leonard Moore. Dr. Moore had a captivating teaching style that was both engaging and passionate. I served as his teaching assistant for a few years in graduate school and learned a lot about lecturing and the delivery of information.

What is your most memorable or rewarding teaching experience?

Receiving my first student evaluations is one of my most memorable experiences. As a new professor you often question yourself about content and teaching: am I teaching relevant information? Do the students have a clear understanding of the information? And how can I make the information relatable? I remember reading those evaluations and feeling a sense of accomplishment because the majority of the students spoke about how I changed their perceptions of history. I fondly remember one student mentioning that I made a “boring” subject interesting and relatable. Another student said and I quote, “Dr. Sinegal-DeCuir, knows her sh*t,” -- when I am having a hard day, I think of that comment. It makes me smile every time.

It is also very rewarding to get emails, cards, and letters from current and former students expressing the impact I have made on their lives as a professor.

What are your hopes for history as a discipline?

I hope that the discipline continues to embrace diversity. Diversity in interpretations of historic events and diversity in scholars and scholarship. The facts of history don’t change but as long as we are accepting of diverse interpretations, history will forever be relevant.

Do you own any rare history or collectible books? Do you collect artifacts related to history?

I don’t have any rare historic collectables but I do have a few sad irons and one antique coffee grinder. I also have a large number of books pertaining to history and a small collection of Christmas ordainments from every museum I’ve visited across the U.S. and overseas. I am trying to start a Clementine Hunter collection; right now I have several prints, my plan is to replace the prints with the original paintings.

What have you found most rewarding and most frustrating about your career? 

The most rewarding thing about my career is becoming known as a scholar in my field. I have written a New York TimesOp-Ed, appeared on MSNBC with Al Sharpton, and have done a few local interviews about historic events. I enjoy putting myself out there. What is frustrating is that many people dismiss the field of history as not being important or not being as prestigious of a field like medicine or law. 

How has the study of history changed in the course of your career?

It has become more inclusive to different interpretations of events and to diverse fields. The field of history is no longer limited to a cookie cutter view of the past. Historians are uncovering amazing new stories and revisiting old ones with through the lens of gender studies and minority studies, to name a few.

What is your favorite history-related saying? Have you come up with your own?

I have not come up with my own saying but I really like this one, “History is not a burden on the memory but an illumination of the soul,” by Lord Acton. We should all be open to understanding the events of the past whether it is our own family history or the history of our country. Don’t hide the ugly truths and only embrace the good, happy times; we should learn from it all.

What are you doing next?

I am continuing to put myself out there through my scholarship. I am in the process of researching and writing a book proposal. 

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170774 https://historynewsnetwork.org/article/170774 0
When We Really Needed an Anti-Lynching Law Congress Wouldn’t Pass It    

Congressman George Henry White of North Carolina

 Last week Congress unanimously passed the first federal anti-lynching bill. Perhaps Congressman George Henry White of North Carolina, must have finally be smiling. A Black Indian born to an enslaved woman, little made him smile during the grim days that marked the post-Civil War decades. Strong-willed, eloquent and determined, White devoted his life to educating his people. He became a teacher, school principal, and lawyer and in 1896 was elected to the U.S. Congress. 

During his two terms White spoke as "the sole representative on this floor of 9 million of the [Black] population of the United States." While Congressmen ignored him as they regaled each other with “darky stories” on the floor of the house and in its corridors, he fought on. 

Racist cartoon from the period

On January 20, 1900 White introduced HR 6963 the first federal anti-lynching bill. By then an average of three African American men, women and children were lynched each week in the southern states. These festivals of horror and pain drew approving white crowds. Refreshments were served. Lawmen and sheriffs often assisted or led lynch mobs, and southern Governors and Senators offered nothing but praise. No one was ever arrested for this crime.

Congressman White compared lynching to treason and HR 6963 mandated the death penalty for those convicted. It died in the House Judiciary committee that year – and that year 105 Black people died at the hands of lynch mobs.

In 1898 White wasre-elected. He now stood as the last Black survivor of slavery to hold this high office in the 19th century, and the first in the 20th century. 

George Henry White spoke as “the sole representative” of millions of people. He spoke ”in behalf of an outraged, heart-broken, bruised, and bleeding, but God-fearing people: faithful, industrious, loyal, rising people -- full of potential force.” 



In 1898 a U.S. sea invasion under the banner of “Christianity and civilization” seized Spain’s colonies from Puerto Rico to the Philippines. As the U.S. imposed white supremacy on its new possessions, White again bravely stood to remind fellow Americans, “charity begins at home.”

During White's second Congressional term in 1900, North Carolina amended its Constitution to eliminate African American voters and office-holders through a poll tax, a grandfather clause and a literacy test enforceable in 1902. Black people throughout the southern states were being denied any right to run for Congress or other offices.

In his last Congressional speech White denounced the use of "constitutional amendment and legislation" and "cold-blooded fraud and intimidation" to deny the right to vote. 

During the Great Depression of the 1930s President Franklin D. Roosevelt was asked to sponsor an anti-lynching bill. “If I come out for the anti-lynching bill now,” he said, Southern Congressmen “will block every bill I ask Congress to pass to keep American from collapsing. I just can’t take that risk.” Congressman White took that risk over and over again. But his prediction that the South would send others to Congress stalled until the Civil Rights revolution. 

White's courageous service and his retirement by fire are missing from our school texts and college courses. It is unfortunate so few Americans know this brave freedom fighter. Spreading his story today would make it harder to deny people of color their voting rights.


Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170761 https://historynewsnetwork.org/article/170761 0
Oh No! The Depressing Truth About the Willy Wonka Chocolate Factory Workers

Oompa-Loompa illustration by Joseph Schindelman, copyright © 1964 and renewed 1992 by Joseph Schindelman, from CHARLIE AND THE CHOCOLATE FACTORY by Roald Dahl. Used by permission of Alfred A. Knopf, an imprint of Random House Children’s Books, a division of Penguin Random House LLC. All rights reserved.


Countless Americans grew up with Roald Dahl’s captivating 1964 children’s book, Charlie and the Chocolate Factory, or with one of its two film versions—in 1971 and 2005. Who knew during all this time that this most beloved story is shot full of white supremacist messages worthy of the Klu Klux Klan?  But now we do, and now is the time to replace this destructive narrative for one that tells healing truths. Netflix has announced that it will create a series of animations from sixteen of Dahl’s children’s tales, and most prominent among them, Charlie and the Chocolate Factory. And this past June Newsweek reported that Hollywood is considering yet another film knockoff of Dahl’s story, only this time as “a prequel,” explaining how Willy Wonka “acquired his riches and his legendary chocolate factory.” 

People love an artist's proposal of a black Charlie for Netflix's animated remake of 'Willy Wonka." https://t.co/r05aurYxeu

— HuffPost BlackVoices (@blackvoices) December 10, 2018

Inverting history in a way that would have shocked Dahl, the new film may star the African American actor, comedian, and singer Donald Glover as Wonka. Newsweek quoted Dahl’s widow, Felicity d'Abreu Crosland Dahl, as saying that her former husband’s original scheme was to have Charlie, the boy who finds the golden ticket that gets him into Wonka’s legendary factory, be “a little Black boy.” In a BBC radio interview, she declared it a “great pity” that Dahl (who at the time was married to the American actress Patricia Neal) bowed to his publisher’s wishes and dropped the idea. 

This latest possible remake would go far beyond anything Dahl could have envisioned. But will it go far enough? Does Donald Glover, or anyone else associated with this new effort fully understand what is at stake?

Despite what Felicity Dahl implied, Roald Dahl never considered any black roles for his famous story that were not right out of the Sambo tradition, British imperialism, or slavery. Indeed, the workers for his chocolate factory, the Oompa-Loompas, were slaves. When Charlie and the four other golden ticket holders and their parents first spied the Oompa-Loompas Wonka explained that the workers were not made of chocolate, but they “are real people! They are some of my workers!” He had imported the tiny black people “direct from Africa!” They belonged to “a tribe of tiny miniature pygmies known as Oompa-Loompas. I discovered them myself,” Wonka exclaimed. I brought them over from Africa myself—the whole tribe of them, three thousand in all. I found them in the very deepest and darkest part of the African jungle where no white man had ever been before.”



Wonka informed Charlie and his companions that the tribe had been starving, subsisting on green caterpillars but longed for cacao beans; “oh how they craved them,” he said. He bargained with the tribe and promised that if they agreed to “live in my factory” they could have all the cacao beans they wanted: “I’ll even pay your wages in cacao beans if you wish!”

So, the black pygmies traded their freedom for permanent enslavement and all the cacao beans they could eat. After the tribal leader agreed to stop eating green caterpillars and work for “beans,” Wonka “shipped them over here, every man, woman, and child in the Oompa-Loompa tribe. It was easy. I smuggled them over in large packing cases with holes in them, and they all got here safely.”

Because Britain the slave trade had outlawed the trade in 1807, as Wonka alluded to, he smuggled the slaves into England in packing cases, in conditions that sounded almost as horrific as the Middle Passage. And so that no one would miss the point, Joseph Schindelman’s images of the Oomp-Loompas in the book showed them as animal-skin clad jovial Sambos who just loved their labor.

Drawing by Hanson Booth, in Fremont P. Wirth, The Development of America (New York, 1936), 352.


A slave galley even made an appearance in the book, one powered by the pygmies who rowed on a river of chocolate. To further emphasize the slave analogy, Dahl introduced whips into the tale, “WHIPS—ALL SHAPES AND SIZES.” And why whips? Well, “For whipping cream, of course!”

Riveting the idea that these black pygmies were Wonka’s property, to which he could do whatever he wanted, the Oompa-Loompas were subject to hair-growing medical experiments and product testing that turned the little pygmies into blueberries. The entire Wonka enterprise relied on slavery and complete racial subordination.

Dahl, as he later confessed, grew up with an imperialist frame of mind. While in prep school, for instance, and dreaming about gold and an adventurous life that might be awaiting him in Africa, he remarked, “Sometimes there is a great advantage in traveling to hot countries where niggers dwell.” He carried that attitude to Africa just before World War II when he labored for the Asiatic Petroleum Company in the former German East Africa, then known as Tanganyika. His imperial enterprise and his experiences with the servants he commanded, and whom he labeled “boy,” and the native Africans he encountered, would shape his racial views and tincture almost all his future writings.

Even in 1982, long after the NAACP had condemned the way he presented Africans, and after he dropped the black Oompa-Loompas in the revision of the book, he continued in the same vein. His first draft of The BFG, the friendly giant at the center of that book emerged as the very worst imitation of a Zip Coon figure, a black, flat-nosed, giant with “thick rubbery lips . . . like two gigantic purple frankfurters lying on top of the other.” For once, an editor spoke up. Dahl’s new editor, Stephen Roxbourgh at Farrar, Straus, and Giroux, properly denounced Dahl’s characterization as a “derisive stereotype.” Dahl conceded the point, responding: “the negro lips thing is taken care of.” 

If we are to surmount the ugly legacy of Dahl’s work, the kind of imagination that Lin-Manuel Miranda applied to his blockbuster Alexander Hamilton is necessary. But to create a counter-narrative of America’s origin story that attacks white supremacy racial subordination, Miranda needed not just talent and imagination, but awareness, facts, accurate knowledge of this nation’s (racial) past. It’s a lofty goal, but it can’t be done right without the awareness that history and knowledge brings.

For a full analysis of Dahl’s attitudes and his writings, see my essay “Innocence Betrayed: Charlie and the Chocolate Factory and the Deep Roots of White Supremacy,” http://www.processhistory.org/yacovone-dahl-racism/



Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170755 https://historynewsnetwork.org/article/170755 0
So I Became a Historian—Now I’m Telling How It Worked Out

Although what has happened to history as a major is being disputed as I write, it is clear enough that “something is going on.” This essay is written entirely for an audience currently puzzled about “what to major in.” Because the author majored in history at three universities way back when (Emory, UGA, and Stanford), he has something to say on the subject. He not only survived as a history major, he flourished. Now 101, and obviously very active, he weighs here how it has been all these years to face life as a history major under seven important employers. Read away!

The idea here is for me to tell you, the Reader, about my long preparation for a life in history—and how it worked out. I intend to be candid, truthful even when it hurts, and now and then just a bit prideful. I expect you to emerge 10 percent better prepared to stick with (or abandon) your decision to become a history major.

My opinion is, you see, that a totally informal, conversational, recitation about the interaction between my history major and my life will be a good way for many a student to pass the time. I do know one thing for sure: History as subject matter has been far more than relevant to what happened to me as I have lived on to over 101 years of age. That’s right: born October 10, 1917.



I don’t believe I took any history courses in my two high schools in the vicinity of Philadelphia. In that jr/sr year I did read several books akin to history, by Roy Chapman Adams, Lincoln Steffens, David Fairchild and others, but I didn’t know that. 

Now we’re in college (Emory University, on a nice scholarship). Sigma Chi didn’t seem to care what I majored in, so OK. Right off the bat in the first quarter was a required history course, “Europe Since 1500,” I think. It was competing with half course, Slide Rule (where I made 100) and Spanish (which I flunked, making three A’s at the same time). I have to say I was humiliated, so at year’s end I dropped my engineering major. Soon I decided to major in journalism where there were more A’s, and a C (barely), in accounting. Soon it was pre-law, only because in that major, I was free to take just about anything I wanted—and I wanted to explore the curriculum. Surely a little of this sounds familiar to you. Right? What about history?

Well, there were all kinds of Southern history courses; all were entrancing and appealing to the mostly native Georgians dominating the class. English history was exotic. In fact, I liked those history courses and especially the term papers that were always required.

I had no idea at the time that the maybe eight term papers I wrote in 1936-39 were conditioning me for a life of research! Yes, it’s true. Footnotes, ibid., op. cit. and Bibliography were infiltrating my cosmos and I was evolving deep down inside, whether I knew it or not.

But history didn’t have a monopoly on my life at the time. I pitched baseball to five victories in a row in class competition. Though offered a tennis scholarship at Duke University, my father turned it down. I loved abnormal psychology. Three law courses: law of the press, constitutional law, and international law, using that Law Library, skewed me toward a legal career.

Philosophy, and a course in logic were exciting. Oh: I should mention elective Bible – its history, only. At the end came an Honors assignment to study every aspect of the New South and be examined. Lots of history (mostly Southern).

That summer after finishing Honors with six other graduates who stood with me in Glenn Memorial Church, a letter came from the history professors with the second of 12 scholarships I was going to get—with and without application. They were buying me!It was summer, 1939. I put aside those law books (secondhand from Gainesville, Florida), and returned to Emory for a Masters Degree, taking anything historical that I wanted, doing any thesis – so long as it was historical (Royal Government in Colonial Georgia with a sophisticated title, rooted in really original sources). I was entranced with 17th century England. And, for a time, Ancient Greece and Rome…. Anthony and Cleopatra somehow caught my full attention. And why not?

Now, the powers that be maneuvered a full year history grant at the University of Georgia under a very productive senior faculty member. But he wasn’t there! Anyway, I continued to learn much too much about The South. Whoa. 

World War II was coming for the United States, we wise ones thought in spring 1941, so I took up weightlifting and signed up with a recruiter for something military. Given the chance, I walked out on the Marines, and on September 25 I was called to active duty in a secret Intelligence unit of the Navy that seemed delighted with my history major. (It’s hard to believe they insisted that their recruits be “a third generation American.”) I was first an enlisted yeoman; then, luckily, an Ensign. Trained, two months, in “stuff.” I was the best, of many hundred, in the obstacle race, at NITSI-Naval Training School, Indoctrination, Quonset Pt, R.I. (Richard Nixon graduated in the next class, August, 1942.) As I lived that first military experience I admit that I don’t recall anybody asking, “What was your major in college?” I thought everybody would care.

My war career lasted over 4 years and involved major leadership on my part; nobody from the Admiral’s staff paid any attention to me; I ran the huge barracks at NAS Alameda alone, but with lots of Master at Arms and Compartment Cleaners working hard. I wrote a clean formal book on the subject of barracks administration, never published at 165 pages, when the Bomb ended the war unexpectedly. I wrote pamphlets spelling out things. 

This barracks officer was popular! My history major was a howling success. Why? I could do almost anything that was needed! It turned out that I had taken a “paperwork skills” concentration; it was adaptable; I was literate; I was used to getting things done. At war’s end they wouldn’t let me out for four months because I was “valuable to the demobilization.” They offered me instant Lieutenant Commander if I would stay in. I didn’t, but later I decided to stick with the Reserves and put in 23 years total.

Postwar, I did advanced personnel work at Mercer University, for the Veteran’s Administration. I could authorize all kinds of remedial services and classes for the disabled. Next, I was employed on a 12 month contract at University of Miami, for 2 full years. I taught a heavy load of Western Civ and US Survey. It was time for Stanford, where I majored in history (with a political parties minor) and finished in June 1951. God bless the G.I. Bill.

I got three large grants after Stanford, doing tricky research and writing. My family was happy. Now (1953 to 1958) I pretty much founded the field of social welfare in American scholarship. A famous figure in San Francisco said I should fathom “The welfare needs of the people of California, and how they are being met” for the famed Commonwealth Club. They expected a big book. In three years they got one: California Social Welfare. Original, 108 tables, 100 pages of law, about 600 pages; some bullying of organizations public and private was part of “research.” Bodies I battled ranged from IRS to county and private units. 

It was one of three books I now wrote on social welfare, having never studied a word on the subject or heard of it. Here were philanthropy and foundations; adoptions and birth control; charity; government programs of aid; religious units financing things. Prentice Hall went all out to produce the 5,000 handsome copies of California Social Welfare and they disappeared. Next I drafted, over and over, on a full year grant, Welfare in America, a beautiful book including photographs and poems. Oklahoma issued it twice. 

Then after exciting New York City committee work with the American Heart Association, a weekend a month, I wrote for them the influential, The Heart Future. That newsworthy book made the news columns of the New York Times.

This yesterday history major was now to be interviewed in New York City repeatedly by organizations wanting me to work for them. I flew, from Santa Monica each time, but my collie said “no” to leaving the Pacific Ocean permanently. (There were offers, and quasi chances. One, possibly, was to direct the national FDR Infantile Paralysis unit. Rockefeller checked in with a research study. A mental health unit wanted me. All NYC.)

Groups stepped forward to help me along. I was the editor for things American at a great encyclopedia, but despised the working conditions. Then I was part of the scholarly Bureau of Medical Economic Research at the American Medical Association in Chicago. My financing during those years is of little interest but I do think it pertinent to mention that at one point when first enrolled for unemployment insurance in Chicago, I was referred to Midas Muffler, who wanted a head of “Research.” I wish I had ventured a visit, so I might say at this point something about “history and mufflers,” but I accepted a great alternative offer at that very moment.

We moved around, and we changed allegiances, but we survived. We were solvent. I was in a Marquis book already (and later another) but renown was in no way as important as contentment.

However, we must admit at this point that there had been for me a happy marriage in 1944 and the birth of two exceptional children. All three made their marks in life, in a very big way.

Now we turn to the tour-de-force: I became a general, final, editor for administration at The RAND Corporation, THE think tank, in Santa Monica. Enroute to the next step, I had aided three famous intellectuals prepare their books, for a year, half a year, a quarter of a year, fulltime in each case. Those books amounted to something! They were on FDR, Space, and thermonuclear war. I had to learn in a hurry and do a “perfect” job. Then I helped on a book about the cost of ulcers to society; planets in other systems; Laos; and more. That history major had built me for a think tank life.

I founded two oral history projects (RAND’s, Truman Library). Did maybe twenty long and really vital interviews at RAND. I also ghosted an important Harvey Mudd speech for the general manager. They used it to raise funds for some time, I heard.

Now came misfortune, as funding modifications unexpectedly demanded by the Air Force put me in a bind. How would I as a history major like to edit, henceforth, for engineering? I wouldn’t. The sciences? No, no. Sorry, children, I know you love the ocean and I love town hall and other things, but “it’s over.” Opportunities with gigantic corporations in “aerospace” got a no.

I had a few tough months. Interviewed by “history chairmen” I always heard “but you’re senior to many history faculty and certainly to me! Clark Kerr in two pages said his California system didn’t hire interdisciplinary people. I should have waited a decade; then they sure did!

Now in the mid 1960s, I rejoined academia. That is, I came in at “the top” of a rather small place: Professor of History and Social Science and Chairman, Social Sciences Division, at up and coming Southern Oregon College in Ashland on a twelve-month contract. I did it for seventeen years, most of it anyway. Directing and living with seven departments I found I had developed all kinds of—what shall we say?—maybe “talents.” I could DO things and avoid many hazards.

Sudden illness (a heart infarction) in time brought early retirement. It was the same thing that hit LBJ in 1956. Retirement? Really? 

There isn’t much drama from 1980 to 2018. It’s been articles, and books all over the place, including four months (with my wife, Beth) working for Chapman Colleges World Campus Afloat. She was secretary to the cigar smoking ship’s dean. Lord!

It may be of some interest that during my years as Quasi-Dean teaching regular sessions and summers, I taught twenty-three different courses. It was necessary to fill slots when abruptly necessary. 

There was membership for two decades on the United States Civil Rights Commission for Oregon. President of the Rogue Valley Symphony. Earlier, it was Sigma Chi; now it was fifty years and more in Ashland Rotary. Son was an Eagle Scout, daughter an ardent Girl Scout.

So: Let’s talk about “history” not quite in the erudite manner to be found now and then on the History News Network, but as, well, something I blundered or maneuvered into in the 1930s, ignored in the early 1940s, lived with solidly thereafter to 1951, and apparently got paid enough when affiliated with it to support a family—and to be happy—for several decades (actually, a lifetime).

I see nothing to be gained at this point in conversing about all that “who’s who” and “distinguished” stuff I picked up enroute, nor do I want to list books I wrote (eighteen) or helped others to write (maybe ten). Do take note of this plain fact: In my years as a historian I gave very little attention to the idea that I was “out of the ordinary” and said little about it—despite having endured and profited from nine (yes, nine!) years of university instruction, all told. 

I would like to say here that “any history major could have done it.” But I really don’t know. I had handled major morning paper routes in high school; worked in my dad’s engineering office on ordinary stuff in summers; gotten little or no advice or “tutoring.” You could have. Yes.

I guess what I want to say is that one nice thing about majoring in history is that you may keep getting abler. 

Back in the early 1930s I had no idea what I wanted to be. Taking history courses was “a way out.” It postponed decision. I kept getting more knowledgeable, yes, apparently smarter, but: I didn’t have to do anything about it. The world around me kept thinking I could DO whatever they had vaguely in mind. Sure enough, I mostly could. They didn’t seem concerned and neither did I.

Over a period of twenty-three years in the Naval Reserve, I was forced to take and teach all kinds of odd courses. Nothing to it. I always took them—yes, and over time, taught them, too. I can run things. Now, where did I learn that? I can DO things. How come? A 355 page book on SPACE requested by Congress was prepared by twenty geniuses in 1959. I was asked to precis it to a mere seventy-two pages. Yes!

Some are entranced at my editorship of American history, geography, and biography at the Encyclopaedia Britannica. I did do significant and important work there and liked the occupation of an editor.

I guess that those twenty-five books per “field” at Stanford on history, political parties, union labor, really seven fields in all, taught me something in addition to history, right? That is, “I can survive and sorta prevail in our world.” Why not? I did major in history! 

I am published in quite a variety of learned journals, because a history major refuses to be sequestered. The Bornet bibliography goes to maybe 30 pages of fine print, 1933 to date. 

So choose your major, you male or female student enrolled “somewhere.” There is a future family out there for you to create and support by being a teacher or professor—or lots of other things. Your father and mother are going to have to assume that you do know what you’re doing by majoring in history. Have a good life, next generation historian, if that’s what you decide you want—and life decides to let you become! Welcome to my academic fraternity, and good fortune attend ye, as you live from now to the very end of your highway. 

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170747 https://historynewsnetwork.org/article/170747 0
We Wanted to Publish an Illustrated History of the Holy Land. We Couldn’t Make Everybody Happy.

Even in these troubled times pilgrims flock by the thousand to the Holy Land. Many have a religious motivation: they want to walk “in the footsteps of Jesus” or to see the land of the patriarchs. Often, however, despite the best spin that the tour guides put on it, what they actually see is “the land of Herod” and the buildings of the Crusaders or the Ottomans. What is more, when they consult their digital maps or guidebooks, they will not find “the Holy Land” anywhere depicted. As a political definition of turf it simply does not exist. Indeed, it occurs as a name only once in the Hebrew Bible/ Old Testament, never even once in the New Testament, and again it is extremely rare in the Qur‘an.

In devising this Illustrated Historywe were therefore guided by other than purely political—or sometimes even historical!—concerns. Rather, we wanted to outline the history of that part of the Levant that has seen the birth of two of the three great monotheistic faiths and has been of central concern to the third. This has had three particular consequences.

First, in terms of geography the boundaries we have worked with have been varied, depending on where the action relevant to the faiths was based. While the region around Jerusalem is always most prominent it was necessary usually to go further north as well, but only sometimes to include, for instance, the territory of present-day Jordan, Syria, and Lebanon, of ancient Assyria and Babylon (in modern Iraq), and even of Turkey, the heartland of the Ottoman empire.



Second, our chronology has also been determined by the goals of the book. Although some of the earliest human remains world-wide are found in the Carmel caves, and although Jericho has often been called the oldest city on earth, we in fact start with Abraham, whose name has in recent years come to be associated with the three Abrahamic faiths. And perhaps I might add here that we stop in 1918, not because that was the end of the Holy Land but because we wanted resolutely to avoid any danger of contaminating that notion with current political claims and counter-claims. We are historians, not specialists in international relations.

And thirdly, while of course the political history of the nearly three thousand years we cover is given full attention, the focus throughout is on how the land has provided the setting for the development of three great faiths. To help with this, we decided not only to include chapters on what I like to call the normal “hunks of history,” but also three on matters peculiar to religion but in one way or another shared by all of them: pilgrimage, holy places, and sacred texts. Such social/cultural matters are quite as much an element of history as kings and battles.

So at a minimum, this book should prepare people for a better understanding of what one actually sees on the ground, whether visiting in person or only as an armchair pilgrim. It is remarkable how an increased knowledge of history can enrich the appreciation of landscape and the built environment, turning what sometimes looks like a confused jumble of remains from widely differing periods into an intelligible whole. I have seen people’s appreciation of the Church of the Holy Sepulchre, for instance, completely transformed once it is explained to them why things are as they are and how they developed from very different-looking beginnings.

There is a deeper matter, however, which is of interest, if not challenge, for all whose knowledge of the history of this region is derived primarily from the sacred texts of one of the religions or another. Proper historical narratives are the result of painstaking research which combine information from a variety of sources, and in some cases these were not available to writers long ago or even, from time to time, quite recently. To mention just the most obvious of these, it is only really in the past century that we have discovered and deciphered many of the records from the nations of antiquity who impacted the Holy Land: Assyrians, Babylonians, Arameans, Egyptians, Persians, and so on. These obviously often fill out the picture presented in familiar sources (such as the Bible in antiquity) and help us to realize that both sides had their axes to grind in ways that historians have to take into account.

Again, archaeology has refined and developed its techniques in transformative ways so that excavation is no longer just a treasure hunt but a vital tool for filling in the longer term trends in agriculture, domestic as well as public architecture, and other such cultural stages in the way of life which determines much of the path along which history evolves. This is all obviously to be welcomed.

These newer data can also challenge deeply-held positions that are based only on an inherited knowledge of the story. Naturally each author in this volume has to present the data with all the clarity that she or he can muster, and we are aware that sometimes people will find this disturbing. The authors of each chapter were therefore specifically asked not to alienate the reader but rather to lead him or her gently through the evidence that cannot be ignored by anyone wanting to maintain their intellectual integrity. 

As editors we concluded our preface by saying that “we do not believe that the results of modern historical research are in any way incompatible with the continuing use of the Bible as scripture.” We also added that our hope is that “through such understanding appreciation of what each of the faiths had to offer may be deepened without the hostile fragmentation which has characterized much of the history we trace here and which still, sadly, is prevalent in the modern world.”

 As is often said in the present climate, the Holy Land sometimes sounds like a contradiction in terms. It is up to each reader to work through how their faith or their cynicism will take these points on board, but of one thing we are certain: it cannot be done by refusing to face up to the history which is here outlined by expert scholars on the basis of the best knowledge currently available to us.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170743 https://historynewsnetwork.org/article/170743 0
Is Monticello Monetizing Race at Jefferson’s Expense?

Thomas Jefferson's Monticello, By Martin Falbisoner - Own work, CC BY-SA 3.0


Consider this simple syllogism: Slavery is bad; Thomas Jefferson owned slaves; so, Thomas Jefferson was bad. Consider this simplistic precept: Racism is bad. Both are anything but profound and certainly not illuminating, but they typify, with due consideration for hyperbole, the quality and blinkered approach to Jeffersonian scholarship in the past several decades. The focal issue has been Jefferson’s racism, and the issue within the issue has been his assumed relationship with Sally Hemings. Jeffersonian scholarship has become an exercise in battology—a useless, fatuous repetition of the same claims but with a slightly different twist. “Jefferson was a racist but he really loved Sally Hemings” versus “Jefferson was a racist and he raped Sally Hemings,” and so on. Those twists are what merit publication. The collision of radically different, but historically reasonable, ideas, needed for advances in historical scholarship, has become anathema.

Yet there is a place for simple syllogisms and simplistic precepts. They were, for instance, a significant part of a youth’s education in antiquity. The ancient Greek and Roman Stoics commonly used such syllogisms and precepts to hammer home lessons concerning happiness. Simple syllogism: One always benefits by having more of a particular virtue; too much wealth can bring ruin; so, wealth is not a virtue. Precept: Virtue is the sole good. For the Stoics, such simple syllogisms and simplistic precepts were not meant to be profound or provocative. They were uttered to reinforce virtuous behavior with the aim of equanimity. They were especially useful for children, whose rational faculties were too undersized and inchoate to appreciate the richness and complexity of circumstances, which were for those persons of full rational maturity, for the Stoics, the true determinants of virtuous behavior, not simple syllogisms or simplistic precepts. Uttering that wealth is irrelevant to happiness is itself not sufficient to mollify someone experiencing bankruptcy. Yet knowing that wealth itself is irrelevant to happiness is, in the words of French psychologist Jean Piaget, complete assimilation and accommodation of the principle such that it becomes part of the fabric of a person. That takes decades of agonizing critical thinking—of thinking through some issue to understand how it applies in all circumstances, of thinking through what makes a life meaningful. Once assimilated and accommodated, living consistently with the principle is easy.



Issues of slavery and racism are equally complex, and cannot be understood by uttering simple syllogisms or simplistic precepts as scholars today are wont to do. Slavery is known to have been practiced in Ancient China as early as the 18th century B.C. and continued to be practiced till the twentieth century. Slavery was also practiced in India, in parts of Asia, in the Middle East, and even in Africa, where Blacks enslaved other Blacks. So prevalent was the institution that it was taken for granted prior to the American Revolution. Said John Jay: “Prior to the great revolution, the great majority or rather the great body of our people had been so long accustomed to the practice and convenience of having slaves, that very few among them even doubted the propriety and rectitude of it.” It is mostly with the ascendency of Enlightenment thinking, with its twin postulates of liberty and equality, articulated for illustration by Jefferson in his Declaration of Independence, that slavery has become vital and vibrant in scientific, moral, and political discussions.

The driving force behind Monticello, the Thomas Jefferson Foundation, has for at least two decades been using the issue of race as an enticement for bringing people to Monticello. It is unclear whether that strategy has worked to bring visitors, but it has brought it grant money.  It began with their scholarly take on the 1998 DNA report, misleadingly titled “Jefferson Fathered Slaves’ Last Child,” on Jefferson’s paternity. Their report, published in 2000, stated that Jefferson was very likely the father of Eston Hemings and probably the father of all other children of Sally Hemings. In June, 2018, and by appeal to no new evidence, they pronounced, in a new exhibit on Sally Hemings, that the relationship was fact. “In the new exhibit exploring the life of Sally Hemings, her choices, and her connection to Thomas Jefferson, as well as in updates to our related online materials and print publications, the Foundation will henceforth assert what the evidence indicates and eliminate qualifying language related to the paternity of Eston Hemings as well as that related to Sally Hemings’s three other surviving children, whose descendants were not part of the 1998 DNA study.” They have also brought to bear the question whether Hemings was raped by Jefferson. What a tantalizing suggestion, and to my mind a sleazy one!

That they claim, concerning Jefferson’s paternity, to be merely asserting what the evidence indicates is to me astonishing. As one who has taught logic and critical thinking for some 30 years and has published four books in that area, I admit to being flummoxed by that assertion. Thorough analysis of all the relevant evidence points to lack of a relationship, though I admit that I cannot make that utterance with a high degree of probability. I have never done that. We just do not know! Proof that something suspicious is happening is this: If the evidence is so compelling that we can safely state the relationship is factual, then why could not TJF have seen that some 20 years ago? The evidence has not changed.

The situation at Monticello is toxic. They are unwilling to aim to settle the issues of Jefferson’s paternity and of his avowed racism by rational debate concerning the evidence, or even concerning what ought to count as evidence. Members of TJF—and many of them are, I suspect, sufficiently unfamiliar with Jefferson to be judges of the issue of paternity—have elected themselves to be the sole arbiters of Thomas Jefferson’s legacy, which is no longer open to debate. Their influence extends to the Robert H. Smith International Center for Jefferson Studies, run by the vice-president of TJF, as well as University of Virginia Press. They control who comes to the center and what books related to Jefferson get published. TJF’s depiction of Jefferson, jaded as it is, has won the day. It is now no longer necessary to recognize others who disagree with TJF, to read their arguments, to assess critically those arguments, and to engage in debate with them.

What is the next step?

The next step, doubtless, will be to remove or disallow all the excellent books in Monticello’s library concerning arguments for skepticism or anti-paternity—Dr. Robert Turner’s Scholars Commission Report, Cynthia Burton’s Jefferson Vindicated, and William Hyland’s In Defense of Thomas Jefferson. There is no need to remove my Framing a Legend: Exposing the Distorted History of Thomas Jefferson and Sally Hemings—as well as books that paint Jefferson in a favorable light. It, and my numerous other books which I have published on Jefferson’s philosophical thinking (e.g., Thomas Jefferson: Moralist: Jefferson’s Political Philosophy and the Metaphysics of Utopia: and Thomas Jefferson’s Bible: With Introduction and Critical Commentary) and have nothing to do with Sally Hemings, but take us deep into the mind of Jefferson, have never been at their library.

What is worse, as New York Times reporter Farah Stockman says, is that TJF “is phasing out the popular ‘house tour’ of the mansion, … [thereby] radically changing what is experienced by the more than 400,000 tourists who visit Monticello annually.” Why would TJF phase out a popular tour? Why is that significant? Tourists will no longer see the Great Clock; the Native American artifacts; the numerous paintings (e.g., Bacon, Locke, and Newton); the many busts (e.g., Jefferson and Hamilton, face to face); the library sorted according to Memory, Reason, and Imagination; the inventions and gewgaws (e.g., the dumbwaiter, the revolving bookstand, and the polygraph); among other things. Tourists to Monticello will be kept outside to see Sally Hemings’ room and the slaves’ quarters at Mulberry Row. Jefferson’s beloved Monticello might soon be a shrine to Sally Hemings, even though we do not know whether she and Jefferson had a relationship (see my article on HNN)!

To the objection that Monticello ought to be principally about Jefferson and not about slavery or Sally Hemings, Annette Gordon-Reed replies:“Some people come here and say, ‘I didn’t come here, to a slave plantation, to hear about slavery.’ There’s nothing to do but keep pushing back.” To her, it has become a shoving match. Monticello is not Jefferson’s home, but a “slave plantation”—her agenda is plain—and visitors to it will hear about that whether or not it suits them. The comment plainly betrays the political posture of TJF. There is a good reason to keep on shoving. Monticello was awarded NEH grants in 2018 totaling nearly one million dollars! The focus on race is being handsomely rewarded, even if truth is deserted. Who cares if the number of visitors radically declines if grant money keeps pouring in? 

While it is laudable that members of the TJF wish to be viewed historically as paladins of human rights, they are doing so by constructing an image of Jefferson that is warped by political ideals. Their Jefferson is an opportunist, hypocrite, racist, and perhaps even rapist, and they do not give voice to scholars who disagree. The climate is authoritarian—certainly not in keeping with Jefferson’s republican thinking.

Historical truth and a pro-human-rights agenda are not inconsistent. In pressing too hard, too fast, for the latter, we do so at the expense of the former, and the accounts of the past we leave behind to future generations become no more reliable than Homer’s Iliad—a story founded on historical truth, but overwhelmingly colored by fancy.

Yet I am, like Jefferson and perhaps naively, convinced of the good judgment of people over time. As the issue of race is a hot potato, Jefferson will continue to be a fall guy, and perhaps the quick monetary rewards of a focus on Jefferson’s racism will be a template also for short-term success at Poplar Forest. One can only hope that that will not be the case—that The Corporation for Thomas Jefferson’s Poplar Forest not will sell out their souls for money.

In the meantime, we must do what we can and also exercise patience. As Jefferson writes to Dr. Thomas Cooper (7 Oct. 1814): “We cannot always do what is absolutely best. Those with whom we act, entertaining different views, have the power and the right of carrying them into practice. Truth advances, and error recedes step by step only; and to do to our fellow men the most good in our power we must lead where we can, follow where we cannot and still go with them, watching always the favorable moment for helping them to another step.”

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170713 https://historynewsnetwork.org/article/170713 0
75 Years Ago Ernie Pyle Wrote a Tribute to a Dead World War 2 Soldier

Captain Henry T. Waskow, United States Army - By Source (WP:NFCC#4), Fair use


In this war I have known a lot of officers who were loved  and respected by the soldiers under them.  But never have  I crossed the trail of any man as beloved as Captain Henry  T. Waskow of Belton, Tex. 

Journalist Ernie Pyle was a household name during the Second World War.  Millions at home consumed his dispatches from various theatre's of battle, carried in more than 300 Scripps-Howard newspapers across the country, with the same devotion they reserved for v-mail they received from their loved ones in harm's way overseas. 

Pyle's most famous column eulogized Henry Waskow of the 36th infantry, who died December 14, 1943—75 years ago—when he was struck in the chest by a German mortar in the fighting that raged in the mountains south of Rome.  If it had not caught Pyle's attention, his death would have been lost amid the casualty numbers coming out of that exceptionally bloody front. 

For three days Pyle waited at bottom of the trail where mule teams were bringing down bodies. “This one is Captain Waskow,” he remembered hearing that cold, moonlit night as he eavesdropped while the men of his unit approached, one by one, to offer their sad and awkward goodbyes.     

Pyle knew when to let a scene speak for itself, and the starkness of the words he chose to describe this one only added to their weight and power.  “I sure am sorry, sir,” said one soldier, his voice trailing off.  Another knelt down and, for several minutes, held his friend's hand, lost in thought. Before joining the others as they moved up the road to the next assignment, he paused for an extra moment, to “gently straighten...the points of the captain's shirt collar, and then he sort of rearranged the tattered edges of his uniform around the wound.” 

The intimate, closely-observed details of this column struck such a chord with Pyle's audience that Arthur Godfrey decided to read it aloud on his syndicated radio program, and it would be adopted for war bond drives in much the same way as Joe Rosenthal's photograph of the flag raising at Iwo Jima a year later. 



And so we should pause, to remember the work of Ernie Pyle, and the life of the young army captain he so nobly memorialized—one of eight children, born to a family of Dust Bowl sharecroppers, the class president at his tiny high school, an aspiring teacher, a humble and caring man who went the extra mile for his men, and who, like so many others, sacrificed his future, for ours.     

Today Henry Waskow lies buried in a military park in Italy with thousands of his fallen comrades, a long way from home. “I would like to have lived,” he wrote in a heartfelt letter to his family, to be opened in the event of his death.   

But since God has willed otherwise, do not grieve too much, dear ones, for life in that other world must be beautiful, and I have lived a life with that in mind all along.

I will have done my share to make this world a better place in which to live. Maybe when the lights go on again all over the world, free people can be happy and gay again.

If I failed as a leader, and I pray God I didn't, it was not because I did not try.

You can read Pyle’s tribute to Captain Henry Waskow here.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170707 https://historynewsnetwork.org/article/170707 0
In Memory of the Man Who Was Identified with the “God Is Dead” Movement


God still spoke to his prophets during the 17th century English civil wars. Oftentimes God thundered in paradoxical aphorisms that sounded heretical or blasphemous. Sometimes, two centuries before Friedrich Nietzsche, God would declare his own death. For the members of the dissenting religious sects that functioned as the pamphleteers for these ideas, theology was a radical practice. Faith didn’t exist to bolster things-of-this-world, but rather God’s decomposition allowed for a fecundity where new meanings could grow. Subsequently, groups with colorful designations like the Muggletonians, the Levellers, the Diggers, the Grindletonians, the Philadelphians, the Behmists, the Familiasts, and the evocatively (and appropriately) named Ranters produced some of the strangest and most beautiful theological work in Christian history.

Theirs was a fervency before the sublime altar of Nothingness, a wisdom prostrating itself before an empty throne, holding to the 18th century visionary poet and prophet (and possible enthusiast for Muggletonianism) William Blake, who wrote that “men forgot that all deities reside in the human breast.” This dictum reverberates in the prophetic work of Blake’s great theological reader, our contemporary Thomas J.J. Altizer, who passed at the age of 91 on Wednesday, November 28. His former student Alina Feld, an adjunct professor of philosophy at Hofstra University, remembered Altizer as “an indomitable gadfly awakening all slumbering, lukewarm hearts and minds,” a voice more at home with the radicals of the 17th century than with the bromides of contemporary conservative Christianity.

Altizer is associated in the popular imagination with the best-selling 1966 Time article about his and others’ contributions to the “Death of God” movement which tried to reconcile Christianity and Judaism to Nietzsche’s infamous declaration about God’s mortality in The Gay Science. Altizer, who was then a professor in the religion department of Emory University, was joined by Fr. Paul van Buren at Temple University, Rev. Gabriel Vahanian, Rabbi Richard Rubenstein and William Hamilton. These scholars drew upon the philosophy of Nietzsche, as well as more recent theologians like Dietrich Bonhoeffer and Paul Tillich, asking what it meant to have faith in a faithless world. Altizer wrote that “modern man has known a moral chaos, a vacuous nihilism dissolving every ground of moral judgment, which is unequaled in history,” where God’s silence is louder than any Hosanna.



Consequently, a radically honest theology must be crafted in response. Time religion editor John T. Elson explained that this diverse assortment of thinkers “believes that God is indeed absolutely dead, but proposes to carry on and write a theology without God.” Radical theologians understood that to confront the horrors of Auschwitz and Hiroshima, one couldn’t return to the platitudes of mainstream Christianity as uttered by someone like Billy Graham (who denounced Altizer from his pulpit). In opposition, Altizer declared in Toward a New Christianity that“If Protestant theology has reached the point where it is closed to the challenge of atheism, then it has ceased to be the intellectual vanguard of Christianity.”

Altizer was arguably the most radical of featured thinkers in Elson’s essay, though what endured in the wider culture was less its author’s quick tour, but rather the stark red-and-black cover which asked “Is God Dead?”Altizer attracted instant notoriety, which dogged him throughout his life. Even more problematic, however, was the reduction of such a subtle and multifaceted theology to a single magazine cover. Lissa McCullough, an adjunct professor of philosophy at California State University Dominguez Hills and editor of The Call to Radical Theology, explained that the controversy has threatened to obscure Altizer’s significance. In her opinion “Altizer began to work out and publish his most original theological work… well after the so-called death of God debate faded,” with that infamous cover threatening to overshadow the “most important theologian of the second half of the twentieth century.”

Born to wealthy parents in 1927, Altizer was raised among Charleston, West Virginia’s high society. A direct descendant of Confederate general Thomas “Stonewall” Jackson, Altizer would reject the reactionary politics of his upbringing, as well as the conservative Christianity which acts as its handmaiden. Writing in 1966’s The Gospel of Christian Atheism, Altizer argued that “the radical Christian maintains that it is the Church’s regressive religious belief… which impels it to betray the present… reality of Christ.” Altizer would receive his bachelors, masters, and doctoral degrees entirely from the University of Chicago’s famed divinity school. During this period, he attempted to become an Episcopal priest, but was rejected after a psychological evaluation in which he revealed both his personal experiences of Satan and his revelation of the death of God. After a short stint as a professor at Wabash College, he would move to Emory University where he would first develop his infamous theology.

His appointment at the school would last from 1956-1968, and while the university was a defender of his academic freedom, Atlanta was not necessarily the most conducive to so heretical a thinker – the Methodist Church which governed Emory having officially denounced him. He would spend the final three decades of his academic career at SUNY Stony Brook, retiring to the Poconos in 1996 where he continued to write, the last of his over 20 books published just a few months ago. Too often ignored by literary scholars, Altizer’s engagement with Dante, Milton, Blake and James Joyce revealed a tradition of interconnected Christian epic, whereby such poetry can be understood as the unfolding revelation of a single work. Within those writings, Altizer presents a “comprehensive and systematic accounting of Christian faith,” as McCullough told me, but unlike more recent systematic theologians, Altizer’s has the “unorthodox twist that it seeks to recapture the transformative apocalyptic energies and potentials of primitive Christianity.”

Altizer can be an esoteric thinker, conversant with poetic metaphor as much as logical syllogism, the better to synthesize Nietzsche and Blake with Paul and Augustine. McCullough explained that “Altizer’s conviction is that the historical church reversed the real emphasis of Jesus’s teachings,” and in that regard he stood with both the ancient Gnostics and the early modern dissenters in excavating a radical Christianity opposed to profane reality. Altizer wrote that the “the original heresy was the identification of the Church as the body of Christ,” a species of idolatry that inevitably corrupted faith. Identification of such corruption is a mainstay of political theology. It was the diagnosis of Lollards denouncing the medieval Church, of the Protestant reformers denouncing Rome, and of Anabaptists denouncing everyone. What makes Altizer fascinating is that he stands against any organized Christianity, seeing the very idea as contradictory. His theology holds that the “exalted and transcendent Lord is a sufficient sign to the radical Christian that Christianity has reversed the movement of the Incarnation.” By way of paraphrase to the eastern faiths which fascinated Altizer, the spoken Christ is not the real Christ, and should you meet Jesus on the road you must crucify him.

His is a systematized theology of all of the subversive, radical, counter-currents implicit within Christianity. I see him as the most full-throated modern manifestation of those groups mentioned earlier. He is of the tradition of John Reeve, founder of the Muggletonians, who demanded that you should “look into thy own body, there thou shalt see the Kingdom of Heaven and the Kingdom of Hell.” Altizer draws from the same well-spring as the preacher Theaurau John Tany who sang that the “Soul is of the essence of God/There is neither hell nor damnation,” or of the Ranter Abiezer Coppe’s belief that radical faith existed to “overturn, overturn, overturn.”

Altizer and his precedents sang in tongues of fire; they were theological astronauts. McCullough told me that Altizer’s thought was “committed to the ultimate sacredness of life in this world, here and now, rather than a heavenly realm after death,” and in that I hear Reeve who thundered in one pamphlet that you must “look into thy own body, there thou shall see the Kingdom of Heaven and the Kingdom of Hell.” Far from a purely atheistic negation, this perspective was intoxicated with God, for by erasing those distinctions between the almighty and everything else there was the promise of making the world a Paradise.

Anyone who met Altizer couldn’t help but be struck by a similar God-intoxication, even with his reputation for “Christian atheism” (or perhaps because of it). Seventeenth century non-conformists understood that theology and rhetoric are equivalent, and spectacle was central to their teachings, be it the Leveller Gerard Winstanley destroying hedges which separated private estates, or the Quaker John Naylor marching into Bristol by donkey on Palm Sunday. Corresponding with him through email and having met him several times, I can attest that Altizer had a similar countenance. Meeting him at the first conference of the International Society for Heresy Studies held at New York University in 2014, my impression was of having conversed with an actual prophet.  Imagine Blake in a red blazer over a Hawaiian shirt, standing at Broadway and W. 4th.

Altizer’s keynote was a fire-and-brimstone sermon for God’s funeral delivered in an Appalachian twang. Gregory Erickson, an NYU English professor, as well as an organizer of that conference, recalls that Altizer signed off an email with “For I am a Preacher at heart and a Preacher of that Death of God which is Resurrection and Apocalypse at once.” Something arresting about these “statements [that] would come thundering out of him, in his writing and in person,” as Erickson recalls. Jordan Miller, co-editor of The Palgrave Companion of Radical Theology, which contains some of Altizer’s last work, remembered his early encounters. After a lecture to an undergraduate class that Miller was enrolled in, Altizer gave a copy of The Gospel of Christian Atheism to him. Miller explained that “Like a good, southern preacher” Altizer kept copies of his books to give to “any independent bookstore he passed on his travels.”

For Altizer, faith and doubt were unified together in their own strange ecstasy, and paradoxical divine atheism gestures to a God bigger than the one circumscribed by scripture. Miller told me that Altizer’s rejection of Christianity in “consort with powerful and oppressive institutions” allows for a new and “meaningful theology in a tragic world.” Altizer’s is a difficult path, the theologian writing that “passage through the death of God must issue in either an abolition of man or in the birth of a new and transfigured humanity.” Yet as Miller explains, this theology is “not escapist. It embraces the world in its love the world.” From Miller’s perspective, it is precisely this sort of radical faith which is a “true antidote to toxic evangelicalism and the milquetoast theology of the privileged.”

Altizer was the latest of the dissenters, the final nonconformist, who promised intimation of salvation in a universe without redemption. He knew the score, writing that for “the Christian who bets that God is dead” there are risks of “both moral chaos and his own damnation.” Yet Altizer was a betting man, and what he gained was a type of liberty. Such was the Blakean imperative, what he describes in Godhead and Nothing as “theological language [that] is a truly universal language,” which is the “absolute No… a darkness which is finally the darkness of God.” Blake wrote with thundering truth that “I must Create a System, or be enslav’d by another Man’s.” By such standards Altizer, who died on Blake’s birthday, was one of the freest of people.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170687 https://historynewsnetwork.org/article/170687 0
Now that the Bush Funeral Is Over with, We Need to Talk Honestly About the End of the Cold War

Germans stand on top of the Wall in the days before it was torn down - By Lear 21 at English Wikipedia, CC BY-SA 3.0


Related Link How People Are Remembering George HW Bush

Imagine a superpower founded on a revolution inspired by Enlightenment values (often honored in the breach), great violence liberally applied and mostly badly remembered, and a myth of exceptionalism and superior progress that even its critics find hard to fully escape. 

Our superpower’s global reach and bite commands if not the respect then at least the fear of the world in a manner second to no other state on the planet. While its adventures abroad are often less than successful and leave corpses and ruins in their wake, it is or feels so strong that that doesn’t teach it much. 

Its domestic politics are close to stagnation, as visitors can quickly guess from its almost comically decrepit infrastructure so clearly not in sync with the ability of its scientists, engineers, and workers or its claims of indispensable leadership. 



Its ruling elite is irrationally and obstinately wedded to hoary old dogmas – originally imported from dour European ideologists – about the relationship between the economy and politics. Increasing numbers of its ordinary citizens, meanwhile, are not only unhappy about specific policies but – much more worryingly – about the system as a whole, its principles, institutions, and representatives. 

There is a widespread and plausible sense that the elites – in politics, the economy, and the media – have built themselves a privileged world of careerist cynicism, lying as a way of life, corruption without limits or regrets, and, last but not least, gross impunity. 

This loss of faith in the political and social order is reflected in the rise of a desperately dark sense of humor. Especially the young wonder with increasing trepidation what fate awaits them in the world their elders have unmade. Even some former insiders and dissident elite members speak of the deep perversion of the system they know so well. Some citizens are even doubting the literally fundamental ideas and heroes of the original revolution. 

This is, of course, a description of the late Soviet Union, the other superpower of the Cold War and the only state that, for now at least, could ever claim to have – very unwisely – challenged and stood its ground against post-World War Two America, for a while and at crippling cost. 

Despite all the well-known differences, it is also, equally obviously, a description of the USA about one generation after the Cold War ended and the collapse of the Soviet Union. Where doctrinaire anti-capitalism mightily helped the Soviets dig their own grave, doctrinaire pro-capitalism might still do the job for their old nemesis, especially under conditions of expensive militarism, another similarity. 

And where the simple-minded veneration of Lenin, the Soviet founding father, could not survive the deeply unsatisfying reality of the state he created, perhaps the flagrant flaws of American politics may finally end up toppling slave-holding founding fathers from their pedestals as well. 

Certainly, the American Dream has already lived longer than its Soviet rival. But must it live forever? The Soviet one did not, and – most disturbingly – it died with a suddenness that took many observers by surprise. As the title of an influential post-mortem of the Soviet Union has it, “everything was forever, until it was no more.”  

Yet despite – or because? – of this bleak picture, we have just seen a collective outpour of nostalgic triumphalism. Intriguingly, it focuses on the American leader during whose reign the Soviet Union breathed its last, George Bush I. After his passing away, most of the US media have exploited the traditional convention of the eulogy – to speak nothing but good of the dead – to engage in a collective fit of national self-adoration that Pravda might have been proud of. Almost everything about this exhibitionist wave of nostalgia is wrong. Bush I was not a kind king, but a ruthless wielder of power, at home and abroad.

Let’s focus, however, on just one element of this love fest for the powers that be – or at least were – namely the bizarre yet popular claim that Bush I managed the end of the Cold War well. This is factually misleading and chockfull of bad politics as well. Here is why: First of all, the Cold War did not end under Bush I, the Soviet Union did. The Cold War was over by the time, Bush I came into power. If we want to be nice to an arch-conservative American president for making a contribution to ending it – by taking Soviet initiatives seriously – that would be Ronald Reagan. 

Why does that matter? Certainly not because Reagan must have his share of the glory. As almost all presidents, he will always be served well enough with adulation, deserved or, mostly, not. What we lose by misdating the end of the Cold War is a sense of how unlikely it was, not, at that point, because of the Soviets but because of the American establishment. Reagan’s one positive contribution to world history – after the war scares of 1983 which he helped bring about – was to go against the blob. Ironically, he did so precisely because he had that “vision thing” that Bush I would later mock. It was not the WASPs and their vaunted get-things-done sobriety that helped end the Cold War, but the wild-eyed if oddly placed utopianism of a former Hollywood actor. Thus no, this is not a lesson about trusting traditional elites to manage the world well – sorry, Ross Douthat.  

We also gain something by conflating the end of the Cold War and that of the Soviet Union – and that is even worse, namely a blinding bias: If we pretend that the Cold War only ended when and because the Soviet Union disappeared, we imply that this was a war that could only end with the total defeat, even the annihilation – if, in this case, mostly peacefully – of the opponent. That is, of course, a favorite illusion of the American right and, alas, center. 

Here, the end of the Cold War morphs into the greatest case of successful regime change yet – and an eternal reminder that there are no alternatives. Thus, the lesson implied is to never seek compromise with irritatingly, obstinately, unbearably other Others but, instead, insist that they become like us, whether they want to or not. Yet, in reality, compromise – if much in favor of the USA – is exactly how the Cold War really ended. 

Put differently, it is a fallacy to believe that the USA won the Cold War because the Soviet Union lost it. Yes, the Soviets did lose it, but America, fortunately, initially only took advantage withoutinsisting on winning. That was the key to its end.

Which brings us to what happened afterwards, namely Bush I beginning to mess up the ensuing peace (an endeavor then continued by his successor Bill Clinton), in two ways at least: just after the Soviet Union had ceased to exist, in his State of the Union Address of January 1992, he could not resist crowing about America having won the Cold War, quite blasphemously, invoking higher powers as well. Indeed, he rubbed it in, making a point of insisting that the Cold War had not “ended” (his scare quotes) but been won

And he presided over a war against Iraq that demonstrated that the post-Cold War “New World Order” would be one of America having it its way even more than before. In both instances, he did not create but helped along the Russian bitterness and self-pity that has since grown strident. He also promoted the American elite arrogance that has since grown self-defeating. 

It’s not as if America could not learn anything from looking at Russia, but it has a habit of getting the lessons wrong. Watching Putin, it fails to see that his attempts to influence its politics are much less important than the deep capitalist-oligarchic convergence between the two countries. 

Looking at the end of the Soviet Union, America fails to see that what lost the Cold War for its old best enemy was the hubris of super-powering-while-declining. What then killedit was its own failure to address its glaring flaws at home quickly and effectively enough and, of course, a ruthlessly self-interested elite that put its own careers, power, and profit above everything else.    

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170676 https://historynewsnetwork.org/article/170676 0


An interview with HNN publisher and founder Rick Shenkman.  He is retiring at the end of 2018 when the website will be taken over by George Washington University under Kyla Sommers

MAH: Rick, you have a vita of which any professor of history would be proud: best-selling author; presidential historian; contributor to the NBC Today Show; producer, writer, and host of a program on The Learning Channel; journalistic lecturer; and founder and editor of History News Network; among other things I’ve failed to mention. All such things in the interest of advancing our appreciation of history. What got you interested in history?

RS: I always liked history in school, a love I inherited from my mother, I’m sure. She loved history and politics.

I was better in history than other subjects in school, so that had something to do with my liking history. You like a subject in which you do well.

But until I was in high school I didn’t really excel in any subject in school. Then one day, in the summer of 1970, in-between my sophomore and junior years, I happened to walk into a bookstore in Ridgewood, NJ, which was next to the town I grew-up in—a town with a funny name, Ho-Ho-Kus—and happened upon the paperback edition of Richard Hofstadter’s The Age of Reform. I think I was drawn to the striking red, white and blue cover. It cost just $1.65 and I bought it. It was a snap decision and it changed my life.

I devoured the book in four days. From that moment on I was hooked. Hofstadter was the match that lighted a fire that burns to this day. I think of it as an intellectual awakening. It made me realize there are complicated forces that drive human behavior and by studying documents you can figure out what makes humans behave the way they do.

I have had only one other intellectual awakening like this. That came when I read Daniel Kahneman’s Thinking Fast & Slow (2011). That set me on a path I am still on today, which is to understand how our neural engineering drives the decisions we make.

After finishing The Age of Reform I went on to all of his other books, reading almost all of them by the end of high school. (A school administrator, hearing of my achievements in history called me into his office.  He was skeptical that I had read all these books so he grilled me about them.  Apparently, I passed muster.  In June 1972 I was selected as Junior Rotarian of the Month and given the high school history award. For somebody who'd done middling work for most of my school career these were heady times.  Junior Rotarian of the Month!)

Another influence was Thomas Bailey, the Stanford historian who wrote The American Pageant. That was the school textbook we’d be reading the following year in U.S. History I. I ordered a copy from the publisher, DC Heath, and basically memorized it by the time school began in the fall. By mistake they sent me the teacher’s Quiz Book along with the textbook so I was able to test myself as I went along. I have always been grateful for this blunder.

When I showed up for class in September I told my teacher I had read the textbook cover to cover and wanted to do independent study. He quizzed me to make sure I really knew the material and I passed. I spent the year writing an 85 page term paper on the agrarian movement. The next year they let me do another independent study project, this time on the liberal and conservative tradition in American history. This term paper ran 163 pages. I even got it bound.

Because I had never been a good student until this point I came to regard history and my identity as one. This feeling has persisted to this day.

I should also mention three of my high school teachers. They loved my enthusiasm for history and nurtured it. I owe them a great deal. Two of the teachers were historians.  I audited their classes when I was given an exemption from History I and II.  Milo Okkema was a conservative who tutored students in the Great Books.  Harry Ahearn was a classic New Dealer who lionized FDR.  Then there was Frank Asher, who taught sociology and Afro-American History (yes, that's what it was called back then).  He was an old leftist who was reputed to have once been a member of the Communist Party. Going from class to class forced me to confront the fact that history is an argument without easy answers, which helped me avoid the impression left by Bailey’s textbook that history consists of a set of facts one commits to memory.

I came across a wonderful quote from Henry Steele Commager, which I used at the front of one of my two big term papers:  "You becomes a historian not so much because you're interested in history, but because you admire people who are interested in history." This captures the human element in my love of history. I loved those teachers of mine and because they thought history was important. I thought it was too.

MAH: Hmm, I can relate. I had a similar experience while reading Aristotle’s Politics very early in the morning while working as a security guard at a movie complex, under construction. Yet I never met Aristotle. Okay, a right turn! Having over the years read so many historians with so many different approaches to history, you certainly have developed thoughts on historiography. What is the Shenkmanian philosophy of history? Is imagination important?

RS: To begin with, I’m not much of a philosopher. One reason I decided to become a historian was because historians generally resist philosophizing. We prefer the concrete. And so do I. We are wary of philosophy. Down that rabbit hole is Spenglerism, Toynebeeism, and other isms. I am downright allergic to that sort of thing. History isn’t a science and there aren’t any laws to history. You study the facts as best as you can accumulate them and look for patterns. You don’t start with a pattern and work backwards.

But having said that I think it’s rather foolish to think that historians lack a philosophy. Like R.G. Collingwood, whom my Vassar professors exposed me to some four decades ago, there are a set of assumptions historians make about the world that amount to a philosophy even if we don’t strictly call it that.

First and foremost is the belief in contingency, the idea that consequential things often happen in pell-mell fashion. We don’t believe that the order of the universe is decided at the direction of god or man. Shit happens, as the old sixties saying has it.

Second, the world is complicated. As a result of studying history I try to never let the words “it’s simple, really” cross my lips. Nothing of consequence is ever simple. Studying history cures one of the curse of oversimplifying. Okay, got that off my chest!

You mention imagination. It’s vital to the study of history. Anybody who has ever faced a blank piece of paper when writing a story knows this. The facts don’t add up to a story by themselves. One has to arrange them in a certain order for the story to emerge. That takes imagination.

Now having said all this I have got to admit that in the last decade, as I delved into the research for Political Animals:  How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books, 2016) I came away with a new respect for the insights of social scientists. They start with a hunch, run experiments, and write up their findings in a report. I have derived a great deal from studying their work. It’s given me a new way to look at history. I think that historians make a mistake in writing off the work of social scientists and many historians do this.

I’ll give you a quick example. The Theory of Affective Intelligence posits that when people become anxious owing to a mismatch between their perceptions of the world and the way things actually work they pause to try to resolve the dissonance they feel. In this moment they become ready to reconsider their commitments. Example. Say you voted for Richard Nixon in 1972 and then followed the news about Watergate. At some point the disclosures no doubt made you feel queasy enough to wonder if perhaps Nixon was the snake his enemies always insisted he was. That queasy feeling can be scientifically measured in your brain. Knowing this has changed how I think about turning points in history when people decide to change their minds. It makes me look for signs of the moment when, as a social scientist friend of mine puts it, "the burden of hanging on to a belief becomes greater than the burden of changing it."

Another thing I’ve learned from the social scientists is that it’s wrong to think of humans as blank slates. When I was in school in the 1970s this was bien pensant. I think too many historians remain wedded to this belief. But it’s bunk. I am convinced by Evolutionary Psychology that a grammar of human behavior is embedded in our brains. It doesn’t govern all of our reactions and it certainly doesn’t help us predict the course history takes. But we make a grievous mistake if we think we just pop into this world willy-nilly to be shaped wholly by our culture. This is nonsense.

My simplest example is storytelling. All humans tell stories about themselves and their societies. We aren’t computers. Scientists have even found specific places in the brain where we invent stories to explain what we’re experiencing. Experiments with split-brain patients (people whose left and right hemispheres were literally separated to prevent epileptic seizures) show that when the left-hand literally doesn’t know what the right-hand is doing the brain will make up a story out of whole cloth to make what the right-hand is doing comprehensible. (I write about this in Political Animals.)

So I guess I do have a philosophy of history of sorts if any of what I’ve been saying makes sense to you.

MAH: You know of my profound respect for Thomas Jefferson. Jefferson asked to have inscribed on his tombstone the following: “Here was buried Thomas Jefferson, Author of the Declaration of Independence, Of the Statute of Virginia for religious freedom & Father of the University of Virginia.” What three things would you want inscribed on your tombstone once you pass. I’m not saying you’re old, or anything like that, but I’m just asking something like this: Of what three accomplishments are you most proud?

RS: I understand entirely your use of Jefferson as a frame of reference but it’s practically struck me dumb. Jefferson had so many achievements from which to pick it was meaningful to ask him to make a selection. Me—not so much! I have written one book I am really proud of—Political Animals—and established a website—this one—I hope will endure for awhile. As a reporter in Utah I stopped an out-of-touch billionaire from becoming governor and exposed a power company's corruption (which resulted in a rebate to ratepayers of more than 60 million dollars). We all play our part.  This is mine. 

MAH: Okay, the website—History News Network—and that’s the reason why I asked you if you’d let me interview you for HNN for your final page. It’s my way—and I’m sure all others reading this will agree—it’s my way of thanking you for your selfless dedication to history through the website. Why did you decide to take on such a large project? What were your aims? Have they been met? Exceeded?

RS: Like the historian I am, I have a story to tell that explains my answer.  And hey, thanks for asking!

HNN began with a grievance.  During the impeachment of Bill Clinton, you may recall, there were cries that Congress censor him rather than impeach him.  In their reporting the media kept citing the censorship of Andrew Jackson and sometimes John Tyler.  I was doing research at the time for my book,Presidential Ambition, and knew that James Buchanan had been censured too.  I tried to contact various media outlets like ABC News and the New York Times to let them know about this forgotten moment in our history but got nowhere.  I fumed about this.  It seemed crazy that journalists would ignore a historian who had valuable information to add to an important debate. (Here is the article I wound up writing about censure.)

This was the genesis of HNN.  It seemed obvious to me that historians should have a national platform to help journalists and the public make sense of the news.  I set out to create one in 2000.  (We went online in 2001.) 

I hope we’ve fulfilled the mission.  We certainly have carved out a nice niche in the political firmament.  Scores of journalists regularly turn to HNN’s vast archive of around 10,000 original articles. Each year the website draws more than four million page views, so somebody’s reading us!

Along the way I have made innumerable friends with historians across the world.  I treasure these human relationships. What a wonderful job I have had.  Each day I have gotten the chance to commune with intellectuals of the highest caliber and broadest sympathies. Hanging out with people who are your superior in so many ways is inspiring.  As my father, a great tennis player used to say, if you want to improve your game play with people who are better than you.  At HNN that’s what I’ve been doing. 

MAH: A final question. As a man of large aims and large accomplishments, I don’t see you reclining on a Lay-Z-Boy, sipping Kentucky bourbon, and watching Oprah reruns. What do you plan on doing in retirement?

RS:  Well put!  I plan on continuing to do lectures based on Political Animals. I plan on reading fiction for the first time in 40 years, going back to Hemingway and Fitzgerald to start.  I also plan on reading more social science and science books. My reading list, as I just discovered when I did a tally, includes about 200 books.  Egads! That’s several years work right there!  And for fun I’ll keep on reading history books and biographies.  Throw in travel with my husband and biking and swimming and you have a full life. 

Thank you for asking all these questions. I am grateful to you for doing so.  I wasn’t sure how to end my long tenure at HNN (about 18 years).  You solved that problem!

Au revoir!

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170668 https://historynewsnetwork.org/article/170668 0
History Is Likely to Credit George H. W. Bush with These Two Foreign Policy Accomplishments

George H. W. Bush had a hard time winning over the American public. Republicans rejected him in favor of Ronald Reagan in 1980, and he received only 37% of the vote in his 1992 reelection bid. After serving as Reagan’s vice president for eight years and winning the presidency in his own right in 1988, he labored to get out from under the shadow of his more popular predecessor. Early assessments of Bush’s presidency were none too flattering, many citing his lack of legislative accomplishments, his lukewarm advocacy of Reagan’s populist conservatism, his “no new taxes” volte-face, and his inability to reach Middle American voters. Perhaps most important of all, his reelection loss gave his presidency what James Fallows has called the “one-term loser” asterisk that denies its bearer the appellation of greatness. 

Yet a close look at the foreign policy record of the 41st president, who has passed away at the age of 94, reveals a careful strategist whose broadly managerial view of world affairs fostered many successes. Bush came to office with a respectable foreign policy résumé. His experiences as a decorated Navy pilot in the Second World War, congressman, UN ambassador, CIA director, and China liaison educated him in the fragility of the social order and the importance of hard-won compromises. Indeed, if we could characterize his global outlook in one word, it might very well be “caution” – a posture captured in Saturday Night Live performer Dana Carvey’s dead-on impression of his favorite Bush tagline: “It wouldn’t be prudent.” In policy terms, Bush was a staunch free-market advocate and political moderate who combined the pragmatism of Dwight Eisenhower with the guarded realism of George Kennan. 

We should perhaps feel fortunate, then, that it was Bush who presided over the tectonic shift in the global order that took place in 1989-91 and who faced a daunting set of challenges that included the end of communism in Eastern Europe, the dissolution of the Soviet Union, violent ethnic struggles in the Balkans and the Caucasus, Iraq’s invasion of Kuwait, and social unrest in China. As the eminent political scientist Joseph S. Nye, Jr., has noted, “If any of the balls that Bush was juggling had been dropped, the consequences could have been disastrous for the world and for the consolidation of American primacy.”

Future historians will likely point to two crowning achievements. The first was Bush’s steady hand in helping guide the Cold War to a peaceful conclusion. Although popular memory of the era gives Reagan an inordinate amount of credit for communism’s demise, the dramatic events of 1989-91 happened on Bush’s watch. And while Bush did not create the conditions that weakened the governments of Eastern Europe, he was left with the arduous task of managing their transitions and containing the potentially explosive direction of European events. 



Given the luxury of hindsight, it is easy to underestimate Eastern Europe’s fragility in that era and the possibility that the Soviet Union would devolve into civil war. Bush and his advisors well understood that the paths of history are littered with the corpses of unsuccessful reform movements. (As national security advisor Brent Scowcroft noted, “Dying empires rarely go out peacefully.”) Bush’s team wisely refused to stoke the separatist nationalism of the Warsaw Pact states and the Soviet republics, and except for some notable incidents in the Baltic republics and the Caucasus, they and their European counterparts avoided widespread violence in the U.S.S.R.

As the scholar Serhii Plokhy has shown, Bush cast his lot with Soviet leader Mikhail Gorbachev as a means of ensuring a peaceful, stable regional order. In working closely with the Soviet leadership, Bush hoped to manage the fast-moving changes of this region and steer them in a productive direction. His tough lobbying alongside his European partners convinced the Soviets to accept a unified Germany as a NATO member, an achievement that the historian Timothy Naftali has called “one of the greatest U.S. foreign policy accomplishments of the second half of the twentieth century.” The peace dividend was clear. As John Meacham pointed out in his political biography of Bush, “Before his White House years, a nuclear Armageddon between America and the Soviet Union was always a possibility; afterward it was unthinkable.”

The decline and fall of European communism allowed for breakthroughs elsewhere. In Central America, his administration capitalized on weakening Cuban and Soviet commitments and pressed for a democratic transition in Nicaragua and an end to El Salvador’s brutal civil war. And although some criticized Bush’s 1991 decision to lift sanctions against South Africa (his advisers argued that Pretoria had met the terms of the sanctions), it was clear that the South African people had embarked on a path toward ending apartheid.

Bush’s second great achievement was his decision notto invade and occupy Saddam Hussein’s Iraq during the 1991 Gulf War. As the U.S. assembled an international coalition and geared up for its first ground war since Vietnam, the public debate showed that Americans were still sensitive about the use of military force. Some saw the fight as unnecessary or doomed to fail, but the war was brief, and it met its objective of ejecting the Iraqis without a costly, long-term occupation. 

When Bush halted American troops at the Iraqi border, hawkish voices questioned his strategic acumen. He offered a post-bellum justification in his 1998 book, A World Transformed: “Trying to eliminate Saddam, extending the ground war into an occupation of Iraq, would have violated our guideline about not changing objectives in midstream, engaging in ‘mission creep,’ and would have incurred incalculable human and political costs. . . . We would have been forced to occupy Baghdad and, in effect, rule Iraq.” Given what Americans and the world learned during the later Iraq War (2003-2011), Bush’s earlier prudence seems painfully prescient.

On some occasions, Bush’s caution prevented bold maneuvers and wider breakthroughs. His administration encouraged Shiite and Kurdish uprisings in Iraq, but his limited war strategy did not envisage American support for these rebellions, which were subsequently crushed. In response to the outbreak of war in Yugoslavia, he ceded the initiative to European leaders, who were unable to negotiate a peace. Moreover, Bush’s managerial ethos fostered something of a creativity deficit, and he was not a great communicator. His keen insights were often overshadowed by his awkward delivery – a shortcoming made even more glaring by his tenure between two gifted orators: Reagan and Bill Clinton. Although his vision of a “new world order” was a sincere attempt to articulate a post-Cold War order of collective security, its formulation was vague.

Elsewhere, his desire to appear decisive prompted bold strokes disproportionate to the threats involved. In December 1989, he sent 27,000 U.S. troops into Panama, ostensibly to restore democratic rule and to apprehend the increasingly dictatorial Manuel Noriega. The operation clearly violated Panamanian sovereignty, while it also severely damaged the El Chorillo section of Panama City and led to the deaths of several hundred Panamanians and twenty-three Americans. Most Americans supported the action, and there is evidence that certain sectors of Panama did as well. But for many observers, the invasion was far out of proportion to the strategic importance of both Noriega and Panama.

On a more constructive note, Bush was a passionate advocate for free-market economics and parliamentary democracy. In May 1989, shortly before the crackdown at Tiananmen Square and the free elections in Hungary and Poland, he laid out his vision for the new world and America’s place in it: “What is it that we want to see? It is a growing community of democracies anchoring international peace and stability, and a dynamic free-market system generating prosperity and progress on a global scale. The economic foundation of this new era is the proven success of the free market, and nurturing that foundation are the values rooted in freedom and democracy. Our country, America, was founded on these values, and they gave us the confidence that flows from strength.”

Unfortunately for Bush, his foreign policy successes did not translate into long-term political capital. He lost his 1992 reelection bid to Bill Clinton due in large measure to the economic recession and the third-party candidacy of Ross Perot. Another factor was what his former chief of staff John H. Sununu called the Churchill Effect: that is, once a great foreign policy burden is lifted from a nation’s shoulders, the national agenda shifts to domestic priorities. Although Bush had the bad luck to experience his lowest approval ratings in the run-up to the election, his popularity later rebounded, and the polls showed that he was never unpopular as an ex-president. George H. W. Bush may have served only one term, but he is likely to be remembered as one of America’s most quietly accomplished foreign policy presidents.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170645 https://historynewsnetwork.org/article/170645 0
Josephine Baker’s Secret Life as a World War II Spy


The year before the Nazis invaded Paris, in 1939, the most famous woman in Europe received a visitor who would change her life’s course forever. 

Jacques Abtey, a captain in the Deuxième Bureau, the French intelligence agency, entered Josephine Baker’s castle on the Dordogne River begrudgingly, skeptical of her offer to work as a spy against Nazi Germany. Would she be another Mata Hari, the femme fatale charged with acting as a double agent and betraying France during the Great War?

My novel, Josephine Baker’s Last Dance, describes her response:

“France made me what I am today.” She sat erect, babbling now, but who cared? She would talk all night if that was what it took. “And did I not become the cherished child of the Parisians? They gave me everything, especially their hearts.” She thumped her chest with her fist and pressed it against her own wild, twisting heart. “I am ready, Captain, to give my life to France. You may dispose of me as you wish.”


The 20th century icon Josephine Baker was so much more than a sex symbol who danced in a skirt made of bananas. Yes, she took Paris by storm in 1925 with her “Savage Dance”—performed in little more than a strategically-placed feather—and went on to increase her fame with the infamous banana skirt which, legend has it, she designed as a joke for her first revue at the Folies-Bergère.

She also became, over the next twenty years: a chanteuse, or stage singer, and international star; the first black woman to star in a feature film and to headline in New York’s Ziegfeld Follies; a recording artist; an opera diva, and – the detail that most surprises and fascinates people—a spy for the French Resistance during World War II.



Risking her life for freedom

According to some accounts, Baker joined the Maquis, a group of guerilla freedom fighters who reportedly trained her to shoot in the sewers under Paris. (She could snuff a candle at twenty yards, it is said.) But her primary roles were those of a seductress who enticed diplomats and generals to confide in her, and an envoy who carried concealed notes to Gen. Charles de Gaulle’s agents in Lisbon.

She’d write the information on her sheet music in invisible ink, or on pieces of paper pinned to her underwear, or along the insides of her arms, and carry it across borders under the auspices of touring—with Capt. Abtey by her side, posing as her theatrical agent.

Some who knew warned her that she risked her life with these activities, but Josephine only laughed. “Who would dare strip-search Josephine Baker?” she scoffed. She was right: the border patrols fawned over her, asking only for her autograph.

She lived in constant danger. She was nearly arrested several times, including when Nazis came to her castle for an impromptu search. She charmed them with her flirtatious chatter, making them forget all about the basement where several members of the Resistance were hiding.

Had she been caught, the penalty would certainly have been imprisonment in a concentration camp, or worse. But as a black woman and a Jew (she’d converted to Judaism when she married her third husband), Josephine knew that she’d be in greater danger if she remained in Paris.

Fighting for equality 

Although she held dual French and American citizenship, returning to the racist United States was not an option. She’d run from injustice before when, at 19, she’d experienced racial equality for the first time in Paris, and opted to stay. Now she aimed to fight—and for these labors of love, she never earned a dime.

In addition to her work as a spy, Baker volunteered for the Red Cross as a nurse and as a pilot, delivering supplies in her private plane. She entertained French and Allied troops on the Maginot Line; in Morocco (while still recovering from a deathly illness), and throughout Europe.

For her heroism, the French military awarded her with the Croix de Chevalier de la Légion du Honneurand the Croix de Guerre. De Gaulle presented her with the gold Croix de Lorraine, her most prized possession—which she later sold at auction to raise money for the Resistance.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170603 https://historynewsnetwork.org/article/170603 0
The Air We Breathe Is Cleaner Because of George H.W. Bush

Related Link How George H. W. Bush Is Being Remembered

With President George H.W. Bush’s death at the age of 94, commentators will surely draw a number of contrasts between the 41st president and the current occupant of the White House. There’s Bush’s famous gentlemanly demeanor and President Trump’s pugnacious sensibility. When the Cold War ended and the Soviet Union collapsed early on in this presidency, Bush was quick to champion the sort of globalization that Trump has bluntly rejected. To be sure, the vast differences between the two men reflect just some of the ways that the Republican Party has changed since the end of the twentieth century. No doubt some will write that the party of Trump could learn a lot by visiting Bush 41’s time in office. But if there’s just one lesson the 21st century GOP takes away, it should be Bush’s pragmatic handling of a key piece of environmental legislation.

Today, of course, climate change represents an existential threat to all of us. Indeed, the recent National Climate Assessment paints a bleak picture of the planet’s future. Yet even if the severity of the problem seems greater than ever before, it is not the first time a Republican White House has had to deal with grave environmental problems. In fact, extreme pollution and ecological devastation were also on Americans’ minds when George H.W. Bush took office at the end of the 1980s. 

Back then, both acid rain and global warming were growing concerns for Americans, and help from Washington seemed unlikely. Ronald Reagan had been deeply antagonistic toward the environmental movement throughout the 1980s. The 1990s, though, promised to usher in an era of environmental action. The Exxon Valdez spill in 1989 and the twentieth anniversary of Earth Day in 1990 proved to be pivotal events that helped galvanize American attitudes towards the environment. 

However, it was not a given that the new president would be a friend to the environment. Before entering politics, Bush had enjoyed success with the Zapata Corporation – a petroleum business that he founded. Despite his New England roots, George H.W. Bush had once lived the life of a Texas oilman. Yet unlike Trump – who has rejected the recent report’s conclusions – President Bush chose not to ignore the growing tide of environmentalism. 

On the contrary, some of the oral histories of the Bush White House at the University of Virginia’s Miller Center suggest an engaged president determined to move beyond his traditional base of support and address a big problem head on. Even before he took office, the president-elect began reaching out to environmental organizations such as the Sierra Club and the Environmental Defense Fund. From his early days in office, Bush’s administration tried to foster a friendlier relationship with such environmental NGOs. As one staffer remembered, many of these groups were surprised by the outreach, but they welcomed it nonetheless.

When it came to environmental legislation, Michael Boskin (Chair of the Council of Economic Advisors) remembered that it was “clear that the President wanted to do something that was nontrivial.” Such dedication was obvious to those working closely with the president. While putting together the Clean Air Act, Bush surprised many with his command of the details and his willingness to consider competing points of view. As one staffer remembered, during discussions, the president “knew more about CO2emissions” and other pollutants (like sulfur dioxide) than the head of General Motors (who was also in the room).

According to Bobbie Kilberg, who worked in the Office of Public Liaison, the president wanted to hear from all sides when considering environmental legislation. Bush “didn’t only talk to the auto companies.” He also “talked to the environmentalist groups, he talked to the union groups, he talked to the consumer groups.” 

Ultimately, Bush would preside over some of the most important environmental legislation of the late twentieth century – the 1990 amendments to the Clean Air Act. Still, this did not mean that Bush had become a liberal when it came to the environment. In fact, the 1990 update of the original 1970 act introduced a novel – and ultimately conservative – approach to the environment.

The most consequential aspect of the Clean Air Act was emissions trading. Title IV of the act, the Acid Rain Program, set limits for how much sulfur dioxide and nitrogen oxide could be released into the air by factories and other major polluters. However, the Acid Rain Program also built in a way that allowed these participants to trade their allotted pollution allowances. In essence, Title IV created a market incentive to pollute less. A company could even stand to profit by selling off some of their pollution rights.

 Economists had discussed such ideas in the past, but it was Bush’s signature that made “cap and trade” a reality. The idea of allowing corporations to trade the right to pollute was a market-based approach to a problem that had formerly been dealt with through stricter regulatory controls. Indeed, both conservative politicians and business executives had long been vexed by what they saw as a heavy-handed approach to environmental protection. 

By contrast, the “cap and trade” measures in the revised Clean Air Act would, as Boskin recalled, be a new form of “flexible regulation” that would “set a standard and then allow people to comply with it in the least costly way possible by trading their rights.” The politics of getting emissions trading were tricky, with the Bush administration using the threat of a veto and even cutting a side deal with the Environmental Protection Agency over how to interpret specific parts of the law. 

The end result, an attempt to create a market that encouraged participants to behave in an environmentally friendly way, was not something a Democratic administration was likely to have come up with in 1990. Though the Clinton administration embraced the idea later in the decade, and even tried to expand the practice with the ill-fated Kyoto Protocol, the Acid Rain Program was rooted in a Republican economic philosophy. In effect, the Clean Air Act introduced a fundamentally new approach to environmental protection.

Bush followed this landmark piece of legislation in 1992 by signing the Energy Policy Act into law, which included language encouraging the use of renewable energy. Achievements like the 1990 and 1992 acts helped set the stage for businesses to start addressing the environmental issues more directly. Even Bush’s old Houston friend, Ken Lay, used his position as the head of the Enron Corporation to explore clean energy projects.

This is not to say that Bush’s record on the environment was perfect – it wasn’t. Only under intense political pressure did he reluctantly attend 1992’s UN Earth Summit in Brazil. Once there, the U.S. declined to sign onto several treaties, and perhaps the most important agreement on global warming was, in the end, nonbinding. With Al Gore on the Democratic ticket that year, environmentalists had far more progressive options when heading to the election booth.

Still, President Bush’s work on the Clean Air Act is emblematic of a Republican administration that was not afraid to address big environmental problems. Sadly, this is not the case today. While incoming Democratic lawmakers like Alexandria Ocasio-Cortez are promoting the idea of a Green New Deal, Trump’s White House seems incapable of even acknowledging a looming planetary catastrophe. As Republican politicians begin the process of publicly extolling the 41st president’s virtues and legacy, they would do well to revisit the lessons offered by Bush’s work on some of the late twentieth century’s environmental problems.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170598 https://historynewsnetwork.org/article/170598 0
What I’m Reading: An Interview With Historian Carla Pestana


Carla Gardina Pestana is Professor of History at the University of California, Los Angeles and the Joyce Appleby Endowed Chair of America in the World. 

What books are you reading now?

I have a number of different books going at the moment. Disaffection and Everyday Life in Interregnum England by Caroline Boswell, a book I agreed to review, came to me because I listed myself as a military historian (among other categories) on the website, Women Also Know History. This site makes available information about women’s expertise as historians in order to promote the expertise of women historians. Since it was the first request I had received that mentioned having found me there, I felt compelled to agree. Otherwise I seldom review books these days. 

For recreational reading, I just finished Tigerbelle, The Wyomia Tyus Story, an autobiography of an Olympian. I don’t generally read either auto/biographies or modern history, but Ty is a family friend so I made an exception. It’s an interesting account, especially for what it shows about her world, which became dramatically wider (she was raised in the rural south but traveled extensively as a result of her athletic expertise) as well as for the gender dynamics prevailing in the era when she was coming up as an Olympian. 

More work-related but equally enjoyable has been reading Elena Schneider’s The British Occupation of Havana. I’ve been awaiting this study of the 1762 occupation of the supposedly impregnable Cuba port city. Through Schneider’s treatment we can see clearly that imperial boundaries were frequently crossed (in wartime as well as during periods of peace), and regional residents routinely failed to cooperate in efforts to close off one empire from another. That and her treatment of the role of slaves and free people of color in the defense and subsequent occupation of the city are profoundly illuminating. Her’s is one of a spate of excellent new books on the early Caribbean. 

All this is not to mention the doctoral dissertation on Haitian independence and land that I just finished or the many books and articles I am rereading in order to decide if I should assign them for my winter quarter class. There’s always too much reading to do. 

What is your favorite history book? 

This seems like an impossible question, because there are so many amazing books. I have recommended certain books to many people, so I guess that is one measure. Mr. Bligh’s Bad Language is a great book, and Greg Dening was a scholar I always admired as well as a lovely person. I used to teach Natalie Zemon Davis’s The Return of Martin Guerre, along with the related debates. I liked that book for how it shows the way the historian does her work. When I was a graduate student I read (in the same week in my first term) Christopher Hill’s The World Turned Upside and Perry Miller’s The New England Mind: From Colony to Province. This conjunction set me to thinking: how could both these realities have coexisted. My M.A. thesis (which was published in the New England Quarterly in 1983) represented my first attempt to answer that question; and my dissertation—on religious radicalism in early New England—followed and extended my effort to understand how England produced Quakerism and other forms of radicalism even as newly-founded New England embraced orthodoxy and policed its borders with violent results. 

Recently, I have read a number of wonderful books about Caribbean history: Elena Schneider’s book on Havana, but also David Wheat’s Atlantic Africa and the Spanish Caribbean, 1570-1640and Molly Warsh’s American Baroque: Pearls and the Nature of Empire, 1492-1700. I’m preparing to teach a new course on Atlantic history, so I have an enormous stack of such books to go through.   

Why did you choose history as your career?

The simple answer: my undergraduate teachers suggested I try graduate school. Growing up I knew no professors, indeed nobody with a Ph.D. But I loved reading and history, so I was amenable to the suggestion. I have never looked back: I went directly into graduate school from undergrad, and carried right on through M.A., Ph.D., and—somewhat miraculously—to a first position that was a tenure-track job at a good school. It all worked out amazingly well, although it seemed a crazy path, an unimaginable future, at the time when I took it up. If I had known I would become so passionate about being a historian that I would go live in another state for decades, away from family and my beloved Los Angeles, I wonder what I would have done. But, after a long haul, I managed to come home, and take a position at my graduate institution. I now work 15 miles from where I was born, and not too many academics can say that. 

The more complex answer: I find entering an alternate world an interesting way to use my intellectual abilities. Immersing oneself in a particular time and place in order to come to know it well is a fascinating process. At times in my career I have become so thoroughly immersed in my research that I have gotten mixed up about the number properly assigned to the current month; during the era I study, the first of the year was in March, which made December (quite sensibly) the tenth month. On the rare occasion that I can stay in the seventeenth century for an extended time, I have to remind myself that September is not in fact (any longer) the seventh month. 

I once read an essay by Edmund Morgan who suggested that we focus our research questions on what doesn’t make sense. That insight and instruction strikes me as apt in that it’s the disjunctions, the perplexities, which draw the eye and demand to be explored. When we complete that exploration, we so often find something unexpected and revealing. I would take his observation one step further, to say that as we come to know a time and place well, we become more sensitive to the unexpected. Some projects, of course, dig into unknown topics and archives but most re-consider (through deeper research or new questions) already studied topics. I find my work always shifts back and forth between verities (often contained in the historiography) that need to be challenged, and archival sources that open up the possibility of answers. That tension keeps the intellectual life of the historian interesting. 

What qualities do you need to be a historian?

Well, I don’t know about all historians, but I am tenacious, organized, and detail-oriented. I have trouble taking no for an answer, so I just keep digging and trying to figure out what I want to know. While I have never minded (indeed I cherish) time spent alone at my desk or in an archive, struggling with writing and with research, I also thoroughly enjoy the opportunities that my work gives me for thinking with others. I enjoy talking to people—students, colleagues, or the public—about ideas and about the past. I am equally pleased by the solitary and the communal aspects of this work. Ideally you can do both, work alone and with others. I feel that being able to support oneself through work as a historian is a great privilege. 

Who was your favorite history teacher? 

Another difficult question since I have benefited from the teaching and guidance of so many great history teachers. Besides my father who was not a history teacher but was prodigiously intelligent and would answer all my childhood questions about history, a high school teacher leaps to mind. Milton (Mickey) Sirkus was a great teacher. When I think back now about his pedagogical approach, I have to laugh. In an honors history class I took with him in high school, he used teasing each of his students about her or his heritage as a way to engage us in U.S. history. It is hard to imagine a teacher today who would use such a hook, and even at the time it struck me as rather edgy. It was the case that we defended our immigrant ancestors or relatives, and their contributions to U.S. history more energetically than we might have otherwise, since he engaged us on a personal level. I never thought much about what my ancestors and older relatives had faced because they were the children of immigrants until that class. He used sarcasm and teasing in a way I would not feel comfortable doing. As one example of that, when I wrote to him many years later to explain that I had gone on from his history class through college and graduate school to become a historian, he wrote back to welcome me to the ranks of the unemployed. It was in fact a terrible time for finding a history position in a university, but luckily he was eventually proved wrong on that score.  

Since my high school history class, I had excellent teachers at my undergraduate alma mater (people who pointed me toward graduate school) and in my graduate program too. I was fortunate to go to UCLA to pursue a graduate degree in early American history in the 1980s. I started working with Gary Nash, who was an amazing lecturer, galvanizing the undergraduates in his big classes, and an excellent editor, giving the best readings of my written work that I have encountered anywhere. My second year at UCLA, Joyce Appleby joined the faculty, and she was stunningly accomplished in all aspects of the work we do. She served as a role model for so many of us—so smart and no nonsense. I am so pleased to have a chair at UCLA now named in her honor. 

What is your most memorable or rewarding teaching experience?

I used to employ a first-day exercise in smaller classes that both the students and I enjoyed. I’d ask them to write down and pass up their earliest historical memory, and then I would write all their answers of the board. Then we’d discuss the list from numerous angles, starting with what criteria they had used to decide an event was historical. We discussed what guides us in making that sort of a call, and what examples of formal historical writing might align with their choices. The exercise always resulted in a great first-day discussion, one that often ranged widely. I have to say, doing it also brought home to me the ages of my students, as I watched their earliest events move forward in time. I haven’t done that in a while, but I do remember it fondly. 

These days I am enjoying the work I do at UCLA with transfer students. Some of the best undergrads I have taught here have been from the local community colleges, transferring in as juniors. They undergo a bit of culture shock, but at the same time they are eager, smart and enthusiastic. I have thoroughly enjoyed overseeing undergraduate honors theses with a handful of them.  

What are your hopes for history as a discipline? 

I work with so many smart, engaged young people that I am able to remain hopeful. It is easy to bemoan these anti-intellectual times and to worry about what will happen to the American university system and to our ability as a society and a culture to engage intellectually. Yet many people care deeply about learning, including learning about the past, and they work at the thinking and writing that we—whether as producers or consumers—need to keep history going as a discipline and as a form of knowledge. So in spite of the gloomy prognostications, I remain hopeful. History is a foundational component of a humanist education, and it is something that many people beyond the academy know to be valuable. I’m toying with writing a book for a popular audience in part to try to make some of the work we do in the academy more accessible and interesting to those outside it.  

Do you own any rare history or collectible books? Do you collect artifacts related to history?  

I don’t own any particularly rare or collectible books, although I do still have my beloved print copy of the OED—The Oxford English Dictionary—in two volumes, with its magnifying glass in the little drawer that allows me to read the many pages printed on each sheet.  

As for artifacts, I have received some fun items as gifts from former students. One gave me an old nautical sextant—appropriate to my work on maritime history and privateers—while another gave me a framed sheet out of an early edition of John Foxe’s The Actes and Monuments (better known as Foxe’s Book of Martyrs)—a gift relevant to my work on religion. I also have a counted cross stitch sampler that replicates one from the late seventeenth century. The original maker was a New England girl who grew up to join the Quaker meeting in Lynn, Massachusetts, a meeting and a community that I wrote about in my first book. My mother stitched it for me as a gift while I was writing a dissertation that included this girl, Hannah Breed, and other people from her community. 

What have you found most rewarding and most frustrating about your career?

When you ask about my career, I assume you mean my own personal triumphs and trials. If that’s the intention, I have to admit that I have been extremely privileged and lucky, so both the high points and the low occurred in that context. 

One of the most rewarding aspects of my privileged position has been being able to take all the time I wanted and needed to write a second book. I got tenure based on the first book, so despite the pressure to publish again quickly (and the harsh strictures from one department chair in particular about “frozen associate professors” who didn’t finish a second book promptly), I produced a second book (The English Atlantic in an Age of Revolution, 1640-1661) that differed drastically from my first. It took me forever to learn all that I needed in order to be confident about that book and to send it out into the world, but it was a better book for it. I am glad I did not bend to the pressure (whether self-inflicted, institutional, or otherwise) to be fast, and the tenure system allowed me that opportunity. That book might still be my personal favorite of those I have so far written, because of how far I had to stretch to write it. It didn’t help matters that I had two children over the course of researching and writing it, either. 

That is the perfect lead in to the frustrations. Like many women in my cohort, I did experience the challenges of having babies at an institution with no pregnancy leave policy. My female colleagues thought I should go ask what arrangements would be possible, but the chair of the department looked at me blankly. It was aggravating, but because I didn’t have tenure the first time around, I just thanked him and left. I didn’t become better at advocating for myself the second time, either, even though by then I did have tenure. My children are in their early to mid-20s so this was not all that long ago. Most women academics then of my acquaintance who were older than me did not have children, and if they did they often had them before they joined a department. If you found yourself in my situation, you were supposed to hope your baby arrived in the summer, best of all in early summer, so you could spend a little time at home; if the baby was born at a different time of the year, you might be allowed to teach an overload, bank some courses, and get a little time off that way. Some colleagues seemed to think that one should not try to be an academic and a mother. I managed, as did others, but the lack of support or even awareness was a source of frustration. 

How has the study of history changed in the course of your career? 

I have been at this a while, so it has changed in various ways. In my own original field of early American history, when I was in graduate school my fellow students were doing the “New Social History,” studying various groups in society often using quantitative methods. The cultural turn had already overtaken literature departments but was just coming into historians’ awareness. Soon that became the dominant approach, but at the same time areas such as Native American history were blossoming too. Today it seems that some of those early seeds of the social history scholarship—especially its engagement with race, class and (eventually) gender—has paid big dividends, reshaping the ways we think about so many topics.  

In my own historical scholarship, I have been most conscious of the shift in geographical frames. Today Atlantic history seems a bit passé, but the shift out from British North America felt startlingly true and profound at the time. When I was a graduate student, colonial America meant the thirteen colonies that became the United States and the only external links that matter were back to Britain. Most projects were framed within a single colony, and the bent toward social history meant detailed archival work within a relatively narrow geographical framework. Looking up from that narrow landscape to perceive the connectedness of various places not in North America and indeed not within the English imperial boundaries felt like a revelation.  

What is your favorite history-related saying? Have you come up with your own?

I do not like the usual history sayings, because they often assume some simple connection between the past and the present that I perceive to be wrong. For instance, I don’t agree that history repeats itself. Or rather, as George Santayana said, “those who cannot remember the past are condemned to repeat it.” Even Karl Marx’s version, “History repeats itself, first as tragedy and then as farce” doesn’t strike me as entirely accurate. The factors that shape our present are so complex and multifaceted that attempts to achieve or avoid a particular outcome usually set into motion numerous unintended consequences—that (more than the repeat nature of history or our ability to remember it and thereby keep it from repeating) is what strikes me most often as I study the intentions of historical actors. 

I am rather more enamored of the L.P. Hartley observation, which points in the opposite direction: that “the past is a foreign country; they do things differently there.” The opening line of his novel could actually be read as a caution to those who look for repetition or simply lessons, since that often involves ignoring the differences. 

For the sheer pleasure of following its twisted history in our popular culture, I do rather like Laurel Thatcher Ulrich’s “Well-behaved women rarely make history.” I read it in its original context, before it developed a life of its own, in an article about what was considered proper behavior for women. Laurel meant it as a straightforward description of the cultural ideal: women were not to draw attention to themselves but to remain quietly in their proscribed roles. She was not issuing a call to revolution or advocating that women should misbehave and make history. But the quotation got picked up and flipped from its original meaning to its opposite. That reversal is fascinating, and I particularly love how people attribute it to various women (such as Eleanor Roosevelt) who purportedly said it to advocate that women make trouble and call attention to the need for change. Laurel has written a book on the whole phenomenon, in part to get all those who know her to stop sending her pictures of it misattributed on t-shirts, coffee mugs, and protest signs. 

The strange history of that history quotation makes it fun. It remains ubiquitous, and I bet people still email Laurel about its odder appearances. I’ve long since quit doing so, although I continue to see it around. 

What are you doing next?

Well, I am chair of my department, so I am doing a great deal of university service. I care deeply about my department and my university, so I don’t mind giving some of my time over to this work. But that obligation does mean that I will produce less scholarship in the short term. I do have a book manuscript on Plymouth Plantation that I am trying to finish for the 400th anniversary of the Mayflower landing. It differs from anything I have done before, in that it is aimed at a popular audience. I was inspired to write it by an extended visit to the living history museum that reenacts Plymouth, having been brought in along with others to help the staff there to update their historical coverage. That experience got me thinking about Plymouth and how we Americans envision it. My impulse to create this work owes something to the fact that I wrote for a few years for the Huffington Post. Writing for a popular audience about the intersections between the past that I study and current events proved a challenging discipline; 800 words are very few (at least for the historian who writes 200 page books), and the need to respond quickly and in a focused fashion I found invigorating. I am trying to bring what I learned doing that to this new project. 

As usual—as has been the case since the start of my career—I also have a little Quaker piece I am mulling over. My very first research project as a graduate student was on the Quakers, and I keep coming back to them with various questions and ideas. And finally, I mean to get back into the Jamaican archives, to follow up some of what I was doing with my previous book. So, lots to do, but not enough time to do it all. Isn’t that always the case?

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170597 https://historynewsnetwork.org/article/170597 0
HNN Doyen: Gordon S. Wood

What They're Famous For

Gordon S. Wood is Alva O. Way University Professor and Professor of History at Brown University. He is one of the foremost scholars on the American Revolution in the country. His book, The Radicalism of the American Revolution, won the Pulitzer Prize in 1993. It is considered among the definitive works on the social, political and economic consequences of the Revolutionary War. 

Edmund S. Morgan, Professor Emeritus of Yale University in his review of this book for the New York Review of Books called it "a tour de force. This is a book that could redirect historical thinking about the Revolution and its place in the national consciousness." In the book, Professor Wood gives readers a revolution that transformed an almost feudal society into a democratic one, whose emerging realities sometimes baffled and disappointed its founding fathers.  Professor Wood has written numerous other books, including The Creation of the American Republic 1776-1787, which was nominated for the National Book Award and received the Bancroft and John H. Dunning prizes in 1970. He was involved in Ken Burn's PBS production on Thomas Jefferson, is contributing his expertise in the National Constitution Center being built in Philadelphia and regularly devotes a portion of his time teaching history to high school students around the country. Wood was mentioned in the 1997 film Good Will Hunting which Wood in a 2004 Washington Post Interview called "my two seconds of fame."

Personal Anecdote

I was always interested in history, even in high school with a history teacher who taught American history by having the students, up and down the rows, read aloud from the textbook. I majored in history in college but thought that I would enter the foreign service when I completed my military service in the Air Force. But being treated rather arbitrarily by the military (after eight months of training in Texas to become a photo-intelligence officer, I was promptly made a personnel officer when I was assigned to a squadron) made me leery of working for the government. So I applied to graduate school to study history instead. I have never regretted that decision.

I have come to realize that history is not merely an accumulation of information about the past. More important, it is a mode of understanding reality, not just the reality of the past but the reality of the present. Without a deep sense of history a person or a culture lacks perspective and wisdom. Despite the enormous number of history books that are published each year in the United States, most Americans do not seem to have a very deep sense of history. It might get in the way of our enthusiastic ebullience that we Americans can do anything.

Despite the constant repetition of George Santayana's phrase that "those who cannot remember the past are doomed to repeat it," I don't believe that history teaches any lessons. Or perhaps better: it teaches only one lesson, that nothing ever quite works out the way the historical participants intended or expected. In other words, if history teaches anything, it teaches humility, something we all need a little more of.

Looking for all sorts of lessons from the past is to misuse history for the sake of the present.  The search for lessons in fact expresses the sort of present-centered, instrumentalist history that we have usually found in the work of most American historians. Many historians today view history exclusively through the categories and values of the present and seek to use it directly to solve our present problems or to criticize our present culture. Rather than trying to understand the past on its own terms, many historians want the past to be immediately relevant and useful; they want to use history to empower people in the present, to help them develop self-identity, or to enable them to break free of that past. These ought not to be the functions of this greatest of the humanistic disciplines. 

Of my books, my favorite is my first, The Creation of the American Republic, 1776-1787, largely I suppose because it was the first and because it seems to have been the most influential, even though it has not sold the most copies. Of course, I had no idea at the outset that it would become part of a so-called "republican synthesis." That development only reinforces my view that history is a largely a series of unintended consequences in which the best laid plans of people go awry. 

Quotes By Gordon S. Wood

Gordon Wood in "Creation of the American Republic"

By using the most popular and democratic rhetoric available to explain and justify their aristocratic system, the Federalists helped to foreclose the development of an American intellectual tradition in which differing ideas of politics would be ultimately and genuinely related to differing social interests.  In other words, the Federalists in 1787 hastened the destruction of whatever chance there was in America for the growth of an avowedly aristocratic conception of politics and thereby contributed to the creation of the encompassing liberal tradition which mitigated and often obscured the real social antagonisms of American politics. By attempting to confront and retard the thrust of the Revolution with the rhetoric of the Revolution, the Federalists fixed the terms for the future discussion of American politics. They thus brought the ideology of the Revolution to consummation and created as distinctly American political theory but only at the cost of eventually impoverishing later American political thought.

Gordon Wood in "The Americanization of Benjamin Franklin"

It is the image of the hardworking self-made businessman that has most endured. Franklin was one of the greatest of the Founders; indeed, his crucial diplomacy in the Revolution makes him only second to Washington in importance. But that importance is not what we most remember about Franklin. It is instead the symbolic Franklin of the bumptious capitalism of the early republic-the man who personafies the American dream-who stays with us. And as long as America is seen as the land of opprtunity, where you can get ahead if you work hard, this image of Franklin will likely be the one that continues to dominate American Culture.

About Gordon S. Wood

"One of the half dozen most important books ever written about the American Revolution." -- New York Times Book Review reviewing "The Creation of the American Republic, 1776-1787"

"During the nearly two decades since its publication, this book has set the pace, furnished benchmarks, and afforded targets for many subsequent studies. If ever a work of history merited the appellation 'modern classic,' this is surely one." -- William and Mary Quarterly reviewing "The Creation of the American Republic, 1776-1787"

"[A] brilliant and sweeping interpretation of political culture in the Revolutionary generation." -- New England Quarterly reviewing "The Creation of the American Republic, 1776-1787"

"This is an admirable, thoughtful, and penetrating study of one of the most important chapters in American history." -- Wesley Frank Craven reviewing "The Creation of the American Republic, 1776-1787"

"The most important study of the American Revolution to appear in over twenty years... a landmark book." -- Pauline Maier in The New York Times Book Review reviewing "Radicalism of the American Revolution"

"A breathtaking social, political, and ideological analysis. This book will set the agenda for discussion for some time to come." -- Richard L. Bushman reviewing Radicalism of the American Revolution

"An elegant synthesis done by the leading scholar in the field, which nicely integrates the work on the American Revolution over the last three decades but never loses contact with the older, classic questions that we have been arguing about for over two hundred years." -- Joseph J. Ellis, author of Founding Brothers reviewing "The American Revolution"

"In this absorbing narrative, one of out premier American historians has captured the extraordinary interaction of a rising American people and the man who rose with them, shaping their aspirations as they shaped his." -- Edmund S. Morgan, Yale University reviewing "The Americanization of Benjamin Franklin."

"[Wood] possesses as profound a grasp of the early days of the Republic as anyone now working..." -- The New York Times Book Review reviewing "The Americanization of Benjamin Franklin"

"I cannot remember ever reading a work of history and biography that is quite so fluent, so perfectly composed and balanced..." -- The New York Sun reviewing "The Americanization of Benjamin Franklin"

"Wood's relies heavily -- though never heavy-handedly -- on psychology. Wood alludes frequently to Franklin's "genius"... giving the patient reader an exceptionally rich perspective on one of the most accomplished, complex and unpredictable Americans of his own time or any other. -- Jonathan Yardley of the Washington Post reviewing "The Americanization of Benjamin Franklin"

Bancroft and Pulitzer Prize-winner Wood suggests that behind America's current romance with the founding fathers is a critique of our own leaders, a desire for such capable and disinterested leadership as was offered by George Washington and Thomas Jefferson. Revolutionary Characters : What Made the Founders Different JPG Provocatively, Wood argues that the very egalitarian democracy Washington and Co. created all but guarantees that we will "never again replicate the extraordinary generation of the founders." In 10 essays, most culled from the New York Review of Books and the New Republic, Wood offers miniature portraits of James Madison, Aaron Burr, Alexander Hamilton and Thomas Paine. The most stimulating chapter is devoted to John Adams, who died thinking he would never get his due in historians' accounts of the Revolution; for the most part, he was right. This piece is an important corrective; Adams, says Wood, was not only pessimistic about the greed and scrambling he saw in his fellow Americans, he was downright prophetic-and his countrymen, then and now, have never wanted to reckon with his critiques. Wood is an elegant writer who has devoted decades to the men about whom he is writing, and taken together, these pieces add perspective to the founding fathers cottage industry. -- Publishers Weekly advance praise for "Revolutionary Characters : What Made the Founders Different"

"He's a very distinguished name, and he's increased the public profile of the University. It's very sad to lose someone of Gordon's stature. He's the sort of person who puts Brown on the map... "I'm a big fan of Gordon's, he has been tremendous for the University." -- Timothy Harris, Munro-Goodwin-Wilkinson Professor in European History, Brown University in "The Brown Daily Herald"

"You can see he's so knowledgeable and he just has this clear expertise on the Revolution," he said. "I wanted to take a class with a professor who's basically the authority on a subject, and I know that Gordon Wood is the man... "I took it just because I'm interested in the American Revolution and the beginning of our nation, and because I know we're at a time that we're making a lot of decisions. It's interesting to look back and see where our nation began." -- Evan Brown '06, Brown University in "The Brown Daily Herald"

Beth Hoffman became interested in [Wood's] course when her high-school U.S. history teacher told her that Wood is "the Ben Affleck of the history world. "The teacher told Hoffman that "to pass up the opportunity to take a history class with Gordon Wood would be like passing up the opportunity to meet Ben Affleck." -- Beth Hoffman '07, Brown University in "The Brown Daily Herald"

"Wood is an excellent lecturer and his command of the information is unparalleled."... "I know he is famous, but talent is there. We are lucky to have a living legend who is a great teacher and not just resting on his rep. If he is not the next president of our university we should be thrown out of the Ivy League."... "A smart, well-spoken guy who clearly has come up with an innovative and intelligent interpretation in his field. Even occasionally funny at 9am."... "His command of US history is astounding and scintillating." -- Anonymous Students at Brown University

Basic Facts

Teaching Positions:  Harvard University, Teaching Fellow, 1960-64. College of William and Mary, Assistant Professor, 1964-66. Harvard University, Assistant Professor, 1966-67.  University of Michigan, Associate Professor, 1967-69. Brown University, Associate Professor, 1969-71.  Brown University, Professor of History, 1971-.  Pitt Professor, Cambridge University, 1982-83. Brown University, Chairman, Department of History, 1983-86. Brown University, University Professor, 1990-.  Brown University, Alva O. Way University Professor, 1997-.  Northwestern University School of Law, Pritzker Visting Professor, 2001.  Northwestern University, Board of Trustee Professor of Law and History, 2003.

Area of Research: American Revolution, Founding Fathers

Education: A.B., Tufts University (Summa cum laude, Phi Beta Kappa), 1955. A.M., Harvard University, 1959. Ph.D., Harvard University, 1964. 

Major Publications:

● The Creation of the American Republic, (University of North Carolina Press, 1969).


● The Rising Glory of America, 1760-1820, (Braziller, 1971).

● Revolution and the Political Integration of the Enslaved and Disenfranchised, (American Enterprise Institute for Public Policy Research, 1974).

● Making of the Constitution, (Baylor University Press, 1987).

● Radicalism of the American Revolution, (A.A. Knopf, 1992).

● Creation of the American Republic, 1776-1787, (University of North Carolina Press, 1998).

● American Revolution: A History, (Modern Library, 2002).

● The Americanization of Benjamin Franklin, (Penguin Press, 2004).

● Revolutionary Characters : What Made the Founders Different, (Penguin Press, May 18, 2006).


Pulitzer Prize in History (1993), Ralph Waldo Emerson Award of Phi Beta Kappa (1992), and Fraunces Tavern Museum Book Award (1992), all for Radicalism of the American Revolution.

Bancroft Prize, Columbia University, John H. Dunning Prize, American Historical Association, and Nominee for National Book Award in History and Biography, all in 1970 for The Creation of the American Republic. 

Julia Ward Howe Prize from the Boston Authors Club, 2005 for The Americanization of Benjamin Franklin.

 John Adams Fellowship, Institute of United States Studies, 2002.

 Doctor of Letters, LaTrobe University, Australia, 2001.

Rhode Island Heritage Hall of Fame, 2000.

Fletcher Jones Distinguished Fellowship, Huntington Library, 1997-98. 

Guest-Scholarship, Woodrow Wilson Center, 1993-94. 

Visiting Fellowship, All Souls College, Oxford, 1991. 

Sunderland Fellowship, University of Michigan Law School, 1990.

Center for Advanced Study in the Behavioral Sciences, 1987-88.

Douglass Adair Award, 1984.

Daughters of Colonial Wars award for the outstanding article in the William and Mary Quarterly, 1983.

Kerr Prize for best article in New York History, awarded by New York Historical Society, 1981.

Guggenheim Fellowship, 1980-81. 

National Humanities Institute, 1975-76. 

National Endowment for the Humanities Grant, 1972-73.

Distinguished Visitor Award of the Australian-American Education Foundation, 1976.

National Endowment for the Humanities Summer Fellowship, 1967. Toppan Prize, Harvard University, 1964.

Institute of Early American History and Culture, 1964-66.

De Lancey K. Jay Prize, Harvard University, 1963-64.

Additional Info

Wood gave a distinguished lecture on "George Washington," for the Presidential Lecture Series on the Presidency, The White House, 1991. Wood was the president of the Society for Historians of the Early Republic, 1986-87 and Chairman, Board of Advisors, National Historical Society, 1973-. Wood is on the Advisory Committee for the Papers of John Adams, 1990; Advisory Committee for the Papers of Thomas Jefferson, 1990--; Advisory Board for the Papers of James Madison, 1994--; Administrative Board for the Papers of Benjamin Franklin, 1995--. Wood is on the Advisory Board for Northeastern University Press, 1989--.; Board of Editors, Oxford History of the Enlightenment. Board of Trustees, National Council of History Education, 1996--; Advisory Board, Gilder-Lehrman Institute of American History, 1996--; and Board of Scholars, National Center for the American Revolution, 2002.

Wood also served as a Lieutenant in the U.S. Air Force, 1955-58. 

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170593 https://historynewsnetwork.org/article/170593 0
“The First Black Feminist-Abolitionist in America”


Nearly two centuries before Rep.-elect Ayanna Pressley became Massachusetts’ first black woman elected to the U.S Congress in November, Maria Stewart took the stage of Franklin Hall in Boston in 1833. The mixed crowd shifted in disapproval; the older men had already expressed their scorn at her reproach. Yet, the main objection might have come from the elite circle of other women who found this extraordinary African American woman stretched too far beyond the bounds of accepted true womanhood.

"What if I am a woman?” Stewart declared.

The last two years had generated intense scrutiny of the African American speaker—a compelling writer of resistance in Boston. Self- educated, financially stable, Stewart was the widow of a shipping agent who had thrived on the entrepreneurial genius of black sailors on the seas. Everyone knew Paul Cuffe, the famous colonial tax resister and one of the richest black men in Massachusetts’s ports, had built a fortune as a whaler before his departure for Sierra Leone’s colonial experiment. He would even inspire the Ahab character in Melville’s Moby Dick.

But Stewart was no fictional character. Abolitionist readers of the Liberator had turned to the “Ladies Department” for the past year for Stewart’s essays. It had become the most radical page. The ease with which she had entered the hall, as if black women could determine their entrance and exit in a world controlled by white men, had brought her words beyond the page to the public in a way few other black women had dared.

“Is not the God of ancient times the God of these modern days?” Stewart continued. “Did he not raise up Deborah, to be a mother, and a judge in Israel? Did not Queen Ester save the lives of the Jews? And Mary Magdalene first declare the resurrection of Christ from the dead?” 

Inspired by fellow Bostonian transplant David Walker, the son of enslaved parents from the Carolinas, Stewart shared the call for immediate abolition—at any costs. Walker’s Appeal, a pamphlet in the tradition of Paine with its own preamble, had riveted the black community with its harbinger of insurrection; it blasted the rhetorical deceit of Thomas Jefferson, only three years dead; it mocked Jefferson’s hypocritical paeans to Roman slaves—as damning the progress of African Americans for centuries, forever “removed beyond the reach of mixture.”

Circulated among underground networks, including black sailors who subversively stacked it among the packages of contraband along the seaboard ports, the Appeal was “read and re-read until their words were stamped in letters of fire upon our soul,” according to one black abolitionist leader in New England.

And while the Appeal thrust the Liberator into a more radical direction, shaming its white abolitionist editor’s privilege of passive resistance, it cast the armed resistance into the hands of men only. Stewart, on the other hand, according to historian Christina Henderson, did not just envision women at the head of the anti-slavery vanguard—she stood there herself.



“Methinks I heard a spiritual interrogation,” Stewart countered. “Who shall go forward, and take off the reproach that is cast upon the people of color? Shall it be a woman? And my heart made this reply—If it is thy will, be even so, Lord Jesus.”

Deeply Christian, and a deeply Christian moralist, Stewart’s lean into black self-improvement had cast aspersions as much as inspiration on her own free community; she shared Walker’s disillusion with the “disunited, as the colored people are now,” blaming internal conflicts and infighting as obstacles to any liberation. “Had experience more plainly shown me that it was the nature of man to crush his fellow, I should not have thought it so hard,” she lamented. “Wherefore, my respected friends, let us no longer talk of prejudice, till prejudice becomes extinct at home. Let us no longer talk of opposition, till we cease to oppose our own.”

In an attack on the elite black Masons, she had called out the obstacles hindering social progress as self-inflicted. Her words could be blunt, sharpening the edges of her detractors, especially among men. “Is it blindness of mind, or stupidity of soul, or the want of education, that has caused our men who are 60 to 70 years of age, never to let their voice be heard, nor their hands be raised in behalf of their color?”

At the same time, she held her white liberal women friends accountable for their low ceiling of aspirations for women of color. Why did their businesses not hire African American girls, beyond calls of domestic servitude?

Let our girls possess what amiable qualities of soul they may; let their characters be fair and spotless as innocence itself; let their natural taste and ingenuity be what they may; it is impossible for scarce an individual of them to rise above the condition of servants.


Stewart called out this back side of white supremacy in the resistance: “Like King Solomon, who put neither nail nor hammer to the temple, yet received the praise; so also have the white Americans gained themselves a name, like the names of the great men that are in the earth, while in reality we have been their principal foundation and support. We have pursued the shadow, they have obtained the substance; we have performed the labor, they have received the profits; we have planted the vines, they have eaten the fruits of them.”

A decade before nationally known Black nationalist Martin Delany or famous abolitionist Frederick Douglass would command the same stage, Stewart methodically embraced her role as a writer of the resistance, publishing essays—not sermons—and performing them in counterspaces that had been reserved for white versions of abolition. “The first Black feminist-abolitionist in America,” historian William Andrews has hailed her.

“O woman, woman! upon you I call,” Stewart appealed, “for upon your exertions almost depends whether the rising generation shall be anything more than we have or not. “

Still negotiating the limits of “true womanhood” of the period, she extolled women to first “possess the spirit of independence . . . the spirit of men, bold, enterprising, fearless and undaunted.” In her footsteps, women had to take the next move as resisters of an unwarranted prejudice: “Sue for your rights and privileges. Know the reason that you cannot attain them. Weary them with your importunities.”

Violence was inevitable, as part of an unrelenting weariness, as if she couldn’t share the privilege of nonviolence. Stewart declared she would be “a willing martyr” for the African American cause: “I can but die for expressing my sentiments: and I am as willing to die by the sword as the pestilence.”

Not as apocalyptic as Walker, Stewart reframed the resistance against slavery as part of a historical tradition of triumph:

Look at the suffering Greeks! Their proud souls revolted at the idea of serving a tyrannical nation, who were no better than themselves, and perhaps not so good. They made a mighty effort and arose: their souls were knit together in the holy bonds of love and union: they were united and came off victorious. Look at the French in the late revolution! no traitors amongst them, to expose their plans to the crowned heads of Europe! “Liberty or Death!” was their cry. And the Haytians, though they have not been acknowledged yet as a nation, yet their firmness of character and independence of spirit have been greatly admired, and highly applauded. Look at the Poles, a feeble people! They rose against three hundred thousand mighty men of Russia; and though they did not gain the conquest, yet they obtained name of gallant Poles.


After 1833, Stewart never took the stage again; her essays were sparse. She moved to New York City, and then Washington, DC, taking her resistance into schools as a teacher.

The resistance movement against slavery never looked back.

Wed, 16 Jan 2019 10:08:51 +0000 https://historynewsnetwork.org/article/170568 https://historynewsnetwork.org/article/170568 0