id stringlengths 32 32 | url stringlengths 31 1.58k | title stringlengths 0 1.02k | contents stringlengths 92 1.17M |
|---|---|---|---|
c1312f138b23bd1df7498907378b3995 | https://historynewsnetwork.org/article/169711 | Doris Kearns Goodwin's new book? | Doris Kearns Goodwin's new book?
From Amazon:
In this culmination of five decades of acclaimed studies in presidential history, Pulitzer Prize-winning author Doris Kearns Goodwin offers an illuminating exploration of the early development, growth, and exercise of leadership.
Are leaders born or made? Where does ambition come from? How does adversity affect the growth of leadership? Does the leader make the times or do the times make the leader?
In Leadership, Goodwin draws upon the four presidents she has studied most closely—Abraham Lincoln, Theodore Roosevelt, Franklin D. Roosevelt, and Lyndon B. Johnson (in civil rights)—to show how they recognized leadership qualities within themselves and were recognized as leaders by others. By looking back to their first entries into public life, we encounter them at a time when their paths were filled with confusion, fear, and hope.
Leadership tells the story of how they all collided with dramatic reversals that disrupted their lives and threatened to shatter forever their ambitions. Nonetheless, they all emerged fitted to confront the contours and dilemmas of their times.
No common pattern describes the trajectory of leadership. Although set apart in background, abilities, and temperament, these men shared a fierce ambition and a deep-seated resilience that enabled them to surmount uncommon hardships. At their best, all four were guided by a sense of moral purpose. At moments of great challenge, they were able to summon their talents to enlarge the opportunities and lives of others.
This seminal work provides an accessible and essential road map for aspiring and established leaders in every field. In today’s polarized world, these stories of authentic leadership in times of apprehension and fracture take on a singular urgency.
|
58caf91a101bd8b1d09191f1d037972a | https://historynewsnetwork.org/article/169904 | Review of Omarosa Manigault Newman’s “Unhinged: An Insider’s Account of the Trump White House” | Review of Omarosa Manigault Newman’s “Unhinged: An Insider’s Account of the Trump White House”
The provocative quotation following appears, alone, by itself, on the back jacket for everybody to read. You might as well: “He rambled. He spoke gibberish. He contradicted himself from one sentence to the next…. While watching that interview, I realized that something real and serious was going on in Donald’s brain. His mental decline could not be denied. Many didn’t notice it as keenly as I did because I knew him way back when. They thought Trump was being Trump, off the cuff. But I knew something wasn’t right.”
There’s an awful lot to quote in this book. The author is clearly a talented African-American woman and we are always aware of it. She is seldom relaxed, it seems; on the other hand, despite mood swings, she is capable of great joy now and then. We are reminded that two of her family members were shot dead early on; and she worries—and has cause, as mobs misbehave here and there in her pages – but not quite cover to cover.
The author is a beautiful woman of good posture, a veteran of “The Apprentice” for years, seldom unemployed if she wishes to have a job between various intermissions. After being fired by the White House in chapter one, UNBELIEVABLY, in a day or two she is offered a huge salary by a Trump daughter to work on the 2020 campaign. For that day and time, at least, she says “no.”
Subtitle, “An Insider’s Account of the Trump White House,” but she is all over the place on content as we turn the pages: California, Ohio, Florida, NYC, downtown D. C.
The start is dramatic: she’s abruptly FIRED, and not the way we might be. For Omarosa, it was off to the Situation Room with that Kelly soldier where there were threats, loudness, recrimination, replies, locked doors, hints of violence. After that beginning, we have no doubt that this experienced lady is important and her memory feared.
My book is marked up with big circles and “quote,” check marks, and “discuss.” This is a somewhat dramatic narrative, one to be taken seriously. One wishes the writer well, but there seems an aura of strain throughout, I thought.
This White House staff member, Omarosa, expected her life to improve Donald J. Trump’s standing on this planet. She comes to hate him at the end, and she worries about the destiny of our good nation with him still in control. Let’s listen to her misgivings:
“… I knew without Keith [Shiller], the president would probably become unhinged.” (Page 303) Again, “…Due to his lack of empathy and his narcissism…” And, “…I realized that something real and serious was going on in Donald’s brain. His mental decline could not be denied. Many in the White House didn’t notice it as keenly as I did because I knew him way back when.… I knew something wasn’t right.” What might be done? She came up with: “Declare a state of medical emergency?” (Page 246)
This was startling: “During one of my visits he asked, “Hey, Omarosa, what do you think about Comey? I had to let him go, right? He couldn’t be trusted; he was not loyal.” She judges, “No one—and I mean not a single person, agreed with his decision.” (Page 244)
Do read about Trump and his awareness of guns on pages 240-41. She doesn’t think much of White House doctors (nor do I, from my LBJ book research). (Page 242) That medical personnel, she says, gave out pills to anybody. “All we had to do was ask.”
From reading newspapers, we believe our leader disregards Briefings. Here, we read: “In our briefings, Trump’s attention was scattered. He was distracted, irritable, and short. Normally, when DJT got into one of these moods, you knew to give him time and space. But in this case I could not.” (Page 217)
Is his mind sharp and clear? She is sure about his mental deterioration and writes clearly about both that and “his racism.” (Pages 292-93) At one point, this reviewer came to think: NO. Don’t be quoting this to our public! He’s the President of the United States. Limit those quotations.
We never forget this book is by a black woman. It’s clear: “… white men who surrounded me….” “A white participant is given the benefit of the doubt; a black woman in the workplace never is, regardless of the circumstances.” (Page 265) This book is about something called “the cult of Trumpworld.” One word.
Here is one message conveyed: “I was miserable at the White House. Morale was at an all-time low, and the environment was toxic. I realized that Donald Trump was the biggest distraction to his own presidency. Donald Trump, the individual, the person, because of who he is and what he stands for and how he operates, would always be the biggest hindrance for us. Donald Trump, who would attack civil rights icons and professional athletes, who would go after grieving black widows, who would say there were good people on both sides, who endorsed an accused child molester; Donald Trump, and his decisions and his behavior, was harming the country. I could no longer be a part of this madness.” (Pages 318-19)
How, you may ask, does a “typical historian-reviewer” feel on emerging from this candid, observant, critical, worrisome, concerned, notable book? As it happens, I have written about race and presidents. About Trump I am in despair. I have come close to despising our incumbent President, while hoping for the best. (That would be dumping Trump by the wayside at the very soonest.)
On the other hand, Donald has been for years, after being handed immense startup money, unquestionably an entrepreneur, creator of useful hotels and golf courses, and a jovial entertainer of huge audiences. If I enjoyed all or any of those “Miss Bosom” programs he provided on TV, I really should thank him.
Still and all, I’m frightened by what I read in this book about DJT’s CASUAL, UNINFORMED, and IGNORANT “presidency.” Omarosa’s book , sure didn’t help my sense of well-being. As I read along, I felt, well, awful.Examples of an unfit, almost uncaring, certainly inadequate, President kept showing up, over and over. I didn’t like the really staggering contrast with earlier subjects of my books – Herbert Hoover, FDR and LBJ – surrounded as they were in office by brilliant, dedicated public servants who seldom resigned, listened obediently and unfailingly to briefings with consequences,, and spoke rationally and regularly to the man who chose them. Yes, they campaigned while in office, more or less, but they G O V E R N E D and mostly set aside their earlier occupations while in our employ.
Here we have a bright woman of 44 years. She filled a job called (unbelievably) Assistant to the President and Director of Communications for the Office of Public Liaison, all in Trump’s White House. (Earlier, she served in Clinton’s White House!) Back when, she strove beyond the MA at Howard University; was a military Reservist chaplain (sic), and is apparently an occasional minister with her active minister 2nd husband in Jacksonville, that is, when “between political jobs.”
She handles much work with one hand behind her. Her favorite activity, it seems, is striving to change the political mindset of a giant mass of black voters out there, changing it to firm support of her current politician whoever it is, and doing that, whatever the cost or time-demand. One never-ending goal has been to advance her own personal popularity and reputation because it will surely help Donald J. Trump! Maintaining or enlarging his probable vote-count in the coming election was, for a time, her hope.
Candidate Trump after the end of the 2016 campaign was disturbing to Omarosa. “It was very concerning to listen to him go on and on about the election in private. He would get all worked up and get crazed about the ‘fake news’ reports. I was worried that in his first week [in the White House] he was already cracking under the pressure.” (Page 210)
Our book author casually mentions those 4,000 White House jobs to be filled way back when. It’s scary. Members of the staff were to “back up whatever the President said or tweeted, regardless of its accuracy.” (Page 211) What level of employee would allow that? On page 228 she almost casually speaks of “paranoia” setting in. Wow. Later, she wonders as to a tweet, “Does he even realize he sent it?” Italics hers.
Omarosa ruminates happily, after Trump’s victory speech, “That moment was one of the highlights of my life.” It had proved “how wonderful and great this country had been to me.” She was living the American dream, she proclaims. Those days for her long ago being on public assistance were over! You bet! Back after the inauguration, the most powerful man on the planet was next to her, and she deeply reflects as to that: it was “the two of us on that stage together!”
It was many months –almost a year—before the room nicknamed the “WH Osama bin Laden death planning room” would be used to house a General employing it to FIRE an employee far below Cabinet level. It was December 12, 2017 when our Omarosa would be escorted there, threatened, with the door locked against her in that scary basement, maybe facing the spectacular glare of a square foot of long ago awarded gold braid on the chest of her nasty critic. She has been blurting: “I’m being railroaded!”
It all made me think of President Andrew Johnson’s turbulent administration, back when the Senate failed to convict by only one vote, despite impeachment by the House. Maybe I’ll read up on all that—to Be Prepared! This time, maybe seriously consider finishing what we start!
|
22efb3822716b242bb6a4941351a21a3 | https://historynewsnetwork.org/article/169991 | France’s Macron admits to military’s systematic use of torture in Algeria war | France’s Macron admits to military’s systematic use of torture in Algeria war
Related Link Is France Finally Reckoning With Its Brutal Past? (The Nation)
France will formally acknowledge the French military’s systemic use of torture in the Algerian War in the 1950s and 1960s, an unprecedented step forward in grappling with its long-suppressed legacy of colonial crimes.
President Emmanuel Macron announced his watershed decision in the context of a call for clarity on the fate of Maurice Audin, a Communist mathematician and anti-colonial activist who was tortured by the French army and forcibly disappeared in 1957, during Algeria’s bloody struggle for independence from France.
Audin’s death is a specific case, but it represents a cruel system put in place at the state level, the Elysee Palace said. “It was nonetheless made possible by a legally instituted system: the ‘arrest-detention’ system, set up under the special powers that [had] been entrusted by law to the armed forces at that time,” reads a statement that was to be released by Macron’s office Thursday, seen by Le Monde newspaper.
|
715f81bdccbda517d9c8eb7f84a79e1d | https://historynewsnetwork.org/article/170018 | A president’s secret letters to another woman that he never wanted public | A president’s secret letters to another woman that he never wanted public
Family and friends had known about the president’s intimate relationship with another woman for years, but whispers about their involvement were growing.
Woodrow Wilson was so worried that he asked his close adviser, Colonel Edward M. House, to meet him after dinner in his White House study on Sept. 22, 1915. In the meeting, Wilson talked about his longtime friendship with Mary Peck, a divorced woman he had met in Bermuda eight years earlier. He told House that the friendship was platonic but that he had been “indiscreet in writing her letters rather more warmly than was prudent.”
Besides personal embarrassment, the release of the letters would complicate Wilson’s hopes to marry the younger, richer Edith Bolling Galt and cast a shadow over his bid for reelection just as the war in Europe was expanding.
|
4d4a956a2fe29e9b3126ca33ecb0e7d5 | https://historynewsnetwork.org/article/170044 | Should Holocaust education be mandatory? This 9th-grader tries to make it so. | Should Holocaust education be mandatory? This 9th-grader tries to make it so.
Just 10 states have laws requiring grade school Holocaust education, but a Lake Oswego, Oregon, high school student is hoping to make her state the 11th.
On Sept. 25, Lakeridge High School student Claire Sarnowski and concentration camp survivor Alter Wiener will testify before the Oregon State Senate Education Committee in favor of a law requiring that K-12 students receive education on genocide in general and the Holocaust — which claimed millions of victims, including Jews, Roma, prisoners of war, people with disabilities and gay men — in specific.
There are 10 states where Holocaust education is mandatory: California, Connecticut, Florida, Illinois, Kentucky, Michigan, New Jersey, New York, Pennsylvania and Rhode Island, according to Rhonda Fink-Whitman, who helped spearhead the law in Pennsylvania and now coordinates a nationwide effort through her Facebook group Campaign to Make Holocaust Education Mandatory in All 50 States.
|
c33f9e22d41f5b4811f464945fd0b1b3 | https://historynewsnetwork.org/article/170104 | The Netherlands’ Surprise New Best Seller: Hitler's ‘Mein Kampf’ | The Netherlands’ Surprise New Best Seller: Hitler's ‘Mein Kampf’
After being banned in the Netherlands for over 70 years, a new Dutch translation of “Mein Kampf” hit bookstores across the country in August.
It didn’t take long for Hitler’s political manifesto, written in 1924, to become a best seller and stir controversy in the country. Even though booksellers are reluctant to even display it, the translation, “Mijn Strijd,” has been on the country’s best-seller list since its release – peaking in third place in mid-September.
With an introduction to each chapter by historian Dr. Willem Melching, the new Dutch edition of “Mein Kampf” is more than just a translation of Hitler’s work.
|
f608f60f07ec8e854fe36d0c753cacd2 | https://historynewsnetwork.org/article/170132 | DNA evidence links Muhammad Ali to heroic slave, family says | DNA evidence links Muhammad Ali to heroic slave, family says
When Cassius Clay joined the Nation of Islam in 1964 and changed his name to Muhammad Ali he had a straightforward explanation. “Why should I keep my white slavemaster’s name visible and my black ancestors invisible, unknown, unhonored?” Ali asked.
Then, it was more of an abstract concept, a statement against white oppression; Ali did not know much, if anything, about his ancestors or his own family tree. Decades later, though, Ali’s family has made a discovery that appears to shed new light on the boxer’s lineage — where he came from, and also his place American history. Ali, according to his family’s research, is the great-great-great grandson of Archer Alexander, a slave who heroically fought both for his own freedom and against slavery.
Alexander escaped from bondage and surreptitiously fed information to the Union Army during the Civil War. He was later the model for the slave depicted in the Emancipation Memorial, a statue in Lincoln Park, about a mile east of the U.S. Capitol.
|
350ca4c2bf118bf6446c775e8b7d4af3 | https://historynewsnetwork.org/article/170258 | With a House Takeover, Democrats Could Get Trump’s Tax Returns | With a House Takeover, Democrats Could Get Trump’s Tax Returns
Democrats, eyeing control of a powerful House tax-writing committee next year, are studying a century-old provision in the federal tax code that could give them access to President Trump’s long-sought tax returns and eventually the ability to make them public.
The powers laid out in an obscure 1920s addition to the tax code are clear: The leaders of Congress’s tax-writing committees, including the Ways and Means Committee in the House, are empowered to request from the Treasury Department tax returns or related information on any tax filer. Democrats could use that information to finally determine if Mr. Trump, who built a global business empire before entering politics, has problematic financial entanglements with Russia or other undisclosed conflicts of interest….
The provision — Section 6103 in the tax code — dates to the early 1920s, when Congress was mired in investigations of bribes supposedly paid to officials in the Harding administration by citizens seeking leases of federal lands containing rich oil reserves. Lawmakers found themselves reliant on the president to release tax information on officials accused of wrongdoing and argued that to uphold the proper separation of powers and serve as a check on the executive branch, they needed authority to obtain tax returns on their own.
“The whole tenor of the debate was not that different from today’s debate,” said George K. Yin, a University of Virginia tax law professor.
In the decades since, the authority has been used scarcely. In 1974, one committee used it to supplement tax information provided by President Richard M. Nixon to investigate whether he had taken improper tax positions and later to release a nearly 1,000-page bipartisan report on its findings.
In 2014, in a more partisan push, Republicans on the Ways and Means Committee used it to obtain and publicly release tax information as part of an investigation into whether the I.R.S. discriminated against conservative entities seeking tax-exempt status. Democrats denounced it as a political smear that abused the committee’s authority.
|
21be4064082d483a177c1b6089b656c5 | https://historynewsnetwork.org/article/170287 | Trump Is Hitting the Midterm Campaign Trail Hard | Trump Is Hitting the Midterm Campaign Trail Hard
President Trump may not be running for reelection yet himself, but the days and weeks before the 2018 midterms have been busy for him nonetheless.
On Thursday night, Trump appeared in Missoula, Mont., to stump for Senate candidate Matt Rosendale and incumbent Rep. Greg Gianforte running for re-election; on Friday he went to Mesa, Ariz., for Senate candidate Martha McSally; and on Monday he’ll be in Houston for Sen. Ted Cruz. There’s no question that the rallies are drawing crowds — the Houston event had to be moved to a bigger venue — but whether they actually make a difference will have to wait till Election Day to be seen.
That said, enough people think it works that it’s considered normal for a sitting President to campaign for congressional candidates during midterms season. But that wasn’t always the case — and when Presidents do get involved, it doesn’t always help.
|
c57d3d942ad3de8bfcafccb439323ace | https://historynewsnetwork.org/article/170299 | It’s time to make Election Day a holiday — in law and spirit | It’s time to make Election Day a holiday — in law and spirit
A century and a half before the Fourth of July had any particular significance, before Christmas was widely celebrated and long before Thanksgiving was a national holiday, there was Election Day.
From the earliest New England settlements, colonial elections were public feast days, when people put on their best clothes and paraded into town with neighbors and friends. These Puritans would sit for the Election Day sermon — among the most important religious events of the year, with prayers for elected leaders and warnings to the community to stay true to their values — before heading off for discussion, snacks, rounds at the tavern and a communal dinner.
Through the Revolutionary period, the Election Day tradition evolved into more lavish public affairs, at which voters could expect to be treated with barbecue, cake and rum punch. George Washington provided 158 gallons of alcohol to voters during one Virginia election.
Although Americans today are unaccustomed to debating political issues around open barrels of booze, other aspects of our long Election Day tradition should be revived, along with the passionate electoral engagement that accompanied it. American voter participation is abysmal compared with other established democracies, trailing behind countries such as France and Mexico that observe federal holidays for general elections — and also compared with Americans of the 19th century.
The far more robust voter turnouts of this earlier period, in which elections occasioned boisterous public festivities, reveal a civic culture that we’ve lost. That culture was not perfect; Election Day was not only a day of celebration but also of exclusion and, importantly, of resistance. We should should seize on this history — and learn from its blind spots — to imagine a public, social democracy in the United States today.
|
227d2a0b5e3355c27e57c7946be23848 | https://historynewsnetwork.org/article/170303 | Georgia election fight shows that black voter suppression, a southern tradition, still flourishes | Georgia election fight shows that black voter suppression, a southern tradition, still flourishes
Georgia’s Republican Secretary of State Brian Kemp has been sued for suppressing minority votes after an Associated Press investigation revealed a month before November’s midterm election that his office has not approved 53,000 voter registrations – most of them filed by African-Americans.
Kemp, who is running for governor against Democrat Stacey Abrams, says his actions comply with a 2017 state law that requires voter registration information to match exactly with data from the Department of Motor Vehicles or Social Security Administration.
The law disproportionately affects black and Latino voters, say the civil rights groups who brought the lawsuit.
As a scholar of African-American history, I recognize an old story in this new electoral controversy.
Georgia, like many southern states, has suppressed black voters ever since the 15th Amendment gave African-American men the right to vote in 1870.
The tactics have simply changed over time.
Democrats’ southern strategy
With black populations ranging from 25 percent to nearly 60 percent of southern state populations, black voting power upended politics as usual after the Civil War.
During Reconstruction, well over 1,400 African-Americans were elected to local, state and federal office, 16 of whom served in Congress.
Loyal to President Abraham Lincoln, whose Emancipation Proclamation sounded the death knell for slavery, black Americans flocked to the Republican Party. Back then, it was the more liberal of the United States’ two mainstream political parties.
Southern Democrats fought back, using both violence and legislation.
White paramilitary groups like the Ku Klux Klan and White Leagues threatened black candidates, attacked African-American voters, pushed black leaders out of office and toppled Republican governments.
After establishing single-party control over the South, white Democrats in the late 1800s instituted a poll tax, making voting too expensive for former slaves and their descendants.
“White primaries” excluded blacks from choosing candidates in primary elections.
These attacks proved effective. Between 1896 and 1904, the number of black men who voted in Louisiana plummeted from 130,000 to 1,342.
After North Carolina U.S. Rep. George White retired, in 1901, the South would send no African-Americans to Congress until the 1972 election.
Voter suppression in Jim Crow Mississippi
In the early 20th century, many black Americans voted with their feet, migrating north and west.
Around the same time, President Franklin Delano Roosevelt’s New Deal – which instituted racial quotas in hiring for federal public work projects and included policies aimed at reducing inequality – was shifting northern black voters’ allegiance to the Democratic Party.
Black voters in northern cities began putting African-American Democrats into congressional office.
An 1879 cartoon in Harper’s Magazine satirizes the requirement that African-Americans pass a literacy test to vote. U.S. Library of Congress
But they did not give up on the South, pressing the Supreme Court to reaffirm voting rights in the 1944 case Smith v. Allwright, which prohibited white-only primaries.
But black voter suppression remained deeply entrenched in the South. Several states required new voters to complete literacy tests before they could cast a ballot. In the 1880s, 76 percent of southern blacks were illiterate, versus 21 percent of whites.
Strategies for excluding black voters evolved along with federal law.
In reaction to Brown v. Board of Education, which in 1954 overturned “separate but equal” segregation laws, Mississippi in the same year modified its poll test. It asked voters to interpret a section of the state’s constitution, authorizing county registrars to determine whether the applicant’s answer was “reasonable.”
Virtually all African-Americans, regardless of education or performance, failed.
Within a year, the number of blacks registered to vote in Mississippi dropped from 22,000 to 12,000 – a mere 2 percent of eligible black voters.
Political violence – including the 1955 attempted assassination of voting rights activist Gus Courts and murder of George W. Lee – accompanied the legal restrictions, showing the cost of black political independence.
Fighting for the vote
Activists were not deterred. The Student Nonviolent Coordinating Committee and the Congress of Racial Equality continued to wage grassroots voter registration campaigns and fight for official representation in the Democratic Party.
In 1964, a new political party, the Mississippi Freedom Democratic Party, was founded to welcome “sharecroppers, farmers and ordinary working people.”
The Freedom Democratic Party elected 68 delegates to attend the 1964 Democratic National Convention in Atlantic City, New Jersey, hoping to transform the all-white Mississippi delegation.
Trying to broker a deal, national Democratic leaders extended Mississippi’s Freedom Democrats two nonvoting at-large seats at the convention – a minor concession that led most white Mississippi party members to walk out in protest.
Freedom Democrats rejected the two seats as tokenism, holding a sit-in on the convention floor in Atlantic City to highlight the lack of black political representation.
Aaron Henry, chair of the Mississippi Freedom Democratic Party delegation, speaks at the Democratic National Convention in 1964. Library of Congress/Warren K. Leffler
Black voters make gains
Over time, the civil rights movement sparked a political shift that dramatically changed the U.S. electorate.
The 24th Amendment outlawed poll taxes in 1964, abolishing a major barrier to black enfranchisement in the South. Literacy tests, too, were restricted, under the 1965 Voting Rights Act.
The Voting Rights Act also established federal oversight of voting laws to ensure equal access to elections, particularly in the South.
By the early 21st century, African-Americans constituted a majority of the registered Democrats in Deep South states from South Carolina to Louisiana. They turn out in high numbers and have been key voters for getting Democrats into office in the conservative-dominated South.
Voter suppression today
Over the past decade, Republican lawmakers have chipped away at the last century’s advances, enacting voter ID laws that make it harder to vote.
Claiming they seek to deter election fraud, some 20 states have restricted early voting or passed laws requiring people to show government ID before voting.
Voter identification laws have hidden costs, research shows.
Getting a government ID means traveling to state agencies, acquiring birth certificates and taking time off work. That puts it out of reach for many, a kind of 21st-century poll tax.
Federal and state courts have overturned such laws in some states, including Georgia, North Carolina and North Dakota, citing their harmful effect on African-American and Native American voters.
But the Supreme Court in 2008 deemed Indiana’s voter ID law a valid deterrent to voter fraud.
Perhaps most damaging to black voters was a 2013 Supreme Court decision that weakened the Voting Rights Act.
The Voting Rights Act of 1965 stopped southern districts from changing laws to exclude black voters – but only temporarily. Lyndon B. Johnson Library
Shelby County v. Holder ended 48 years of federal oversight of southern voting laws, concluding that the requirement relied on “40-year-old facts that have no logical relation to the present day.”
Current events show that voter suppression is hardly a thing of the past.
From Georgia’s voter registration scandal to gerrymandered districts that dilute minority voting power, millions may be shut out of November’s midterms.
Frederick Knight, Associate Professor of History, Morehouse College
This article is republished from The Conversation under a Creative Commons license. Read the original article.
|
ce7f2e69b87d2f9c617c1b9a74be9e0c | https://historynewsnetwork.org/article/170308 | Should you worry about American democracy? Here’s what our new poll finds. | Should you worry about American democracy? Here’s what our new poll finds.
In just two weeks, Americans will vote in a midterm election that may well decide what party controls Congress and the legislative agenda for the next two years.
But as Americans head into this democratic exercise, some observers wonder whether the country has become dissatisfied with democracy. Young Americans seem more willing to embrace nondemocratic modes of government, and traditionally disadvantaged minority groups less likely to find living in a democracy of absolute importance. Trust in political institutions and adherence to traditional democratic norms look to be in decline.
Our 2018 American Institutional Confidence Poll, sponsored by the John S. and James L. Knight Foundation and Georgetown University’s Baker Center for Leadership & Governance, explores these issues. Conducted in June and July, our research finds a mixed picture — including attitudes toward democratic institutions that vary strongly by partisan identity. What might that say about our democratic future?
|
43591448830e7821a074049d7acce12e | https://historynewsnetwork.org/article/170310 | ‘Voter fraud’ is a myth that helps Republicans win, even when their policies aren’t popular | ‘Voter fraud’ is a myth that helps Republicans win, even when their policies aren’t popular
Related Link Tweet thread on voter thread by Heather Cox Richardson
Last weekend, President Trump tweeted a warning: “All levels of government and Law Enforcement are watching carefully for VOTER FRAUD, including during EARLY VOTING. Cheat at your own peril. Violators will be subject to maximum penalties, both civil and criminal!” But study after study has concluded that, in a nation of more than 300 million people, voter fraud is vanishingly rare. The myth of voter fraud persists only because it is ingrained in modern politics, after a decades-long effort by Republicans to portray Democratic votes, and especially African-American votes, as fundamentally suspect.
The modern myth of voter fraud began in 1986, President Ronald Reagan’s sixth year in office, when Republicans recognized that their policies could not attract a majority of voters. Their budget cuts had hit black Americans particularly hard. An attempt to weaken Social Security just before the election bode ill for the midterms, especially since the Republicans had to defend vulnerable senators elected in 1980 on Reagan’s coattails. How could Republicans address the gap between the unpopularity of the programs and their determination to win? Not by changing their policies, but by changing the composition of the electorate.
The GOP launched a “ballot integrity” program to prevent “voter fraud,” claiming that dead or fictional people were casting ballots. Republican officials sent mail to registered voters in heavily Democratic areas in Louisiana, Indiana, and Missouri, and if the mail came back as undeliverable, Republicans would challenge those individuals’ right to vote. Democrats sued, and the suit turned up a memo between Republican National Committee officials explaining the purpose behind the program: “I would guess that this program will eliminate at least 60-80,000 folks from the rolls,” one GOP operative wrote. “If it’s a close race, which I’m assuming it is, this could keep the black vote down considerably.”
Of course, the idea of keeping African-Americans away from the polls was not just a tactical choice, and it has a much longer history. After the Civil War, white racists insisted the country’s newly enfranchised black voters simply wanted government handouts and cared little about the good of the nation. Purging such undeserving ingrates from the body politic, the argument went, was a public service, leaving the nation’s government in the hands of white men who understood what was best for everyone. The civil rights movement resurrected this old linkage of racism, economics, and politics, as opponents of integration insisted that black Americans demanding equal rights were, in fact, trying to redistribute the wealth of hardworking white men to their own pockets through taxation.
Ronald Reagan played to this theme in his political rise, warning that taxes were sucking hardworking white men dry to support folks like the “welfare queen,” who, he said, “has 80 names, 30 addresses, 12 Social Security cards and is collecting veterans’ benefits on four non-existing deceased husbands. And she is collecting Social Security on her cards. She’s got Medicaid, getting food stamps, and she is collecting welfare under each of her names.” But it was not simply greedy black Americans destroying the country. Any American who believed that the government should regulate business or provide a basic social safety net — in short, anyone, Democratic or Republican, who accepted the premises of the New Deal state and was willing to levy taxes to pay for them — was dangerous. By 1990, Representative Newt Gingrich of Georgia was attacking even Republican president George H. W. Bush as a “RINO” — Republican in Name Only — because he agreed to a tax increase to combat a rising deficit. ...
|
831958328e87567cd6b0d4beec1d9023 | https://historynewsnetwork.org/article/170319 | Review of Doris Kearns Goodwin’s “Leadership: In Turbulent Times” | Review of Doris Kearns Goodwin’s “Leadership: In Turbulent Times”
There was, once upon a time, an extremely popular genre of American biographical literature, going back at least to Parson Weems’s hagiography of George Washington, written with the explicit purpose of inspiring young boys (and only boys) to emulate the example of great leaders and accomplish great things. That genre is now regarded with amused condescension if not contempt and state US history curriculum standards, often echoing the ideology of Howard Zinn, identify few if any praiseworthy individuals in our history—in or out of the presidency.
Doris Kearns Goodwin is far too nuanced a historian to be taken in by either of these extremes. The four presidents explored in Leadership: In Turbulent Times have already been the subjects of Goodwin’s lengthy and extensively documented biographies: Lyndon Johnson and the American Dream(1976), No Ordinary Time: Franklin and Eleanor Roosevelt: the Home Front in World War II(1994—which won the Pulitzer Prize), Team of Rivals: the Political Genius of Abraham Lincoln(2005) and The Bully Pulpit: Theodore Roosevelt, William Howard Taft and the Golden Age of Journalism (2013). Clearly Goodwin’s latest work is different. One prominent historian expressed concern about Goodwin’s use of italicized “self-help bromides,” “bullet-point banalities” and “conference-room poster slogans” typical of the “leadership studies” curriculum in schools of business and public administration—in contrast to the rich narrative histories typical of her earlier, widely-acclaimed biographical work. (There is a 3-page bibliography on “Business Books on Leadership Skills” included after the 13-page historical bibliography.) Likewise, sundry responses on Internet opinion sites have also expressed reservations because Goodwin never explicitly mentions the current political situation and the Trump presidency.
This reviewer has known Goodwin since the early 1980s when she was researching The Fitzgeralds and the Kennedys: An American Saga (1987) at the JFK Library in Boston (where I was Historian from 1977 to 2000). She often made herself available to speak to visiting groups about her research and particularly enjoyed dramatically recounting bringing to light previously unknown archival evidence that required completely rethinking a particular event. I was, at that time, listening to the White House missile crisis tape recordings and discovering that Robert Kennedy’s book, 13 Days: A Memoir of the Cuban Missile Crisis, published posthumously in 1969, was an extremely misleading and inaccurate account of those historic meetings. One morning, as Goodwin was about to speak to an Elder hostel group, I chatted with her, in confidence, about my finding; she was fascinated, as a historian naturally would be, and urged me to follow the evidence wherever it led.
This personal experience, as well as my own historical instinct, suggest that she chose this time to write a very different type of book—one which steers clear of an explicit critique of the Trump presidency; instead, she opted to analyze several didactic and compelling historical episodes that contrast vividly in both style and substance with Trump’s leadership and which, by comparison, reveal the threat to democratic norms posed by his presidency.
Admittedly, there is not a great deal that is new or original in Leadership; the great majority of the historical material is available in countless other secondary works. However, Goodwin’s goal this time is to extract and identify the common chords of leadership that unite the four subjects of her previous work. In doing so, Goodwin, in fact, does not ignore Trump; he is referenced and exposed, though unnamed, by contrasting example on virtually every page of her dissection of democratic leadership.
I am convinced that she approached this book as a public historian and educator rather than as a scholar writing primarily for other scholars and even the more discerning history-reading public. In that spirit, she cites Theodore Roosevelt’s conviction that democracy can succeed only when there is “the fellow feeling, mutual respect, the sense of common duties and common interests, which arise when men take the trouble to understand one another, and to associate for a common purpose.” Likewise, Goodwin emphasizes Lincoln’s “emotional intelligence, his empathy, humility, consistency, self-awareness, self-discipline, and generosity of spirit" as well as his "sensitivity, patience, prudence...tenderness and kindness."“Is leadership possible,” she asks, “without a purpose larger than personal ambition…[and] guided by a sense of moral purpose”?”Lincoln’s leadership, especially “his patient resolve and freedom from vindictiveness,” she insists, is pertinent today, allowing us “to gain a better perspective on the discord of our times.” Goodwin’s historical judgment is all but inescapable despite her decision to only indirectly confront the divisive politics of the Trump era: in temperament and democratic vision and values, Trump is the anti-Lincoln.
The book follows a simple formula; it is divided into four sections with chapters on each of the four presidents: the first examines each individual from childhood through entry into public life; the second deals with the personal early-to-mid adult crises that each man had to face and overcome; the third section, the substantive core of the book, analyzes their successful mastery of democratic political leadership in the White House: for Lincoln, the subject is his careful preparation and judicious implementation of the Emancipation Proclamation in 1862; for TR, the topic is his politically and constitutionally unprecedented intervention in the economy to resolve the 1902 coal strike; for FDR, the emphasis is on his bold but pragmatic response to the ravages of the Great Depression during the first hundred days in 1933; for LBJ, the focus is on the traumatic transition of power after the JFK assassination in November 1963 and later on his brilliant guidance of the 1964 Civil Rights bill through Congress. (The fourth section, on the legacy of these four leaders, is rather cursory—except for the moving personal memories of her experience in Texas helping the former president to write his memoirs.)
But there is likely a more subtle formula also at work in this book. Goodwin may be evoking the so-called “improvement” (or “improving”) literature of the 19th century; in fact, Leadership unashamedly aims to inspire, not unlike the hagiographic biographies mentioned above, by examining specific lessons on how these four men exemplified the character and values suited to governing effectively in a democracy. In our nation’s current situation, this book may be far more necessary and valuable for its potential use in our schools and colleges than just one more scholarly presidential study. “It is my hope,” Goodwin asserts, “that these stories of leadership in times of fracture and fear will prove instructive,” especially, from the perspective of this reviewer, for teaching young Americans about the tumultuous story of our democracy and, by counterpoint, exposing the dangers we face today.
In 1990, I initiated the American History Project for High School Students at the JFK Library. The program consisted of three in-the-school classroom discussions of the study of history and historical methodology and culminated in a two-hour session at the JFK Library in which the students and teachers analyzed a set of documents about the June 1963 confrontation between the Kennedy administration and Governor George Wallace over the desegregation of the University of Alabama. If Leadership had been available at that time, it would have provided an invaluable addition to the core educational purpose of the project. Of course, before Trump, it is very unlikely that Goodwin would have even considered writing what can be viewed as a 21st century version of a 19th century “improvement” book.
Goodwin’s use of the subheads selected below, especially those in italics, would indeed have been out of place in her earlier narrative biographies. But this is clearly “no ordinary time,” and Goodwin, in the judgment of this reviewer, has chosen to write an out of the ordinary book:
Abraham Lincoln in 1862: Transformational Leadership
*Acknowledge when failed policies demand a change in direction
*Exhaust all possibility of compromise before imposing unilateral executive power.
*Assume full responsibility for a pivotal decision
*Refuse to let past resentments fester
*Set a standard of mutual respect and dignity; control anger
*Keep your word
Theodore Roosevelt in 1902: Crisis Management
*Secure a reliable understanding of the facts, causes, and conditions of the situation
*Use history to provide perspective
*Be visible. Cultivate public support among those most directly affected
*Keep temper in check
*Share credit for the successful resolution
Franklin Roosevelt in 1933: Turnaround Leadership
*Restore confidence to the spirit and morale of the people
*Infuse a sense of shared purpose and direction
*Tell people what they can expect and what is expected of them
*Lead by example
*Tell the story simply, directly to the people
Lyndon Johnson in 1963-64: Visionary Leadership
*Honor commitments
*Master the power of narrative
*Know for what and when to risk it all
*Identify the key to success.
*Put ego aside.
*Set forth a compelling picture of the future
Our democracy is currently in uncharted political and constitutional waters. Leadership’s didactic topics and subheads could, in what is admittedly a best-case scenario, provide imaginative teachers with a classroom source capable of awakening and nurturing an interest in history in their students. That transformation will require, as Goodwin has done, exposing the chasm between the democratic political leadership of Lincoln, TR, FDR, and LBJ and that of the current President of the United States—with or without mentioning his name.
|
ed87ccd64637ac76bc0a531db5ecc29b | https://historynewsnetwork.org/article/170326 | 200 Years On, U.K. Hunts for Grave of Man Called World’s 1st Black Sports Star | 200 Years On, U.K. Hunts for Grave of Man Called World’s 1st Black Sports Star
Born a slave on Staten Island in 1763, Bill Richmond left America in 1777, never to return, and spent most of his life in Britain. But it was not until he was 40 years old that he began bare-knuckle boxing — a brutal sport that brought him fame, prestige and an invitation to the coronation of King George IV.
Yet even in his adopted country, where he has been called the world’s first black sporting superstar — or stereotype, some would say — Mr. Richmond’s remarkable life story is largely forgotten.
Now, almost two centuries after his death, in 1829, he is back in the limelight as a search begins in earnest for Mr. Richmond’s remains.
As part of a rail upgrade, one of London’s main stations is being redeveloped, prompting the excavation of a burial ground containing the remains of an estimated 45,000 Londoners, including Mr. Richmond.
|
3985ee0fe2d4228a7f7dfaef2ed2d6bf | https://historynewsnetwork.org/article/170407 | The first midterm ‘wave’ election that ended Republican control of government | The first midterm ‘wave’ election that ended Republican control of government
Days before congressional elections in 1874, the unthinkable suddenly seemed likely.
Republican dominance in Washington had been a fact of life since 1861, when Abraham Lincoln became president and the party held sway on Capitol Hill. With the election of Ulysses S. Grant in 1868, the party remained firmly in control — and when Grant was reelected four years later in a landslide, the Republican juggernaut seemed unassailable.
But by the autumn of 1874, the political climate had changed dramatically. An influence-peddling scandal, a depressed economy and lingering Southern white resistance to Reconstruction clouded Republican chances as never before. As Americans prepared to vote in elections that would determine control of the House of Representatives (senators were elected by state legislatures), the end of Republican supremacy in Washington suddenly seemed possible.
“The complexion of the next House of Representatives,” the Chicago Tribune predicted on Oct. 26, “is likely to be Democratic.”
That proved to be a colossal understatement.
|
aec2b170d4774f1f388293b475d949f3 | https://historynewsnetwork.org/article/170413 | Wonder How Trump Will Handle Defeat? Don’t Bother with History. | Wonder How Trump Will Handle Defeat? Don’t Bother with History.
Okay, assume that $5 billion later, the midterms play out pretty much as everyone has expected for the past two years: Democrats take the House (and a clutch of governorships) and Republicans keep the Senate. What happens next? What happens when someone who is biologically incapable of acknowledging error faces the result of an election in which—if the probable becomes real—the character and conduct of the president was in fact the overwhelming reason for Republicans losing their stranglehold on Congress?
If history were a guide there would first be some humility. When the GOP lost the House and Senate in 2006, President George W. Bush called it a “thumpin’.“ When Democrats lost more than 60 seats in the House in 2010, President Barack Obama called it a “shellacking.” After that would likely come some roll-up-the-sleeves cooperation between a chastened but realistic White House and a newly ascendant opposition. After all, that’s what happened with Harry Truman, Ronald Reagan and Bill Clinton. Truman, notwithstanding his denunciation of the “do-nothing 80th Congress!”, worked with Republicans to shape a bipartisan postwar foreign policy. Reagan worked with Tip O’Neill and Dick Gephardt in the House and Bill Bradley in the Senate to hammer out deals on Social Security and tax reform. (Note to Republicans: that tax reform deal raised the capital gains tax to match the tax on ordinary income and closed or slashed a passel of loopholes that benefited high-income earners.). Clinton, even as his reelection campaign assailed the “Dole-Gingrich” Congress, joined his adversaries on matters from welfare reform to budgets.
But this is where present-day reality forces us to throw all that history out the window. Can anyone who has watched Trump over the past three-plus years reasonably expect a man who considers himself the greatest president ever to reconsider his standing? The far greater likelihood is that he will take credit for saving the Senate and insist that any Republicans who did win their House races got there because he carried them over the line, while the ones who lost would have been beaten worse if not for his intervention.
|
b41e8cfbf1b31e8d5407234880426050 | https://historynewsnetwork.org/article/170417 | Lyndon B. Johnson warned us about this | Lyndon B. Johnson warned us about this
Fifty-two years ago, President Lyndon Johnson warned the nation not to be seduced by proponents of a white backlash. Less than 48 hours before the 1966 midterms, just like today, LBJ saw in the electorate a noxious mix of white anger, hatred and resentment.
Back then, the white backlash had taken form in response to the riots in the Watts neighborhood of Los Angeles one year earlier as well as to an open housing bill that his administration was trying to push through Congress aimed at eliminating racism in the sale or rental of property.
Speaking to reporters at a televised news conference in Fredericksburg, Texas, on November 6, Johnson read from a prepared statement in which he explained, "I can think of nothing more dangerous, more divisive, or more self-destructive than the effort to prey on what is called 'white backlash.' I thought it was a mistake to pump this issue up in the 1964 campaign, and I do not think it served the purpose of those who did. I think it is dangerous because it threatens to vest power in the hands of second-rate men whose only qualification is their ability to pander to other men's fears. I think it divides this nation at a very critical time -- and therefore it weakens us as a united country."
Though LBJ had been a product of the South and had opposed civil rights legislation earlier in his career, he had come to fully embrace the cause of civil rights with a religious zeal, pushing the Civil Rights Act of 1957 while serving as Senate majority leader and then as President moving the Civil Rights Act of 1964 and the Voting Rights Act of 1965 through the Congress.
The President, who was frustrated that the accomplishments from his Great Society were at risk in 1966, didn't hold back when speaking to the reporters. "I think that the so-called 'white backlash' is destructive, not only of the interests of Negro Americans, but of all those who stand to gain from humane and farsighted government. And those that stand to gain from humane and farsighted government is everybody. Nevertheless, there are those who try to stimulate suspicion into hatred, and to make fear and frustration their springboard into public office. Many of them do it openly. Some let their henchmen do it for them. Their responsibility is the same." ...
|
a192ce4719f3a501524cc5cd6511980d | https://historynewsnetwork.org/article/170422 | Why and How Donald Trump Flunks the Presidential Leadership Test | Why and How Donald Trump Flunks the Presidential Leadership Test
Now that the midterm elections are over there will be some reassessments of President Donald Trump. But whatever the verdict on his appeal to supporters or as a campaigner, he remains a failure as the leader of our country.
After he had completed almost a full year as president, 155 presidential scholars concluded that overall he had earned an “F” grade on their Presidential Greatness Survey. In addition to assigning him a general grade, the scholars also graded him on his legislative accomplishments, communicating with the public, foreign policy leadership, and embodying institutional norms. For the first two areas they gave him a “D”; for the last two, an “F.” A slightly larger group of scholars listed him as the worst and most polarizing of our 44 presidents.
Shortly before the above survey appeared, one of the most prominent historians of our presidents, Robert Dallek, stated that “it is clear Trump is unfit to serve,” and lawmakers should “invoke the 25th Amendment” to remove him from office. Dallek has written books about Franklin Roosevelt, Truman, Nixon, Kennedy, Lyndon Johnson, and Reagan. Another leading historian and author of a new book on U.S. presidents at war, Michael Beschloss, recently declared that because Trump lacks historical knowledge, empathy, and self-restraint, and “will grab for as much power as is available,” he is a more dangerous man to have in the White House than the previous presidents Beschloss has studied.
Of course, many Trump supporters might reply, “What can you expect from a group of eggheads?” Actually, the survey group mentioned above was the “Presidents & Executive Politics Section” of the American Political Science Association, which is listed as “the foremost organization of social science experts in presidential politics.” But the anti-intellectualism of Trump and many of his followers is well known, and is often stoked by Fox News and its ilk. And railing against academics or scientists does not invalidate their expertise. “Maybe so,” a Trump supporter might respond, “but how about their biases?”
True, historians, political and other social scientists, and natural scientists (including many who emphasize that human-caused climate change is a major danger) all have their biases. But central to their disciplines is the goal of truth-seeking and objectivity—which cannot be said for ideologues or political extremists—and academics and other experts, especially those in the natural sciences, are not uniformly liberal or progressive. For example, among the 155 scholars who assigned Trump an “F,” 41 percent described themselves as moderate or conservative. The moderates gave Trump an “F,” while the conservatives were more generous, assigning him a “D.” In addition, the scholars who ranked the presidents also listed the conservative Ronald Reagan as our ninth best one.
The Presidential Greatness Survey was completed in January 2018. Has President Trump done anything since then that would lead historians to raise his grade? After all, in September he claimed in a United Nations speech, “In less than two years, my administration has accomplished more than almost any administration in the history of our country. . . . so true.”
At about the same time, the White House posted a list of 52 Trump Administration accomplishments, many of them coming since January 2018. They dealt mainly with economic gains, including a tax cut; eliminating regulations, many relating to the environment; appointing more conservative judges; new foreign policy initiatives, such as renegotiating NAFTA, imposing new tariffs, and withdrawing from what Trump considered a “horrible, one-sided Iran Deal”; and toughening immigration policies, which includes the dubious claim of beginning a wall on the U.S.-Mexican border.
The main problem with the Trumpian list is that most of the “accomplishments” come with a heavy cost that the White House ignores.
Economic gains and tax cut? True, the unemployment rate has dropped from 4.7 to 3.7 percent since Trump took office, but he inherited an economy from President Obama that had racked up 75 straight months of job growth and reduced unemployment by more than 3 percent (7.8 to 4.7). Regarding the tax cut, it is skewed toward the wealthy and increases the federal deficit, which jumped almost 17 percent in fiscal 2018 as opposed to a year earlier. The great harm of this escalating deficit, like Trump’s climate policies, may now be ignored by many, but will wreak its woe in the future.
Eliminating regulations? The cutback on government regulations, especially regarding environmental protections, is already having negative consequences and is likely to have even more dire ones in days to come.
Appointing more conservative judges? An “accomplishment” only for conservatives, not for a majority of our citizens who are not on that side of the partisan divide.
New foreign policy initiatives? Many of these Trumpian actions, like withdrawing from the Iran nuclear deal and imposing new tariffs on foreign products, are considered backward steps by various experts. (The same can be said for the more recent announcement of the U.S. intention to withdraw from the Reagan-Gorbachev 1987 Intermediate-Range Nuclear Forces Treaty.)
Toughening immigration policies, including initiatives to build a border wall? Trump’s latest assertion that he intends to end the policy of automatic citizenship for children of undocumented immigrants and his comments about a caravan of migrants moving closer to the U.S.-Mexican border – “it almost looks like an invasion. . . . It really does look like an invasion”— along with his plan to send U.S. troops to the border, continue his fearmongering rhetoric about our immigration policies allowing in too many terrorists, rapists, and killers. As retired General and former Secretary of State Colin Powell said about the dispatch of troops, there was “no threat requiring this kind of deployment.”
Nevertheless, as Andrew Sullivan noted in October, “Trump’s record as a force of destruction is profound, whether it be the sabotage of Obamacare, the devastation of democratic norms, or the rattling of NATO,” and “he has also shifted the entire polity more decisively toward the authoritarian style of government. In this respect, yes, the Trump administration has indeed accomplished much more than many of us want to believe.”
Despite, or because of, what Trump has so far accomplished in 2018, we are still left with the question, Does his leadership (or lack thereof) still deserve an “F.”One presidential historian that seems to suggest the answer is still “Yes” is Doris Kearns Goodwin. She has written extensively about presidents such as Abraham Lincoln, Theodore Roosevelt, Franklin Roosevelt, and Lyndon Johnson. Her most recent book, Leadership: In Turbulent Times, also focuses on these four presidents.
In an October 2018 interview she suggested that Trump fails as a leader because, unlike him, a “leader normally takes blame when things are wrong,” “shares credit when things are right,” “controls his negative emotions,” “communicates honestly and with truth,” and surrounds himself with “a team that’s built with people who are strong-minded and can argue with him.” Rather than rejoicing in working with others to solve difficult political problems, Trump mainly seems most animated “when he’s arguing with other people and calling them names.”
In her leadership book, she writes that the four presidents she focuses on were “at their formidable best, when guided by a sense of moral purpose, they were able to channel their ambitions and summon their talents to enlarge the opportunities and lives of others.”
Relating to “moral purpose,” one review of her book cites her quote from Theodore Roosevelt that democracy can succeed only when there is “the fellow feeling, mutual respect, the sense of common duties and common interests, which arise when men take the trouble to understand one another, and to associate for a common purpose.”
The review also notes that the book emphasizes Lincoln’s “emotional intelligence, his empathy, humility, consistency, self-awareness, self-discipline, and generosity of spirit" as well as his “sensitivity, patience, prudence. . . tenderness and kindness.” “Is leadership possible,” Goodwin asks, “without a purpose larger than personal ambition . . . [and] guided by a sense of moral purpose?” The reviewer believes that even though Kearns never mentions Trump in her book, she strongly suggests that “in temperament and democratic vision and values, Trump is the anti-Lincoln.”
Decades before Kearn’s book on leadership appeared, another presidential historian, FDR biographer James MacGregor Burns, had written on the topic. For presidential leadership, he also emphasized the importance of values and moral purpose: “Hierarchies of values . . . undergird the dynamics of leadership.” In discussing FDR, he wrote, “It was because Roosevelt’s fundamental values were deeply humane and democratic that he was able, despite his earlier compromises and evasions, to act [against Hitler] when action was imperative.” Such leadership reflects “considerations of purpose or value that may lie beyond calculations of personal advancement.”
Still another historian, Ronald Feinman, who in 2018 wrote about 14 presidents who have demonstrated moral courage, has stated that “the most significant factor” in rating presidents’ greatness “is when they demonstrate moral courage on major issues that affect the long term future.” He believes that President Trump “has no ability within his own narcissistic, self-serving personality, to be anything like these Presidents.”
In an essay earlier this year, I also emphasized the importance of values and moral purpose and contended that Trump has no moral compass. Many of Trump’s evangelical supporters may think that his stance on questions like abortion indicate his moral sympathies, but given his history on the issue, it is difficult to conclude that such sympathies mattered more than political opportunism.
Given the insights of Goodwin and others on presidential leadership, especially regarding “moral purpose,” Trump’s “F” for 2017 should not now be upgraded. In fact, as the midterm elections got closer, Trump’s moral failures worsened. For example, he ramped up his lying and his fearmongering regarding illegal immigrants. (Washington Post headline, October 22, 2018: “Trump, said 83 untrue things in a single day.”)
As to the future, change is always possible. (Feinman mentions Ronald Reagan, “who after strong anti-Communist rhetoric, moved to end the Cold War by negotiating with Soviet leader Mikhail Gorbachev in the late 1980s.”) Since Trump’s egotism is much more anchored than any solid moral beliefs he may have, any change that he thinks might resound to his glory cannot be ruled out. (Remember that when asked by a reporter if he thought he should win the Nobel Peace Prize, he replied, “Everyone thinks so, but I would never say it.”)
With the Democrats now in charge of the House of Representatives, Trump may be inclined to demonstrate that his July-2018 words—“I’m different than other [presidents]. I’m a dealmaker. I’ve made deals all my life. I do really well. I make great deals”—are not just hollow rhetoric. But compromising with Democrats and advancing legislation for the common good perhaps implies more humility than he can muster. Plus, other roadblocks to bipartisanship exist. Regarding the chances of Trump becoming a wiser and less polarizing president, the words that most come to mind are those of the poet W. H. Auden, who in a different context once wrote: “Is it likely? No.”
|
a9f0afbccbf5a18dee58f07490483b09 | https://historynewsnetwork.org/article/170426 | Former Belgian King Ordered to Give DNA for Paternity Test | Former Belgian King Ordered to Give DNA for Paternity Test
Delphine Boël grew up certain that she should be a princess. Now a court ruling and a DNA test could make it true.
Ms. Boël, 50, is a Belgian visual artist who claims that she was conceived during an extramarital affair between King Albert II of Belgium and Ms. Boël’s mother, Baroness Sybille de Selys Longchamps.
The court ruling, published Monday, gave credence to her claim and ordered Albert to submit DNA evidence to determine whether he is Ms. Boël’s biological father.
|
b6a7342c7931c21b15b56a372560da9a | https://historynewsnetwork.org/article/170435 | The long, racist history of Florida’s now-repealed ban on felons voting | The long, racist history of Florida’s now-repealed ban on felons voting
In 1868, Florida’s white elites faced a threat every bit as grave as the Civil War that had ended in Confederate defeat three years earlier. Congress had just forced Florida to rewrite its constitution to allow every man the right to vote. But adding thousands of newly eligible black residents to the rolls would abruptly make whites a voting minority.
The old guard’s only hope was to somehow ban black voters without violating Reconstruction acts passed by Congress after the Civil War. Huddled in Tallahassee backrooms throughout that cool January, they found just the ticket: a lifetime voting ban on anyone with a felony conviction. Combined with postwar laws that made it easy to saddle black residents with criminal records, legislators knew they could suppress black votes indefinitely.
Or at least for a century and a half. On Tuesday, Florida voted to end that 150-year-old ban by backing Amendment 4, which will return voting rights to more than 1 million Floridians who have already served out their sentences. The amendment garnered 64 percent of the vote.
|
9e88530a28580e2ea451e2af3592c373 | https://historynewsnetwork.org/article/170441 | Crossing From Asia, the First Americans Rushed Into the Unknown | Crossing From Asia, the First Americans Rushed Into the Unknown
Nearly 11,000 years ago, a man died in what is now Nevada. Wrapped in a rabbit-skin blanket and reed mats, he was buried in a place called Spirit Cave.
Now scientists have recovered and analyzed his DNA, along with that of 70 other ancient people whose remains were discovered throughout the Americas. The findings lend astonishing detail to a story once lost to prehistory: how and when humans spread across the Western Hemisphere.
The earliest known arrivals from Asia were already splitting into recognizably distinct groups, the research suggests. Some of these populations thrived, becoming the ancestors of indigenous peoples throughout the hemisphere.
|
7f7356d02417058a55631a2c860cb6b8 | https://historynewsnetwork.org/article/170442 | The president's party just suffered a historically big wipeout during a strong economy | The president's party just suffered a historically big wipeout during a strong economy
President Trump argued in his post-election news conference that the Republican Party’s performance in the midterms defied history. An analysis by a JPMorgan economist suggests Trump is right — just not in the way he intends.
Measured against the strength of the economy, the GOP’s losses in the House mark the worst midterm results for a president's own party in at least a century, per Michael Cembalest, JPMorgan Asset Management’s chairman of market and investment strategy.
|
14cea837a7e55c861dcf6196621510ec | https://historynewsnetwork.org/article/170460 | Michael Beschloss says it’s time to stop celebrating Woodrow Wilson | Michael Beschloss says it’s time to stop celebrating Woodrow Wilson
On the Nov. 11 100th anniversary of the Allied victory in World War I, I’m celebrating the heroism of American warriors in Europe. Perhaps 116,000 of them died in that struggle. Their commander in chief, Woodrow Wilson, did not match the quality of their service. During the conflict, Wilson made serious mistakes as a political leader that should never be forgotten.
Wilson’s missteps in wartime were hardly his only defects. His most disgraceful flaw was his racism. Given his high-flown rhetoric as a professor about elevating humankind, Wilson especially stood out in his white supremacy. He was not a man of his time but a throwback. His two predecessors, Theodore Roosevelt and William Howard Taft, had looked far kindlier on African Americans and their rights.
In 1916, Wilson, a Democrat, narrowly won reelection, campaigning under false pretenses with the slogan “He Kept Us Out of War.” Privately, however, he knew it was quite possible that he would take the nation into the European struggle soon after starting his second term.
As an academic, Wilson had emphasized the need for presidents to explain military setbacks and other complex or mystifying events to Americans. Yet he spent much of 1917, the first year of U.S. engagement in the war, in kingly isolation, rarely using his luminous oratorical gifts to explain to his countrymen why they needed to make severe sacrifices for a conflict that wasn’t an obvious, direct threat to America’s national security.
Wilson, who preened as a civil libertarian, persuaded Congress to pass the Espionage Act, giving him extraordinary power to retaliate against Americans who opposed him and his wartime behavior. That same law today enables presidents to harass their political adversaries. Wilson’s Justice Department also convicted almost a thousand people for using “disloyal, profane, scurrilous or abusive language” against the government, the military or the flag. Wilson is an excellent example of how presidents can exploit wars to increase authoritarian power and restrict freedom, some arguing that criticizing the commander in chief amounts to criticizing soldiers in the field. ...
|
bbd80c82eff94c958517d5c6762acb33 | https://historynewsnetwork.org/article/170474 | Democrats Aren’t Moving Left. They’re Returning to Their Roots. | Democrats Aren’t Moving Left. They’re Returning to Their Roots.
Be advised: “Democrats are in danger of going too far left in 2018.” So warn Republicans like Mitt Romney and ex-Democrats like Joe Lieberman and public personae as diverse as James Comey and Howard Schultz. In recent months, the pundit class has determined that the party’s leftward lurch heralds the rise of a “liberal tea party”—a movement that could very well unmoor Democrats from their longstanding center-left traditions, in close imitation of the spiral of events that caused the Republican Party to turn sharply to the right in recent years.
What’s fueling this argument? For one, more Democrats have rallied, either noisily or cautiously, around such policy innovations as “Medicare for all,” universal college and a universal basic income. That a smattering of Democratic candidates have elected to call themselves “democratic socialists” has only fueled the claim that such programs are “socialist.” “The center is Harry Truman and Daniel Patrick Moynihan, not Eugene Debs and Michael Harrington,” warned New York Times opinion columnist Bret Stephens recently. (Debs and Harrington were self-identified socialists.)
But there’s something wrong with this historical interpretation: Truman strongly supported single-payer health care. Moynihan supported a universal basic income in the 1960s. Dating back to World War II, Democrats sought to make a government-paid education available to as many Americans as possible. If Democrats are marching to the left, that road leads directly back to platforms and politicians who, in their day, commanded wide support and existed firmly in the mainstream of political thought.
What’s more, to label these programs “socialist”—which is to say, far outside the center of the political spectrum—reveals a constrained worldview. For over six decades, center-right parties in Europe—in Britain and France, Germany and Austria, and almost everywhere between—have either participated in or acceded to the very same policies.
In a remarkable way, today’s debate today strongly resembles a broader discussion that occurred in the United States and Europe in the 1940s, amid wartime mobilization and economic reconstruction, and in the decades following. ...
|
a0f3a0c83a3db828e1af066d86a22e27 | https://historynewsnetwork.org/article/170479 | Midterms and Troops: The Bid to Save a Party that Led to the Wounded Knee Massacre | Midterms and Troops: The Bid to Save a Party that Led to the Wounded Knee Massacre
On November 13, 1890, troops moved into South Dakota, a military movement that would result six weeks later in the Wounded Knee Massacre. The president sent soldiers to South Dakota, the largest movement of troops since the Civil War, in the midst of a midterm election campaign that looked bad for his party.
In 1890, Republican president Benjamin Harrison was facing a revolt in the midterms. The Republicans had risen before the Civil War as the party of ordinary farmers and workers, and had fought the Civil War to take control of America out of the hands of the nation’s wealthy slave owners. But after the war, Republicans had gradually swung behind the nation’s rising industrialists—men like Andrew Carnegie and J. D. Rockefeller—and propped up their industries with tariff walls that enabled them to keep consumer prices high. Voters, who hated the tariffs, increasingly backed the Democrats, who promised to lower them. Democrats had won the House of Representatives in 1874, and in 1884, Grover Cleveland became the first Democrat elected to the White House since the 1850s. Horrified Republicans had pulled out all the stops in 1888 to reclaim the government for their party.
In the 1888 election, they tapped large donors to fill the Republican war chest, then used the money to flood newspapers with pro-tariff arguments, warning that the Democrats were radicals who would destroy the economy, and promising that Republicans themselves would “reform” the tariff. But while Republicans’ strategy won the House of Representatives, it didn’t work for the presidency: Cleveland garnered about 100,000 more votes than the Republican candidate, Benjamin Harrison. So Republican operatives swung the election in the Electoral College, striking a backroom deal with the New York delegation to win an electoral victory. When the pious Harrison mused that Providence had given him the win, one of his operatives grumbled, “Providence hadn’t a damn thing to do with it. A number of men were compelled to approach the penitentiary to make him President.”
Harrison’s men recognized that they could not continue to hold power under the current system. So they rigged it. They admitted six new western states to the Union, the largest bulk admission of states since the original thirteen. In 1889, they split the huge Territory of Dakota into two parts—North Dakota and South Dakota—and added both to the Union, along with Washington and Montana. They fully expected the new states to vote Republican: when Montana went Democratic, they claimed the vote was fraudulent and replaced the Democrats with Republicans. In 1890, they added Wyoming and Idaho, moving so fast in the latter case that they had to call for volunteers to write a constitution that voters approved only months later. It was unclear that any of these western states even had enough people in them to justify statehood, but Republicans insisted the forthcoming 1890 census would prove that admitting them had been warranted. Administration men boasted that the admission of the new states would guarantee Republican control of the Senate and the Electoral College for the foreseeable future.
With this security in place, party leaders actually raised, rather than lowered, tariff rates just before the November election. They insisted that stronger protections for business would help workers by making the economy boom. ...
|
bcb971a794aa31f959f752d8a23b913e | https://historynewsnetwork.org/article/170485 | Who Gets the Last Word in a Disputed Senate Race? The Senate. | Who Gets the Last Word in a Disputed Senate Race? The Senate.
Amid the blur of lawsuits and noise around the recount in Florida’s Senate race lies a possible outcome that could be the ultimate twist: Senator Bill Nelson may not win the election, even if he wins the election.
That is because the Constitution says that the ultimate arbiter of who gets a Senate seat is the Senate itself — not election officials or the courts.
That would not seem to bode well for the Democrats or their incumbent, Mr. Nelson. Though he trails his Republican opponent, Gov. Rick Scott, the margin has dwindled to about 13,000 votes; Mr. Nelson hopes to make that up in the recount and take the lead.
If that were to happen, though, it is not unthinkable that Republicans would consider using their majority power in the Senate to refuse to seat Mr. Nelson and to give the seat to Mr. Scott instead — especially considering how he and his party have repeatedly insisted, without offering evidence, that the ballot review process has been riddled with fraud and misconduct.
|
0afee6fec304411d6a61a372b5bb7bbd | https://historynewsnetwork.org/article/170559 | Neanderthals and Humans Were No One Night Stand | Neanderthals and Humans Were No One Night Stand
Many people have a little bit of Neanderthal DNA. In recent years, this discovery has led scientists to conclude that early humans mated with Neanderthals over a single period of time. However, new research suggests that these groups mated with each other over multiple encounters—in other words, this was no one night stand.
As early humans migrated out of Africa, they interacted and mated with Neanderthals who lived in Europe and parts of Asia. People whose ancestors stayed in Africa don’t have any Neanderthal DNA because these two groups never got a chance to meet up. In contrast, everyone else in the world has about two percent Neanderthal DNA from when their ancestors frolicked with these ancient hominins during their migrations.
The reason researchers think Neanderthals and early humans had multiple encounters is because the percentage of Neanderthal DNA you have depends on where your ancestors come from. Compared to people with European ancestry, the proportion of Neanderthal DNA is 12 to 20 percent higher in people with East Asian ancestry. This research, published in Nature on November 26, 2018, suggests that early humans and Neanderthals didn’t just come together during just one historical episode.
|
6a2fd8ec2d881c8f98ee7b1adf3e2a67 | https://historynewsnetwork.org/article/170561 | Yale's David Blight explains why he was drawn to Frederick Douglass | Yale's David Blight explains why he was drawn to Frederick Douglass
A Conversation with David Blight, author of FREDERICK DOUGLASS
Q. You worked on this book for ten years, but first began researching Douglass as a PhD student. What initially inspired you to study Douglass? Did you always plan to write a biography of Douglass?
After that first book in the late 1980s, and over time, I edited new editions of Douglass’s first and second autobiographies, a new edition of the Columbian Orator(the book that changed Douglass’s life as a slave), and I had written a number of essays on the former slave. But I did not really intend to write a full biography until I encountered the Evans collection in Savannah. Only then did I decide to attempt a full life of Douglass. I was initially inspired to study Douglass in graduate school in the early 1980s because I wanted to research and write about abolitionists and the coming of the Civil War. I especially wanted to probe the stories of black abolitionists. Douglass was the greatest perhaps of all abolitionists, and I was especially drawn to the famous orator and writer’s masterful use of words. The research for this biography took me many places and to many archives in the US and the UK. The places in Douglass’s life are very important to his biography.
Q. Describe the private collection of Douglass papers and material that you had access to during your research. How did you come upon this collection?
In 2008 I first encountered the private collection of Douglass manuscripts owned by Walter O. Evans while on a lecture trip to Savannah, GA. The material was less discovered than it was purchased over time from other collectors. I was the first Douglass scholar to extensively consult this collection and my book is the first full biography ever to use the collection.
The collection consists of ten or so Douglass family scrapbooks assembled largely by Douglass’s sons, Lewis, Frederick Jr., and Charles; many family letters and other documents; photographs; as well as handwritten and typescript versions of many speeches. The collection especially contains thousands of newspaper clippings from the final third of Douglass’s life, from the 1860s to the 1890s. I spent many weeks over the past nine years doing research on the Evans’s dining room table as their guest. It was my great luck to encounter Walter and his collection when I did. It is one of those stories we historians dream about, and a story about the relationships that are possible between astute and deeply knowledgeable collectors and the scholars who depend upon them.
Q. What insights did you glean from your research using the Evans collection that you had not previously known about Douglass?
The Evans collection allowed me unprecedented access to the Douglass of the last thirty years of his life, the 1860s to 1890s. It opened worlds we had not yet seen before about his family relationships, his back-breaking and nearly endless lecture tours into old age, his place and role as the leader of black America in Washington DC, his finances revealed in the account books, his roles as Marshal of DC and Recorder of Deeds of DC, his role and place as a symbolic leader, his image out across the land where he travelled, his lecture tours in the deep South about which we knew almost nothing before, his hugely controversial marriage to second wife, Helen Pitts. The Evans collection above all gave me new insights into the extraordinary trajectory of Douglass's life—the former slave born in a backwater of the South who rises to be the most famous abolitionist in America, who lives to see his cause triumphant in the Civil War, but then also lives long enough (another 30 years) to see much of that victory betrayed and eclipsed. Above all the Evans collection makes possible the fullest critical treatment of the older and aging Douglass ever attempted and I have made this story a primary thread of the book.
Q. Can you discuss the tension the aging Douglass experienced as he made the transition from radical outsider to political insider and symbolic figure of great fame?
Few radical reformers in history live to see their causes triumph, and then also live long enough to become a political insider within the government or a system they had fought to overthrow, destroy, and reinvent. Nelson Mandela comes to mind. Vaclav Havel and many other Eastern European leaders after the fall of the Soviet Union and the Berlin Wall also come to mind. Some of America’s major Civil Rights leaders who later became major office holders also are good examples. Douglass is the greatest example of this phenomenon in the 19th century. In his case this meant becoming a loyal advocate of the Republican Party for thirty years as it decisively changed from the party of emancipation and the 13th, 14th, and 15th Amendments to the party of big business and the retreat from the egalitarian transformations of Reconstruction. Douglass became an office holder (by appointment, not election), he became often a symbol as much as an actual political leader. Douglass always had to live up to expectations of performing as the black leader, the voice of the freedpeople, the former slave who had to prove the capacities of black people. But above all Douglass also became in the final thirty years of his life, 1865-1895, a patriarch of a huge extended family of three sons, one daughter, and twenty-one grandchildren. Along with his two wives over time, these kinfolk all became to one degree or another financially dependent on Douglass. Living in Washington, his family emerged as a kind of black first family in the District of Columbia press. Douglass, therefore, lived with an acute problem of “fame,” in all its positive and negative aspects.
Q. Douglass was a man of contradictions, and in the book you address how Douglass has become a figure adopted by all elements of the political spectrum—even current Republicans, who have claimed Douglass as one of their own. Why do you think that is?
Like all great writers and leaders who live a long time, Douglass was a person embodying contradictions. He possessed a long love/hate relationship with America. He became a radical patriot who also levied some of the most withering attacks on American racism, hypocrisy, and craven defense of slavery and states’ rights. He could be an enormously compassionate man toward all people undergoing oppression and indignity, but he also during the war practiced a vicious brand of war propaganda aimed at the destruction not only of the Confederacy but of white Southerners themselves. Douglass was a vehement advocate of the natural rights tradition, of human equality, and of aggressive use of interventionist, activist government. But he was at the same time a persistent voice of black self-reliance, of his people’s efforts to make their own way in the world, before and after emancipation. Douglass often argued that the federal government should let black folk alone, but always give them at the same time fair play and a fair chance. Unfortunately, modern-day conservatives have appropriated Douglass’s advocacy of self-help, claimed him as a voice of limited government (nothing could be more wrong), and therefore all but erased his radical abolitionism, his fierce fight for equality, as they stress how Douglass was a “Republican.” Douglass was a member of a Republican Party decidedly different, with a very different history, especially about race, from the Republican Party that exists today and has frankly existed now for several generations. Douglass is now one of the major figures of American history that various parts of our political spectrum try to claim for their own causes and uses. This usually says much more about the people or groups appropriating Douglass than about Douglass himself.
Q. How would you characterize Douglass’s legacy today? What lessons could we all—political leaders, cultural leaders, and active citizens—take from his life and work?
Douglass delivers many legacies to us today in the 21st century, both from the trajectory of his life and from his ideas and writings. He is first one of the best examples ever of a person who led by language, a genius with words whose oratory and writing provide the primary reasons we know him. Second, Douglass delivered over and over a critique of America as a slave society that had to be dismembered and destroyed before being re-created around the idea of human equality. Third, Douglass’s writings, especially in the autobiographies, constitute the most compelling descriptions and analysis of the nature and meaning of American slavery crafted by any American. Fourth, on a personal level, for anyone who has ever experienced despair, captivity, oppression in many forms, displacement, isolation of the soul, or legal and political denial, Douglass’s story, and his writings, offer a deep well of hope and inspiration. Fifth, Douglass might have given up on the cause of abolition, of emancipation, of US victory in the Civil War, or of the endurance of the triumphs for black rights in Reconstruction. But he never truly gave up. That alone gives his life and thought lasting use and significance. Sixth, Douglass was a great editor, writer, speaker. He was an organizer, a creator of and believer in social protest movements. All who seek social and political change or transformation do well to examine Douglass’s example. Seventh, Douglass remains a classic model of political pragmatism grown out of radicalism. His story shows us over and again that all revolutions will lead to counter-revolutions. A true reformer has to keep a long view of history, and try to fashion the most effective and not always the most radical method of change. Eighth, Douglass not only lived a heroic life in his escape from slavery and the remaking of himself in freedom; he became a major thinker – about the nature of history, about the natural rights tradition, about political and constitutional philosophy, about the elements of morality in human nature. Ninth, Douglass has a great deal to tell us eternally about what it means to be an American, and about how the issue and history of race stands at the center of that question. And tenth and finally, but not least, Douglass’s world view, sense of history, and his gripping talent for storytelling rested deeply in his reading and use of the Bible, especially the Old Testament. Just why Americans in the nineteenth century were so steeped in Biblical story and metaphor is beautifully and powerfully on display in Douglass life and work.
Q. You have spent over thirty years of your life studying Douglass. If you had the chance to sit down with Douglass, what is the one thing you would want to know?
There are a thousand questions I would want to ask Douglass if I had him in a room. But if I only get one it would be something like: How, sir, would you characterize the meaning and lasting significance of the Civil War in your life and that of your entire family? In this question I would hope to keep him talking about the many possible meanings he might raise.
|
65aa4fa01bc11a1383cc3d8d526ab3ab | https://historynewsnetwork.org/article/170577 | The DNA Industry and the Disappearing Indian | The DNA Industry and the Disappearing Indian
Amid the barrage of racist, anti-immigrant, and other attacks launched by President Trump and his administration in recent months, a series of little noted steps have threatened Native American land rights and sovereignty. Such attacks have focused on tribal sovereignty, the Indian Child Welfare Act (ICWA), and the voting rights of Native Americans, and they have come from Washington, the courts, and a state legislature. What they share is a single conceptual framework: the idea that the long history that has shaped U.S.-Native American relations has no relevance to today’s realities.
Meanwhile, in an apparently unrelated event, Senator Elizabeth Warren, egged on by Donald Trump’s “Pocahontas” taunts and his mocking of her claims to native ancestry, triumphantly touted her DNA results to “prove” her Native American heritage. In turning to the burgeoning, for-profit DNA industry, however, she implicitly lent her progressive weight to claims about race and identity that go hand in hand with moves to undermine Native sovereignty.
The DNA industry has, in fact, found a way to profit from reviving and modernizing antiquated ideas about the biological origins of race and repackaging them in a cheerful, Disneyfied wrapping. While it’s true that the it’s-a-small-world-after-all multiculturalism of the new racial science rejects nineteenth-century scientific racism and Social Darwinism, it is offering a twenty-first-century version of pseudoscience that once again reduces race to a matter of genetics and origins. In the process, the corporate-promoted ancestry fad conveniently manages to erase the histories of conquest, colonization, and exploitation that created not just racial inequality but race itself as a crucial category in the modern world.
Today’s policy attacks on Native rights reproduce the same misunderstandings of race that the DNA industry is now so assiduously promoting. If Native Americans are reduced to little more than another genetic variation, there is no need for laws that acknowledge their land rights, treaty rights, and sovereignty. Nor must any thought be given to how to compensate for past harms, not to speak of the present ones that still structure their realities. A genetic understanding of race distorts such policies into unfair “privileges” offered to a racially defined group and so “discrimination” against non-Natives. This is precisely the logic behind recent rulings that have denied Mashpee tribal land rights in Massachusetts, dismantled the Indian Child Welfare Act (a law aimed at preventing the removal of Native American children from their families or communities), and attempted to suppress Native voting rights in North Dakota.
Profiting by Recreating Race
Let’s start by looking at how the ancestry industry contributes to, and profits from, a twenty-first-century reformulation of race. Companies like Ancestry.com and 23andMe lure customers into donating their DNA and a hefty sum of money in exchange for detailed reports claiming to reveal the exact geographical origins of their ancestors going back multiple generations. “Who do you think you are?” asks Ancestry.com, typically enough. The answer, the company promises, lies in your genes.
Such businesses eschew the actual term “race” in their literature. They claiminstead that DNA reveals “ancestry composition” and “ethnicity.” In the process, however, they turn ethnicity, a term once explicitly meant to describe culture and identity, into something that can be measured in the genes. They conflate ethnicity with geography, and geography with genetic markers. Perhaps you won’t be surprised to learn that the “ethnicities” they identify bear an eerie resemblance to the “races” identified by European scientific racist thinking a century ago. They then produce scientific-looking “reports” that contain purportedly exact percentages linking consumers to places as specific as “Sardinia” or as broad as “East Asia.”
At their most benign, these reports have become the equivalent of a contemporary parlor game, especially for white Americans who make up the vast majority of the participants. But there is a sinister undertone to it all, reviving as it does a long-discredited pseudoscientific basis for racism: the notion that race, ethnicity, and ancestry are revealed in the genes and the blood, and passed down inexorably, even if invisibly, from generation to generation. Behind this lies the assumption that those genes (or variations) originate within clearly defined national or geographic borders and that they reveal something meaningful about who we are -- something otherwise invisible. In this way, race and ethnicity are separated from and elevated above experience, culture, and history.
Is There Any Science Behind It?
Although all humans share 99.9% of our DNA, there are some markers that exhibit variations. It’s these markers that the testers study, relying on the fact that certain variations are more (or less) common in different geographical areas. As law and sociology professor Dorothy Roberts puts it, “No sooner had the Human Genome Project determined that human beings are 99.9% alike than many scientists shifted their focus from human genetic commonality to the 0.1% of human genetic difference. This difference is increasingly seen as encompassing race.”
Ancestry tests rely on a fundamental -- and racialized -- misunderstanding of how ancestry works. The popular assumption is that each of us contains discrete and measurable percentages of the “blood” and DNA of our two biological parents, four grandparents, eight great-grandparents, sixteen great-great-grandparents, and so on, and that this ancestral line can be traced back hundreds of years in a meaningful way. It can’t. As science journalist Carl Zimmer explains, “DNA is not a liquid that can be broken down into microscopic drops... We inherit about a quarter of our DNA from each grandparent -- but only on average... If you pick one of your ancestors from 10 generations back, the odds are around 50% that you carry any DNA from him or her. The odds get even worse beyond that.”
In reality, such testing does not tell us much about our ancestors. That’s partly because of the way DNA is passed down through the generations and partly because there exists no database of ancestral DNA. Instead, the companies compare your DNA to that of other contemporary humans who have paid them to take the test. Then they compare your particular variations to patterns of geographical and ethnic distribution of such variations in today’s world -- and use secret algorithms to assign purportedly precise ancestral percentages to them.
So is there really a Sardinian or East Asian gene or genetic variation? Of course not. If there is one fact that we know about human history, it’s that ours is a history of migrations. We all originated in East Africa and populated the planet through ongoing migrations and interactions. None of this has ended (and, in fact, thanks to climate change, it will only increase). Cultures, ethnicities, and settlements can’t be frozen in time. The only thing that is constant is change. The peoples who reside in today’s Sardinia or East Asia are a snapshot that captures only a moment in a history of motion. The DNA industry’s claims about ancestry award that moment a false sense of permanence.
While whites of European ancestry seem enthralled with the implications of this new racial science, few Native Americans have chosen to donate to such databases. Centuries of abuse at the hands of colonial researchers who made their careers on Native ancestral remains, cultural artifacts, and languages have generated a widespread skepticism toward the notion of offering genetic material for the good of “science.” In fact, when it comes to one DNA testing outfit, 23andMe, all of the countries included in its lists of the geographical origins of those who have contributed to its “Native American” database are in Latin America and the Caribbean. “In North America,” the company blandly explains, “Native American ancestry tends to be five or more generations back, so that little DNA evidence of this heritage remains.” In other words, 23andMe claims DNA as conclusive proof of Native American identity, then uses it to write Native North Americans off the map altogether.
The Ancestry Industry and the Disappearing Indian
The ancestry industry, even while celebrating diverse origins and multiculturalism, has revived long-held ideas about purity and authenticity. For much of U.S. history, white colonizers argued that Native Americans would “vanish,” at least in part through biological dilution. New England’s native peoples were, for instance, systematically denied land rights and tribal status in the nineteenth century on the grounds that they were too racially mixed to be “authentic” Indians.
As historian Jean O’Brien has explained, “Insistence on ‘blood purity’ as a central criterion of ‘authentic’ Indianness reflected the scientific racism that prevailed in the nineteenth century. New England Indians had intermarried, including with African Americans, for many decades, and their failure to comply with non-Indian ideas about Indian phenotype strained the credence for their Indianness in New England minds.” The supposed “disappearance” of such Indians then justified the elimination of any rights that they might have had to land or sovereignty, the elimination of which, in a form of circular reasoning, only confirmed their nonexistence as a people.
However, it was never phenotype or distant ancestry but, as O’Brien points out, “complex regional kinship networks that remained at the core of Indian identity in New England, despite the nearly complete Indian dispossession that English colonists accomplished... Even as Indians continued to reckon membership in their communities through the time-honored system of kinship, New Englanders invoked the myth of blood purity as identity in denying Indian persistence.”
Such antiquated understandings of race as a biological or scientific category allowed whites to deny Indian existence -- and now allow them to make biological claims about “Indian” identity. Until recently, such claims, as in Senator Warren’s case, rested on the murkiness of family tales. Today, the supposed ability of DNA companies to find genetic “proof” of such a background reinforces the idea that Indian identity is something measurable in the blood and sidesteps the historical basis for the legal recognition or protection of Indian rights.
The ancestry industry assumes that there is something meaningful about the supposed racial identity of one of hundreds or even thousands of an individual’s ancestors. It’s an idea that plays directly into the hands of right-wingers who are intent on attacking what they call “identity politics” -- and the notion that “minorities” are becoming unduly privileged.
Indeed, white resentment flared at the suggestion that Senator Warren might have received some professional benefit from her claim to Native status. Despite an exhaustive investigation by the Boston Globe showing conclusively that she did not, the myth persistsand has become an implicit part of Donald Trump’s mockery of her. In fact, any quick scan of statistics will confirm the ludicrousness of such a position. It should be obvious that being Native American (or Black, or Latino) in the United States confers far more risks than benefits. Native Americans suffer from higher rates of poverty, unemployment, infant mortality, and low birth weight, as well as lower educational levels and shorter life spans than do whites. These statistics are the result of hundreds of years of genocide, exclusion, and discrimination -- not the presence or absence of specific genetic variations.
Reviving Race to Undermine Native Rights
Native rights, from sovereignty to acknowledgment of the conditions created by 500 years of colonial misrule, rest on an acceptance that race and identity are, in fact, the products of history. “Native Americans” came into being not through genes but through the historical processes of conquest and colonial rule, along with grudging and fragile acknowledgement of Native sovereignty. Native American nations are political and cultural entities, the products of history, not genes, and white people’s assertions about Native American ancestry and the DNA industry’s claim to be able to reveal such ancestry tend to run roughshod over this history.
Let’s look at three developments that have, over the past year, undermined the rights of Native Americans: the reversal of reservation status for Mashpee tribal lands in Massachusetts, the striking down of the Indian Child Welfare Act, and Republican attempts to suppress Native American votes in North Dakota. Each of these acts came from a different part of the government: the Bureau of Indian Affairs in the Department of the Interior, the courts, and North Dakota’s Republican-dominated state legislature. But all three rely on notions of identity that place race firmly in our genes rather than in our history. In the process, they deny the histories that turned the sovereign and autonomous peoples of North America before European colonists arrived in “the New World” into “Native Americans,” and imply that Native American historical rights are meaningless.
The Mashpee of Massachusetts finally achieved federal recognition and a grant of reservation land only in 2007, based on the fact that they “had existed as a distinct community since the 1620s.” In other words, federal recognition was based on a historical, not a racialized, understanding of ethnicity and identity. However, the tribe’s drive to build a casino on its newly acquired reservation in Taunton, Massachusetts, would promptly be challenged by local property-owners. Their lawsuit relied on a technicality: that, as they argued in court, reservation land could only be granted to tribes that had been federally recognized as of 1934. In fact, the Mashpee struggle for recognition had been repeatedly stymied by long-held notions that the Indians of Massachusetts were not “real” or “authentic” because of centuries of racial mixing. There was nothing new in this. The state’s nineteenth-century legislature prefigured just such a twenty-first-century backlash against recognition when it boasted that real Indians no longer existed in Massachusetts and that the state was poised to wipe out all such “distinctions of race and caste.”
In September 2018, the Department of the Interior (to which the court assigned the ultimate decision) ruled against the Mashpees. Recently appointed Assistant Director of Indian Affairs Tara Sweeney, the first Native American to hold that position, “paved the way for a reservation to be taken out of trust for the first time since the termination era,” a 20-year period from the 1940s to the 1960s when the federal government attempted to “terminate” Native sovereignty entirely by dismantling reservations and removing Indians to urban areas to “assimilate” them. The new ruling could affect far more than the Mashpees. Some fear that, in the Trump years, the decision portends “a new termination era,” or even a possible “extermination era,” for the country’s Native Americans.
Meanwhile, on October 4th, a U.S. District Court struck down the Indian Child Welfare Act, or ICWA. This is a potentially devastating development as Congress passed that Act in 1978 to end the then-still-common practice of breaking up Native families by removing Indian children for adoption into white families. Such acts of removal date back to the earliest days of white settlement and over the centuries included various kinds of servitude and the founding of residential boarding schools for Indian children that were aimed at eliminating Native languages, cultures, and identities, while promoting “assimilation.” Indian child removal continued into the late twentieth century through a federally sponsored “Indian Adoption Project,” as well as the sending of a remarkable number of such children into the foster care system.
According to the ICWA, “An alarmingly high percentage of Indian families are broken up by the removal, often unwarranted, of their children from them by nontribal public and private agencies and that an alarmingly high percentage of such children are placed in non-Indian foster and adoptive homes and institutions.” States, it added, “have often failed to recognize the essential tribal relations of Indian people and the cultural and social standards prevailing in Indian communities and families.” The Act gave tribes primary jurisdiction over all child custody issues including foster placements and the termination of parental rights, requiring for the first time that priority be placed on keeping Native children with their parents, kin, or at least within the tribe.
The ICWA said nothing about race or ancestry. Instead, it recognized “Indian” as a political status, while acknowledging semi-sovereign collective rights. It was based on the Constitution’s implicit acknowledgement of Indian sovereignty and land rights and the assignment to the Federal government of relations with Indian tribes. The District Court’s ICWA decision trampled on the collective political rights of Indian tribes by maintaining that the act discriminated against non-Native families in limiting their right to foster or adopt Native children. That rationale, like the rationale behind the Mashpee decision, directly attacks the cultural and historical acknowledgement of Native sovereignty.
Superficially, the assault on Native voting rights may appear conceptually unrelated to the Mashpee and ICWA decisions. North Dakota is one of many primarily Republican-controlled states to take advantage of a 2013 Supreme Court ruling eliminating key protections of the Voting Rights Act to make registration and voting more difficult, especially for likely Democratic voters including the poor and people of color. After numerous challenges, a North Dakota law requiring prospective voters to provide a street address was finally upheld by a Supreme Court ruling in October 2018. The problem is this: thousands of rural Native Americans, on or off that state’s reservations, lack street addresses because their streets have no names, their homes no numbers. Native Americans are also disproportionately homeless.
In the North Dakota case, Native Americans are fighting for a right of American citizens -- the right to vote -- whereas the Mashpee and ICWA cases involve fights to defend Native sovereignty. The new voting law invoked equality and individual rights, even as it actually focused on restricting the rights of Native Americans. Underpinning such restrictions was a convenient denial by those Republicans that the country’s history had, in fact, created conditions that were decidedly unequal. (Thanks to a massive and expensive local effort to defend their right to vote, however, North Dakota’s Native Americans showed up in record numbers in the 2018 midterm election.)
These three political developments downplay Native American identity, sovereignty, and rights, while denying, implicitly or explicitly, that history created today’s realities of racial inequality. The use of DNA tests to claim “Native American” genes or blood trivializes this same history.
The recognition of tribal sovereignty at least acknowledges that the existence of the United States is predicated on its imposition of an unwanted, foreign political entity on Native lands. The concept of tribal sovereignty has given Native Americans a legal and collective basis for fighting for a different way of thinking about history, rights, and nationhood. Attempts to reduce Native American identity to a race that can be identified by a gene (or a genetic variation) do violence to our history and justify ongoing violations of Native rights.
Senator Elizabeth Warren had every right to set the record straight regarding false accusations about her employment history. She should, however, rethink the implications of letting either Donald Trump or the ancestry industry define what it means to be Native American.
|
b0c6a67556b819a08ddc7c37be72f84e | https://historynewsnetwork.org/article/170651 | Ancient strain of plague found in Sweden – and it could rewrite history | Ancient strain of plague found in Sweden – and it could rewrite history
A group of researchers from Sweden, Denmark and France analyzed skeletons from a tomb outside Falköping and found DNA traces of yersinia petis, the bacteria that cause plague.
The remains belonged to farmers who lived in the area around 4900 years ago.
All strains of the plague discovered in later periods – of which the most well-known being the Black Death which killed more than 50 million in the 14th century – are variants of this one, the researchers said.
|
d61e8941be9f6c1623086c7a673b73e1 | https://historynewsnetwork.org/article/170676 | Now that the Bush Funeral Is Over with, We Need to Talk Honestly About the End of the Cold War | Now that the Bush Funeral Is Over with, We Need to Talk Honestly About the End of the Cold War
Germans stand on top of the Wall in the days before it was torn down - By Lear 21 at English Wikipedia, CC BY-SA 3.0
Related Link How People Are Remembering George HW Bush
Imagine a superpower founded on a revolution inspired by Enlightenment values (often honored in the breach), great violence liberally applied and mostly badly remembered, and a myth of exceptionalism and superior progress that even its critics find hard to fully escape.
Our superpower’s global reach and bite commands if not the respect then at least the fear of the world in a manner second to no other state on the planet. While its adventures abroad are often less than successful and leave corpses and ruins in their wake, it is or feels so strong that that doesn’t teach it much.
Its domestic politics are close to stagnation, as visitors can quickly guess from its almost comically decrepit infrastructure so clearly not in sync with the ability of its scientists, engineers, and workers or its claims of indispensable leadership.
Its ruling elite is irrationally and obstinately wedded to hoary old dogmas – originally imported from dour European ideologists – about the relationship between the economy and politics. Increasing numbers of its ordinary citizens, meanwhile, are not only unhappy about specific policies but – much more worryingly – about the system as a whole, its principles, institutions, and representatives.
There is a widespread and plausible sense that the elites – in politics, the economy, and the media – have built themselves a privileged world of careerist cynicism, lying as a way of life, corruption without limits or regrets, and, last but not least, gross impunity.
This loss of faith in the political and social order is reflected in the rise of a desperately dark sense of humor. Especially the young wonder with increasing trepidation what fate awaits them in the world their elders have unmade. Even some former insiders and dissident elite members speak of the deep perversion of the system they know so well. Some citizens are even doubting the literally fundamental ideas and heroes of the original revolution.
This is, of course, a description of the late Soviet Union, the other superpower of the Cold War and the only state that, for now at least, could ever claim to have – very unwisely – challenged and stood its ground against post-World War Two America, for a while and at crippling cost.
Despite all the well-known differences, it is also, equally obviously, a description of the USA about one generation after the Cold War ended and the collapse of the Soviet Union. Where doctrinaire anti-capitalism mightily helped the Soviets dig their own grave, doctrinaire pro-capitalism might still do the job for their old nemesis, especially under conditions of expensive militarism, another similarity.
And where the simple-minded veneration of Lenin, the Soviet founding father, could not survive the deeply unsatisfying reality of the state he created, perhaps the flagrant flaws of American politics may finally end up toppling slave-holding founding fathers from their pedestals as well.
Certainly, the American Dream has already lived longer than its Soviet rival. But must it live forever? The Soviet one did not, and – most disturbingly – it died with a suddenness that took many observers by surprise. As the title of an influential post-mortem of the Soviet Union has it, “everything was forever, until it was no more.”
Yet despite – or because? – of this bleak picture, we have just seen a collective outpour of nostalgic triumphalism. Intriguingly, it focuses on the American leader during whose reign the Soviet Union breathed its last, George Bush I. After his passing away, most of the US media have exploited the traditional convention of the eulogy – to speak nothing but good of the dead – to engage in a collective fit of national self-adoration that Pravda might have been proud of. Almost everything about this exhibitionist wave of nostalgia is wrong. Bush I was not a kind king, but a ruthless wielder of power, at home and abroad.
Let’s focus, however, on just one element of this love fest for the powers that be – or at least were – namely the bizarre yet popular claim that Bush I managed the end of the Cold War well. This is factually misleading and chockfull of bad politics as well. Here is why: First of all, the Cold War did not end under Bush I, the Soviet Union did. The Cold War was over by the time, Bush I came into power. If we want to be nice to an arch-conservative American president for making a contribution to ending it – by taking Soviet initiatives seriously – that would be Ronald Reagan.
Why does that matter? Certainly not because Reagan must have his share of the glory. As almost all presidents, he will always be served well enough with adulation, deserved or, mostly, not. What we lose by misdating the end of the Cold War is a sense of how unlikely it was, not, at that point, because of the Soviets but because of the American establishment. Reagan’s one positive contribution to world history – after the war scares of 1983 which he helped bring about – was to go against the blob. Ironically, he did so precisely because he had that “vision thing” that Bush I would later mock. It was not the WASPs and their vaunted get-things-done sobriety that helped end the Cold War, but the wild-eyed if oddly placed utopianism of a former Hollywood actor. Thus no, this is not a lesson about trusting traditional elites to manage the world well – sorry, Ross Douthat.
We also gain something by conflating the end of the Cold War and that of the Soviet Union – and that is even worse, namely a blinding bias: If we pretend that the Cold War only ended when and because the Soviet Union disappeared, we imply that this was a war that could only end with the total defeat, even the annihilation – if, in this case, mostly peacefully – of the opponent. That is, of course, a favorite illusion of the American right and, alas, center.
Here, the end of the Cold War morphs into the greatest case of successful regime change yet – and an eternal reminder that there are no alternatives. Thus, the lesson implied is to never seek compromise with irritatingly, obstinately, unbearably other Others but, instead, insist that they become like us, whether they want to or not. Yet, in reality, compromise – if much in favor of the USA – is exactly how the Cold War really ended.
Put differently, it is a fallacy to believe that the USA won the Cold War because the Soviet Union lost it. Yes, the Soviets did lose it, but America, fortunately, initially only took advantage withoutinsisting on winning. That was the key to its end.
Which brings us to what happened afterwards, namely Bush I beginning to mess up the ensuing peace (an endeavor then continued by his successor Bill Clinton), in two ways at least: just after the Soviet Union had ceased to exist, in his State of the Union Address of January 1992, he could not resist crowing about America having won the Cold War, quite blasphemously, invoking higher powers as well. Indeed, he rubbed it in, making a point of insisting that the Cold War had not “ended” (his scare quotes) but been won.
And he presided over a war against Iraq that demonstrated that the post-Cold War “New World Order” would be one of America having it its way even more than before. In both instances, he did not create but helped along the Russian bitterness and self-pity that has since grown strident. He also promoted the American elite arrogance that has since grown self-defeating.
It’s not as if America could not learn anything from looking at Russia, but it has a habit of getting the lessons wrong. Watching Putin, it fails to see that his attempts to influence its politics are much less important than the deep capitalist-oligarchic convergence between the two countries.
Looking at the end of the Soviet Union, America fails to see that what lost the Cold War for its old best enemy was the hubris of super-powering-while-declining. What then killedit was its own failure to address its glaring flaws at home quickly and effectively enough and, of course, a ruthlessly self-interested elite that put its own careers, power, and profit above everything else.
|
d381df04f2f023963c74f7e35f0ca3db | https://historynewsnetwork.org/article/170747 | So I Became a Historian—Now I’m Telling How It Worked Out | So I Became a Historian—Now I’m Telling How It Worked Out
Although what has happened to history as a major is being disputed as I write, it is clear enough that “something is going on.” This essay is written entirely for an audience currently puzzled about “what to major in.” Because the author majored in history at three universities way back when (Emory, UGA, and Stanford), he has something to say on the subject. He not only survived as a history major, he flourished. Now 101, and obviously very active, he weighs here how it has been all these years to face life as a history major under seven important employers. Read away!
The idea here is for me to tell you, the Reader, about my long preparation for a life in history—and how it worked out. I intend to be candid, truthful even when it hurts, and now and then just a bit prideful. I expect you to emerge 10 percent better prepared to stick with (or abandon) your decision to become a history major.
My opinion is, you see, that a totally informal, conversational, recitation about the interaction between my history major and my life will be a good way for many a student to pass the time. I do know one thing for sure: History as subject matter has been far more than relevant to what happened to me as I have lived on to over 101 years of age. That’s right: born October 10, 1917.
I don’t believe I took any history courses in my two high schools in the vicinity of Philadelphia. In that jr/sr year I did read several books akin to history, by Roy Chapman Adams, Lincoln Steffens, David Fairchild and others, but I didn’t know that.
Now we’re in college (Emory University, on a nice scholarship). Sigma Chi didn’t seem to care what I majored in, so OK. Right off the bat in the first quarter was a required history course, “Europe Since 1500,” I think. It was competing with half course, Slide Rule (where I made 100) and Spanish (which I flunked, making three A’s at the same time). I have to say I was humiliated, so at year’s end I dropped my engineering major. Soon I decided to major in journalism where there were more A’s, and a C (barely), in accounting. Soon it was pre-law, only because in that major, I was free to take just about anything I wanted—and I wanted to explore the curriculum. Surely a little of this sounds familiar to you. Right? What about history?
Well, there were all kinds of Southern history courses; all were entrancing and appealing to the mostly native Georgians dominating the class. English history was exotic. In fact, I liked those history courses and especially the term papers that were always required.
I had no idea at the time that the maybe eight term papers I wrote in 1936-39 were conditioning me for a life of research! Yes, it’s true. Footnotes, ibid., op. cit. and Bibliography were infiltrating my cosmos and I was evolving deep down inside, whether I knew it or not.
But history didn’t have a monopoly on my life at the time. I pitched baseball to five victories in a row in class competition. Though offered a tennis scholarship at Duke University, my father turned it down. I loved abnormal psychology. Three law courses: law of the press, constitutional law, and international law, using that Law Library, skewed me toward a legal career.
Philosophy, and a course in logic were exciting. Oh: I should mention elective Bible – its history, only. At the end came an Honors assignment to study every aspect of the New South and be examined. Lots of history (mostly Southern).
That summer after finishing Honors with six other graduates who stood with me in Glenn Memorial Church, a letter came from the history professors with the second of 12 scholarships I was going to get—with and without application. They were buying me!It was summer, 1939. I put aside those law books (secondhand from Gainesville, Florida), and returned to Emory for a Masters Degree, taking anything historical that I wanted, doing any thesis – so long as it was historical (Royal Government in Colonial Georgia with a sophisticated title, rooted in really original sources). I was entranced with 17th century England. And, for a time, Ancient Greece and Rome…. Anthony and Cleopatra somehow caught my full attention. And why not?
Now, the powers that be maneuvered a full year history grant at the University of Georgia under a very productive senior faculty member. But he wasn’t there! Anyway, I continued to learn much too much about The South. Whoa.
World War II was coming for the United States, we wise ones thought in spring 1941, so I took up weightlifting and signed up with a recruiter for something military. Given the chance, I walked out on the Marines, and on September 25 I was called to active duty in a secret Intelligence unit of the Navy that seemed delighted with my history major. (It’s hard to believe they insisted that their recruits be “a third generation American.”) I was first an enlisted yeoman; then, luckily, an Ensign. Trained, two months, in “stuff.” I was the best, of many hundred, in the obstacle race, at NITSI-Naval Training School, Indoctrination, Quonset Pt, R.I. (Richard Nixon graduated in the next class, August, 1942.) As I lived that first military experience I admit that I don’t recall anybody asking, “What was your major in college?” I thought everybody would care.
My war career lasted over 4 years and involved major leadership on my part; nobody from the Admiral’s staff paid any attention to me; I ran the huge barracks at NAS Alameda alone, but with lots of Master at Arms and Compartment Cleaners working hard. I wrote a clean formal book on the subject of barracks administration, never published at 165 pages, when the Bomb ended the war unexpectedly. I wrote pamphlets spelling out things.
This barracks officer was popular! My history major was a howling success. Why? I could do almost anything that was needed! It turned out that I had taken a “paperwork skills” concentration; it was adaptable; I was literate; I was used to getting things done. At war’s end they wouldn’t let me out for four months because I was “valuable to the demobilization.” They offered me instant Lieutenant Commander if I would stay in. I didn’t, but later I decided to stick with the Reserves and put in 23 years total.
Postwar, I did advanced personnel work at Mercer University, for the Veteran’s Administration. I could authorize all kinds of remedial services and classes for the disabled. Next, I was employed on a 12 month contract at University of Miami, for 2 full years. I taught a heavy load of Western Civ and US Survey. It was time for Stanford, where I majored in history (with a political parties minor) and finished in June 1951. God bless the G.I. Bill.
I got three large grants after Stanford, doing tricky research and writing. My family was happy. Now (1953 to 1958) I pretty much founded the field of social welfare in American scholarship. A famous figure in San Francisco said I should fathom “The welfare needs of the people of California, and how they are being met” for the famed Commonwealth Club. They expected a big book. In three years they got one: California Social Welfare. Original, 108 tables, 100 pages of law, about 600 pages; some bullying of organizations public and private was part of “research.” Bodies I battled ranged from IRS to county and private units.
It was one of three books I now wrote on social welfare, having never studied a word on the subject or heard of it. Here were philanthropy and foundations; adoptions and birth control; charity; government programs of aid; religious units financing things. Prentice Hall went all out to produce the 5,000 handsome copies of California Social Welfare and they disappeared. Next I drafted, over and over, on a full year grant, Welfare in America, a beautiful book including photographs and poems. Oklahoma issued it twice.
Then after exciting New York City committee work with the American Heart Association, a weekend a month, I wrote for them the influential, The Heart Future. That newsworthy book made the news columns of the New York Times.
This yesterday history major was now to be interviewed in New York City repeatedly by organizations wanting me to work for them. I flew, from Santa Monica each time, but my collie said “no” to leaving the Pacific Ocean permanently. (There were offers, and quasi chances. One, possibly, was to direct the national FDR Infantile Paralysis unit. Rockefeller checked in with a research study. A mental health unit wanted me. All NYC.)
Groups stepped forward to help me along. I was the editor for things American at a great encyclopedia, but despised the working conditions. Then I was part of the scholarly Bureau of Medical Economic Research at the American Medical Association in Chicago. My financing during those years is of little interest but I do think it pertinent to mention that at one point when first enrolled for unemployment insurance in Chicago, I was referred to Midas Muffler, who wanted a head of “Research.” I wish I had ventured a visit, so I might say at this point something about “history and mufflers,” but I accepted a great alternative offer at that very moment.
We moved around, and we changed allegiances, but we survived. We were solvent. I was in a Marquis book already (and later another) but renown was in no way as important as contentment.
However, we must admit at this point that there had been for me a happy marriage in 1944 and the birth of two exceptional children. All three made their marks in life, in a very big way.
Now we turn to the tour-de-force: I became a general, final, editor for administration at The RAND Corporation, THE think tank, in Santa Monica. Enroute to the next step, I had aided three famous intellectuals prepare their books, for a year, half a year, a quarter of a year, fulltime in each case. Those books amounted to something! They were on FDR, Space, and thermonuclear war. I had to learn in a hurry and do a “perfect” job. Then I helped on a book about the cost of ulcers to society; planets in other systems; Laos; and more. That history major had built me for a think tank life.
I founded two oral history projects (RAND’s, Truman Library). Did maybe twenty long and really vital interviews at RAND. I also ghosted an important Harvey Mudd speech for the general manager. They used it to raise funds for some time, I heard.
Now came misfortune, as funding modifications unexpectedly demanded by the Air Force put me in a bind. How would I as a history major like to edit, henceforth, for engineering? I wouldn’t. The sciences? No, no. Sorry, children, I know you love the ocean and I love town hall and other things, but “it’s over.” Opportunities with gigantic corporations in “aerospace” got a no.
I had a few tough months. Interviewed by “history chairmen” I always heard “but you’re senior to many history faculty and certainly to me! Clark Kerr in two pages said his California system didn’t hire interdisciplinary people. I should have waited a decade; then they sure did!
Now in the mid 1960s, I rejoined academia. That is, I came in at “the top” of a rather small place: Professor of History and Social Science and Chairman, Social Sciences Division, at up and coming Southern Oregon College in Ashland on a twelve-month contract. I did it for seventeen years, most of it anyway. Directing and living with seven departments I found I had developed all kinds of—what shall we say?—maybe “talents.” I could DO things and avoid many hazards.
Sudden illness (a heart infarction) in time brought early retirement. It was the same thing that hit LBJ in 1956. Retirement? Really?
There isn’t much drama from 1980 to 2018. It’s been articles, and books all over the place, including four months (with my wife, Beth) working for Chapman Colleges World Campus Afloat. She was secretary to the cigar smoking ship’s dean. Lord!
It may be of some interest that during my years as Quasi-Dean teaching regular sessions and summers, I taught twenty-three different courses. It was necessary to fill slots when abruptly necessary.
There was membership for two decades on the United States Civil Rights Commission for Oregon. President of the Rogue Valley Symphony. Earlier, it was Sigma Chi; now it was fifty years and more in Ashland Rotary. Son was an Eagle Scout, daughter an ardent Girl Scout.
So: Let’s talk about “history” not quite in the erudite manner to be found now and then on the History News Network, but as, well, something I blundered or maneuvered into in the 1930s, ignored in the early 1940s, lived with solidly thereafter to 1951, and apparently got paid enough when affiliated with it to support a family—and to be happy—for several decades (actually, a lifetime).
I see nothing to be gained at this point in conversing about all that “who’s who” and “distinguished” stuff I picked up enroute, nor do I want to list books I wrote (eighteen) or helped others to write (maybe ten). Do take note of this plain fact: In my years as a historian I gave very little attention to the idea that I was “out of the ordinary” and said little about it—despite having endured and profited from nine (yes, nine!) years of university instruction, all told.
I would like to say here that “any history major could have done it.” But I really don’t know. I had handled major morning paper routes in high school; worked in my dad’s engineering office on ordinary stuff in summers; gotten little or no advice or “tutoring.” You could have. Yes.
I guess what I want to say is that one nice thing about majoring in history is that you may keep getting abler.
Back in the early 1930s I had no idea what I wanted to be. Taking history courses was “a way out.” It postponed decision. I kept getting more knowledgeable, yes, apparently smarter, but: I didn’t have to do anything about it. The world around me kept thinking I could DO whatever they had vaguely in mind. Sure enough, I mostly could. They didn’t seem concerned and neither did I.
Over a period of twenty-three years in the Naval Reserve, I was forced to take and teach all kinds of odd courses. Nothing to it. I always took them—yes, and over time, taught them, too. I can run things. Now, where did I learn that? I can DO things. How come? A 355 page book on SPACE requested by Congress was prepared by twenty geniuses in 1959. I was asked to precis it to a mere seventy-two pages. Yes!
Some are entranced at my editorship of American history, geography, and biography at the Encyclopaedia Britannica. I did do significant and important work there and liked the occupation of an editor.
I guess that those twenty-five books per “field” at Stanford on history, political parties, union labor, really seven fields in all, taught me something in addition to history, right? That is, “I can survive and sorta prevail in our world.” Why not? I did major in history!
I am published in quite a variety of learned journals, because a history major refuses to be sequestered. The Bornet bibliography goes to maybe 30 pages of fine print, 1933 to date.
So choose your major, you male or female student enrolled “somewhere.” There is a future family out there for you to create and support by being a teacher or professor—or lots of other things. Your father and mother are going to have to assume that you do know what you’re doing by majoring in history. Have a good life, next generation historian, if that’s what you decide you want—and life decides to let you become! Welcome to my academic fraternity, and good fortune attend ye, as you live from now to the very end of your highway.
|
05b87569c4491125d76808b60d5a5161 | https://historynewsnetwork.org/article/170791 | UPDATED Highlights of the 2019 AHA Annual Meeting | UPDATED Highlights of the 2019 AHA Annual Meeting
Dear #Twitterstorians; my kid in attendance at #AHA19 announced she wants to be a historian and activist. I think my work here is done.
— Naomi Rendina (@NaomiRendina) January 6, 2019
Key Links
AHA Annual Meeting (homepage) Twitter: AHA Annual Convention 2019 (Use #AHA19. If you want to live tweet a session use the session number. Example: #AHA19 #s25.) Program Guide AHA on Facebook Twitter: Main AHA Account
Interesting choice Macmillan... not sure I’d have brought those to the #AHA19 but sure... pic.twitter.com/FBqbGRAqv6
— Adam Domby (@AdamHDomby) January 5, 2019
PSA: Women with PhDs in history are not "little girls." #AHA19 #twitterstorians #womenalsoknowhistory https://t.co/xIcbjjIYpX
— Coordinating Council for Women in History (@TheCCWH) January 6, 2019
News
What Can Historians Teach The Media In The Era of Trump? 4 Historians Weigh In By Kyla Sommers (See also this post from John Fea's blog.) History in Crisis: 5 Challenges to Organizing Graduate Student Workers and 3 Ways to Still Succeed By Kyla Sommers Historians discuss efforts to evaluate student learning far beyond a grade.
.@JLWeisenfeld opens her paper by noting that she presented her first scholarly paper as a grad student at ASCH 29 yrs ago. At that conference, she was 1 of 2 scholars of color on the program + the only one presenting on AfAmerican religion. #ASCH19 pic.twitter.com/n0v2bS8xm2
— Christopher Jones (@ccjones13) January 4, 2019
What Historians Are Tweeting (most recent entries at top)
Statistical analysis of #AHA19's social media footprint Silent Sam controversy: What's at stake The issues facing historians who use social media -- Live tweeted here What historians of race and gender think of Elizabeth Warren's DNA disclosures On academic blogging On Sam Wineburg's critique of history education A high school teacher reflects on her day at the AHA By Megan Jones, history teacher On politics and race in the 1990s AHA leaders should make a commitment to bar job interviews that don't pay candidates to attend the annual meeting Mid-career malaise Loyalists in the Revolution Linda Kerber on the women's movement and gendered history Pedagogical Skills to Promote Historical Thinking Women’s Global Activism in the Twentieth Century Historians are still assigning Time on the Cross? Beware sexual harassment at conferences Collective list of men historians who are known sexual harassers & predators Claire Potter's advice for job candidates Problems candidates for jobs face Polarization and Partisanship in Contemporary America (John Fea's blog) Taking Stock of Gender History at the American Historical Association Annual Meeting 2019 Re-imagining the way historians do the whole conference thing It is Time for a President of the AHA Who Does Not Work at a Research University By John Fea The Relevance of the Enlightenment By Megan Jones, history teacher "The AHA, like all of our professional organizations, is shaped by the people who populate its paid, elected, & volunteer leadership." New Narratives of Revolutionary and (Post) Revolutionary Haiti
Q from audience on the progress of women’s history: LKK says that women’s progress in historical profession is linked symbiotically to the existence of women’s history—there is no either/or. #AHA
— Dr. Ann M. Little (@Historiann) January 5, 2019
Linda Kerber speaking RN at the #AHA19 Cmmee on Gender Equity breakfast talking ‘bout her generation of scholars graduating in 1969. pic.twitter.com/IC0Mr0VFaB
— Dr. Ann M. Little (@Historiann) January 5, 2019
Sessions Relevant to Topics in the News (Links to tweets may not be active until after sessions have begun)
Thursday, January 3, 2019
Divided Loyalties in the United States: Polarization and Partisanship in Contemporary America 3:30 PM-5:00 PM (Click here for tweets: #AHA19 #s25) Here's a summary. A Church in Crisis: Catholic Sex Abuse in Historical Context 3:30 PM-5:00 PM Displaced Persons: The Present Crisis and Its Histories (Plenary) 8:00 PM-9:30 PM
Friday, January 4, 2019
Why Can't We All Just Get Along? The Debate over Free Speech on Campus 8:30 AM-10:00 AM (Click here for tweets: #AHA #s68) "Nunca Mais?": Reflections on the 2018 Brazilian Presidential Election 8:30 AM-10:00 AM (Click here for tweets: #AHA19 #s71a) Two More Years of Trump: What Is to Be Done? (Historians for Peace and Democracy Roundtable) 10:30 am to 12:00 New Perspectives on Black Women's Internationalism 1:30 PM-3:00 PM Archives Burning: The Fire at the National Museum in Rio de Janeiro and Beyond 1:30 PM-3:00 PM (Click here for tweets: #AHA19 #s119a)
Saturday, January 5, 2019
Federal Agency Records: Who Decides What Is Kept? 10:30 AM-12:00 PM (Click here for tweets: #AHA19 #s190a) Rapid Response History: Native American Identities, Racial Slurs, and Elizabeth Warren 1:30 PM-3:00 PM Genealogy, Genetics, and History (Plenary) 8:30 PM-9:30 PM
Sunday, January 6, 2019
Removing Silent Sam: History, Memory, and Activism at the University of North Carolina at Chapel Hill 9:00 AM-10:30 AM (Click here for tweets: #AHA19 #s262a) Social Media for Historians 9:00 AM-10:30 AM
“I was told there would be no math.” -@KevinMKruse
this quote is how you know you are at a conference of Historians#aha19 #s25
— Thomas Harvell-DeGolier (@DeGolierThomas) January 3, 2019
Do historians miss the ideals of assessment, as some have suggested? https://t.co/VXkK9PnLGb
— History News Network (@myHNN) January 4, 2019
|
0c3b1fdedbfeac8faac011865da40af8 | https://historynewsnetwork.org/article/170916 | A Tyrant's Temper Tantrum | A Tyrant's Temper Tantrum
King Charles I of England, frustrated at the limitations of his otherwise powerful position, decided to dissolve Parliament in March of 1629 and to clap several of the opposition’s leaders in irons. The monarch had come to an impasse over issues as lofty as religious conformity and as mundane as the regulations concerning tonnage, eventually finding it easier to simply dissolve the gathering than to negotiate with them. Historian Michael Braddick explains that the “King was not willing to assent to necessary measures” in governance, and that Charles was intent on “upholding his right to withhold consent” as he saw it, believing that “without that power he was no longer a king in a meaningful sense.” Charles was a committed partisan of the divine right of kings, truly believing himself to be ennobled to rule by fiat, and regarding legislators as at best an annoyance, and at worst as actively contravening the rule of God.
Though it was legally required to levy taxes, at this point in history Parliament was always an occasional institution; indeed, this was the fourth that Charles had dissolved. Yet separation of powers still made it impossible for the king to directly collect taxes of his own accord, and so he adopted byzantine means of shuffling numbers around to draw income into the treasury. Such was the “Period of Personal Rule,” and to critics the “Eleven Years Tyranny,” in which Charles’ power became ever more absolute. Royalists may have seen the dissolution as a political victory, yet the ultimate loss would be Charles’, to spectacular effect. Historian Dianne Purkiss explains that the “events that were ultimately to lead to the Civil War were set in motion by a royal tantrum.”
Royal tantrums are very much on all of our minds this new year, as we approach the fourth week of the longest government shutdown in U.S. history. As the state coffers of England were depleted after Parliament was dissolved and continued solvency required creative means of reorganization, redefinition, and shifting of funds, so too do we find government agencies forced by extremity to demand work of essential employees without pay. Garbage piles up in federal parks and at the National Mall, TSA agents and air-traffic controls work for free, yet the president, under the influence of right-wing pundits, refuses to end this current shutdown. With shades of Charles’ tantrum, Speaker Nancy Pelosi explains Donald Trump’s current obstinacy as a “manhood thing for him.”
Meanwhile, Trump claims that his proposed border wall with Mexico is a national security issue, and after two years of inaction on his unpopular signature campaign promise has decided, not uncoincidentally following the election of a Democratic House, that he’ll invoke sweeping emergency powers to construct said wall, which last month Jennifer De Pinto and Anthony Salvanto of CBS News reported 59% of Americans oppose. At the time of this writing it’s unclear as to if Trump will declare those broad executive powers, in an audacious power-grab not dissimilar to Charles’ petulant dissolution of Parliament.
Trump’s proposal calls to mind the pamphleteer and poet John Milton’s appraisal of Charles in his 1649 Eikonoklastes that the monarch did “rule us forcibly by Laws to which we ourselves did not consent.” Milton denounced the royalists whom he saw as an “inconstant, irrational, and Image-doting rabble,” this crowd who wished to make the kingdom great again and who are “like a credulous and haples herd, begott’n to servility, and inchanted with these popular institutes of Tyranny.”
Yet as much fun as it is to draw parallels between the events of the 17th century and our current predicament, we must avoid the overly extended metaphor. Trump is not Charles I; Pelosi is not the anti-Royalist John Pym; the Republicans are not Cavaliers and the Democrats are not Parliamentarians. Treating history as a mirror can obscure as much as illuminate, and yet I’d argue that the past does have something to say to the present, especially as we understand the ways in which American governance is indebted to understandings of those earlier disputes.
Political pollster and amateur historian Kevin Phillips argued that the English civil wars set a template for perennial political conflict in his 1998 book The Cousins’ Wars: Religion, Politics, & the Triumph of Anglo-America. With much controversy, Phillips argued that a series of conflicts between the 17th and 19th centuries should best be understood as connected to one another, analyzing how “three great internal wars seeded each other,” with the “English Civil War… [laying] the groundwork for the American Revolution” which “in turn, laid the groundwork for a new independent republic split by slavery” that would be torn asunder during the American Civil War. For Phillips, modern Anglo-American history should be interpreted as a Manichean battle between two broad ideologies, which manifested themselves differently in each conflict while preserving intellectual continuities with their forebearers. Basing his analysis on geography and demography, Phillips sees in Charles’ claims of Stuart absolutism and religious conformity the arguments of King George III in the American Revolution, or the aristocratic defenses of inequity offered by the Southern planter class in the American Civil War. As a corollary, in the Parliamentarian he sees the language of “ancient liberties” as embraced by the American revolutionaries, or the rhetoric of New England abolitionists in the antebellum era. The first position historically emphasizes order, hierarchy, and tradition, while the second individualism, justice, equality, and liberty.
There’s much that is convincing in Phillip’s claims. The American revolutionaries certainly looked back to thinkers like Milton; the Puritanism of the Parliamentarians was crucial in both revolutionary and antebellum New England in terms of crafting a language of rebellion. The Southern aristocrats and apologists of slavery during the American Civil War consciously compared themselves to Charles’ Cavaliers, and rejected the creed as spelled out in the United States’ founding documents as evidence of heretical non-conformism. Thus applying Phillips’ to the current divisions in the United States has a logic to it. If the American Revolution continued the same debates from the 17th century English civil wars (and it in part did), and the American Civil War was born from the contradictions of the Revolution (which is undeniably true), then it might follow that the current divisions in our country are a continuation of the American Civil War by other means. In this perspective, Trump is a kind of Copperhead President, a northern Confederate sympathizer as argued convincingly by Rebecca Solnit in The Guardian.
While acknowledging that there is much that’s valuable in Phillips’ interpretation, I prefer rather to draw a different lesson entirely. Without comment as to the causal relationships between those conflicts, I rather note a particular structure by which each one of them unfolds, an ur-narrative which for progressives is incredibly important to be aware of as we may soon be facing a period of unrivaled opportunity for enacting profound change.
Returning to the 17th century, parallels to today can be seen in the Parliamentarian view that Charles was both an incompetent monarch and an aspiring tyrant, an illegitimate ruler enraptured by foreign influence. Had it not been for his own petulant intransigence Charles may have been able to weather those political storms, but it was precisely his own sense of inviolate authority which made his downfall inevitable. Charles’ fall from power, in turn, heralded a period of incredible potential for radical change in English history. Historian David Horspool writes that this discourse was “of a kind never before witnessed in England: an open debate” on how the new Republic should be governed. Occasions like the Putney Debates, held by the New Model Army, put front and center issues of republican liberties that had been marginal before, such as the participant Thomas Rainsborough who declared that “Every person in England hath as clear a right to elect his Representative as the greatest person in England.” Meanwhile, religious radicals like the Levellers and the Diggers, the former of whom had sizable support in both the army and Parliament, suggested communitarian economic arrangements, whereby the commons would be restituted as the property of all Englishmen, views that would still be radical today.
Such is the primordial narrative: an ineffectual and reactionary leader makes attempts at increasingly more power which triggers a crisis that leads to his downfall while presenting the opportunity for unprecedented, radical political change from the opposition. Had Charles been less vainglorious, perhaps the civil wars could have been avoided, but he was and as a result what ultimately presented itself was the possibility of something far more daring than mere incremental change. The same template is in evidence during the American Revolution. Had moderate voices like Prime Minister William Pitt been heeded, had George III been less intemperate regarding the imposition of the Intolerable Acts, than perhaps America would still simply be part of the British Empire. As it was, the hardening of George’s position allowed for the introduction into the world of the radical democratic philosophy which defined the American Revolution, and which flowered during the Articles of Confederation when many states adopted shockingly egalitarian constitutions. Similarly, on the eve of the American Civil War, most northerners were not abolitionists, yet increasing belligerence from the Southern slave-owning class, in the form of the Missouri Compromise and especially the Fugitive Slave Act, rapidly radicalized the northern population. In the years following the Civil War there was radical possibility in Reconstruction, when true democratic principles were installed in southern states for the first time.
We’ve already seen the arrival of new radical possibilities in opposition to the reactionary leader. Does anyone credibly think that we’d have elected several Democratic Socialists were it not for Trump? Does anyone believe that we’d finally be able to consider policy proposals like Representative Alexandria Ocasio-Cortez’s Green New Deal, and the restitution of a proper marginal tax rate, had it not been for the rightful frustration and anger at the reactionary Republican agenda? Suddenly the Democrats are suggesting actual ideas and not just the furtherance of the collapsing neo-liberal consensus; suddenly it seems as if actual change might be possible. In this sense, Trump has ironically accomplished something that the Democrats themselves haven’t been able to do – he’s pushed them to the left.
But I must present a warning as well, for there is another part to those narratives. Writing of the English civil wars, historian Frank McLynn explains that those years “undoubtedly constituted a revolutionary moment, a time when, in principle, momentous changes were possible.” Yet the English civil wars’ radical promise was never realized, betrayed by the reactionary Lord Protector Oliver Cromwell, and among the radical participants in that revolution they were done in by “the besetting sin of the Left through the ages – internal factionalism and squabbling instead of concentrating on the common enemy.” The result would be the demagoguery of Interregnum and finally the Restoration of the monarchy. Similar preclusion of democratic possibility occurred in the 18th Century United States, when the radical politics of the Revolution would be tempered at the Constitutional Convention of 1787, with the drafting of a document that abolitionist William Lloyd Garrison famously described as “an agreement with Hell.” Post-Civil War Reconstruction, often cynically and incorrectly presented as a manifestation of draconian Yankee opportunism, was a hopeful interlude drawn to a close by Congress’ 1877 betrayal, the ramifications of which define our politics today.
Consequently, there is a central question which the left must ask itself. It’s no longer if Trump will fall, it’s the question of what opportunities will be taken by progressives once he does. Trump’s gross incompetence and unpopularity has done more to discredit right-wing ideas than decades of liberal punditry. Clearly, we cannot afford to retreat to bland centrist moderation when the tide of history seems to call for more radical proposals. But the historical template provides warning, especially about how quickly hopeful moments can be squandered and reversed. A king’s greatest weakness is that he too often actually believes in his divine right. To be effective we can never be as stupidly arrogant. Now, what will we do with this moment?
|
cac04b8298224c4df18e9119cd0b6845 | https://historynewsnetwork.org/article/170959 | William Barr Needs a History Lesson | William Barr Needs a History Lesson
As the Senate Judiciary Committee holds its confirmation hearings for William Barr, the current nominee for Attorney General of the United States, it is clear Barr needs to brush up on his constitutional law, as well as U.S. history.
During yesterday’s hearing, Senator Mazie Hirono (D-HI) asked Barr whether or not he believed birthright citizenship was guaranteed by the 14th Amendment. The question is important as the idea of birthright citizenship has come under increasing attack from the right in recent years. From the Republican primaries onward, Donald Trump has repeatedly asserted that birthright citizenship is unconstitutional, should be eliminated, and can be ended by executive order. While some on the right have balked at the last claim, Trump has tapped into an ever-present disdain among conservatives for birthright citizenship.
For his part, Barr seemingly tried to side step the politically divisive issue. However, his answer to Senator Hirono’s question was not only vague, it also suggested that the soon-to-be Attorney General doesn’t know basic constitutional law or history.
“I haven’t looked at that issue legally. That’s the kind of issue I would ask OLC [Office of Legal Counsel] to advise me on, as to whether it’s something that appropriate for legislation. I don’t even know the answer to that,” Barr answered.
There are a couple of worrying signs in this response. First, birthright citizenship is a part of the 14th Amendment, meaning any action to change that would have to be a constitutional amendment, not legislation. This is a basic tenant of constitutional law. The fact that Barr, who previously served as Attorney General under George H.W. Bush, thinks any action can be taken against birthright citizenship through simple legislation shows one of two things: (1) he isn’t competent enough to understand basic constitutional processes in the United States or (2) he was rather insidiously actually answering Senator Hirono’s question.
The latter point warrants a bit of explanation. Barr quite visibly looked like he was attempting to simply move past the question and not answer Senator Hirono. However, if Barr does in fact think that birthright citizenship can be dealt with through congressional legislation, then the only logical explanation for this, barring the above first option, is that he doesn’t believe the 14th Amendment guarantees this status. Whereas the first possibility of incompetence warrants a refresher in constitutional law, this second one demands a lesson in history.
History is quite clear on the intent of 14th Amendment: it was meant to create the birthright citizenship in the wake of emancipation. The 14th Amendment was created to guarantee that freed slaves, free blacks, and their posterity would forever be considered American citizens. Before its adoption, citizenship was a murky, ill-defined, status. The Constitution only mentions citizenship a few times, and does not provide a concrete definition of what a citizen is or who can be a citizen. To this day there is actually no legal definition of what citizenship actually is.
From the Constitution’s ratification to the adoption of the 14th Amendment, black Americans had repeatedly claimed they were citizens because of their birth on American soil. Scholars such as Elizabeth Stordeur Pryor and Martha S. Jones have shown the myriad of ways in which black Americans made claims on this status, only to be rebuffed in many cases. Citizenship could provide black Americans with a recognized spot in the nation’s political community. It represented hope for a formal claim to certain rights, such as suing in federal court.
This leads to the infamous 1857 Supreme Court decision Dred Scott v. Sandford, when Chief Justice Roger Taney crafted an opinion that quite consciously attacked the very possibility of black citizenship. Taney concluded that Dred Scott, an enslaved man, could not sue in federal court because he was not a citizen. He was not a citizen, in Taney’s words, because black people “are not included, and were not intended to be included, under the word ‘citizens’ in the Constitution… On the contrary, they were at that time considered as a subordinate and inferior class of beings who had been subjugated by the dominant race, and, whether emancipated or not, yet remained subject to their authority, and had no rights or privileges but such as those who held the power and the Government might choose to grant them.”
Taney went out of his way to create a Supreme Court decision that attempted to put the legal nail in the coffin of black citizenship. The 14th Amendment was, quite consciously, crafted to upend Dred Scott, which was still the law of the land after the Civil War. Thus when conservatives rail against birthright citizenship and claim that it is not, in fact, a part of the Constitution, they are ignoring America’s long history of slavery, discrimination, and segregation.
When the soon-to-be Attorney General William Barr states that he thinks legislation can be used to make changes to birthright citizenship, it is because he does not believe the 14th Amendment guarantees it. And when he and other conservatives espouse such an opinion, it is because they are once again willfully ignoring American slavery's legacy of racism. This is, admittedly, not surprising. Barr also expressed the opinion during his confirmation hearing that the justice system “overall” treats black and white Americans equally, despite mountains of information proving otherwise.
While the attack on birthright citizenship from the right deserves attention and should be fought at every turn, the underlying historical erasure of slavery and discrimination is also requires our attention. This willful amnesia is why the potential next Attorney General of the United States can, in one day, ignore so many aspects of America’s fraught history with race. And it is why we all must be on guard.
|
e49055cc0ad0d463a50cbc70fbf34424 | https://historynewsnetwork.org/article/171031 | Kamala Harris is among the few black women to run for president. Here is the amazing story of the first. | Kamala Harris is among the few black women to run for president. Here is the amazing story of the first.
The sitting Republican president was unpopular and divisive. The country was a pressure cooker of partisan rage. Big names in the Democratic Party were mulling whether to jump into the presidential race: past candidates; high-powered senators; known personalities.
But then in January 1972, a political outsider announced a surprise run for the White House — upsetting the party’s power brokers and making history.
Forty-seven years ago this week, Rep. Shirley Chisholm (D-N.Y.) announced she was seeking the Democratic 1972 nomination, becoming the first woman and first African American to run for a major political party’s presidential ticket.
“I am not the candidate of black America, although I am black and proud,” Chisholm said in her announcement as supporters cheered. “I am not the candidate of the women’s movement of this country, although I am a woman, and I’m equally proud of that. I am not the candidate or any political bosses or fat cats or special interests. . . . I am the candidate of the people of America.”
|
97715716615148274c2535993cfc0319 | https://historynewsnetwork.org/article/171034 | What I’m Reading: An Interview With Russianist Historian Katherine Antonova | What I’m Reading: An Interview With Russianist Historian Katherine Antonova
Katherine Pickering Antonova is Associate Professor of History at the City University of New York, Queens College. She is the author of An Ordinary Marriage: The World of a Gentry Family in Provincial Russia (Oxford University Press, 2013) and the forthcoming Essential Guide to Writing History for Students (Oxford University Press, 2019) as well as A Consumer's Guide to Information: How to Avoid Losing Your Mind on the Internet (Amazon, 2016).
What books are you reading now?
Imagining Russian Regions: Subnational Identity and Civil Society in Nineteenth-Century Russia by Susan Smith-Peter, which I’m reviewing, and I’m also enjoying Naomi Novik’s new novel Spinning Silver.
What is your favorite history book?
Books that had the biggest impact on me were Barbara Alpern Engel’s Mothers and Daughters: Women of the Intelligentsia in Nineteenth-Century Russia and Laurel Thatcher Ulrich’s A Midwife’s Tale: The Life of Martha Ballard, Based on Her Diary, 1785-1812. There are so many others I’ve loved for other reasons: some that may not be familiar include Loren Graham’s Ghost of an Executed Engineer, Sheila Fitzpatrick’s Stalin’s Peasants, and the new biography of Rasputin by Douglas Smith. Everyone should read the gutting new book by my colleague, Deirdre Cooper Owens, Medical Bondage: Race, Gender, and the Origins of American Gynecology, and a very different but equally great new book by another colleague, Julia Sneeringer’s A Social History of Early Rock ‘n’ Roll in Germany: Hamburg from Burlesque to the Beatles, 1956-69.
Why did you choose history as your career?
I usually say it’s because I like to read other people’s diaries, and it’s true: I love reading primary sources, preferably the originals, in an archive where you can feel the texture of the paper and spot the occasional hundred-year-old dead bug still stuck to the page. As a kid I always read history and historical fiction for fun. About a year into college I realized that if I majored in history, my fun reading could also be my required reading. It was a couple of years after that before I really learned what a historian actually is. When I was growing up, people who liked history became k-12 teachers and the only other kinds of professionals people associated with history were archeologists, geneologists, or writers (I didn’t like science or family trees and writers starve or live off trust funds, as I was told at the time). I didn’t encounter any clear sense of what academic historical research looks like until late in college when the secondary sources we were reading were connected to stories of the archives told by my profs – most notably Sheila Fitzpatrick, who was one of the early western researchers in Soviet archives and has some very good stories to tell.
What qualities do you need to be a historian?
This is such a great question because I think there’s a popular perception that all a historian really needs is a great memory for names and dates, which is of course not remotely true. Some might go a bit farther and wish that historians were also “good writers.” But scholarship that can be vetted and built on by other scholars can’t be written the same way as historical fiction or popular history intended for pleasure reading, though of course it should be well-written for its purpose. It’s difficult to articulate a historian’s qualities because I think we rarely try. A historian looks at the world as continuously changing, not as a separate “past” that to many people can feel as remote as fiction. A historian sees what happened in the past not as a set story, but as disparate bits of evidence that might or might not cohere enough to answer our questions. A historian sees each event or action or phenomenon as contingent: no outcome was inevitable, every factor depended on other related factors. Our important questions are not “what happened” – that’s just a means to an end -- but always “why” and “how”: why did things go this way and not that way, how do systems and processes and ideas work and do they vary depending on different contexts? Being a good historian is about being meticulous with details, like names and dates, sure, but also much more important details like the nuances of meaning in a text, the shifting perspectives of multiple narratives, the interactions of multiple causal forces, and the infinite ways that context affects people’s behavior and views. A good historian doesn’t just find, track, organize, and weigh all these many factors in a infinitely complex system with imperfect evidence. She also synthesizes all of it to figure how what it might mean: what questions it can answer, with what implications. And we have to do all that while vetting and citing every source and constantly checking ourselves for errors of logic or bias, so that our work can serve its purpose as the foundation of further research, as a reliable teaching tool, and as a reliable basis for all the other kinds of history that rest wholly or partly on scholarship: fiction, popular history, and public history.
Who was your favorite history teacher?
I was incredibly lucky to study with Sheila Fitzpatrick, one of the great historians of the Soviet Union, as an undergraduate at the University of Chicago in the mid-90s. I actually had no idea how important her work was until I was about to finish and a grad student clued me in. I only found out in retrospect that a lot of the readings she assigned were by people who virulently disagreed with her. She led her own discussion sections, where she walked us through making our own interpretations of primary sources. She modeled what historians do and helped us try it out, and that ultimately is what all the best history teachers do.
What is your most memorable or rewarding teaching experience?
My university introduced a new general education program that added to the typical “freshman comp” introductory writing course a second semester of writing instruction that was explicitly disciplinary. I developed the version of the course for history at my college, taught many sections of it over several years, and ultimately brought this together with my grad school training in Composition Studies to write a manual for students on how to write history essays – not just the typical research essay (which is now often assigned only toward the end of an undergraduate program) but the other common kinds of history writing, from primary source close-readings to exam essays and imaginative projects like role-playing games and historical fiction. In many ways it’s the culmination of a journey I started as a grad student with a fellowship in the University Writing program -- I found ways to answer questions that have bothered me from the first baby steps of my career. Students led me to those answers because this course gave us the time to explore the meta questions like “why are we here?” “what is this for?” and “can this be better?”
What are your hopes for history as a discipline?
I’m very glad to see, and be a part of, efforts to get much better at articulating what historians do and why it matters. The American Historical Association has taken the lead with this through their Tuning Project, but we’re also seeing a new generation of historians who came through grad school at a time when training in pedagogy and composition studies were finally beginning to be recognized in places like history Ph.D. programs. I was trained in composition studies as part of a fellowship to teach a year of freshman comp, and it completely changed the way I approach teaching. At the same time my university instituted its first formal teaching training programs for grad students, which I took part in. Those experiences opened the door for me to a whole world of evidence-based problem-solving and purpose-driven teaching. Earlier generations were often left to either continue what was “traditional” or reinvent the wheel on their own. Now there are a lot more people who got at least some teaching training, and there’s much greater access to conversations about teaching via blogs and Twitter and so on, and all of that is gradually having a very positive impact on how history is taught, which in turn is helping historians be more articulate in general about what we do. I hope it is also the beginning of a substantive course correction in how the public understands what history is all about. But we’re doing all this in an environment of crippling austerity and short attention spans, so my hope is qualified by quite a bit of anxiety that things may just keep getting worse despite everyone’s efforts.
Do you own any rare history or collectible books? Do you collect artifacts related to history?
Ha! I don’t get paid enough for that. I have some trinkets from having spent a lot of time in Russia on research trips, but nothing valuable. My favorite souvenir other than the deck of cards with the Romanovs on them is a pair of thick, felted mittens with tiny holes in the pad of the forefinger and thumb to make it just possible to hold a pen. I made them for working in the archive in Ivanovo where I spent 10 months researching my dissertation. Archivists keep the building chilly partly for preservation, partly from lack of funding, but after sitting still for hours day after day in the reading room the cold seeps into your bones. It’s worth it to read other people’s diaries, though.
What have you found most rewarding and most frustrating about your career?
As one of the incredibly lucky few who got a tenure-track job (just before the 2008 crash obliterated the market), I’m very aware of the incredible privilege of being able to teach with the speech protections afforded by tenure, not to mention the steady salary even though it’s low and the workload is ridiculous – at least I don’t have to teach this many or more courses at several different campuses for a fraction of the money! This means I am able to mostly focus on my teaching, research, and professional service, which is what I’m good at and worked so hard for for so many years. This is fulfilling work, and though I work all the time, I can do it with a pretty extraordinary degree of flexibility and autonomy and I know how rare and valuable that is.
The most frustrating thing is how few qualified, brilliant historians share my luck, and the unspeakable loss to society of so much knowledge and talent being thrown away by the adjunctification process that exploits people as long as possible until they leave the system. The stupidity and waste of it horrifies me. Similarly, the pervasive myths about what higher ed and the humanities are, how they work, and the value we bring to society are really frustrating, not least because these myths are deliberately perpetuated as a way of continuing this process of taking all the money out of public education and putting it in the pockets of private companies and their executives. It’s a looting process that’s happening throughout our society, though – not specific to public education, though we’re a relatively easy target because everyone has at least one teacher they’re still angry about, and it’s easy to exploit those feelings. At the same time, we’re a valuable target because we actually do have such a big impact on society that knocking us down a few pegs really disrupts the whole system.
How has the study of history changed in the course of your career?
The whole climate of education has changed so much in my lifetime – the loss of funding and public support for education, particularly in the humanities, the rise of testing culture in k-12 schools, the adjunctification and commodification of higher ed, and the resulting crises in tuition/indebtedness and textbook costs, faculty security, workload, and pay, and the impact of all those crises on what anyone can do in a classroom – have been so huge that sometimes it’s hard to remember to look at the relatively smaller changes within my discipline or field. Historical research has been greatly enriched in the past few decades by increased diversity in who can do history and how we do it and the kinds of questions we ask. But all that progress is now at risk because of these outside pressures, and much of the great recent work historians have done doesn’t ever really reach the public thanks to the defunding and privatization of academic publishing alongside the limitations on everyone’s attention.
In my own subfield of imperial Russia, I’ve seen tremendous new insights thanks to a generation of archive-based work in the wake of the Cold War. Most people are aware that the collapse of the Soviet Union opened archives and enriched the study of that period, but it has a huge impact on the study of Russia before 1917, also, and on our whole conceptualizion of the Russian empire as a continuous entity before and after that date. For example, our understanding of regional diversity and the importance of developments outside the capital cities is only just beginning to inform the broader narrative – that’s one of the contributions of the book by Susan Smith-Peter that I mentioned above.
What is your favorite history-related saying? Have you come up with your own?
I love the Lamartine quote, “History teaches everything, including the future.”
What are you doing next?
This fall I’m hoping to finish the research for my second monograph, which centers on Russian police investigations of women mystics and sectarians in the 1820s and 30s. I’m also developing another book project about writing, with a different focus and working with a co-author.
|
6bc76d9ccdeee79c5c117e573c9df327 | https://historynewsnetwork.org/article/171382 | Genocide Denial In Bosnia: A Danger to Europe and to the World | Genocide Denial In Bosnia: A Danger to Europe and to the World
The Srebrenica Genocide Memorial in Potočari
On July 11, 1995, Ratko Mladic and his Serb (VRS) paramilitary units arrived to the sleepy eastern Bosnian city of Srebrenica. Designated a “safe zone,” by the United Nations, civilians from neighboring cities and municipalities clamored to the area in hopes of salvation and safety. That day, over 8,000 young Bosniak men and boys were brutally executed by Mladic’s troops in the biggest single massacre of the Bosnian genocide. It was an event unseen in Europe since the murderous Holocaust campaigns carried out by Hitler’s Nazi regime. Today, this event, the details around it, and the nature of the killings has become political fuel for nationalist politics in Bosnia and Herzegovina. Despite the annual exhumation of new mass graves, genocide denial has once again raised its ugly head, just as it did in the aftermath of World War II.
Despite thousands of testimonies, photographs, video evidence and overwhelming physical evidence in the form of mass graves, Bosnian Serb and Serbian politicians such as Milorad Dodik (currently a member of the Presidency of Bosnia and Herzegovina) continue to question and deny that a genocide took place in Srebrenica and the wider Bosnia and Herzegovina. These are by no means passive remarks but rather a targeted campaign of denial. The latest iteration of this heinous and destabilizing action is Republica Srpska’s (one of the political entities created under the Dayton Agreement) so called “truth investigation,” into the Srebrenica genocide. The implications could not be any more clear: a rise in nationalist fervor and fascistic political ideologies (the same one that fueled the last wars in the Balkans), historical revisionism, political instability, and perhaps most worrying a return to the denial of human rights, the truth, and reconciliation in the country and this precarious part of Europe.
Misinformation campaigns are nothing new. Nazi authorities and their co-conspirators denied the killing of over 6,000,000 Jews during the war, and many who were sympathetic to their cause, continued to do so afterwards. This did not simply stop at passively dismissing or denying the Holocaust, but ramped up through targeted campaigns of misinformation. Nazi propaganda dehumanized Jews and cultivated support for the mass murder of Jews before, during, and after the war. Their supporters, such as historian Harry Elmer Barnes, actively supported reports denying the existence of Nazi death camps and even published literature on the topic. Neo-Nazi “think tanks,” (akin to RS’s investigative body) opened old wounds by downplaying the death count or actively denying the existence of a well-planned, full fledged campaign of extermination.
Dodik and authorities in the Republika Srpska seem to have taken a page out of this playbook. During the war, mass graves were routinely covered up and concealed, and the bodies of victims moved. Today, this makes identifying the victims very difficult since there is significant mingling of remains. For example, one victim’s remains were found at two separate locations, over 30km away from each other. The disinformation and deceit did not stop with the halting of hostilities. Serb nationalist politicians and their supporters routinely downplay the genocide or dismiss it outright, refusing to accept blame or to begin a process of reconciliation. They are aggressively pursuing a policy of genocide denial and introducing unsubstantiated doubt in an effort to destabilize the country, and further, deny the humanity of the victims of the genocide. In 2004, the Appeals Chamber of the International Criminal Tribunal for the former Yugoslavia (ICTY), located in the Hague, ruled that the massacre in Srebrenica constituted genocide, which is a crime under international law. This ruling was further upheld in 2007 by the International Court of Justice (ICJ) in 2007. These rulings matter little to nationalist leaders such as Dodik and those of his ilk. Ultimately, they have very little respect for international bodies, considering them nothing more than attack dogs against the Bosnian Serb people. Their tools of the trade have been misinformation campaigns, propaganda, and political investigations. What they fail to understand is that genocide denial has further societal implications. The distrust and feelings of enmity in Bosnia cannot subsist without the truth being taken seriously, and authorities formally apologizing and undertaking actions to prevent similar atrocities from ever happening again.
Ultimately, why is this so important? The same de-humanizing philosophy which fed into ethnic and religious tropes leading to genocide is back, perhaps stronger than ever. The denial of history and the truth has become normalized in many parts of the world, sometimes through masked efforts at legitimacy. In this moment it is especially important for scholars, journalists, and other professionals to stand up for the truth and demand a platform which overshadows lies and misinformation. Historical revisionism threatens not just the sense of justice for families in Bosnia, but the democratic process in the region. If Europe is indeed serious about protecting democracy and individual rights, it needs to respond to attacks on the truth first.
|
59775997cb6c674d629017dbeb0a4a78 | https://historynewsnetwork.org/article/171457 | What Historians Are Tweeting: The Women Historians Who Inspire on International Women's Day | What Historians Are Tweeting: The Women Historians Who Inspire on International Women's Day
Dr. Sarah Bond (@SarahEBond) asked "How did your female mentor make a difference in your career?" Here are some of the responses.
Click inside the image below and scroll to see tweets.
|
a5c8b61e9627a82cd9d690d3cf523b77 | https://historynewsnetwork.org/article/171478 | How the Daughters and Granddaughters of Former Slaves Secured Voting Rights for All | How the Daughters and Granddaughters of Former Slaves Secured Voting Rights for All
In the fall of 1916, four years before the 19th Amendment would make it unconstitutional to deny voting rights on the basis of sex, African-American women in Chicago were readying to cast their first ballots ever for President. The scenes in that year of black women, many of them the daughters and granddaughters of former slaves, exercising the franchise, was as ordinary as it was unexpected.
Theirs was a unique brand of politics crafted at the crossroads of racism and sexism. African-American women had always made their own way. In Chicago, they secured a place at the polls by way of newly enacted state laws that, over 25 years, extended the vote to the women of Illinois, gradually, unevenly and without regard to color. The real story, however, is an older one that stretches across generations of black women’s ambition and activism. It only sometimes intersects with better-known tales of how white women campaigned for their political rights. And yes, sometimes black and white women clashed. Still, the history of black women and the vote is one about figures who, though subjected to nearly crushing political disabilities, emerged as unparalleled advocates of universal suffrage in its truest sense.
Their story begins in an unexpected place—the church. For black women, church communities were central sites for developing their sense of rights and how then to organize for them. No one understood this better than Julia Foote, born in 1823 and who, at the age of 18, felt herself called to preach in the African Methodist Episcopal (A.M.E.) Church. By the 1840s, Foote was a leader in a churchwomen’s movement which demanded that they, like men, should be entitled to occupy pulpits and interpret the scriptures.
|
339e491b14085a872ab617d220f00772 | https://historynewsnetwork.org/article/171508 | Parents, College, Money, and the American Dream | Parents, College, Money, and the American Dream
The University of Southern California, one of the schools mixed up in the college admissions scandal.
The front-page news about the college admissions bribery arrests has people talking about social class, fairness, status anxiety, helicopter parenting, and whether an expensive education can translate into a lifetime of wealth and happiness. None of this is new. In writing about the history of babies in the 20th century United States, I discovered early 20th century baby books distributed by banks and insurance companies prompting parents to save for college. At a time when less than 20 percent of Americans completed high school and far fewer went on to higher education, financial institutions told parents to start saving for college.
Insurance companies, banks, and savings and loan firms enticed customers by encouraging parental hopes and dreams. Just as manufacturers of disinfectants, patent remedies, and infant foods turned to baby books to advertise the products parents could buy to keep babies healthy, financial firms sold their services as ways for making babies wealthy and wise--in the future. For all kinds of companies, playing to parental anxieties and aspirations became the means of expanding their clientele.
Consider this example. In 1915 an Equitable Life Assurance baby book advertisement in the Book of Baby Mine began, "Say, Dad, what about my college education?" At the time, high school graduation rates hovered below 13 percent and college attendance and graduation was much lower. Nevertheless, parents looked to the future with great hopes for their offspring. In 1919 the United States Children's Bureau conducted an investigation of infant mortality in the city of Brockton, Massachusetts. An immigrant Italian mother interviewed for the study reported she was saving to send all four of her young children to college. Clearly, in reaching out with a save-for-college message, financial firms were capitalizing on a common but mostly unrealized dream and helping to reinforce the message that college was a pathway to success.
Banks promoted thrift by reaching out to customers via motion pictures, newspaper advertisements, and programs in schools collecting small deposits from children. Competition for savers grew as the number of banks doubled between 1910 and 1913. Accounts for babies soon became part of banks' advertising strategy. Savings and loans and banks gave away baby books with perforated deposit slips, slots for coins, or simply included pages for listing deposits into the baby's bank account. The 1926 Baby's Bank and Record Book even included a section on college savings estimating a future scholar would need $1000--a figure it derived, the advertisement explained, from the University of Pennsylvania catalog. In addition to citing this source, the ad included a helpful chart showing that saving $1 a week would, with compounding interest, yield $1065.72 in fifteen years.
The Great Depression wiped out many of the banks and small insurance companies holding the savings of infants, children, and adults, thus erasing the hopes of many who had dreamed their child would obtain a college education. However, as children withdrew from the workforce because of the lack of job opportunities and New Deal laws limited their employment, high school completion rates grew to 40 percent by 1935. As scholars have pointed out, G.I. benefits after World War II (the Serviceman's Readjustment Act) and the National Defense Education Act of 1958 both led to big increases in college attendance thanks to the financial support they provided.
What changed in the wake of enhanced federal financial support was not the desire for one's children to acquire more education, but the numbers of young people able to go to college. A quick look through baby books from the first half of the twentieth century shows the "go to college" message being sent and received well before government dollars came in to the picture. Banks and insurance companies knew what customers dreamed of for their offspring and they made it the centerpiece of some of their advertising. Today, the vast majority of students and families still save up and borrow to afford higher education. And, of course, financial firms still promote themselves as critical resources for fulfilling this dream. What just might surprise us is how, for over a century, banks and insurance companies have been delivering this message, aware of what parents thought about when they gazed at their new babies and thought about their futures.
|
9d675de83689f8b529f0dc568b64a2b6 | https://historynewsnetwork.org/article/171817 | White supremacists dragged James Byrd to his death in 1998. One of them was just executed. | White supremacists dragged James Byrd to his death in 1998. One of them was just executed.
James Byrd Jr.’s body was found in pieces along a country road in Texas in June 1998.
Forensic investigators later determined the injuries that killed Byrd — cuts and scrapes around his ankles and other abrasions on his body — seemed to indicate his ankles had been wrapped together with a chain and that he had been dragged by a car.
The gruesome killing of Byrd, a 49-year-old black man, seemed to hark back to an era of lynchings and racially motivated slayings across the South. The trials of the three white men charged with the crime drew wide attention to Jasper, a town of about 7,500 in East Texas, just a short drive from the state’s boundary with Louisiana.
Texas officials announced this week that one of Byrd’s killers, John William King, 44, would be executed Wednesday night, two decades after being convicted. It would make him the fourth inmate executed this year in the United States, and it would be one of the final legal steps in a case that has prompted a national discussion about hate crime legislation. But will it provide closure in a case that remains painful 20 years later?
|
20d196fd99f95d1dbba3a43e4b523f69 | https://historynewsnetwork.org/article/172000 | Presidential Moral Character and Teddy Roosevelt | Presidential Moral Character and Teddy Roosevelt
It was Saturday morning, September 14, 1901, and President William McKinley was dead, eight days after being shot by a crazed assassin. Americans were aghast—this was the third murder of a president in thirty-six years. Everything had been going so well. The economy had rebounded from the 1893 Panic, our industry was the most productive in the world, technological innovation had made life easier, and America had just won a war, gaining a new global empire with unlimited commercial possibilities.
Suddenly, the historically do-nothing office of vice-president was in the spotlight as its occupant was sworn in as the 26th president of the United States. In the cigar- and whiskey-reeking backrooms of big city political bosses, the august boardrooms of Wall Street moguls, and the genteel verandahs of Newport aristocrats, the nation’s elite was anxious about what kind of president Theodore Roosevelt would turn out to be. Many already had an idea and they didn’t like it. This was because by 1901, Roosevelt was anything but an enigma to America. Though he was only forty-two when he became president that Saturday, his moral character and intellectual ability were already widely known.
His moral character was illuminated from his first political office at age twenty-three in 1881, when he became the youngest person ever elected to the New York State Assembly. Roosevelt stood out from other politicians because of his fearless quest for honesty and efficiency in government. Over his next three terms in the statehouse, he took on powerful foes like financier Jay Gould, who had attempted to corrupt officials, and Judge Theodore Westbrook, who had a shady relationship with Gould. He not only believed, but showed, that honest government transcended party politics, working with Democratic Governor Grover Cleveland to pass a civil service reform bill. Within two years of his arrival, Roosevelt was chosen to be the Republican minority leader in the state assembly. These years in the gritty mechanics of legislative process provided him with solid experience in how government works, and how to craft arguments to advance his agenda. This knowledge helped greatly once he was president. It also gave him the impetus to fully use presidential power, as shown by the fact he issued more executive orders (over 1000) than any of his 25 predecessors.
His intellect showed early on, as well. During his life, Roosevelt was of the most prodigious readers and writers in America. By age twenty-four he had written The Naval War of 1812, a book which was soon required reading for naval officers around the world. To this day, it is considered the definitive history of that naval war. Over his lifetime, Roosevelt wrote (no ghostwriters for him!) dozens of magazine articles, essays, thousands of letters, and no less than forty-five books. His topics were diverse: hunting, social responsibility, travel, history, biography, politics, living the strenuous life. Naturally, his literary work spread his name and ideas, but it had another benefit. It introduced him to many of the famous journalists and authors of the day, several of whom became lifelong friends who promoted his political programs. It was not by coincidence that Colonel Roosevelt had his own press entourage while trudging through the jungles of wartime Cuba in 1898.
His impressive mental capacity was manifested in another way—he was an outstanding orator. He could converse in French and German, though with a pronounced American accent. Roosevelt’s experiences out west with cowboys, in the tenement slums of New York, and with soldiers in the army, people who were completely outside his social norm, taught him their language styles. It gave him confidence in public speaking with different cultures. It also gave him the ability to size up an audience’s pride, hopes, and fears, allowing him to personalize his message to them. Sometimes he even turned adversaries into advocates with his candid sincerity, as he did with German immigrants irate with his decision to enforce the no alcohol on Sunday laws as New York Police Commissioner in 1896. After speaking to them in German, he had them laughing with him. As recordings of his speeches show, Roosevelt’s voice was high pitched and not what we would consider stentorian, but his passion for the topic and audience emerged loud and clear. In short, he knew how to bond with the audience.
An obvious sign of Roosevelt’s remarkable self-discipline was his physical fitness, which greatly influenced his character. A sickly boy with severe asthma, as a teenager he transformed his frail body into that of an athlete. He became a devotee of daily practice in martial arts (Judo, boxing, and single-stickfighting). Roosevelt always seemed in motion. He never strolled—he strode. He didn’t walk up steps, he leaped two at a time. The tragic deaths of his father, and later his first wife and mother, and after that his brother, taught him perseverance through plunging into hard work, both physical and mental. His time with rough men in the Dakota Badlands, facing enemy fire in Cuba, and in rugged sports, gave him a determination which no one doubted. His clenched jaw and narrowed eyes could give pause to the fiercest opponent, either physical or political. And, of course, there was the other manifestation of his personality, a sense of gentle humor displayed in that famous ear-to-ear enameled grin, accompanied by a true belly-slapping laugh that was impossible to not join in. Often, it was directed at himself.
We mustn’t think Roosevelt perfect, however. His moral strength sometimes failed. One of the most prominent political causes which withered was his initial presidential support of black civil rights in the South in the face of increasingly oppressive Jim Crow laws and KKK violence. That faded as he dealt with the considerable southern political powerin Congress. An example is his early friendship for Booker T. Washington, inviting him to be the first African-American to dine at the White House only three weeks after being sworn-in as president. The backlash was immediate and vicious, and Mr. Washington never got another such invitation. During his second term in 1906, Roosevelt made the decision to rely on and support racist army officers’ evaluations and adjudication of a black regiment’s alleged rioting in Brownsville, Texas. A total of 167 soldiers were dishonorably discharged and humiliated, though later they were shown to be innocent. Many in the nation were disappointed by Roosevelt.
But when viewed overall, Roosevelt’s life was an extraordinary preparation for the presidency. For over twenty years, he devoted much of his life not to personal gain but to public service. By the time he became president, he’d worked in legislative and/or executive branches of municipal, state, and federal governments. He’d been in appointed positions like U.S. Civil Service Commissioner (under Republican and Democratic administrations) and New York Police Commissioner; and elected positions like assemblyman, governor, and vice-president. He’d won elections and lost them. He’d served in the military as assistant secretary of the U.S. Navy and as a volunteer colonel in the U.S. Army who endured combat. It was a remarkable resumé of service beyond oneself which has seldom been equaled by other presidents.
Among the upper-class in The Gilded Age, Theodore Roosevelt was an anomaly. Though he came from their class, he didn’t act like them. He didn’t want to change their lives. He wanted to change life for the rest of America, making citizens’ lives safer, fairer, and more hopeful. Roosevelt’s “Square Deal for Every Man” centered around consumer protection, corporate regulation, and conservation of America’s natural wonders.
His life experiences and intellectual ability helped frame Roosevelt’s moral character and thus, his political goals. His considerable stamina and skills were used to achieve those goals. Battling the political bosses, corporate moguls, and social elitists, he made progress in surprisingly diverse areas: The Pure Food and Drug Act, Meat Inspection Act, and food safety programs. Protection of labor rights. Promotion of American commerce. Veterans benefits. Rural free postal delivery. Breaking up of commerce, finance, and utility monopolies. From 1902 to 1905 alone, 190 indictments against corrupt government officials. Regulation of railroad rates to ensure access for all. Stressing personal physical fitness and literacy. Support of child labor laws. Construction of the Panama Canal with affordable transit rates for all nations. Modernizing the U.S. Navy and Army. Protecting Latin America against European military attacks. Founding the U.S. Forest Service. Creating 4 game preserves, 5 national parks, 18 national monuments, 51 bird preserves, and 150 national forests.
A century later, America seems to be searching for a leader with moral character, intellectual ability, proven sacrifice for the nation, personal bravery, and genuine sincerity—a new version of Theodore Roosevelt. Someone with whom you might disagree on policy, but still personally admire and trust. Someone who can laugh at themselves, and even get you to join in. Someone the world will respect for speaking softly while carrying that big stick.
I know that person is out there, because even though so much has changed over the last 110 years, this is still the America of Theodore Roosevelt.
|
a220a741cd4884e1180632d2998ea282 | https://historynewsnetwork.org/article/172005 | What Trump Could Learn About Immigration from Teddy Roosevelt | What Trump Could Learn About Immigration from Teddy Roosevelt
An example of anti-Japanese sentiment.
Recently, Donald Trump virtually gutted the Department of Homeland Security with the forced resignation of DHS Secretary Kirstjen Nielson and key deputies because they were ‘too soft’ on immigration. He then made a bad situation far worse by the de facto appointment of xenophobic Stephen Miller to take over immigration policy.
Anti-immigrant sentiment has been at the forefront of Trump’s politics since he announced his run for president. At his campaign announcement in 2015, he labeled Mexicans as rapists. Once in power he has labeled immigration as a national crisis and has demanded a wall between the U.S. and Mexico. While he initially asserted Mexico would pay for such a wall, when Congress denied him funding for it he shut down the government for a record-breaking 35 days. Most recently, he threatened to shut down the U.S. –Mexico border but ultimately did not because of the loss in trade.
President Trump could learn a valuable lesson from Theodore Roosevelt on immigration. At the turn of the 20th century, Asian immigrants were demonized by Americans. The Chinese laborers brought to the west to work on the construction of railroads fueled the hatred of Asians and the “Yellow Peril” that some thought threatened to take over America and destroy Western civilization.
After the Chinese Exclusion Act of 1882 the focus turned to the relatively small number of Japanese coming to America. It was particularly virulent in San Francisco where Mayor Eugene E. Schmitz formed the Japanese and Korean Exclusion League in 1905. Schmitz demanded the segregation of the tiny fraction of Japanese children in the public schools in order "to save white children from being affected by association with pupils of the Mongolian race." The Board of Education agreed and the children were forced to attend a segregated school.
In much the same way Americans protested the treatment of Amanda Knox in the travesty of a murder trial, Japan interpreted the discrimination as an insult to its national pride. Japan had recently been fortified by military victory over Russia and the acceptance of Japan by Western nations as an emerging world power. A series of diplomatic notes passed between Japan and the United States, and tensions mounted.
In order to diffuse and resolve the problem, Roosevelt brought the mayor and the school board to the White House and cajoled them to reverse the decision. He secured a promise that the segregation would be lifted if Japan restricted emigration. The Japanese government agreed and stopped issuing passports to the United States, although some were allowed to go to the Hawaii Territory. With the guarantee the school board relented.
The resolution became known as the Gentlemen’s Agreement of 1907 or in Japanese Nichibei Shinshi Kyoyaku. Without rancor or inflammatory rhetoric Roosevelt solved an immigration crisis. The agreement was not perfect. Some Japanese people granted entry into Hawaii could and often did make their way to the mainland. An exception for family members, a practice now denounced by the current president as ‘chain migration’, still allowed some Japanese to come to the United States. Japanese nationals escaped the feared exclusion acts until the Immigration Act of 1924 which cut Asian immigration to near zero.
Even if the current occupant knew this part of history he would not be able to learn from it. Intelligent foreign often requires the delicate touch of a scalpel rather than the pounding of a sledge hammer. But for the rest of us we can know from this history that it is possible to handle immigration policies sensibly and even compassionately. And there is hope that a better president will soon be repairing the damage.
|
40d99da271cc89c401e2b250b1aaf423 | https://historynewsnetwork.org/article/172011 | Trump says he’s an antiabortion champion like Reagan. History says: Not quite. | Trump says he’s an antiabortion champion like Reagan. History says: Not quite.
In a Saturday tweetstorm, President Trump bent the reality of Ronald Reagan to align himself with the former president’s view on abortion.
“As most people know, and for those who would like to know, I am strongly Pro-Life, with the three exceptions — Rape, Incest and protecting the Life of the mother — the same position taken by Ronald Reagan,” Trump said, as he appeared to distance himself from Alabama’s restrictive abortion law.
But Reagan’s legacy on abortion is far more complicated, and antiabortion advocates have long considered his actions a disappointment.
In 1967, nearly six years before Roe v. Wade went to the Supreme Court, newly minted California Gov. Ronald Reagan signed one of the most liberal abortion laws in the country. The Therapeutic Abortion Act allowed for pregnancy terminations if the mother was in physical or mental distress as a result, or if the pregnancy was a product of rape or incest.
At the time, abortion was considered so taboo that newspapers referred to the procedure as an “illegal operation.” But the burgeoning feminist movement, coupled with horror stories of butchery that desperate women had suffered at the hands of practitioners who performed it outside the law, had compelled policy to address the issue nationwide.
|
260063e0fd5876bf7e05f19bd6337634 | https://historynewsnetwork.org/article/172035 | Why nuclear diplomacy needs more women | Why nuclear diplomacy needs more women
After months of “fire and fury,” a summit, “beautiful” love letters and a promise the North Korean threat was resolved, the latest nuclear weapons negotiation between the United States and North Korea failed. This isn’t surprising.
We haven’t seen a U.S.-North Korea agreement because nuclear policy is hard. Negotiating takes time, innovation, expertise and having the best minds involved in the process — including women.
But women were notably not represented at the last Trump-Kim summit. At the negotiating table were nine men; the only woman was an American translator. Translating has long been considered a “feminine” job in international delegations. This work powered the United States’ rise over the 20th century, yet it reflects how women have been directed to support male-designed policy rather than contribute to its creation. As our research into women’s history in this field since the 1970s shows, for decades, women have been sidelined in nuclear policymaking. That makes our world less safe.
Exclusion in the national security world means ignoring those who may have more experience and knowledge about a region or population, the ability to foresee obstacles others are blind to and more potentially effective and innovative solutions. In the nuclear field, where the stakes are life and death on a massive scale, this exclusion is costly.
Although they may not have been prominent at the Trump-Kim negotiation, women have been integral to the nuclear diplomacy field from the beginning. Female scientists in the Manhattan Project held a range of positions, including physicists, chemists and phlebotomists, working with male researchers to study plutonium and troubleshoot nuclear reactors. As the nuclear arsenal expanded, so did the number of women working as translators, counsels and senior policy and national security advisers. Beginning in the 1970s, women served as under, assistant and deputy secretaries, and starting in 1993, as Cabinet-level secretaries in the State, Defense and Energy departments.
|
dc9f9090abf518f4166e974b968a6923 | https://historynewsnetwork.org/article/172057 | A Day to Remember: Memorial Day 2019 | A Day to Remember: Memorial Day 2019
On May 30, 1963, I urged citizens to remember 42 years earlier when locals dedicated a granite monument in Ashland, Oregon as “a permanent memorial, reminding those that come after us of the glory of the republic, the bravery of its defenders and the holiness of its ideals.” This monument, dedicated shortly after World War I in 1921, remembered those who had not been brought back alive on our troop ships. They had died in trenches, of poison gas, or in tank warfare, maybe side by side with the British in the fields of bloody France.
When preparing to speak to more than a hundred locals, I read up on war and peace, suffering and victory, and the joy found in winning. Often I reflected on that emotional World War I against the Kaiser and the sacrifices in trenches and sunken ships.
It had been a War to Make the World Safe for Democracy. Woodrow Wilson, his Ph.D. from Johns Hopkins and academic life lived at Princeton, had chosen Herbert Hoover to be “Food Czar” with the mandate to unite the farmers of America behind the mission of making sure Europe (the part in Allied hands at least) did not starve.
At home, homeland happy Germans and agitating Socialists had a minimum audience for their protests. In the Navy Department, the Assistant Secretary, hale and hearty Franklin D. Roosevelt, was charged with creating a mine field to keep Germany out of the North Sea. He dealt in the capitol with my engineer father, Vaughn Taylor Bornet of the Budd Company to make a success of it.
A few decades earlier, 1898, I pointed out, we had fought Spain to free Cuba “in the cause of humanity and to put an end to the barbarities, bloodshed, starvation, and horrible miseries” that colony was felt to be suffering.
In half a century it would be time to invoke the memory of Midway and Okinawa, of D-Day. We ensured the survival of Britain and France and occupied Japan! Plenty there to memorialize! It was indeed true that World Wars I and II had been victorious after The Yankees had come to the rescue of democratic regimes fighting the Kaiser, Nazis and Fascists….
On February 2, 1966, I raised the question—as Vietnam was still being actively fought over—whether there was “an ethical question” in that war we were waging so seriously, yet so unsuccessfully. I didn’t do very well, I thought in retrospect, so in 1976 I revised my remarks. Looking back, I wrote this emotionally trying paragraph:
“We can now look back upon Vietnam, that tortured spot on the planet, and we look hopefully for signs that Good and Evil were clearly defined and readily identifiable to those who undertook the long crusade by force of arms.”
A world of jungles surrounded us back then.
Today we look back full decades. We visit and walk pleasantly about in today’s Vietnam. We regret we didn’t “win.” We still deplore Communism—that is, after departing by plane or ship. And, especially, we regret all those deaths—on BOTH sides. As we take pleasure in the happiness now visible on the sidewalks, we know that while the economy thrives, freedom is short. We also know full well that the war waged from Kennedy to Nixon, yes!, should have been curtailed long before it was!
We do have a right to ask bluntly, “Did we have to wage it with such extraordinary vigor (just because we weren’t winning).” Did we find Honor in not stopping? We sought, it must be said, a victory of the kind we had won earlier, in the 19th and 20th centuries. It was unaccountably being denied us in jungles way off somewhere. It was humiliating!
In my book on celebrating our patriotic holidays I pointed out that “The literature that attempts to evaluate the Vietnam War is thoughtful and immense.” Competing with it here is out of the question—although I must admit to having been, as a working historian, very much a part of it as I defended “patriotism” back when. I devoted maybe 200 pages to President Johnson’s turmoil when deciding what in Hell to do in Southeast Asia.
He could see that the Communists were not going to prevail in the rest of Southeast Asia! In Indonesia, Thailand, Malaysia, Singapore the Philippines, Republic of China, and Shri Lanka. Whatever North Vietnam and China might want, South Vietnam was to be their limited prize. We had been content with what had “worked” in South Korea, but that South Vietnam had been a different ballgame, it had turned out.
The Vietnam disaster had an effect on the kind of patriotism that prevailed earlier; no doubt about it. This time, we had Lost! For awhile, we just wouldn’t think about it too much or too often. Find something else to consider when reflecting on our body politic.
I will dare, as I conclude this troubled essay, to quote from my book’s page 149: “The anti-patriotic among us sometimes descend to portraying the United States in the role of an “empire” engaged routinely in “imperialist” invasions and dedicated to “conquest” for only “economic gain.”
For some among us, Patriotism sometimes seems just “old hat.” Not for everybody. One thinks back on what can easily be termed “Great Causes” supported by us in the Past. Some are still part of our active heritage. There is a free Europe.
Partly from what we did in our past emerged a new Commonwealth, an independent British Empire. Bad as it is sometimes, Africa could be worse. We have helped, overall—not wisely, always, but aided by philanthropy centered in the U.S., by Gates, Rotary, and others, by sometimes doing the right thing. Maybe we’re a little better than we sometimes think!
Yet our Nation’s prestige has suffered severely in the past two years. Leadership has lost us the affection of far too many countries who were once so close as to show pride routinely. Beginning with that inexcusable withdrawal from the Paris accords on climate, we have from our top office displayed misunderstanding, even contempt, for other Lands.
This must stop; the end of “going it alone” cannot come too soon. Surely this mostly verbal misbehavior is a temporary and transitory thing. All in office in the Executive Branch need to bear in mind at all times that they are trustees for our evolving reputation. We must, and we will, strive to do better, very soon. Downhill is not the right direction for the United States of America!
This Memorial Day is a good time to think back, bring our minds up to date, and fly that beautiful flag while humming or singing one of our moving, patriotic songs. For this quite aged American, it remains “God Bless America” all the way.
|
933b02f20942f8b397dece5c26002f1f | https://historynewsnetwork.org/article/172060 | How About a Peace Race Instead of an Arms Race? | How About a Peace Race Instead of an Arms Race?
In late April, the highly-respected Stockholm International Peace Research Institute reported that, in 2018, world military expenditures rose to a record $1.82 trillion. The biggest military spender by far was the United States, which increased its military budget by nearly 5 percent to $649 billion (36 percent of the global total). But most other nations also joined the race for bigger and better ways to destroy one another through war.
This situation represents a double tragedy. First, in a world bristling with weapons of vast destructive power, it threatens the annihilation of the human race. Second, as vast resources are poured into war and preparations for it, a host of other problems―poverty, environmental catastrophe, access to education and healthcare, and more―fail to be adequately addressed.
But these circumstances can be changed, as shown by past efforts to challenge runaway militarism.
During the late 1950s, the spiraling nuclear arms race, poverty in economically underdeveloped nations, and underfunded public services in the United States inspired considerable thought among socially-conscious Americans. Seymour Melman, a professor of industrial engineering at Columbia University and a peace activist, responded by writing The Peace Race, a mass market paperback published in 1961. The book argued that military spending was undermining the U.S. economy and other key aspects of American life, and that it should be replaced by a combination of economic aid abroad and increased public spending at home.
Melman’s popular book, and particularly its rhetoric about a “peace race,” quickly came to the attention of the new U.S. President, John F. Kennedy. On September 25, 1961, dismayed by the Soviet Union’s recent revival of nuclear weapons testing, Kennedy used the occasion of his address to the United Nations to challenge the Russians “not to an arms race, but to a peace race.” Warning that “mankind must put an end to war―or war will put an end to mankind,” he invited nations to “join in dismantling the national capacity to wage war.”
Kennedy’s “peace race” speech praised obliquely, but powerfully, what was the most ambitious plan for disarmament of the Cold War era: the McCloy-Zorin Accords. This historic US-USSR agreement, presented to the UN only five days before, outlined a detailed plan for “general and complete disarmament.” It provided for the abolition of national armed forces, the elimination of weapons stockpiles, and the discontinuance of military expenditures in a sequence of stages, each verified by an international disarmament organization before the next stage began. During this process, disarmament progress would “be accompanied by measures to strengthen institutions for maintaining peace and the settlement of international disputes by peaceful means.” In December 1961, the McCloy-Zorin Accords were adopted unanimously by the UN General Assembly.
Although the accelerating nuclear arms race―symbolized by Soviet and American nuclear testing―slowed the momentum toward disarmament provided by the McCloy-Zorin Accords and Kennedy’s “peace race” address, disarmament continued as a very live issue. The National Committee for a Sane Nuclear Policy (SANE), America’s largest peace organization, publicly lauded Kennedy’s “peace race” speech and called for “the launching of a Peace Race” in which the two Cold War blocs joined “to end the arms race, contain their power within constructive bounds, and encourage peaceful social change.”
For its part, the U.S. Arms Control and Disarmament Agency, created by the Kennedy administration to address disarmament issues, drafted an official U.S. government proposal, Blueprint for the Peace Race, which Kennedy submitted to the United Nations on April 18, 1962. Leading off with Kennedy’s challenge “not to an arms race, but to a peace race,” the proposal called for general and complete disarmament and proposed moving in verifiable steps toward that goal.
Nothing as sweeping as this followed, at least in part because much of the subsequent public attention and government energy went into curbing the nuclear arms race. A central concern along these lines was nuclear weapons testing, an issue dealt with in 1963 by the Partial Test Ban Treaty, signed that August by the U.S., Soviet, and British governments. In setting the stage for this treaty, Kennedy drew upon Norman Cousins, the co-chair of SANE, to serve as his intermediary with Soviet Premier Nikita Khrushchev. Progress in containing the nuclear arms race continued with subsequent great power agreements, particularly the signing of the nuclear Nonproliferation Treaty of 1968.
As is often the case, modest reform measures undermine the drive for more thoroughgoing alternatives. Certainly, this was true with respect to general and complete disarmament. Peace activists, of course, continued to champion stronger measures. Thus, Martin Luther King, Jr. used the occasion of his Nobel Peace Prize lecture in Oslo, on December 11, 1964, to declare: “We must shift the arms race into a ‘peace race.’” But, with important curbs on the nuclear arms race in place, much of the public and most government leaders turned to other issues.
Today, of course, we face not only an increasingly militarized world, but even a resumption of the nuclear arms race, as nuclear powers brazenly scrap nuclear arms control and disarmament treaties and threaten one another, as well as non-nuclear nations, with nuclear war.
Perhaps it’s time to revive the demand for more thoroughgoing global disarmament. Why not wage a peace race instead of an arms race―one bringing an end to the immense dangers and vast waste of resources caused by massive preparations for war? In the initial stage of this race, how about an immediate cut of 10 percent in every nation’s military budget, thus retaining the current military balance while freeing up $182 billion for the things that make life worth living? As the past agreements of the U.S. and Soviet governments show us, it’s not at all hard to draw up a reasonable, acceptable plan providing for verification and enforcement.
All that’s lacking, it seems, is the will to act.
|
5f6c9d1beb9045c6171245d0035c8f2a | https://historynewsnetwork.org/article/172064 | Leadership and Mimicry: What Plutarch knew about Elizabeth Holmes | Leadership and Mimicry: What Plutarch knew about Elizabeth Holmes
Founder of the biotech company, Theranos, Elizabeth Holmes is currently awaiting trial for cheating investors and deceiving her clients. She claimed that her company was building a device that would revolutionize healthcare by running dozens of lab tests on a single drop of blood. This device, called the Edison, was to become widely available in a nation-wide chain of drug stores, providing nearly every American with quick, affordable access to important information about their health. Holmes appeared to be doing the impossible, and nearly everyone believed in her, from seasoned Silicon Valley entrepreneurs to wealthy investors to former Secretaries of State. By the time she was thirty she had accomplished one of her childhood dreams: she had become a billionaire. But quick and easy blood testing, it turns out, really is impossible. While a legal decision about her behavior as CEO lies in the future, the verdict on her character appears to be in. Elizabeth Holmes is a fraud.
In the last year alone, Holmes has been the subject of a book (soon to be a movie), countless newspaper and magazine articles, an HBO documentary, and an ABC News podcast (soon to be a television series). This entrepreneur, once celebrated as a genius, is now more often called names like “disgraced fraudster,” and her career has repeatedly been cast in highly moral terms, with a rise-and-fall trajectory that seems already to have completed its arc. The way to explain the collapse of Theranos, it seems, is to study the deficiencies in Holmes’ character.
This approach to telling Holmes’ story calls to mind the Greek philosopher Heraclitus, who claimed that “character is destiny.” This ancient saying remains popular in our modern world. The New York Times editorial board used it just last year, for instance, to describe the downfall of Eliot Spitzer and to speculate about the future of Donald Trump. John McCain selected it as the title for his 2005 book, which contains stories of successful historical figures who demonstrated admirable character. Character alone, McCain argues in the introduction, determines the courses of one’s life and career. And so, according to both the ancient philosopher and the modern statesman, there is no pre-ordained path that we are obliged to follow, nor should we look for external guidance as we navigate our careers. We deserve full credit for our successes, but we must also take full responsibility for our failures.
Long before the rise and fall of Elizabeth Holmes, however, philosophers and ethicists were contemplating the implications of Heraclitus’ dictum. Plutarch of Chaeronea, for one, knew this principle well and wrote at length about the fundamental importance of character, especially for people in positions of power. He composed treatises on leadership, but his most ambitious project was the Parallel Lives, a lengthy series of biographies of Greek and Roman leaders that demonstrate how their virtues and vices affected their political and military careers.
For Plutarch, good character was fundamental to becoming an authentic leader. In his essay To an Uneducated Leader, he laments that most people who aspire to positions of power fail to realize that they must prepare themselves ethically. “And so,” he writes, “they imitate the unskilled sculptors who believe that their colossal statues appear great and strong when they fashion their figures with a mighty stride, a straining body, and a gaping mouth.” By emphasizing appearance over character, such leaders fool everyone, including themselves, into thinking they are the real thing because they “speak with a low-pitched voice, cast a harsh gaze, affect a cantankerous manner, and hold themselves aloof in their daily lives.” In fact, such leaders are just like the statues, “which on the exterior possess a heroic and divine facade but inside are filled with earth and stone and lead.” Plutarch is imagining statues made of bronze, which were cast over a clay core that remained inside. The statues, at least, could rely on this internal weight to keep them upright, while uneducated leaders “are frequently tripped up and toppled over by their innate foolishness, because they establish their lofty power upon a pedestal that has not been leveled, and so it cannot stand upright.” That pedestal, in Plutarch’s view, is character, and so a leader who forgoes ethical development is destined to fail.
Plutarch believed he could show that character was destiny by examining historical examples. In his biography of the Athenian orator and politician Demosthenes, for example, he presents an inauthentic leader who is publicly exposed as hollow. Demosthenes modeled himself on Pericles, an Athenian leader of an earlier generation who in both ancient and modern times has been portrayed in ideal terms. Demosthenes was selective in following his model, however, imitating only his style of speaking, his public demeanor, and his habit of getting involved in only the most important matters, “as though Pericles had become great from these practices alone” (Dem. 9.2). Now Demosthenes did indeed become a great speaker, and he used his oratorical prowess to organize resistance to Philip of Macedon, whose military might posed an existential threat to the independent Greek cities. He talked his way into a leadership position, but when the united Greek armies met Philip’s forces in battle, Demosthenes could not live up to the image he had created. “To this point he had been a brave man,” Plutarch explains. “In the battle, however, he did nothing that was honorable or that corresponded with his words, but he abandoned the formation, running away most shamefully after casting off his arms” (Dem. 20.2). Throwing away one’s arms, especially the heavy shield, was the proverbial sign of cowardice in Greek warfare. Thus, in this single act, Plutarch found all the proof he needed of Demosthenes’ deficiency in character.
The modern story of Elizabeth Holmes is one that Plutarch would surely have recognized. ABC News in particular has focused on Holmes’ efforts to shape her public persona and so to conceal the clay inside. When the company was new, the young entrepreneur had no shortage of admirers. “Don’t worry about the future. We’re in good hands,” declares Bill Clinton in the podcast’s first episode. He is followed by an exuberant newscaster who compares Theranos to Amazon, Intel, Microsoft, and Apple, before gushing, “It could be that huge.” But Holmes was not who she pretended to be. In order to make her company more like Apple, she hired away Apple’s employees. And then she went a step further, donning a black turtleneck in deliberate imitation of Steve Jobs, “as though Jobs had become great by wearing the turtleneck alone,” Plutarch would have added. The black shirt, it turns out, was a metaphor for the black box that was supposed to be testing blood but never really had the right stuff inside. In ABC’s version of the story, neither Holmes nor the Edison was ever more than a shell.
In business and in politics, then, philosophers and reporters tell us that no one can hide deficiencies in character forever. “It is, of course, impossible for vices to go unnoticed when people hold positions of power,” Plutarch writes in To an Uneducated Leader (7). Then he adds this example: “When jars are empty you cannot distinguish between those that are intact and those that are damaged, but once you fill them, then the leaks appear.” So how do we avoid giving our money to an Elizabeth Holmes, or putting a Demosthenes in charge of our government, only to find out too late that they are not up to the challenge? The answer for jars is to fill them with water and check for leaks before we use them to store expensive wine or oil. Just so, Plutarch, and before him, Heraclitus, would surely have suggested that we ought not give millions of dollars to a first-time entrepreneur, or place an untested politician in high office. In those situations, their character may be their own, but their destiny is ours.
|
8f8e06dcc0c7acf24636cbd0223b2015 | https://historynewsnetwork.org/article/172078 | Edmund Morris, known for his biography of Reagan, dies at 78 | Edmund Morris, known for his biography of Reagan, dies at 78
Presidential biographer Edmund Morris, best known for writing a book about the life of Ronald Reagan, has died. He was 78.
Morris died Friday in a hospital in Danbury, Connecticut, a day after suffering a stroke, his wife, Sylvia Jukes Morris, told The Associated Press on Monday.
“We at Random House mourn this loss with all who knew him and loved him, and with those who read his remarkable books. Our deepest sympathies are with his beloved wife Sylvia,” read a statement from Andy Ward, Morris’ editor.
Morris was a polished prose stylist whose career took off with the success of his first book, “The Rise of Theodore Roosevelt,” which won the Pulitzer Prize in 1980. But what cemented his legacy was “Dutch: A Memoir of Ronald Reagan.”
Aides to Reagan, who took office in 1981, thought Morris an ideal candidate for a book on him. Morris received a seven-figure contract from Random House and access most historians would only dream of: ongoing time with a sitting president, from meetings to private interviews, including with Reagan’s family.
“He had the guts to let somebody come in from outside, stare at him, read his mail, go off and talk to his children,? Morris said in 1991 . “Whatever you say about Ron Reagan, he has guts.”
But as Morris began work on the book, he realized that Reagan himself was a puzzle — an amiable man unknowable even to those closest to him.
|
fb926980af59f424cb7bcbfe553db47b | https://historynewsnetwork.org/article/172152 | Photos From the 1986 Chernobyl Disaster | Photos From the 1986 Chernobyl Disaster
"As the HBO miniseries Chernobyl comes to a conclusion tonight, viewers will have been taken on a dramatic trip back to 1986, experiencing the horror and dread unleashed by the world’s worst-ever civil nuclear disaster. Thirty-three years ago, on April 26, 1986, a series of explosions destroyed Chernobyl’s reactor No. 4, and several hundred staff and firefighters tackled a blaze that burned for 10 days and sent a plume of radiation around the world. More than 50 reactor and emergency workers were killed in the immediate aftermath."
Click here to view the original article and photos!
|
483456b4d5875e1451810e3654c97261 | https://historynewsnetwork.org/article/172330 | How the Debate Over the Use of the Term ‘Concentration Camp’ was Amicably Resolved in 1998 | How the Debate Over the Use of the Term ‘Concentration Camp’ was Amicably Resolved in 1998
When on June 18th, the Jewish Community Council of Greater New York (known locally as the JCRC) addressed an open letter of complaint to Rep. Alexandria Ocasio-Cortez for calling migrant detention centers “concentration camps,” the JCRC was reflecting how emotionally charged this term is for Jews. In subsequent statements, Ocasio-Cortez made it clear that she was not drawing an analogy to Nazi-era death camps. The JCRC’s letter compounded what might be considered community-relations malpractice in patronizingly offering “to arrange a visit to a concentration camp, a local Holocaust museum, hear the stories of local survivors, or participate in other educational opportunities in the hopes of better understanding the horrors of the Holocaust.”
But lost in the controversy was a resolution of a parallel dispute in 1998 that redounded to the credit of all concerned. At that time, Japanese-American organizers were preparing a museum exhibit at Ellis Island entitled “America’s Concentration Camps: Remembering the Japanese American Experience,” on the forced relocation and imprisonment of Japanese Americans by the United States government during World War II. Instead of criticizing the exhibit’s curators, the American Jewish Committee (AJC) conferred with them and amicably arrived at an arrangement that satisfied both understandable Jewish sensibilities regarding the memory of the Holocaust and the right of other Americans to commemorate the injustice they endured during those very same years. This was explained in their joint press release:
An exhibit—entitled America’s Concentration Camps: Remembering the Japanese American Experience—chronicling the shameful treatment of Japanese Americans during World War II, will soon open at the Ellis Island Immigration Museum. Thousands have already seen the exhibit, which was created by and, in 1994, shown at the Japanese American National Museum in Los Angeles. Today, our sights are trained on the importance of such an exhibit in teaching about episodes of intolerance. We strongly urge all who have the opportunity to see the exhibit to do so and to learn its critical lessons.
A recent meeting between Japanese American and American Jewish leaders in the American Jewish Committee’s New York City offices led to an agreement that the exhibit’s written materials and publicity include the following explanatory text:
“A concentration camp is a place where people are imprisoned not because of any crimes they have committed, but simply because of who they are. Although many groups have been singled out for such persecution throughout history, the term ‘concentration camp’ was first used at the turn of the century in the Spanish-American and Boer Wars.
“During World War II, America’s concentration camps were clearly distinguishable from Nazi Germany’s. Nazi camps were places of torture, barbarous medical experiments, and summary executions; some were extermination centers with gas chambers. Six million Jews were slaughtered in the Holocaust. Many others, including Gypsies, Poles, homosexuals, and political dissidents were also victims of the Nazi concentration camps.
“In recent years, concentration camps have existed in the former Soviet Union, Cambodia, and Bosnia.
“Despite differences, all had one thing in common: the people in power removed a minority group from the general population and the rest of society let it happen.”
The meeting and the agreement about the text also reinforced the close and constructive relationship that has long existed between the Japanese American and American Jewish communities. Jewish community groups, especially the American Jewish Committee, were among the first and most vocal outside the Japanese American community calling for the U.S. government to offer an apology and monetary redress for its treatment of Japanese Americans during World War II.
In 1988, Congress and President Reagan passed legislation that formally granted the redress and apology to Japanese Americans who were incarcerated. Both communities have been among America’s leading voices advocating for strong civil rights, anti-discrimination and hate crimes laws. The meeting’s participants were encouraged to continue the work of preserving the memories of our communities’ experiences and helping other learn from them.
The exhibit represents a precious opportunity for those who must tell its story—Japanese Americans and other victims of tragic intolerance—and for those who must hear it. The story is one of betrayal; betrayal of Japanese Americans, who were deprived of protections that all Americans deserve; and betrayal of the American soul, which is defined by its unique commitment to human rights. The best insurance that we will never again commit such acts of betrayal is to use history of this sort as an object lesson for Americans today and in the future.
We know that today’s iteration of this dispute over terminology and history is political in ways that the 1998 episode was not, as exemplified by partisan brawling on the meaning and motives behind Ocasio-Cortez’s words. Still, it’s good to know that communities and individuals can come to an accord over such a sensitive matter when they exercise prudent judgment.
|
fd4409c2ab1cc5b591863eeead6dd970 | https://historynewsnetwork.org/article/172406 | Adding a Citizenship Question to the Census Will Return It to Its Racist Origins | Adding a Citizenship Question to the Census Will Return It to Its Racist Origins
For most Americans, the census is something none of us really think about, for good reason. It doesn’t come around but once every ten years, and then it’s gone again. Besides a few stories on the changing demographics of the country, and possible changes in the number of representatives any given state is allotted, the census is usually in the back of many people’s minds.
That, however, changed with the arrival of the Trump administration. Seemingly out of nowhere it was announced the 2020 Census would be modified in a simultaneously subtle and monumental fashion. Wilbur Ross, the Secretary of Commerce, the division of the government which oversees the Census Bureau, announced in 2018 the next census would include a question asking for respondents’ citizenship status. Almost immediately the potential question was met with a bevy of criticism, support, and lawsuits.
Those lawsuits have culminated this week in the Supreme Court handing down a decision in the citizenship census case. The Court essentially held the government’s reasons for adding the question where inadequate at best, and lies at worse. However, the question is not yet settled. The Court explicitly left the door open for the government to come back with better reasons. It should be remembered the Court has shown it is more than willing to look past the Trump administration’s stated reasons for the actions it takes, and fabricate constitutionally legitimate reasons in order to uphold government actions. Trump has now stated he wants to delay the 2020 Census until the Court reconsiders (and submits to) this administration’s demand for a citizenship question.
What may seem to some as a fairly innocuous question could actually have quite drastic consequences for what our country looks like for the next ten years. By adding the question, the administration hopes to scare non-citizens into not filling out the census. The logic being non-citizens, whatever their immigrant status, would be too intimidated to announce themselves to an administration which has made anti-immigrant ideology a central plank of its governance. States with heavy immigrant populations, which also coincidentally happen to be strongholds for the Democratic Party, would likely lose seats in Congress, funding, and a host of other things as a result of their “official” population lowering. The fear, by many immigrants, is certainly well founded, considering the President of the United States opened his campaign with violently racist remarks against immigrants, and his administration has made it a point to manufacture a humanitarian crisis on the southern border.
Many Americans might ask: What’s the big deal? The census is meant to count the number of American citizens there are in the country, right? Wrong.
Nowhere in the Constitution does it say the census should count the number of citizens. Instead, the Constitution goes for a much broader category: “persons.” It was no mistake either. Instead, the inclusion of persons, or people, in the Constitution was the result of a deliberate concession to slaveholders made during the Constitutional Convention.
Almost from the outset of the Constitutional Convention, when delegates began debating how representation would be decided for what would become the House of Representatives, a major sectional divide emerged among slaveholding and non-slaveholding members. Simply put, those who enslaved black Americans thought their slaves should count towards representation, while those who did not thought the opposite. Non-slaveholding Northerners reasoned that by counting enslaved people the South would gain a disproportionate amount of power in the new national government. If enslaved people could not formally be apart of the society, then why should they count towards Southern representations?
Eventually, after many debates, much anger, and several threats, the delegates finally decided on a compromise: representation would be based on “adding to the whole Number of free Persons, including those bound to Service for a Term of Years, and excluding Indians not taxed, three-fifths of all other Persons.” The delegates decided on this language quite purposefully. During a time when it was not expected that most people would be citizens, much less would be allowed to be citizens, counting people was the best route towards compromise. It was the only way to count enslaved people, in any fashion, while holding onto some semblance of enslaved people’s debased statusunder the institution of chattel slavery. Imagine if the Constitution said “three-fifths of all other ‘citizens’” would count towards representation when speaking of enslaved people.
The compromise paid dividends for Southerners. The Three-Fifths Clause allowed Southern slaveholders to possess far more political power in the national government thanthey should have, compared to their largely free Northern counterparts. From the adoption of the Constitution until the beginning of the Civil War, Southerners would enjoy a padded population number under the census, at the expense of their enslaved population. It wouldn’t take long, either, for the investment to pay off. Thomas Jefferson, for example, would have never been elected president in 1800 had it not been for Three-Fifths Clause, and the census which helped to actualize it.
Now, over 200 years after the adoption of the Constitution, over 150 years after Emancipation, conservatives wish to return the census to its racist origins. By adding a citizenship question, the census would once again be used to misappropriate the political power in the country. Once deemed the best way to grab political power, the use of the word “people” in the Constitution now poses a threat to those who struggle to hold onto power. The Trump administration is now attempting to do exactly what slaveholders did in the eighteenth-century: use the Constitution to inflate the political power of a vocal minority.
|
fcdbbbe1343f03e8a402c3c83df095b7 | https://historynewsnetwork.org/article/172428 | Why Tom Ikeda, a historian of Japanese Internment, is Protesting Trump Over Immigrant Detention | Why Tom Ikeda, a historian of Japanese Internment, is Protesting Trump Over Immigrant Detention
When Tom Ikeda, a Seattle historian, learned that 1,400 migrant children will be detained this summer at Fort Sill in Oklahoma, the symbolism hit home.
Pause on that number. One thousand, four hundred kids. That’s how many students attend Ingraham High School in north Seattle.
Another number: 40,900, the number of migrant children detained in the last year. That's the number of Seattle kids in kindergarten through eighth grade.
Fort Sill was first the site from which the Indian Wars were fought.
Many years later, during World War II, it was where 700 Japanese-Americans were imprisoned, accused of being spies and traitors.
Ikeda traveled to Oklahoma to protest. It was the first time he took part of a demonstration. What follows is his interview with KUOW’s Kim Malcolm.
Four years ago another round of presidential campaigns started. At that point the candidate Donald Trump started describing people coming from the southern border as rapists and murderers invading our country.
There were similarities to how people were talking about Japanese during World War II – as potential terrorists and spies.
I remember raising the red flag, “Let’s be careful about this rhetoric. This will lead to bad things.”
At that point, I think people thought it was curious and made those historical connections but pretty much said, “We’re America, this is campaign rhetoric, don’t worry about it.”
And then, as we’ve watched, it’s gotten worse and worse. We’ve seen policies and now these camps, which are so similar to what we experienced as a community.
I work with Densho, and we’re the story keepers for the Japanese American community, in particular for what happened in World War II.
Nine years ago I was in Kona, Hawaii, interviewing this 80-year-old man, and he told me this very painful story. He was about 11, maybe 12 years old during World War II.
The day after Pearl Harbor, his father was picked up by the FBI. He was picked up because he was a community leader; he helped other families fill out paperwork for the Japanese consulate.
The FBI picked him up, this father of 11 children, prominent businessperson in Kona, put him in a military camp first in Hawaii, and then Fort Sill.
The son told me that his father snapped — mentally snapped — in the middle of the day.
His father ran to the fence, and started climbing it, yelling out, “I want to go home.”
The guards came out with guns and started shooting at him. As the shots were fired, the man paused and stopped toward back to the internees. At that point, one of the guards shot him in the back of his head and killed him.
As his son told me the story, I could see his face transform. You could see as he goes back to being that 11-year-old boy … the pain in his face as he talked about what it meant to him and his family to lose his father at that time.
I carry the memory of that story with me. That also made it very personal for me to go to Fort Sill to honor this man and his family.
|
b2da883d70a9a029155efae1b2787054 | https://historynewsnetwork.org/article/172535 | Angered by This Roosevelt Statue? A Museum Wants Visitors to Weigh In | Angered by This Roosevelt Statue? A Museum Wants Visitors to Weigh In
There’s a quote that takes up its own wall at the American Museum of Natural History’s newest exhibition: It’s more important to tell the truth about the president — pleasant or unpleasant — thanabout anyone else.
The words were written, in fact, by a president: Theodore Roosevelt. A century later, it’s hard to know if Roosevelt expected his words could be used in a context that highlights unpleasant truths of his own.
The exhibition, titled “Addressing the Statue” and opening Tuesday, is the museum’s way of contextualizing a monument of Roosevelt that towers outside its Central Park West entrance. With the president seated high astride a horse, flanked by a Native American man and an African man standing below, people who look at the statue often see a legacy of colonialism and a visually explicit racial hierarchy.
The statue was installed to honor Roosevelt, a staunch conservationist whose ties to the Natural History museum trace back to his father, a founding member of the institution. But Roosevelt’s own racist views, including statements about Native Americans and Africans, complicate the monument’s implications even further.
With the national conversation about monuments and who we choose to honor reaching a fever pitch, the “Equestrian Statue of Theodore Roosevelt” was one of four controversial memorials in New York up for a city commission to reconsider in 2017. The commission was split, and the city decided to leave the statue upand to add context. The resulting exhibition is not permanent, but the museum is looking at ways to incorporate parts of it in other areas of the institution.
|
205dfdea29323d06e6fc050ac596734c | https://historynewsnetwork.org/article/172611 | Dear Moderators of the Presidential Debates: How About Raising the Issue of How to Avert Nuclear War? | Dear Moderators of the Presidential Debates: How About Raising the Issue of How to Avert Nuclear War?
You mass media folks lead busy lives, I’m sure. But you must have heard something about nuclear weapons―those supremely destructive devices that, along with climate change, threaten the continued existence of the human race.
Yes, thanks to popular protest and carefully-crafted arms control and disarmament agreements, there has been some progress in limiting the number of these weapons and averting a nuclear holocaust. Even so, that progress has been rapidly unraveling in recent months, leading to a new nuclear arms race and revived talk of nuclear war.
Do I exaggerate? Consider the following.
In May 2018, the Trump administration unilaterally withdrew from the laboriously-constructed Iran nuclear agreement that had closed off the possibility of that nation developing nuclear weapons. This U.S. treaty pullout was followed by the imposition of heavy U.S. economic sanctions on Iran, as well as by thinly-veiled threats by Trump to use nuclear weapons to destroy that country. Irate at these moves, the Iranian government recently retaliated by exceeding the limits set by the shattered agreement on its uranium stockpile and uranium enrichment.
At the beginning of February 2019, the Trump administration announced that, in August, the U.S. government will withdraw from the Reagan era Intermediate-Range Nuclear Forces (INF) Treaty―the historic agreement that had banned U.S. and Russian ground-launched cruise missiles―and would proceed to develop such weapons. On the following day, Russian President Vladimir Putin declared that, in response, his government was suspending its observance of the treaty and would build the kinds of nuclear missiles that the INF treaty had outlawed.
The next nuclear disarmament agreement on the chopping block appears to be the 2010 New START Treaty, which reduces U.S. and Russian deployed strategic nuclear warheads to 1,550 each, limits U.S. and Russian nuclear delivery vehicles, and provides for extensive inspection. According to John Bolton, Trump’s national security advisor, this fundamentally flawed treaty, scheduled to expire in February 2021, is “unlikely” to be extended. To preserve such an agreement, he argued, would amount to “malpractice.” If the treaty is allowed to expire, it would be the first time since 1972 that there would be no nuclear arms control agreement between Russia and the United States.
One other key international agreement, which President Clinton signed―but, thanks to Republican opposition, the U.S. Senate has never ratified―is the Comprehensive Test Ban Treaty (CTBT). Adopted with great fanfare in 1996 and backed by nearly all the world’s nations, the CTBT bans nuclear weapons testing, a practice which has long served as a prerequisite for developing or upgrading nuclear arsenals. Today, Bolton is reportedly pressing for the treaty to be removed from Senate consideration and “unsigned,” as a possible prelude to U.S. resumption of nuclear testing.
Nor, dear moderators, does it seem likely that any new agreements will replace the old ones. The U.S. State Department’s Office of Strategic Stability and Deterrence Affairs, which handles U.S. arms control ventures, has been whittled down during the Trump years from 14 staff members to four. As a result, a former staffer reported, the State Department is no longer “equipped” to pursue arms control negotiations. Coincidentally, the U.S. and Russian governments, which possess approximately 93 percent of the world’s nearly 14,000 nuclear warheads, have abandoned negotiations over controlling or eliminating them for the first time since the 1950s.
Instead of honoring the commitment, under Article VI of the 1968 nuclear Nonproliferation Treaty, to pursue negotiations for “cessation of the nuclear arms race” and for “nuclear disarmament,” all nine nuclear powers are today modernizing their nuclear weapons production facilities and adding new, improved types of nuclear weapons to their arsenals. Over the next 30 years, this nuclear buildup will cost the United States alone an estimated $1,700,000,000,000―at least if it is not obliterated first in a nuclear holocaust.
Will the United States and other nations survive these escalating preparations for nuclear war? That question might seem overwrought, dear moderators, but, in fact, the U.S. government and others are increasing the role that nuclear weapons play in their “national security” policies. Trump’s glib threats of nuclear war against North Korea and Iran are paralleled by new administration plans to develop a low-yield ballistic missile, which arms control advocates fear will lower the threshold for nuclear war.
Confirming the new interest in nuclear warfare, the U.S. Joint Chiefs of Staff, in June 2019, posted a planning document on the Pentagon’s website with a more upbeat appraisal of nuclear war-fighting than seen for many years. Declaring that “using nuclear weapons could create conditions for decisive results and the restoration of strategic stability,” the document approvingly quoted Herman Kahn, the Cold War nuclear theorist who had argued for “winnable” nuclear wars and had provided an inspiration for Stanley Kubrick’s satirical film, Dr. Strangelove.
Of course, most Americans are not pining for this kind of approach to nuclear weapons. Indeed, a May 2019 opinion poll by the Center for International and Security Studies at the University of Maryland found that two-thirds of U.S. respondents favored remaining within the INF Treaty, 80 percent wanted to extend the New START Treaty, about 60 percent supported “phasing out” U.S. intercontinental ballistic missiles, and 75 percent backed legislation requiring congressional approval before the president could order a nuclear attack.
Therefore, when it comes to presidential debates, dear moderators, don’t you―as stand-ins for the American people―think it might be worthwhile to ask the candidates some questions about U.S. preparations for nuclear war and how best to avert a global catastrophe of unprecedented magnitude?
I think these issues are important. Don’t you?
|
58a47d825a1d4cf34af0240eb81dacc2 | https://historynewsnetwork.org/article/172675 | Who Owns Theodore Roosevelt? | Who Owns Theodore Roosevelt?
Theodore Roosevelt died a century ago in January, but his political legacy remains up for grabs — today, perhaps, more than ever. Everybody wants him on their team. Senator Elizabeth Warren, a Democrat, has repeatedly cited Roosevelt as her favorite president and her “dream” running mate because, like her, he pushed a progressive agenda that included taking on industrial trusts. “Man, I’d like to have that guy at my side,” she told Ari Melber of MSNBC in March.
Roosevelt was, of course, a Republican, at least until 1912, when he ran as a third-party candidate. Still, many in that party continue to claim him as one of their own — even as a forebear of President Trump. “I think the United States once again has a president whose vision, energy, and can-do spirit is reminiscent of President Teddy Roosevelt,” Vice President Mike Pence said in 2017.
At a gathering earlier this month dedicated to building a nationalist conservative movement — called, appropriately, the National Conservatism Conference — Senator Josh Hawley of Missouri approvingly cited Roosevelt in a speech about the rising threat of “cosmopolitanism.” Mr. Hawley, one of the many young Republicans jockeying to take over the party once Mr. Trump leaves office, knows what he’s talking about: He wrote a well-received book about Roosevelt’s political philosophy.
Like a handful of other figures in American history — Washington, Lincoln, King — Roosevelt inspires admiration across the political spectrum, in part because his own politics are so hard to place. Through his career and his voluminous writings, he can appear as a reformer, a nativist, an imperialist, a trustbuster, a conservative and a progressive — often at the same time.
And it makes sense that Roosevelt is in such demand. He came to prominence in the late 19th century, a time marked by many of the same challenges we face today. Immigration was reshaping the population. Technology and globalization were tearing down old industries and building new ones. Corporate power was at an apogee. The Republican Party was at war with itself.
|
523fdef97af218b623db9d7960d4684a | https://historynewsnetwork.org/article/172695 | The Professor Who Was Ostracized for Claiming the Civil War Was About Slavery – In 1911 | The Professor Who Was Ostracized for Claiming the Civil War Was About Slavery – In 1911
The Battle of Williamsburg, Kurz and Allison
Sometimes when we’re poking around in an archive, we come across century-old documents that are strangely relevant. That’s what the story of Enoch Marvin Banks became to me. An aging letter from 1911 that I found in the Columbia University archive revealed a story that could be in today’s headlines: people in the Jim Crow South tried to capture the memory of the Civil War for political gain.
My main research involves Progressive-Era economic thought, and John Bates Clark was one of America’s foremost economists. Sifting through his papers, I came across the usual letters of economic theories and perspectives, but then something unexpected: A long letter from Enoch Marvin Banks dated April 2, 1911 (the quotes below come from this letter). Banks was a professor of history and political economy at the University of Florida, and he seemed distressed. He was “being violently assailed”, evidently over an article he’d written. I didn’t have the article at the time, but I could understand its context from the hints Banks gave. Basically, Banks had committed the crime of blaming the Civil War on slavery. Southern leaders, he stated, had made “a grievous mistake in failing to formulate plans for the gradual elimination of slavery from our States.” In his view, wise leadership would have ended slavery slowly, kept the union intact, and avoided the catastrophe of civil war.
With a google search, I later found the article in question, “A Semi-Centennial View of the Civil War” in The Independent (Feb. 1911). Upon reading it, I discovered that Banks was even more explicit in print: “The fundamental cause of secession and the Civil War, acting as it did through a long series of years, was the institution of Negro slavery” (p. 300). Banks didn’t stop there. He attacked the South’s leadership as well, praising Abraham Lincoln and criticizing Jefferson Davis as a statesman of “distinctly inferior order” (303). Such views were incendiary in the Jim Crow South, and the cause of Banks’ distress.
Banks’ views touched off a firestorm in his native South (he was born in Georgia and spent most of his life in the South). Confederate veterans’ groups responded with widespread criticism. Banks included a clipping from the United Confederate Veterans Robert E. Lee Camp No. 58 in his letter to Clark. The clipping addressed the University of Florida for having a staff member who sought to “discredit the South’s intelligence and to elevate the North and to falsify history.” “Shall such a man continue in a position as teacher where he will sow the seeds of untruth and make true history a falsifier?,” they asked. The veterans demanded Banks be removed from the university and replaced with “a man who will teach history as it is and not mislead and poison the minds of the rising generation.”
As Banks told Clark, he simply couldn’t stand the controversy and pressure. He obliged these demands by resigning from the university and retreating back Georgia. He died only a few months later. Some suspected that the strain of the ordeal diminished his already weak health and led to his eventual death.
This moment reflected the ongoing battle over the legacy of the Civil War and the ideology of the Jim Crow South. As Banks wrote his article, the South was building and codifying its system of racial segregation. Part of this project involved capturing the war’s historical memory. Confederate leaders had to be presented as noble warriors fighting for a lost cause. Jefferson Davis, who was attacked then and now for incompetence, was “one of the noblest men the South ever produced,” according to the Confederate veterans’ group. That’s why they blamed Banks for distorting history, as he challenged the history that was being constructed. As Fred Arthur Bailey wrote in one of the few articles dedicated to this affair: “This tragic incident was but a small part of a large, successful campaign for mind control. Self-serving, pro-Confederate historical interpretations accomplished their purposes” (17). I can’t help agreeing with Bailey’s conclusion.
This ordeal seems to me a perfect example of how history becomes a battlefield. It’s no secret that the historical memory of the Civil War became contentious almost as soon as the war ended. In a world where debates about Confederate statues and flags frequently make headlines, I can only conclude that the battle is very far from over.
|
7aa28fe841dabbbd8c11f2590867d103 | https://historynewsnetwork.org/article/172711 | Trump’s Tariff War Resembles the Confederacy’s Failed Trade Policies | Trump’s Tariff War Resembles the Confederacy’s Failed Trade Policies
Current efforts by the United States to put tariff pressures on China resemble the Confederacy’s efforts to pressure Great Britain during the American Civil War. In the early 1860s the Confederate leaders’ strategy backfired, damaging the southern economy and weakening the South’s military. Recent developments in the tariff fight with China suggest that President Trump’s strategy could backfire as well. America’s tariff negotiators should consider lessons from the record of Confederate missteps.
In the Confederates’ dealings with Britain and the Trump administration’s tensions with China, booming economies gave advocates of a tough negotiating stance exaggerated feelings of diplomatic influence. Southerners thought the robust cotton trade provided a powerful weapon in their efforts to secure official recognition from the British. President Trump expresses confidence that America’s flourishing economy can withstand a few temporary setbacks in order to win major trade concessions from the Chinese. In both cases, leaders failed to recognize that their gamble had considerable potential for failure.
During the 1850s, southern cotton planters enjoyed flush times. Sales of cotton to English textile mills brought huge profits to the owners of cotton plantations. “Our cotton is . . . the tremendous lever by which we can work out destiny,” predicted the Confederate Vice President, Alexander Stephens. Southerners thought English textile factories would have to shut down if they lost access to cotton produced in the United States and that closures would leave thousands of workers unemployed. To the Confederates, it seemed that the English would have no choice but to negotiate with them. Britain needed “King Cotton.” The Confederate government did not officially sponsor a cotton embargo, but the southern public backed it enthusiastically.
Presently, a booming economy has put the Trump administration in a strongly confident mood, too. President Trump’s advisers and negotiators on tariff issues, Peter Navarro and Robert Lighthizer, hope China will buckle under American pressure. They expect tariffs on China’s exports will force a good deal for the United States. President Trump encouraged the tariff fight, asserting trade wars are “easy to win.”
Economic pressures hurt the British in the 1860s and the Chinese recently, but in both situations coercive measures led to unanticipated consequences. During the American Civil War, some textile factories closed in Britain or cut production, yet British textile manufacturers eventually found new sources of cotton in India, Egypt, and Brazil. Now the Chinese are tapping new sources to replace lost trade with the United States. China is buying soybeans from Brazil and Argentina, purchasing beef from Australia and New Zealand, and expanding commercial relationships with Canada, Japan, and Europe.
The failed strategy of embargoing cotton represents one of the great miscalculations of the South’s wartime government. If the Confederacy had continued selling cotton to the English during the early part of the Civil War – before the Union navy had enough warships to blockade southern ports – it could have acquired precious revenue to purchase weapons of war. The absence of that revenue contributed to a wartime financial crisis. Inflation spiked. Economic hardship damaged morale on the home front. Many Confederate soldiers deserted after receiving desperate letters from their wives. Fortunately for African Americans, ill-conceived Confederate diplomacy speeded the demise of slavery.
Many economists now blame President Trump’s trade fights with China and several other nations for volatility in stock markets. They attribute a recent global slowdown in commerce largely to President Trump’s protectionist policies. More troublesome, though, may be the long-term consequences of the administration’s policy. Much like the South’s foolish cotton embargo, America’s tariff waris forcingthe Chinese to seek commercial ties with other countries. China appears to be moving away from close relationships with American business.
That shift could prove costly. American prosperity in recent decades owes much to commerce with China and the eagerness of Chinese investors to buy American stocks and bonds, including U.S. government debt. If the present conflict over tariffs leads to reduced Chinese involvement in American trade, the Trump administration’s risky strategymay be a reiteration of the Confederates’ foolish gamble on the diplomatic clout of King Cotton.
|
774b7c2a81ca915b674ccdb2401b5352 | https://historynewsnetwork.org/article/172743 | The Most Dangerous American Idea | The Most Dangerous American Idea
Last week, the historian Timothy Naftali revealed a 1971 conversation between Richard Nixon, then the president of the United States, and Ronald Reagan, then the governor of California, in which Reagan referred to African United Nations delegates as “monkeys” who are “still uncomfortable wearing shoes.” Reagan was expressing anger over those African nations that voted to recognize the People’s Republic of China as the legitimate government of China, rather than Taiwan, which had held the seat since the UN was founded, in 1945.
The bald racism of the remarks makes it hard to look beyond the words themselves and focus on the worldview they expressed. Reagan and Nixon were declaring their belief that the African delegates were rendered unfit for participation in world affairs by virtue of their ethnic background, a perspective that inevitably reflects on the rights of black people in the United States. No belief in American history has been more threatening to democracy, or consumed more American lives, than the certainty that only white people are fit for self-government, and the corresponding determination to exclude other citizens from the polity. A man acting on that belief last weekend drove 600 miles from Dallas to El Paso, Texas, to kill 20 people, in the name of stopping an “invasion” of Texas by the people who have lived in Texas since before there was a Texas.
It’s also a belief that continues to shape American politics, one held by the current occupant of the White House. President Donald Trump’s racial conception of American citizenship, and his denigration of nonwhite immigrants from “shithole countries,” is but an extension of Reagan’s and Nixon’s logic. It forms the moral core of much of Fox News’s programming, which warns white Americans day in and day out that their country is being stolen by minorities. And it justifies the long-term efforts of Republican elites to rig democracy to their advantage, by limiting or diluting the political powerof nonwhite Americans through gerrymandering and disenfranchisement. The significance of Reagan’s and Nixon’s remarks is not simply what they said. It is that their conversation shows that, although Trumpism is in some ways unique, what Americans confront today is not nearly so alien to their history and culture as many might assume.
When the former abolitionist Horace Greeley turned against Reconstruction, he nearly took the whole country with him.
The powerful owner of the New-York Tribune was once a reliable Republican partisan. But in the May 1871 issue of the Tribune, an anonymous correspondent attacked the Reconstruction government in South Carolina as emblematic of other Republican-controlled state legislatures throughout the South, a place where black Americans, “a class just released from slavery, and incompetent, without guidance, to exercise the simplest duties of citizenship,” had become “the governing class in South Carolina, and a class more totally unfit to govern does not exist upon the face of the earth.”
Those words reflected Greeley’s own views, according to the historian Eric Foner, who writes in Reconstruction that Greeley saw black people as an “easy, worthless race, taking no thought for the morrow.”
Although he expressed them as an attack on the governing abilities of the freedmen, Greeley’s true objections were ideological, as the historian Heather Cox Richardson writes in Death of Reconstruction. After the Civil War and emancipation, black Americans sought to enjoy their newfound liberty. They wanted to run their own businesses, they wanted to establish schools for their children, and they wanted to tend their own land and manage their own fates.
But white elites still held economic power in Southern states, even if their political power had been diminished by the enfranchisement of the freedmen. For the masses of the freedmen to become more than a captive—if nominally free—labor force for white employers would require government intervention. And for wealthy, conservative Republicans like Greeley, and the white-supremacist Democrats who had lost power in the South, that kind of state intervention on behalf of workers and the poor was antithetical to the American system of government.
|
f60d4668433cf0ab5a4e1b7bc17fd270 | https://historynewsnetwork.org/article/172752 | Why Donald Trump is just following in Ronald Reagan’s footsteps on race | Why Donald Trump is just following in Ronald Reagan’s footsteps on race
Since his campaign, President Trump has pushed race to the center of American politics. It started with his stance on immigration and intensified with his response to the violence in Charlottesville. Over the past few weeks, he has sharpened the focus through attacks on minority members of Congress dubbed “the Squad,” as well as Rep. Elijah E. Cummings (D-Md.) and the city of Baltimore.
It is easy for some to see Trump’s blatant and openly racist statements as an aberration for GOP politics. But the recent disclosure of a phone call between President Richard M. Nixon and California Gov. Ronald Reagan in October 1971 — during which Reagan referred to African leaders as “monkeys” who are “still uncomfortable wearing shoes” — challenges that narrative.
For Reagan, such rhetoric wasn’t an aberration, either, especially when you look at his long record. Along with advisers such as Pat Buchanan, he understood how to use racially coded language, derived from staunch segregationists such as Strom Thurmond and George Wallace and deployed successfully by figures including Nixon to bring Southern voters and working-class urban whites in the Midwest into the Republican Party.
It didn’t stop with such coded language. His policy record on civil rights and racial issues explains why many African Americans continue to view the former president with great disdain. Although he is remembered fondly by the GOP, racist politics played a significant role in Reagan’s political success. The same is true of Trump.
Racial issues were central to Reagan’s political success during the 1966 California gubernatorial election. He denounced the Civil Rights Act of 1964 and the Voting Rights Act of 1965 while running radio ads referring to urban areas as “jungles.” Regarding fair housing, he emphasized: “If an individual wants to discriminate against Negroes or others in selling or renting his house, it is his right to do so.” This resonated well with white conservative suburban voters in places such as Orange County, Calif.
As governor, he targeted African Americans to condemn, particularly activists such as Angela Davis. He joked publicly about Africans and cannibalism, and he verbally accosted an African American protester in 1968 at the Republican National Convention.
This boosted his national reputation as he became a darling of the conservative movement and crisscrossed the South and white working-class and suburban enclaves all over the country. Nixon understood racial politics as the root of Reagan’s appeal, noting how he played on the “emotional distress of those who fear or resent the Negro, and who expect Reagan somehow to keep him ‘in his place.’ ”
But as scholar Jeremy D. Mayer notes, Reagan’s anti-federalism gave him a “plausible deniability on race” that was “perhaps Reagan’s greatest appeal to many racist whites.” He understood how to frame race in terms of states’ rights rather than the blatant racist rhetoric of segregationists such as Thurmond and Wallace.
The goal, however, was the same: to appeal to white Southerners and other working-class whites, bringing this demographic into the GOP. During Reagan’s effort to win the Republican presidential nomination in 1976, this became apparent when he supported a constitutional amendment to end busing and denounced affirmative action.
|
ae318b89e6e7730f8aa5005172507bfd | https://historynewsnetwork.org/article/172849 | Israel’s 2007 Decision to Attack a Syrian Nuclear Facility is History and Warning | Israel’s 2007 Decision to Attack a Syrian Nuclear Facility is History and Warning
Jerusalem Post editor Yaakov Katz probably had no way of knowing that this would be a perfect time to release his briskly-selling Shadow Strike—Inside Israel’s Secret Mission to Eliminate Syrian Nuclear Power (St. Martin’s Press). Or did he?
The world’s attention is once again focused on the nuclear threat from Iran, generating kaleidoscopic theories about a potential military strike to disable Tehran’s program. Katz’s case study of the run-up and run-down to the Jewish state’s clandestine destruction of Syria’s nuclear attempt in 2007 is now an imperative read. Many will recall that for a long time, Israel refused to confirm that its jets had pre-emptively bombed a mysterious Syrianinstallation at Al Kibar. Only in 2018 did Jerualem admit that it had confirmed the installation was an undeclared North Korean aided nuclear site. To destroy the site, the Israeli Air Force (IAF) deployed a crack 8-plane squad of F-16s, F-15s and electronic jamming aircraft that hijacked Syrian radar blinding the country’s defense measures.
Katz flexes both his editorial sinews and his prior government connections as a former senior policy advisor to deliver a suspenseful chronicle, bolstered by rapid-fire precision and continuous in-room details. Understandably, this volume will be consulted time and time again by military theorists and diplomatic observerswhowonderhow another nuclear site destriction might be done—just in case it must be done again.
From the first “you are there” opening scene that details Mossad Chief Meir Dagan’s White House presentation on the Syrian threat, Katz skillfully surrounds each personality in the story with a rich biography and a functioning profile in the time frame. Readers are enveloped in more than the historical facts. They are transported to the tense, unfolding world of personalities, events, clashes, countdowns, and decision-making that resulted in the successful Syrian takedown.
For example, in Chapter 3, when introducing Israeli security cabinet official Rafi Eitan, the author ensures we know Eitan is more than just a security functionary taking notes. We are told that Eitan captured Adolf Eichmann and visited an American reactor when 200 kilograms of highly enriched uranium disappeared and perhaps, who-knows, found their way to Israel’s ambiguous nuclear program in Dimona. Eitan also worked ground operations against the PLO in Lebanon and recruited and managed Israel’s infamous spy in the American naval establishment, Jonathan Pollard.
So intense is the detailing of the decision process that the actual bombing of Syria’s reactor is but briefly reported in a few paragraphs as an ipso facto of the narrative.
It might be easy to conflate the 2007 Syrian challenge to the current Iranian crisis. Syria was only taking preliminary steps toward nuclearization. Iran now has the essentials for a nuclear bomb that can be assembled and deployed within weeks, according to many experts.
Tehran’s endless centrifuge arrays have spun off enough kilograms of 99 percent Highly Enriched Uranium that can be compressed into an unstable and dense core encased in an R-265 Shock Generator configured in a bifurcated sphere lined with 5mm grooves filled with PETN explosive that can be ignited with microsecond precision to create the synchronous implosion that will absorbed by an exploding bridgewire, sturdy enough to transduce and focus the massive implosion force triggering a neutron initiator to fire one particle into the warhead core to create the atomic chain reaction that will clap forth a murderous mushroom cloud.
Additionally, Iran has developed a fleet of mobile Shahab-3 missiles derived from the North Korean No Dong, each with a nose cone large enough to carry the nuclear warhead. Tehran also possesses the flight guidance and ignition control to detonate such a warhead precisely 550 meters above the ground—mimicking the bombing of Hiroshima—thus unleashing a ferocious nuclear inferno. In fact, Iran recently test-fired such a Shahab-3 as a reminder that it still knows how to pull the trigger.
As an added factor, Iran wields a Russian S-300 missile defense system fully capable of protecting its nuclear program and strike assets. All this just raises the stakes on decision-making and decisions.
When reading Katz’s book, we are reminded that as complex and difficult as the Syrian strike was, any similar action on Iran’s nuclear capability would be infinitely more daunting and riskier.
Katz’s mastery of the facts and his relentless assemblage of puzzle pieces, together with his knowledge of the players and the potentialities, make Shadow Strike a powerful read. The volume also demands that Katz write another. No one knows if such a sequel will chronicle yet another shadow strike upon another nation to the north.
|
4cbbaaf6771fe65b843f5a2f9a08ee88 | https://historynewsnetwork.org/article/172892 | How history textbooks reflect America’s refusal to reckon with slavery | How history textbooks reflect America’s refusal to reckon with slavery
...
“Textbooks are supposed to teach us a common set of facts about who we are as Americans ... and what stories are key to our democracy,” said Alana D. Murray, a Maryland middle-school principal and author of The Development of the Alternative Black Curriculum, 1890-1940: Countering the Master Narrative.
As textbooks show — through omissions, downright errors, and specious interpretations, particularly regarding racial issues — not everyone enjoys the perks of civic belonging or gets a fair shake in historical accounts. This is even true of textbooks used today — 400 years after Africans’ 1619 arrival, more than 150 years after emancipation — with narratives more interested in emphasizing the compassion of enslavers than the cruelty endured by the enslaved.
Textbooks have long remained a battleground in which the humanity and status of black Americans have been contested. Pedagogy has always been preeminently political.
From fast facts to black inferiority: how slavery has been portrayed historically in textbooks
The Hazen’s textbook framed Jamestown and its role in the development of US slavery as an inevitable matter of labor demand and economic pragmatism, a common argument in US school materials at the turn of the 20th century.
Yet that was just one school of thought. After slavery’s end in this country, many Southern-focused textbooks promoted a Lost Cause approach to Jamestown and slavery writ large, portraying the institution as part of a natural order. White Southerners created ideologically driven narratives that yearned for the Good Ole Days where whites sat atop the hierarchy and African Americans were faithful slaves. In this racist revisionism, they didn’t have to reckon with the new black citizen, voter, or legislator as nominal equals.
|
aecd31d1a755212367a92f6cb2e9fffb | https://historynewsnetwork.org/article/172910 | The Latest G-7 Summit Showcases Trump's Foreign Policy Failures | The Latest G-7 Summit Showcases Trump's Foreign Policy Failures
The just concluded G-7 summit featured a tour de force in diplomatic dexterity and leadership, as French President Emmanuel Macron strove desperately to bring President Donald Trump back into the real world.
On all the key issues of the day—climate change, Iran, and the trade war with China—Macron used all the Parisian charm he could muster to nudge Trump from the arrogant and amateurish perch which has thus far characterized his diplomacy. Macron succeeded in raising the possibility of renewed negotiations with Iran and China, but whether anything comes of this after the unpredictable US President re-crosses the Atlantic is anyone’s guess.
Unless Macron or someone else can succeed in bringing to the surface a thus far deeply internalized sense of realism in Trump, he is well on his way to being nothing less than arguably the biggest foreign policy disaster ever to inhabit the white House. Here are the top three reasons why:
Reason No. 1: Withdrawal from the United States from the Paris Agreement (2015) on climate change.
President Trump’s empty chair at the G-7 summit meeting on climate change spoke volumes. When civil war-battered Syria signed on to the Paris Treaty in November 2017, Trump’s announcement of unilateral US withdrawal from the accord earlier that year made it official that the United States alone among all the nations of the world would not participate in combating the scientifically verified, potentially devastating, and daily snowballing effect of global warming. This is the single most reckless, damaging, and rudderless action taken by Trump. If he is reelected, the American people will be voting for nothing less than the destruction of life as we know it on the planet Earth.
Reason No. 2: Abandoning arms control treaties with Iran and Russia.
One of the great achievements of the right-wing Reagan presidency was the INF Treaty, signed with Soviet President Mikhail Gorbachev in December 1987, which eliminated an entire class of intermediate range missiles in Europe and moreover established rigorous verification procedures on compliance. One of the great achievements of the centrist Obama presidency was the 2015 Iran nuclear treaty, more formally the JCPOA, or Joint Comprehensive Plan of Action in which Iran agreed to rigorously verifiable limitations on its ability to enrich uranium for bomb-making capacity in return for lifting of US-sponsored international economic sanctions.
The born-rich real estate tycoon President with zero foreign policy experience summarily terminated both treaties.
Other members of the UN Security Council (China, Russia, Britain, France) as well as the European Union signed off on the treaty with Iran, which is why Macron is trying desperately to get it back on track. To its credit, Iran appears willing to reopen negotiations as well. Thanks to the French president and the Iranians, Trump has a chance to rethink his precipitous and reckless action.
Trump’s termination of the INF Treaty frees his pal Russian President Putin as well as the venerable US military-industrial complex to rev their engines and restart the nuclear arms race, which has been quiescent for decades. Another critical treaty with Russia, the New START Treaty, also denounced by Trump, expires in 2021.
Termination of these arms control treaties rank near the top of Trump’s foreign policy failures because, well, as the old bumper sticker read, “One nuclear bomb can ruin your whole day.”
Reason No. 3: Recognizing Jerusalem as the “eternal capital” of Israel and the Golan Heights as Israeli territory.
Because of the overweening influence over the US Congress and the American public of AIPAC, the American Israel Public Affairs Committee—one of the top two or three lobbies in the country—all too many Americans may not find these actions, or the termination of the Iran treaty, objectionable, but anyone with broader knowledge of the Middle East conflict understands that Trump is playing with dynamite.
Jerusalem is a holy city for Christians and Muslims as well as Jews and cannot be dominated by any one if there is ever to be hope of peace. Under the moribund two-state solution, East Jerusalem was to be the capital of a Palestinian state.
As the UN and the international community have repeatedly affirmed, Israel has no legitimate claim to either the Golan Heights or the West Bank, both of which it nonetheless has been settling for decades in blatant violation of international law. Trump is not the first president to bow to AIPAC and Christian fundamentalists on Middle East policy, but he has taken it to a new level by signing off on Israel’s sole occupation of Jerusalem.
Alas, There’s More . . .
The top three above strike me as the most serious Trump foreign policy failures because their consequences can be catastrophic. But there is a litany of failure on the part of this president.
Trump has alienated all of Africa by calling it a continent full of “shithole” countries; he has done nothing to calm the India-Pakistan dispute playing out today in Kashmir, at the risk of escalation between two nuclear-armed powers; he has given little rhetorical support to the pro-democracy movement in Hong Kong while launching a trade war with China; he has anointed a new ruler who can’t actually come to power in Venezuela; and he has inspired neo-fascist movements all over the globe, as men such as Brazilian President Jair Bolsonaro and Philippine president Rodrigo Duterte praise his leadership.
There is still hope that Trump can make a breakthrough with North Korea’s Kim Jong Un but his diplomacy thus far--if that’s what you call the three highly publicized meetings with Kim--has done nothing but give visibility and legitimacy to a ruthless autocrat. A deal can be secured if the United States and its allies offer a trade along the lines of terminating off-shore military maneuvers, which make Kim feel threatened, in return for denuclearization, but for reasons known only to Trump he has chosen to strike up a pointless friendship legitimating a petty dictator while achieving no tangible results.
Where on this list, some may wonder, is Trump’s indifference to Russian meddling in American elections? While clearly it is true that Russia does meddle—the fact of the matter is, so do we--all over the world. So, while it is not “fake news,” Russian meddling is also not an existential threat to world peace as are the issues discussed above. Sorry, Democrats, Russia did not decide the election--the Electoral College and too many naïve American voters did that to themselves.
There is still time for Trump to reverse his legacy of diplomatic ineptitude. If he fails to do so he may well go down in history as the most feckless foreign policy president in American history.
|
4629d846a60a24d5441c437ec2093591 | https://historynewsnetwork.org/article/172987 | National Security Archive Publishes New Documents on the First Soviet Nuclear Test Offering New Information on Beginning of Nuclear Arms Race | National Security Archive Publishes New Documents on the First Soviet Nuclear Test Offering New Information on Beginning of Nuclear Arms Race
Seventy years ago, on 9 September 1949, Director of Central Intelligence Admiral Roscoe Hillenkoetter handed President Harry Truman a carefully worded report of “an abnormal radio-active contamination" in the Northern Pacific that greatly exceeded normal levels in the atmosphere. While uncertain as to the cause, the DCI’s first hypothesis was “An atomic explosion on the continent of Asia.” This proved to be accurate – it was the first Soviet test of a nuclear device.
Moscow’s success in building a nuclear bomb was a monumental development made all the more alarming for U.S. strategists by the fact that it occurred one-to-four years sooner than analysts had expected. The White House chose to preempt possible Kremlin triumphalism by announcing the finding to the world on 23 September 1949, a move that evidently came as a shock to the Soviets who had no idea the U.S. had the capability to isolate and identify the signs of a nuclear blast.
Hillenkoetter’s memo, never before published, is at the core of a new posting today by the National Security Archive offering previously classified information and context surrounding the U.S. discovery of the landmark Soviet test. The documents are an update to an earlier Archive compilation and focus on the state of U.S. intelligence about the Soviet nuclear program before and after the test. They help address lingering questions about the unexpected abilities of U.S. nuclear detection technology but also about the disturbing failure to predict the Soviet atomic breakthrough more accurately.
|
353d2efd34bc2aa6920da3b2af92ed10 | https://historynewsnetwork.org/article/173099 | A Family History of the Red Scare | A Family History of the Red Scare
Who gets to decide what it means to be an American? It's a question of some urgency these days, and one that Pulitzer Prize-winning journalist David Maraniss addresses in A Good American Family: My Father and the Red Scare as he recounts the historical experiences of his father, Elliott. The elder Maraniss was a devoted husband, kind father and dedicated newspaperman, whose patriotic bona fides came under fire during the Red Scare of the early 1950s. The book is an interesting addition to Cold War historiography, combining analysis of major features of the Red Scare and a heartfelt family story.
The book opens on March 12, 1952, in room 740 of the Federal Building in Detroit, Michigan, as Elliott appears before the House Committee on Un-American Activities (HUAC), which had come to the Motor City ostensibly to root out alleged Communist subversives in the automobile industry. Elliott was fired from his position at the Detroit Times for his refusal to “name names” and answer several questions posed by members of the committee. Afterwards, he was fired from his position at the Detroit Times and blacklisted from any meaningful work in journalism for several years. However, this book is not simply an account of a man wrongly accused and punished for his radicalism. Rather, Maraniss attempts to understand and reconcile his memory of his father as a good man and patriotic American with the fact that he was also a member of the American Communist Party (CPUSA) and in many ways an apologist for Joseph Stalin’s tyrannical regime.
The son of Jewish immigrants who fled Odessa, Ukraine, in 1890 to escape the Russian Empire’s rising anti-Semitism, Elliott grew up in Brooklyn where he attended Abraham Lincoln High School. The school’s principal—a fascinating New York University alum who wrote his Ph.D. thesis on Baruch Spinoza—helped stoke Elliott’s idealism by encouraging him to find inspiration in Ralph Waldo Emerson’s “heart-stirring, untraditional, iconoclastic words about initiative, conformity, consistency, truth-telling, prayer, and independence.” After graduating in 1936, Elliott followed a fellow Lincoln High alum, Arthur Miller, to the University of Michigan where he wrote for the school’s paper, the Michigan Daily, and became active in radical politics along with his future wife, Mary Cummins. Maraniss describes his parents as young idealists: “They loved the promise of America, were disoriented by the economic collapse of the U.S. economy during the Depression, were seeking answers to the chaos of the world, and at the same time wanted to believe in a virtuous, peace-seeking, equality-minded Soviet Union.” While he is fulsome in his praise of his parents’ idealism, Maraniss simply cannot understand how they could have also supported the Soviet system: “They thought they were working toward a true and open American democracy even as they were rationalizing the actions of what was in fact a ruthlessly totalitarian foreign power.”
The CPUSA’s craven servility to Stalin’s agenda in the 1930s is well known. American communists ignored, refuted, or attempted to justify Stalin’s horrific crimes such as the forced-famine in Ukraine and the Great Purges. In foreign policy, the CPUSA adhered to the Comintern-led Popular Front strategy by which Stalin ordered all communist parties around the world to work within any government to oppose Mussolini and Hitler’s aggression, an approach Elliott enthusiastically supported. When the party abandoned its stated anti-fascist policy and defended the Nazi-Soviet Pact of 1939, many American communists quit the party. Elliott did not. He embraced the new Stalinist line and advocated American neutrality in World War II. He even defended the 1939 Soviet invasion of Finland that crushed the Scandinavian democracy. To his credit, Maraniss does not hide his father’s staggering willingness to justify Stalin’s machinations, and quotes an editorial Elliott co-authored describing the war as a “clash of rival imperialisms.” In a later edition, Elliot added: “This is not a war against fascism, it is not a people’s war and does not attack the vital causes of war.” Trying to explain the inexplicable, Maraniss simply says: “… he was stubborn with ignorance.”
The Japanese attack on Pearl Harbor apparently changed his mind. Two weeks after the attack, Elliott enthusiastically volunteered to serve in the U.S. Army, which Maraniss contends demonstrates his father’s patriotism. Though he concedes Elliott’s commitment to the war effort came only after Hitler launched Operation Barbarossa against the Soviet Union in June 1941, he defends his father’s motivations: “It was not that my father felt more strongly about the Soviet Union than his own country. In all his previous writings, he showed a deep belief in America and the American promise. He was a patriot in his own way.” Unfortunately, he provides no insights into Elliott’s reaction to Barbarossa or the way he rationalized yet another shift in his views on the fascist challenge.
Elliott’s military service got off to a rocky start. The Military Intelligence Division of the War Department investigated him and concluded that he was “communistic” and potentially “disloyal.” (This was hardly a rash assessment. Subsequent revelations from the Venona decryptions and Soviet archives demonstrated the CPUSA’s many ties to Soviet intelligence, a fact Maraniss fails to note.) Elliott did, however, serve with distinction, commanding an African American salvage unit in the segregated army, which suited his commitment to racial equality. Throughout the chapter on World War II, we hear Elliott’s views on the war and on the United States through the many letters he sent home to Mary and their newborn son: “Jimmie [the author’s older brother], you know that I love you and your Mother very much. I wouldn’t be a very good father, nor much of a man, if I didn’t stand up and fight against those Japs and Nazis. That is why I am in the army and that is why I am going far away for a long time. Your mother, who is not only very beautiful but also very brave and very intelligent, understands all this. And I know that you will understand too, Jimmie.”
Returning to the U.S. after the war, Elliott resumed his work for the CPUSA, secretly writing and editing two of the party’s periodicals while working for the Detroit Times, a breach of journalistic ethics. His commitment to the CPUSA was confirmed by Beatrice Baldwin, an FBI informant who provided Elliott’s name to HUAC and landed him in front of the committee in 1952. Maraniss devotes considerable attention to the backgrounds of HUAC’s members, especially Chairman John Stephens Wood (D—GA), a committed segregationist who had flirted with membership in the KKK. Maraniss explores Wood’s shady past not to justify his father’s actions, but to raise the intriguing question about who gets to decide who is American and who is un-American. Although he never provides a clear answer, Maraniss clearly suggests that Wood and the other members of HUAC were hardly representative of American virtues.
Elliott’s testimony (which Maraniss includes in full) was not especially dramatic. The committee peppered him with questions that he often refused to answer, citing his Fifth Amendment rights, a tacit admission of guilt in the eyes of many Americans during the Cold War. He was immediately fired from the Detroit Times and was unable to find work as a journalist for five years, supporting his family by taking whatever odd jobs he could find.
In 1957, as fears of the Red Scare dissipated, Elliott moved his family to Madison, Wisconsin, where he worked as a reporter and editor for the Capital Times for many years, earning the respect and admiration of his colleagues, family, and friends for his work. Maraniss proudly shares the anecdote of Ben Bradlee, the legendary editor of the Washington Post, saying: “There’s Elliott Maraniss, a great editor.” As for his father’s ideology, Maraniss explains, “His politics changed, from radical to classic liberal, but not his values or belief in America—a generous spirit that he had carried with him since his days at Abraham Lincoln High School and that he expressed so powerfully in his letters to my mother during the war.”
Throughout the book, Maraniss struggles to explain just how his father could have been so deeply committed to admirable causes like racial and economic justice, free and open inquiry, and peaceful international relations while simultaneously following the dictates of Stalin. He speculates that loyalty to friends and family, naiveté, and unbridled idealism, could have been factors, but ultimately he finds no satisfactory answer, in large part because his father never explained why he believed and acted as he did. In the end, Maraniss and his readers are left wondering how a man of such vision could have been so blind for so long.
|
5fc2c4adb3a5c255257080ca266e6faf | https://historynewsnetwork.org/article/173149 | What Does History Tell Us About Impeachment and Presidential Scandal? Here's 7 Historians' Analysis. | What Does History Tell Us About Impeachment and Presidential Scandal? Here's 7 Historians' Analysis.
What a week! I--like many of you, I’m sure—have been glued to my phone, constantly hitting refresh on Twitter and the Washington Post’s website.
This morning alone, lawmakers released the whistleblower report at the center of the ongoing controversy over a phone call between President Trump and Ukrainian President Volodymyr Zelensky. Joseph Maguire, the acting intelligence chief, is testifying before the House Intelligence Committee. Trump’s response? “THE DEMOCRATS ARE TRYING TO DESTROY THE REPUBLICAN PARTY AND ALL THAT IT STANDS FOR. STICK TOGETHER, PLAY THEIR GAME, AND FIGHT HARD REPUBLICANS. OUR COUNTRY IS AT STAKE!”
Perhaps most significantly, Speaker of the House Nancy Pelosi announced she would launch a formal impeachment inquiry on Tuesday. 218 members of congress now support an impeachment inquiry—a majority of the 435 members of the House of Representatives.
As we wait for the next big development to break, it’s valuable to pause and zoom out to the bigger picture of American history. To help us to do that, students in my “History and the News” class asked several historians to offer their perspective on this week’s events. Some are established historians; some are in graduate school. All provide valuable insights into the historical context of this momentous week.
“I expect a tumultuous next year”
Eladio B. Bobadilla (@e_b_bobadilla), Assistant Professorat the University of Kentucky
Historically, impeachment has been exceedingly rare. Only two presidents have been impeached—Andrew Johnson and Bill Clinton—and neither was convicted.
The two cases were starkly different. In the backdrop of Johnson’s impeachment lay critical debates about Reconstruction, a defining period for millions of newly-free African Americans and their descendants. The case of Clinton, by contrast, revolved around allegations of personal (specifically, sexual) misconduct. What the two did have in common—beyond the failure of the Senate to convict—narrowly in the case of Johnson, less so in the case of Clinton—was that they proved to be captivating political theater and that they reflected deep divisions in both government and the population.
While historians are hardly equipped to predict the future, I expect a tumultuous next year, especially as we head to an election season that will be defined by impeachment proceedings and political, legal, and social battles.
Impeachment has historically “made it difficult for the president to keep control of the public conversation”
Varsha Venkat (@varsha_venkat_), PhD student at the University of California, Berkley
An impeachment inquiry is more than just an indictment against the president’s actions, historically it has made it difficult for the president to keep control of the public conversation. In our current technological age with the 24 second news cycle, this is probably the most important consequence of the House’s decision to begin an impeachment inquiry. Johnson, Nixon, and Clinton each experienced this loss of control as well. It has also been the case that historically, during the turmoil of an impeachment process, the president has been more receptive towards compromise and even restraint. Though President Trump has not revealed this capacity during his time as president so far, one can hope that even the process of daily hearings and public discussion about his removal will lead him to reconsider some of his more problematic decisions. History shows us that impeachment has been a political decision, and it is important to keep in mind the political consequences for both the Democratic and Republican parties during this process, not just the potential consequences for the president.
“The rule of law is legitimately threatened at this moment”
Jennifer Mercieca (@jenmercieca), Associate Professor at Texas A&M University
The rule of law is threatened by right wing nationalists all over the world. Yesterday we saw institutions in two nations make political decisions to uphold the rule of law. Most people watching the events unfold yesterday in Britain and the US probably didn't notice that both the UK Supreme Court and the US House of Representatives justified their decisions based upon the need to defend the rule of law. That's historic both because the rule of law is legitimately threatened at this moment and because institutions are acting to defend it. I hope, we all hope, that this isn't the moment that future historians will point to as the rule of law's "last stand."
Impeachment “comes with political and historical red flags”
Andrew Hartman (@HartmanAndrew), Professor of History at Illinois State University
Getting impeached would be just desserts for Trump, someone who has always skirted legal boundaries for selfish purposes, a habit that did not end when he became president (in fact got worse because the stakes became much higher). But I would qualify this in two ways: 1) There's a great deal of uncertainty as to whether this will work to the benefit of Democrats, since it will embolden Trump's sense of victimhood which he has played to his advantage for the last four years. Some would argue that political calculation should not factor into impeachment considerations because the law's the law, but this is naive since politics always and especially factors into something as serious as impeachment. Moreover if people are that concerned with law and order in the White House there are hardly any presidents that have not deserved impeachment, which leads me to my second qualification. 2) What does it say when our willingness to impeach is correlated to the potential political advantage and not to matters of war. Nixon should have been impeached for illegally bombing Cambodia, a much more serious offense than Watergate. Reagan should have been impeached for the CIA illegally mining Nicaraguan harbors under his watch. The list goes on and is not limited to Republican presidents. The immoral and unjustifiable Vietnam War was ultimately a Democratic war for which LBJ deserved impeachment or worse. In short, impeaching Trump would feel good to those of us who find him a disaster of a human being and an embarrassment to the nation. But it comes with political and historical red flags.
P.S. I wrote the above before the whistleblower complaint was made public. It seems more likely than ever that Trump might finally be in trouble. The dam seems to have finally broken. Which is good, and perhaps changes the political calculation (though not the historical calculation).
“The law is whatever those in power make it”
Derek Litvak (@TheTattooedGrad), PhD student at the University of Maryland
“No one is above the law.” When Speaker of the House Nancy Pelosi said this, a question I often ask myself came to mind. What is the law? Studying legal and constitutional history has brought me to the same conclusion on many occasions: The law is whatever those in power make it. And therefore, the Constitution, as the “supreme law of the land,” follows suit. However, as much as history has taught me this lesson, it has also shown me how much those not in power can affect change. From the abolition of slavery, to women’s rights, to marriage equality, the law and our Constitution have always been a battleground. Some battles are won. Many are lost. I remain hesitant to put much stock in Speaker Pelosi’s “impeachment inquiry,” as opposed to formal articles of impeachment. But perhaps Democrats are now willing to at least fight the fight, and stop letting those in power make the law what they wish.
“The sustained grassroots pressure…seems to have finally born fruit”
David Walsh (@DavidAstinWalsh), PhD Candidate at Princeton University
To me, one of the biggest questions for historians when thinking about the impeachment saga is, "why now?" After all, Trump committed essentially the same set of offenses during the 2016 election and actively obstructed justice in office while attempting to cover it up -- which ultimately culminated in what essentially amounted to an impeachment referral by Special Counsel Robert Mueller to a Democratic House less than six months ago. And the Democrats punted. So what changed? There are two factors: one, this is an *active* attempt to solicit foreign aid by the sitting president for his ongoing re-election bid. The smoking gun, as it were, is still in the president's hand. That makes this scandal distinctive from Watergate, Iran-Contra, and even Trump's criminal behavior in 2016. Two, the fact that congressional Democrats punted on impeachment six months ago has fueled a tremendous amount of anger at the party from its voting base. The sustained grassroots pressure on Democratic incumbents to, in the words of Congresswoman Rashida Tlaib, to "impeach the motherfucker," seems to have finally born fruit.
Trumps’ behavior is “exactly what the Founding Fathers worried” about
Rick Shenkman (@rickshenkman), founder and former editor-in-chief of the History News Network
Republicans are, it is safe to assume, going to disparage Speaker Pelosi’s decision to begin a formal impeachment inquiry. But given what we know already about President Trump’s behavior an impeachment inquiry is exactly what the Constitution calls for. At the heart of the latest Trump violation of historical norms is the allegation that he turned to a foreign government for dirt on potentially the strongest opponent he is likely to face in the upcoming 2020 election, according to current polls. This goes Richard Nixon one better. To take out his perceived strongest opponent (Ed Muskie) in the election of 1972 Nixon merely resorted to a series of dirty tricks. Trump, in contrast, has shown a willingness to involve a foreign government in his machinations. This is exactly what the Founding Fathers worried might happen and is the reason they went to such great lengths to insulate the presidency from foreign influence. There is a reason the Founders included in the Constitution the clause forbidding a foreign government from bestowing on any federal officeholder "any present, Emolument, Office, or Title, of any kind whatever” without the approval of Congress. They worried deeply that our politics could be upended by foreign meddlers, as in fact happened almost immediately when French Citizen Genet saw fit to involve himself in our affairs in George Washington’s second term. The present case is a kind of reverse Genet Affair. All accounts suggest in this situation the foreign power in question has wanted nothing to do with this nasty business. Rather, it is the president of the United States who instigated it. That surely compounds the crime.
|
e2b06ce3ffce52d5b48ed46feab6b65c | https://historynewsnetwork.org/article/173167 | How Do We Address a Statue of President Roosevelt That Affirms Racist Hierarchies? | How Do We Address a Statue of President Roosevelt That Affirms Racist Hierarchies?
Almost two years after the 2017 fascist rally at Charlottesville around a mediocre statue of Robert E. Lee, the American Museum of Natural History (AMNH) opened its exhibition Addressing the Statue for an unspecified run. The statue in question is James Earle Fraser’s massive “Theodore Roosevelt Equestrian Memorial,” situated outside the Museum’s main entrance, depicting Roosevelt flanked by his gun carriers, a stereotyped Plains Indian and a generic African.
It wasn’t the museum’s fault that another “replacement theory” white supremacist, like those at Charlottesville, carried out a terrorist attacks in El Paso just a few days later. But it wasn’t unconnected either. From hate groups like Identity Evropa to Trump’s evocation of “beautiful” statues, the classicized statue has become a symbol of White supremacy.
In this moment, what does it mean to “address” the Roosevelt statue? In poet Claudia Rankine’s Citizen, the narrator attends a lecture by the philosopher Judith Butler, who is asked why words are hurtful: “Our very being exposes us to the address of another, she answers. We suffer from the condition of being addressable.” The narrator observes: “After considering Butler’s remarks, you begin to understand yourself as rendered hypervisible in the face of such language acts.”
The racist murder of Heather Heyer at Charlottesville made it clear that Confederate statues are such speech acts — and hate speech at that. Their address in public space makes people vulnerable, albeit unevenly and unequally so. In response, cities like New Orleans and Memphis took down at least some of their Confederate statues. Nationwide only some 60 statues out of a total of over 700 were removed.
|
af9838a2e06e8012ac09c59433b86250 | https://historynewsnetwork.org/article/173226 | Incognegro: How Law Enforcement Spies on Black Radical Groups | Incognegro: How Law Enforcement Spies on Black Radical Groups
In the center of the photo - Herb Callender (front, dark suit, Black male) and Ray Wood (sunglasses, bow tie, white suit, Black male)
Attorney General Barr recently gave a speech to the country's largest law-enforcement organization in which he criticized "social justice reformers" and contended there must be "zero tolerance for resisting police". A few days earlier the Young Turks broke the story that the Federal Bureau of Investigation (FBI) listed "Black Identity Extremists" (BIE) as a top priority in its fight against terrorism. The FBI considered BIE a higher priority than Al Qaeda and White Supremacists. According to the FBI documents they acquired, a program codenamed “IRON FIST” planned to use undercover agents to counter these BIE.
The news reminded some of the FBI's Counterintelligence Program (COINTELPRO) of the 1960's and 70's that targeted Black activism. It is important to understand how government agencies, in particular law enforcement, work to obstruct Black people from organizing. The history of COINTELPRO illuminates why so many Black radical and grassroots organizations were previously destroyed and why it remains so difficult to organize today.
COINTELPRO was created to maintain "the existing social and political order". It sought to "prevent the long range growth of militant Black organizations, prevent such groups from gaining respectability". In practice, this amounted to political repression and flagrant violations of first amendment rights to speech, to peacefully assemble, and to petition the government for redress of grievances.
COINTELPRO often worked in conjunction with and received information from local law enforcement "red squads" such as the Bureau of Special Services (BOSS), a special division of the New York City Police department (NYPD). Also known as the Bureau of Special Services and Investigations, its job was to monitor and surveil political radicals.These divisions often relied on informants and undercover agents. Ray Wood, a.k.a. Ray Woodall, was one such informant who, as a police officer for BOSS, infiltrated the Bronx chapter of the Congress of Racial Equality (CORE) in the early 1960s. Some historians like Ward Churchill and Susan Brownmiller have written about Woodbut new information providesa clearer picture of how undercover agents contributed to COINTELPRO.
One of the most significant organizations of the Civil Rights movement, CORE was the first of the non-violent direct action groups. It pioneered many of the tactics and techniques that have come to characterize the movement. Bronx CORE was one of the most militant chapters and was best known for its 1963 demonstrations against employment discrimination at White Castle restaurants. At the height of the campaign, more than 1,000 counter protesters carrying Nazi and Confederate flags and symbols faced off against roughly two dozen CORE demonstrators.
During this same period, Wood infiltrated Bronx CORE, successfully gained its members trust, and became the chair of its housing committee. Wood was a delegate for Bronx CORE to CORE’s 1964 national convention, giving him the opportunity to meet CORE leaders from all over the country. Locally, Wood was a well-known figure and even dated women from other CORE chapters.
While secretly a police officer, Wood often was involved in campaigns that publicly challenged the police. Wood worked closely with Bronx CORE’s chairman Herb Callendar who launched a campaign against police brutality after the White Castle demonstrations. The NYPD commissioner criticized the campaign and characterized Callendar, along with Malcolm X and rent strike leader Jesse Grey, as the "three most irresponsible Civil Rights leaders in the city".
In 1964, Callendar, Wood and another Bronx CORE member carried out the chapter's most audacious action to date: a "citizen's arrest" of the Mayor of New York City (NYC). All three CORE members were themselves arrested but Callendar received an especially harsh punishment and wasplaced in the psychiatric wing of Bellevue Hospital. Callendar would later tell CORE's national director James Farmer that the idea to arrest the mayor came from Wood.(1)
Wood suggested illegal acts to radical groups in the hope the organizations would act on his suggestions and provide police the ability to arrest Black radicals. For example, Wood likely planted the idea to blow up the Statue of Liberty. In 1965, Wood was outed as an agent when a story appeared in the New York Times about his role in instigating a 1965 plot by another radical group, the Black Liberation Front (BLF), to blow up the Statue of Liberty.(2) This was a plan previously suggested to members of East River CORE. Even though he was not a member, Wood went out of his way to attend East River CORE meetings be involved and helpful. As he got to know members, he suggested they get involved in more militant actions likethe Statue of Liberty plan and robbing liquor stores to raise funds for East River CORE. Such actions would have done enormous damage to the reputation and public support of the entire national organization as CORE stressed non-violent action. Unbeknownst to the chapter, Wood was spying on members and eventually reported chapter head Blyden Jackson as a possible member of the Communist Party to the House of UnAmerican Activities Committee.
Further, Wood’s connection to East River CORE, combined with documented connections between members of the BLF and CORE in NYC, suggests Wood used CORE as a stepping-stone into other Black radical circles. Other historians, for example, have discussed how he infiltrated the Revolutionary Action Movement (RAM) in Queens and testified at the Panther 21 trial in the late 1960’s.(3)
By the 1970’s, Wood’s name had faded into obscurity. Historian Garrett Felber’s 2015 Guardian article about Wood reignitedinterest in him. Felber argued Wood mighthave been the mysterious second man arrested at the Audubon Ballroom when Malcolm X was assassinated. As part of the research team for Manning Marable'sbiography of Malcolm X, A Life Reinvented, Felber found notes from Yuri Kochiyama, a member of CORE and the Organization of Afro-American Unity (OAAU) who witnessed the assassination. In OAAU meeting notes, Kochiyama, wrote that "Ray Woods" was "seen running out of (the) Audubon, was one of two picked up by police".
As Felber himself admits, the timing of Wood’s outingin the press (New York Times February 17, 1965) and the date of Malcolm X's assassination (February 21, 1965) means it is possible word already spread among activists about Wood's true identity. While this makes it more unlikely he would have been at the Audubon, it does not mean this theory should be completely discounted. His interactions with RAM and the Black Panthers in the late 1960’s suggests other Black activists in the NYC area may not have known he was an undercover police officer.
Regardless of if he was at the Audubon, Wood and other undercover agents like him had a detrimental impact on organizations like CORE. Wood was not just spying on CORE but was on a mission to "misdirect, disrupt, discredit and neutralize" CORE's leadership. He was acting as a provocateur. What resulted from these operations were not just literal assassinations such as the police murder of the Black Panther's Fred Hampton in Chicago, but character assassinations as in the case of CORE's Callendar and Blyden. Such operations demonized Black leaders in the public eye and thereby de-legitimized both the Civil Rights and Black Power movements.
While BOSS maintains its job was not to facilitate, instigate or provoke the committing of illegal actions, as time goes on, mounting evidence suggests that is exactly what such agencies were doing. What remains to be determined is if Wood's activities were unusual or the standard for BOSS and COINTELPRO.
Today, many of the activities of intelligence programs like COINTELPRO have been deemed illegal after the Church Committee hearings and the Handschu agreement. Nevertheless, recent programs like IRON FIST provide a warning thatthe FBI’s mission is unchanged.
In 2017, historian Robin Spencer wrote that the FBI’s labeling of current activists as 'Black Identity Extremists' was the latest version of COINTELPRO. While BIE's are described as individuals who "use force or violence in violation of criminal law in response to perceived racism and injustice in American society", they are also more vaguely defined as those interested in "establishing a separate black homeland or autonomous black social institutions, communities or governing organizations within the USA". This broad definition could potentially include rappers such as Jay-Z, Black Christian church groups, the Nation of Islam, and the Black Lives Matter (BLM) movement.
The Black Life Matters movement has characterized itself through non-violent direct action. The Guardian, however, ran another story in 2017 detailing how law enforcement used social media to keep BLM activists under surveillance and infiltrate their groups. These techniques fuel distrust within groups andmake it much harder to organize. Creating such an atmosphere of suspicion and paranoia was the goal of programs like COINTELPRO and agents like Ray Wood.
This distrust also affects scholars as it complicates efforts to research the history of the Civil Rights and Black Power movements. Some older activists of the 1960's and 1970's can be distrustful and refuse to be interviewed because of the history of journalists and scholars who have acted as agents. This frustrates efforts to learn about what effect these programs ultimately had on organizations such as CORE.
The history of COINTELPRO is essential to understand both for historians and current activists. There is still a great need to understand what happened to the activists and movement during this time period. We still do not understand the breadth of such programs and the damage done.
Currently, we cannot learn more because many of the program’s records are still classified. Releasing the records of these domestic spy programs could give scholars the tools needed to decipher this important time period in the Black Freedom Movement and give us a better understanding of how the legacy of these programs affect such activists today.
Footnotes
Farmer, James. Lay Bare the Heart, page 263. TCU Press. 1985 “4 Held in Plot to Blast Statue of Liberty, Liberty Bell and Washington Monument”. Homer Bigart, New York Times. Feb 17, 1965 Grady-Willis, Winston. “The Black Panther Party: State Repression and Political Prisoners”, The Black Panther Party (Reconsidered).1998. Baltimore: Black Classic Press.; Zimroth, Peter L. Perversions of Justice—The Prosecution and Acquittal of the Panther 21, Pg 48. New York, Viking Press, 1974
|
cff56187525dcc97d76cdc855ef7bdd0 | https://historynewsnetwork.org/article/173282 | The History Briefing on Pregnancy Discrimination: What Historians Had to Say About This Week’s News | The History Briefing on Pregnancy Discrimination: What Historians Had to Say About This Week’s News
Editors note: This is part of a series called The History Briefing. Contributors, primarily HNN internships, historically contextualize the week's top headlines by summarizing how different historians have added their unique perspective to enhance news coverage.
United States Senator and Democratic presidential candidate Elizabeth Warren made headlines this week over a story she has told on the campaign trail. In the early 1970s, Warren was fired from a teaching job because she was pregnant. While she was initially offered an extension of her job for the next year, after she was visibly pregnant, the job went to someone else. This account was originally questioned by a writer for Jacobin magazine, was then picked up by right-wing websites, and eventually was covered in the national news. Not only has Warren’s story sparked a conversation among the media and women who experienced similar discrimination, but it also inspired many historians to add clarity and context to pregnancy discrimination.
Historian and writer Joshua Zeitz Tweeted his input earlier this week. He explains that “to believe Warren is lying, one need be blind to the history of discrimination in 'pink collar' professions…” In the remainder of his Tweets, Zeitz describes the New Jersey state laws that prohibited the expulsion of pregnant teachers which were put in place a few years after Elizabeth Warren left her teaching job. While many pregnant teachers in the late 1960s, or even early 70s, faced less workplace discrimination in comparision to other professions, the fact that laws had to be enacted to legally bar pregnancy discrimination against teachers shows it was rampant.
History professor and Huffington Post contributor David M. Perry added nuance the historical analysis of pregnancy discrimination via Twitter. Perry tweeted:
Actual news: Over 60 women have now accused Donald Trump of sexual harassment, abuse, assault, and rape.
NYT A1: We're launching a week by week coverage of Elizabeth Warren's pregnancy as it happened all those years ago, probably
Satirical in nature, Perry’s joke has a serious message about the treatment of women in not only the workplace, but also in the media. The rampant sexual harassment and assault women experience is a minor story while questioning the veracity of a woman's experience of discrimination is headline news. While Perry's tweet is short, it helps us question the media's narrative and how it is rooted in historical sexism.
Historians and journalists have discussed the issue of pregnancy discrimination long before it made headlines this week. In Lily Rothman's article “The Complicated History Behind the Fight for Pregnant Women’s Equality,” Rothman chronicles the history of discrimination that soon-to-be mothers have faced in the workplace and emphasizes that pregnancy discrimintion is still a big problem today. In 2014, for example, the Supreme Court heard the case of Penny Young who sued UPS because she was given unpaid leave for not being able to complete the laborious tasks required for her job while pregnant due to a recommendation from her doctor. As Rothman explains, the case hinged on the Supreme Court's interpretation of the Pregnancy Discrimination Act of 1978.
Nancy Woloch, a professor of History at Barnard College, previously wrote a piece for HNN on paid maternity leave. Woloch's article outlines the evolution of legal protections for people working while pregnant. Title VII of the Civil Rights Act of 1964 prevented "employment discrimination on the basis of sex”. The Medical Leave Act of 1993 ensured that pregnant women can take time off work without fear of retribution from the company.
|
5d7e58c9eda491d5eed74fee798907e9 | https://historynewsnetwork.org/article/173284 | Why We Must Impeach | Why We Must Impeach
With a single telephone call, Donald Trump betrayed the presidency in ways almost unimaginable until that moment. During the call, he attempted to pressure a foreign leader to help him smear and destroy both a chief political opponent and that opponent’s political party to benefit himself in a presidential election. This offense differs from all his other transgressions, venal corruptions, and daily degradations of the office. It is an attack on the foundations of our republic, turning diplomacy into a weapon of personal and partisan political power. The nation’s founders understood, having fought a revolution against monarchy, that no government of the people was invulnerable to such egregious abuses of power. They were particularly concerned, as Alexander Hamilton wrote in the Federalist Papers, that a president, through “cabal, intrigue, and corruption,” might help “foreign powers to gain an improper ascendant in our councils.” In their wisdom, they created a mechanism to halt this disloyal corruption in its tracks: impeachment.
Impeachment is a severe measure of last resort, which ought to be used only in the most extreme cases. In the United States, the voters are supposed to decide who governs. That’s what the Framers of the Constitution had in mind when they formed a new government in which, at every level, ultimate sovereignty lay in “We, the People.” Elections legitimately won cannot be illegitimately undone at the whim of a faction or party. They should only be undone by throwing the bum out at the next election.
What happens, though, if a president uses the powers of office to disrupt the next election? What if that president does so by brazenly enlisting the aid of a hostile foreign power? Or if he does so by trying in secret to extort cooperation from a foreign ally threatened by that same hostile power? What if the president has denied the existence of an ongoing systematic cyberattack from the hostile power, which every U.S. intelligence service calls a clear and present threat to our democracy? What if the actions of that president raise urgent questions about the legitimacy of the next election and cast a darker cloud over how he gained the office in the first place?
There have been earlier impeachments and interferences with democratic institutions in our history, but nothing like this one. In this, as he likes to say, Trump truly stands alone. He has assaulted American democracy, claimed he has the authority to do so, and dared anybody to do anything about it, dismissing with contempt Congress’ clear constitutional authority to oversee and check the executive branch. He thinks he can use the office of the presidency as a personal instrument, along with private emissaries, to desecrate the rule of law and then protect himself from the consequences. He even declares in public that he is the law, claiming that, according to Article II of the Constitution, “I have the right to do whatever I want as president.” Not even the most corrupt and criminal of our previous presidents has tried to pervert our most sacred institutions as openly as Trump has.
At the dawn of the republic, during the troubled 1790s, the incumbent administration of President John Adams took extraordinary actions against a mounting opposition led by Vice President Thomas Jefferson, arguably interfering with national politics more directly than Trump. By signing the repressive Alien and Sedition Acts in 1798, Adams outlawed public criticism of himself or any member of Congress. More than 20 Republican newspaper editors went to jail, as did a Vermont congressman. Yet as Congress had initiated the new laws, there was never a question of impeaching Adams. What Jefferson called “the reign of witches” would end only when Adams very narrowly lost re-election to the Virginian in 1800.
|
e93372da6246bafdb438cb66db52ad13 | https://historynewsnetwork.org/article/173306 | A newspaper accused the president’s family of profiting from a foreign deal. The president sued. | A newspaper accused the president’s family of profiting from a foreign deal. The president sued.
The president was furious over “scurrilous and libelous” newspaper articles alleging wrongdoing by men linked to his administration. Somebody should sue the newspapers for libel, he suggested. Maybe even the federal government should do it.
The president was Theodore Roosevelt, who in 1908 was stirred to anti-press rhetoric that foreshadowed the anger of President Trump about what he calls the fake-news media. That anger has ramped up as Trump, his personal attorney Rudolph W. Giuliani and their associates have come under scrutiny for an alleged foreign influence campaign of their own.
But back in 1908, Roosevelt campaigned for fellow Republican William Howard Taft to be his successor. A month before Election Day, Joseph Pulitzer’s New York World disclosed charges that a secret American “syndicate” made “huge profits” from the U.S. purchase of the Panama Canal property from France for $40 million. Those involved allegedly included relatives of both Roosevelt and Taft.
The day before the election, the Indianapolis News published an editorial asking, “Who Got the Money?” The result was the biggest presidential attack on the press since John Adams jailed journalists under the notorious Alien and Sedition Acts, which barred criticism of the president.
In contrast to Trump, Roosevelt generally enjoyed friendly relations with the press after the former vice president succeeded President William McKinley, who was assassinated in September 1901. On returning from McKinley’s funeral, the 42-year-old Roosevelt met in the White House with reporters from three major news services. “I shall be accessible” to you for information, Roosevelt said. But he added, “If you even hint where you got it, I’ll say you are a damned liar.”
|
af13628038f85b2aa479d09bf2544e6b | https://historynewsnetwork.org/article/173344 | Melania Trump Just Restarted a 100-Year-Old Political Controversy: The White House Tennis Court | Melania Trump Just Restarted a 100-Year-Old Political Controversy: The White House Tennis Court
On Tuesday October 8, with impeachment speculation swirling and increasingly disturbing reports coming out of Syria, First Lady Melania Trump broke through the noise to share some good news: Ground was being broken for the construction of a new tennis pavilion at the White House.
The 1,200 square foot pavilion, we learned, will replace a small, lattice-covered bathroom structure currently on the site. The White House tennis court itself, in its current location for the last 40 years and retrofitted most recently with a basketball hoop and court lines for Barack Obama, will remain mostly untouched.
“It is my hope that this private space will function as a place to gather and spend leisure time for First Families,” the First Lady said in a statement. She also clarified that this “Legacy Project” would be funded entirely with private donations. Like the First Ladies that preceded her, Melania intended to leave the White House a better place than she found it.
Not surprisingly, the announcement caused the Twittersphere to lose its collective mind.
@TonyPosnanski’s tweet captured the general mood of those that responded to @FLOTUS.
“Thousands of our allies are being attacked because we abandoned them because of your husband, your husband is attacking the Constitution, and your husband is bullying Americans, but congrats on the new tennis court. Seriously, [explitive] you. You are an embarrassment.”
The response to the announcement became as much as story as the project itself. “Melania Trump Trolled Over Her ‘Legacy Piece’: Does the White House Need an Entire Tennis Court Pavilion?” Newsweek asked.
If the Trump White House is looking for advice on how to handle the blowback—It’s just a tennis pavilion; We just wanted to make the White House grounds a bit more beautiful—perhaps it should look back to the administration of Theodore Roosevelt. After all, it was TR who brought tennis to the White House grounds in the first place.
Shortly after ascending to the Presidency following the assassination of William McKinley in 1901, Roosevelt requested funds from Congress to overhaul the White House. The executive mansion needed the attention; problems varied from exposed wiring (fire hazard) to cramped office space.
As part of the renovation, the landscaping crew—working under Edith Roosevelt’s watchful eye—installed the first White House tennis court. They placed it immediately adjacent to the President’s executive office, on the spot where the Oval Office sits today.
As the court neared completion, both the Washington Post and New York Times picked up on the story. The “President’s Children to Have a Model Playground Adjoining his Office,” the Post reported.
Not everyone approved. Roosevelt’s critics seized upon the White House new tennis court as a sign that the President was out of touch with the average American. One Tennessee newspaper, for example, suggested that the nation could hardly afford to keep Roosevelt in the White House. “The White House has been enlarged at an expense of $500,000,” the paper wrote, “a $2,500 tennis-court has been built for his children, and the living expenses have been about triple.” The paper called for an end to “this carnival of graft and extravagance.”
A debate about the court ensued. Roosevelt’s steadiest literary supporter, Outlook, argued in defense of the tennis court. “Is the President Extravagant?” No. “It is true that there is a tennis-court on the White House grounds, but it cost less than … the greenhouses under the previous administrations.”
The editors of Outlook put forth an early form of life-balance counseling. “We think there can be no serious objection on the part of any decent American to the President playing tennis with his children, and it is impossible for them to play tennis except on the White House grounds.” The Republicans liked the Outlook article so much, they entered it into the Congressional record.
Puck, a devilishly satirical publication, questioned what exactly would transpire on the White House tennis court. “The mutter of conspiracy is heard,” Puck editorialized. Perhaps the White House tennis court simply provided cover for other activities. “Who questions the happy outcome of conference or confab, the parties of which have previously lobbed and smashed, volleyed and served together on a common level the smooth delightful level of the White House tennis court?”
For Roosevelt, however, the White House Tennis Court eventually went from being a political liability to an asset.
Stories of Roosevelt’s long, sweaty matches—sometimes against unprepared foreign dignitaries—came to bolster his reputation as a purveyor of “The Strenuous Life.” The group of 30 or so regulars at the court became known as Roosevelt’s “Tennis Cabinet.”
The fact that Roosevelt went public at times about his struggle to keep his weight under control, and thus felt compelled to fit tennis (or boxing or hiking) into his schedule, also resonated in a nation struggling with the realities of urbanization and industrialization.
While Melania declared her tennis pavilion a gift to future White House inhabitants, it seems likely that TR’s tennis court came about as a spousal nudge from Edith. Edith was concerned about her husband’s growing waistline. Life in the White House, Roosevelt admitted as the tennis court was being finished, “has been very conducive to me getting fat.” Edith certainly noticed. Once complete, the court, just steps from the President’s desk, served as a reminder to TR to get out and exercise.
Effort trumped expertise on the TR’s White House tennis court. “My impression is that father didn’t play a great game, but played very hard,” Roosevelt’s always-candid daughter Alice explained. Or as another observer put it: “He played tennis vigorously on the White House courts, though he never became very expert, there being no danger at any time of the President’s entering the National Tennis Tournament at Newport.”
As Roosevelt’s administration neared its end, the narrative regarding the White House tennis court took on an exuberantly positive tone. No journalist portrayed Roosevelt as a tennis snob playing on his own taxpayer-provided court; rather the press fought amongst itself to see who could best capture the image of a President of the United States valiantly competing on the court even though he had a country to run. The President finds time for exercise, the thinking went, thus so should you.
After Roosevelt left the White House, William H. Taft took over and oversaw the bulldozing of TR’s court in order to make room for further West Wing improvements. Taft cared little about the change; he preferred golf to tennis anyhow. Landscape architects configured a new court into the south lawn area of the grounds. The court was moved several times before taking its current position. In 1989, President George H.W. Bush signed off on improvements to the tennis court, which until Obama retrofitted it for basketball, made the court what it is today.
The Tennis Court snagged other victims along the way. It was on the tennis court, so the story has long gone, that Calvin Coolidge Jr. got a blister, that then got infected, which then led to the teenager’s death from blood poisoning in 1924.
For Jimmy Carter, the White House tennis court became a symbol of a weak, distracted, micro-managing President. Late in his term, a White House insider wrote a tell-all accusing Carter of personally managing all requests to use the tennis court. Carter denied the story, but it stuck.
At a press conference on April 30, 1979, after talking about energy conservation, and the Soviet threat, and strategic arms limitations, Carter tried to put the tennis court issue to bed:
“The White House tennis court: I have never personally monitored who used or did not use the White House tennis court. I have let my secretary, Susan Clough, receive requests from members of the White House staff who wanted to use the tennis court at certain times, so that more than one person would not want to use the same tennis court simultaneously, unless they were either on opposite sides of the net or engaged in a doubles contest.”
Needless to say, the non-denial denial did nothing to help Carter’s image.
The lesson in all of this? Beware of the White House tennis court. Or more directly: Presidents, be wary of associating with country club sports during times of political crisis.
As Theodore Roosevelt explained it: “I myself play tennis, but the game is a little more familiar; besides you never saw a photograph of me playing tennis.”
Perhaps just for someone like President Donald Trump, Roosevelt expounded even further. “I am careful about that,” Roosevelt said of publicity regarding his connections to sports. “Photographs on horseback, yes; tennis, no. And golf is fatal.”
|
f9eb03565113bace9a1c2324aaa4301c | https://historynewsnetwork.org/article/173359 | The Internet At 50: How the Dot-Com Bubble Burst | The Internet At 50: How the Dot-Com Bubble Burst
This is the second article in a series reflecting on the Internet at 50. For the first article, click here.
As the new millennium began, greed, ignorance, and misplaced hopes within the tech world nearly destroyed the financial potential for the internet. But as described by Harlan Lebo, author of 100 Days: How Four Events in 1969 Shaped America (Amazon,Barnes & Noble), the real message that emerged after the dot-com bubble burst had even more important implications for the role of the internet as an enduring global force.
* * * * * * * *
“When will the Internet Bubble burst?” For scores of 'Net upstarts, that unpleasant popping sound is likely to be heard before the end of this year.”
– Jack Willoughby, Barron’s, March 2000
* * * * * * * *
It was too good to last.
By the late 1990s, the internet had evolved beyond anything that the pioneers of digital technology could have imagined 30 years earlier. From the first crude connections that had linked computers for academics and government agencies, the internet had blossomed into a dynamic and wildly-popular technology for a rapidly-growing public audience.
And with that popularity, the internet became a river of investment opportunity and potential profits for dot-com developers and entrepreneurs.
The formation of online companies – quickly dubbed “dot-coms” – became the business trend of the decade. With almost-daily unveiling of new dot-com enterprises, multi-million-dollar investment deals, and even bigger stock offerings, the prospects for a new era of internet-based business never looked brighter.
From the mid-1990s until 2000, investing in budding dot-coms was the wildest of rides, expanding within an aura of wealth, power, and optimism that had become the hallmarks of the go-go internet world.
Lavish spending on marketing reached a high-profile peak on January 30, 2000, when 14 dot-com companies each paid more than $2 million to advertise during Super Bowl XXXIV – inspiring the game to be called the “Dot.com Super Bowl.”
But behind the extravagant spending and flashy deals festered a problem – a simple, disaster-provoking problem: for the most part, neither the new dot-com companies nor the investors who bought into them had the slightest idea what they were doing.
* * * * * * * *
Much of the “growth” of new internet companies was a façade, an industry fed by novelty and perceived investment potential – but in most cases without planning or financial evidence to back up the talk. Hard-boiled financiers threw common sense out the window, investing in companies that, with even a moment of consideration, would have been viewed as the most absurd folly.
In retrospect, investment mistakes are always crystal-clear, but even so, the depth of the miscalculations in the late 1990s now seems unfathomable.
“Investors desperately, desperately wanted the dot-coms to succeed,” said Jeffrey Cole, director of the Center for the Digital Future at USC Annenberg. “Company management offered promises about the potential for their startups, and backers had expectations that had nothing to do with reality.
“The dot-com bubble,” Cole said, “was business plans written on the backs of napkins.”
The problem for many of the start-up companies was demonstrated in a single question from editor Rich Karlgaard to a young vice-president of “business development” at a start-up. When Karlgaard asked if the dot-com was profitable, the executive said, “We’re a pre-revenue company.”
In 2000, the bubble burst.
* * * * * * * *
What pin had pricked the surface? Some warnings had been coming from calmer voices, but the reckless types viewed the alerts as unwelcome noise. With legions of companies operating with no rational business plans for short-term survival – let alone long-term success – and most roaring ahead with a “grow big, grow fast” mentality, the collapse was inevitable.
On March 10, the prices of dot-com stocks peaked – the slide began.
An indisputable alarm came on March 20, 2000, when Barron’s, the weekly financial magazine, splashed its cover with drawings of mounds of cash on fire behind the headline “Burning Fast.” The issue featured a study of more than 200 internet firms, with the publication’s analysis of “which ones could go up in flames, and when.”
“When will the Internet Bubble burst?” asked columnist Jack Willoughby in his column titled “Burning Up” that preceded the study. “For scores of 'Net upstarts, that unpleasant popping sound is likely to be heard before the end of this year.”
Barron’s followed up the original story three months later, this time with “burn rates” for internet companies that were blazing through their cash at the end of 1999; by the time the list appeared in Barron’s, the problems were much worse. At the top of the list of companies draining their reserves were such now-forgotten names as Netzee, CDnow, Boo, Beenz, eToys, Flooz, Kozmo, and Netivation; none would survive. For many other dot-coms as well, the cash from investors was beginning to run out.
By April 6, dot-com stocks had lost nearly $1 trillion in stock value.
* * * * * * * *
The consequences of the bubble’s burst dragged on for several years – the worst of them in 2000 and 2001 – as a growing list of dot-com companies floundered under the weight of too-high expectations and too-low revenue.
The fate of two companies in particular tells much of the story of the business misjudgments and the misplaced investor enthusiasm that created the dot-com collapse. Perhaps the most high-visibility example of the peak and downfall was Pets.com, which called itself “a new breed of pet store.”
Pets.com debuted in February 1999 – with financing from some of the premiere venture capital companies – selling a full line of supplies for America’s pet owners. Marketing for Pets.com was backed by plenty of traditional print advertising, but it was the company’s mascot, a sock puppet of a ragged-eared dog that appeared in dozens of television commercials and became the company’s high-profile face to the public.
The puppet (voiced by comedian Michael Ian Black) became instantly popular with a celebrity presence that extended far beyond corporate marketing: the puppet was “interviewed” on talk shows, and had his own giant helium balloon in the 1999 Macy’s Thanksgiving Day parade.
But within months, the puppet would become the poster child for the entire meltdown.
Even with such a high-visibility position in retailing, Pets.com was never a sustainable enterprise. The company lost money almost every time a purchase was made, as it sold millions of dollars’ worth of products for as little as one-third of their cost in the hopes that customers could be converted to high-margin buying.
In spring 2000, Pets.com spent $17 million on sales and marketing, at the same time bringing in half that much in revenue. By autumn, the company was spending $158 for each customer it acquired.
(Perhaps the Pets.com leadership should have heeded the words of their own mascot; among the puppet’s many antics in commercials, it could often be heard singing the first line from the song, “Spinning Wheel,” by Blood, Sweat, and Tears: “what goes up, must come down….”)
Later, many would ask: what could explain the reasons that investors sank money (literally) into the company?
“Perhaps venture capitalists should have been leery of Pets,” wrote tech columnist Mike Tarsala, “since even off-line retailers barely make any margin on pet food – the company's staple seller. The money came rolling in anyway.”
The company’s strategy could not last; less than a year after the puppet balloon floated through Manhattan, on November 9, 2000, Pets.com stopped taking orders, and the company laid off most of its 320 employees. In June 2008, CNET named Pets.com as one of history’s greatest dot-com disasters.
The demise of Pets.com may have been a high-profile debacle, but other meltdowns were even more costly, including several that showed just how unaware dot-com investors could be – even when alerted to problems.
Possibly the worst of all was Webvan.com, the grocery delivery service, which opened in 1996 operated by a team of executives – not one of whom had management experience in the supermarket industry.
When Webvan stock went on sale in November 1999 – and in spite of public notices that the company had already lost more than $65 million for the year and warned of losses for “the foreseeable future” – the stock sold for 65 percent over its initial offering price.
With huge expenses – at one point committing $1 billion for construction of distribution centers and delivery trucks – Webvan expanded too quickly, its costs far outstripping its revenue by millions, then hundreds of millions. The prospects for attracting customers were unrealistic and the returns were low; on July 8, 2001, the company website carried the notice, "We're sorry. Our store is temporarily unavailable while it is being updated. It will be available again soon."
The next morning, 2,000 Webvan employees were laid off, and company closed – eight months after the initial stock offering. Overall, the company lost $830 million – reportedly the largest of the dot-com disasters.
* * * * * * * *
But many of the more responsible dot-coms survived the bubble relatively unscathed, including eBay, Priceline, Craigslist, Monster, WebMD, and others that still thrive today. All were companies that had not over-promised and had not over-expanded, and each had something that almost all of the failed dot-coms had lacked: a thoughtful business model based on solid financial planning and realistic projections.
After the bubble, there were some well-earned opportunities for “I-told-you-sos.” In 1999, superstar investor Warren Buffett had warned early investors – those whose stock had risen based on unreasonable expectations – to get out before the end came.
"After a heady experience of that kind," Buffett said of the gains in previous years, "normally sensible people drift into behavior akin to that of Cinderella at the ball. They know that overstaying the festivities...will eventually bring on pumpkins and mice."
Buffett – whose purchases of companies in 2000 did not include a single technology firm – was pummeled by critics for his seeming lack of vision. But in 2001, with his investments intact, he looked back on the fallout, saying, “The fact is that a bubble market has allowed the creation of bubble companies – entities designed more with an eye to making money off investors rather than forthem.”
When the dot-com dust had cleared, the results were gruesome: by 2004, more than half of new dot-coms – hundreds of companies – had failed. About $5 trillion in stock value was lost. Hundreds of cocky start-up executives who through initial stock offerings had been made instant millionaires – on paper at least – found themselves penniless.
And thousands of employees – some estimates as high as 85,000 – confident that they had joined exciting and viable ventures, were abruptly on the street. The ripple effects also damaged the value of other successful dot-coms, and of computer and software companies as well.
Perhaps worse – but understandable given the financial debacle – investors temporarily lost faith in new dot-com investments, whether they were sustainable or not: in 1999, 107 start-ups doubled their stock value on the first day; in 2000, the number dropped to 67; by 2001, the number was zero.
Of the 14 dot-coms that advertised on the 2000 Super Bowl, in less than a year, five were gone. For the next Super Bowl, E-Trade, a company that survived the bubble, produced a commercial that showed a chimp riding a horse through a ghost town of defunct dot-coms. The ad ended with the single line: “Invest Wisely.”
* * * * * * * *
As a cautionary tale and a business school lesson about irrational investor expectations, no modern example proved better than the dot-com bubble. But even more telling about the role of the online technology in the American experience was the viewpoint that emerged after the disaster which revealed the perception – a hope to some – that the internet was going to wither, if not completely disappear.
“After the bubble burst,” said Cole, “it was amazing to see how many people in industry assumed that the collapse meant the end of the internet itself.”
“We had been studying the internet since the early 90s,” Cole remembered, “and at meetings I would be asked, ‘now that this internet thing is over, what are you going to do now?’ They assumed that when the bubble burst, the usefulness of the internet had ended – and as a result they wouldn’t have to relearn how the business world works.
“And I wasn’t just hearing this view from leadership in retail – it was journalists, advertising executives, and people in other fields as well.
“But we knew,” Cole said, “that in spite of the bubble burst, a failure of the internet could not be farther from the truth.”
Those who watched the online world could see that not only was ‘the internet thing’ still relevant, but it was more popular than ever.
Even while the dot-com debacle festered as daily news between 1999 and 2002, Internet use did not decline at all – in fact going online continued to increase. By 2001, at the peak of the crash, more than 70 percent of Americans were internet users, and were spending an increasing amount of time online at home every day, and at work as well.
Even after the collapse of many dot-com retailers, the number of Americans who bought online grew as well; by 2001, half of internet users had also become internet buyers – and continued to buy online.
In spite of the burst of the dot-com bubble, the message was clear: America had no intention of giving up on the internet.
|
32f425c693948f0f7dea2265ef0486f0 | https://historynewsnetwork.org/article/173413 | The Internet at 50: The Night the Internet Was Born | The Internet at 50: The Night the Internet Was Born
Above: The laboratory’s logbook from the night the Internet was born
This is the third article in a series reflecting on the Internet at 50. For the first article on the four developments that created the world wide web, click here. For the second on the dot com bubble burst, click here.
On October 29 1969, computers at UCLA and the Stanford Research Institute were connected in the first tentative experiment that would later be recognized as the birth of the internet. But as Harlan Lebo, author of 100 Days: How Four Events in 1969 Shaped America (Amazon,Barnes & Noble), points out, 50 years ago there was no expectation of where the achievement would lead, and the real impact would not begin to be understood for almost 20 years.
____________________
“Here is a question: how many revolutions do you
know that you can tell the exact minute when the
revolution began?”
– Leonard Kleinrock, UCLA
UCLA
October 29, 1969
9:30 p.m.
Boelter Hall is a nondescript but pleasant enough brick building on the UCLA campus, framed by California olive trees and bordered by the grass-lined walkway known as the Court of Sciences in the south section of the university.
During the day, Boelter Hall teems with engineering students. But in the evening, with most student housing far across the campus, the building descends into quiet solitude – an ideal setting for work that requires time and focus.
October 29 was a perfect night to change the world.
Charley Kline, a graduate student in the engineering school’s computer science department, viewed the late hours as a time to work in Boelter free of distractions.
“I was a tech guy who liked to program at all hours,” Kline said, “and it was much easier for me to stay focused in the middle of the night.”
But “to program” in 1969 was vastly different from today’s UCLA engineering student, pecking away on a two-pound laptop while sitting in a coffee shop in Westwood a few blocks from campus. For Kline, and a generation of students studying in the young field of computer science, a PC of any size or price was almost a decade away; “to program” meant working in an on-campus laboratory, “computers” were room-filling systems, “keyboards” usually meant cumbersome stand-alone terminals shrouded in sheet steel.
That night, Kline’s project would be relatively simple, and involved testing a new system that had been installed at UCLA for almost two months: an experimental project funded by the federal government to create links between computers in locations across the country.
Simple – if it worked.
For Kline, such assignments were departures from the traditional education that most computer science students pursued in the 1960s. Although some opportunities in computing involved government or academic systems that were used for scientific research and calculation, in that era, jobs in computer science generally meant support for large systems that served as giant calculators and billing machines for banking and other industries.
But Kline sought different types of opportunities.
“I was interested in exploring the problems that were emerging in a world where computers worked independently with some success, but had a great deal of trouble communicating with each other,” Kline recalled. “Our goal was to determine how to make them talk.”
Kline found an opportunity for that mission in the laboratory of Leonard Kleinrock, who at 35 was already recognized for developing a mathematical theory of the methods to create communication pathways between computers at different locations.
In 1969, Kleinrock’s principal project to validate his theoretical discoveries was to participate in building an experimental network, a system that would, according to the July 3 press release, “for the first time, link together computers of different makes and using different machine languages.”
“As of now, computer networks are still in their infancy,” Kleinrock explained in the release, “but as they grow up and become more sophisticated, we will probably see the spread of ‘computer utilities,’ which like present electric and telephone utilities, will service individual homes and offices across the country.”
Creation of the network, reported the release, “represents a major forward step in computer technology and may serve as the forerunner of large computer networks of the future.”
Almost 50 years later, Kleinrock recalled, “In simplest terms, we were trying to shift the thinking from everyone using a large stand-alone computer, to a linked network that could exchange information.”
But for the moment, functional networks were still to come; first came learning how to create practical connections between computers, with a goal of linking systems at universities, government agencies, and scientific institutions so they could communicate and exchange information. To start, four computers would serve as the foundation of the system: machines at UCLA, the Stanford Research Institute, UC Santa Barbara, and the University of Utah.
* * * * * * * *
At UCLA, just receiving the delivery of the computer in 1967 became a logistical headache; the computer – a Sigma 7 built by Scientific Data Systems of Santa Monica – was eight feet wide and almost six feet tall, so cumbersome that no elevator in Boelter Hall could accommodate it. To move the Sigma 7 into the building required a forklift on the loading dock at the back side of the building to raise the plastic-wrapped computer to the third floor, where a section of railing had been ripped out to allow the equipment to slide through.
In the lab, the Sigma 7 was connected to a Honeywell DDP-516, a “mini-computer” (merely the size of a refrigerator). The Honeywell was chosen not only for its price and performance, but also for its rugged structure built to military specifications; to demonstrate to visitors the computer’s physical strength, Kleinrock would pound on the cabinet with his fist.
The Honeywell was equipped with additional technology created by the consulting firm of Bolt, Beranek, and Newman, a Cambridge, Massachusetts company that added the parts to transform the computer into an “Interface Message Processor,” better known as an IMP (this was the first device now called a “router”).
When attached to the Sigma 7, the IMP would – everyone hoped – serve as an all-purpose gateway that would link computers in many locations, built by separate manufacturers, created for a range of purposes, and all using different types of programming languages. It would be an ambitious project.
The first IMP – today still identified with the tag that marked it as node #1 in the national network to come – had been delivered to UCLA on August 30; three days, later, Kleinrock’s team successfully linked the IMP to the Sigma 7. With each IMP requiring a month to construct, the second was ready late in September. On October 1, it was delivered to the Stanford Research Institute in Menlo Park, 350 miles north of UCLA. Later, the third and fourth IMPs would be sent to UC Santa Barbara and the University of Utah, completing the equipment for the quartet of computers that would be the start of the new network.
The next step was to encourage the machines to talk, listen, and respond.
Around 9 pm, Kline walked across the Court of Sciences to the entrance of Boelter, and then downstairs to room 3420, the home to the UCLA Network Measurement Laboratory, where the new computers had been shoehorned through the door.
Kline sat down at the industrial-metal desk next to his terminal, picked up the phone, and dialed a number in Menlo Park.
* * * * * * * *
At the Stanford Research Institute, Bill Duvall was waiting for Kline’s call. Duvall, at 25, was working full-time at the institute. A nonprofit research organization in Menlo Park that was spun off from Stanford University in 1946, the Institute (now known as SRI), had been established to serve as a “center of innovation” – an organization-for-hire that conducted research, designed products, and developed plans for civic agencies and private industry.
For Kline and Duvall, the task that night was clear.
“Our goal was to test the capability of the UCLA machine to log in to the computer at SRI,” said Kline.
It would have seemed a simple experiment, but in practice, the process was much more complicated. That night they would try it.
* * * * * * * *
At 9:30 p.m., Kline and Duvall, each on a telephone headset, powered up their equipment and activated their experimental operating systems that would allow Kline to connect.
Just after 9:30, Kline typed a letter.
"The first letter I typed was an L," Kline said. On Duvall’s terminal, the “L” appeared.
Kline tried again; he typed the “O.”
“I got the ‘O,’” Duvall reported.
But that was all; the computer at SRI overloaded and the connection crashed. Two letters was as far as they got.
But two letters were enough. For at least a moment, the connection had worked. The first communication between the computers was "LO” – an inadvertent, almost-biblical declaration of the beginning of a new age.
“We couldn’t have planned a more powerful, more succinct, more prophetic message,” Kleinrock remembered.
Duvall was able to quickly fix the problem, and an hour later the two machines were again connected, with Kline successfully logging in. At 10:30 p.m., Kline duly recorded the moment by writing the result in the laboratory’s logbook (see image at top of article), and then went home to bed.
Duvall did not see the need for festivities either. He stopped by a local hangout for a burger and a beer.
“It was no celebration,” Duvall said. “I was hungry.”
* * * * * * * *
Kline and Duvall did not mark the moment because they did not realize they had a reason to celebrate. The pair viewed their work that night as simply another step in what they knew would become a long and complex series of technological events.
But leading to what? The link between computers at UCLA and SRI was never intended to become the indispensable technology used daily by billions. Kline, Duvall, Kleinrock, and hundreds of other computer scientists developing 1960s technology had hopes of building a system that would, perhaps at most, connect computers so they could easily exchange information and allow their users to communicate with each other.
At a time when the first personal computers would not appear until the late 1970s, and public access to the internet would not be available for almost 15 years after that, a future filled with billions of websites, online shopping, social media, and instant global access to information was not even the remotest practical consideration. Such miracles were being pondered only as fanciful theory – and on a much more limited scale – by a handful of visionaries.
The birth of the internet – if one technological link in a long chain can be described as a “birth” – had no emotion associated with it; there were no ticker-tape parades, no drama of Thomas Edison watching the first light bulb burn while he contemplated the enormity of what he had done. But the connection achieved on October 29, 1969 was, if nothing else, the starting point of a journey leading to technology that not only succeeded in its original objective, but would evolve into a phenomenon for communication beyond anyone’s most outrageous expectations.
Like all great technological achievements, the internet as we know it exists thanks to a serendipitous intersection of events, people, and inspiration. Over the decades since, some of those combinations would thrive and change the most fundamental activities of work, play, and human interaction; others would fail spectacularly.
Perhaps the most astonishing issue yet to come about the internet would be the extraordinary story of its emergence, as it progressed from a fragile connection between two computers in 1969 with a modest intended function into the most pervasive communications tool of its age – possibly of any age – affecting everything we do, everything we say, and everything we achieve.
The internet serves as an instrument for soaring to creative heights, and spotlights troubling questions about the lowest forms of human depravity. It produces unprecedented opportunities for social interaction while raising deep questions about personal privacy and national security. And because of the internet, perhaps more than any other human advancement, the world is now a much different place than before it arrived, and continues to be reshaped as the technology evolves.
But on October 29, 1969, all of that was years in the future. If there was a single moment that would define the start of the technology that would become the internet – this was it: the future was born.
|
a30b0f10c53a3b3ffbe609b961e1d8e3 | https://historynewsnetwork.org/article/173429 | Washington Post Quotes Historian Ryan Swanson in Article about Teddy Roosevelt and Baseball | Washington Post Quotes Historian Ryan Swanson in Article about Teddy Roosevelt and Baseball
...
“Father and all of us regarded baseball as a mollycoddle game,” his daughter, Alice Roosevelt Longworth, once said. “Tennis, football, lacrosse, boxing, polo, yes — they are violent, which appealed to us. But baseball? Father wouldn’t watch it, not even at Harvard.”
This drove the lords of baseball insane, a story vividly and rather hilariously told by University of New Mexico sports historian Ryan Swanson in his new book “The Strenuous Life: Theodore Roosevelt and the Making of the American Athlete.”
“He doesn’t think it fits into what he thinks sports should be,” Swanson said in an interview. “Roosevelt thinks sports should make Americans better citizens. They should test themselves physically.”
Roosevelt never admitted it, but Swanson also suspects his opinion of the sport might have also developed from his inability to play it. Roosevelt had poor eyesight even before he lost the use of one eye during a White House boxing match.
Stepping into a batter’s box might have gotten Roosevelt killed.
“At one point, he says he fears nothing like he fears a baseball coming at him in the dark,” Swanson said.
Whatever the reason, baseball officials went to extraordinary lengths to turn Roosevelt into a baseball fan, an “effort anchored,” Swanson wrote, in “a broader plan meant to link the president to baseball.”
|
8b87e51a7378fe9146628a3467a7a04f | https://historynewsnetwork.org/article/173440 | National Security Archive Publishes New Briefing Book on Nuclear Weapons and Turkey Since 1959 | National Security Archive Publishes New Briefing Book on Nuclear Weapons and Turkey Since 1959
The current crisis with Turkey over Syria has raised questions, yet to be resolved, about the security of 50 U.S. nuclear weapons stored at Incirlik Air Base. These questions have been posed before, going back almost to the start of nuclear deployments in Turkey in 1959. How the United States responds carries implications for the region, for U.S.-Turkey relations, and for NATO. Today, the National Security Archive is posting a selection of declassified documents from various sources, including the Digital National Security Archive, in order to provide historical context to the situation.
The facts of nuclear deployments in Turkey have been an official secret for decades, but it is no secret that they are a legacy and a relic of the Cold War. It is also no secret that the deployments caused anxiety in Washington when the U.S.-Turkish nuclear relationship began in the late 1950s. For example, according to a declassified memorandum of conversation in late 1960s staffers with the congressional Joint Committee on Atomic Energy (JCAE) told State Department officials that they saw a “real threat in Turkey” because of the possibility that leaders of an Army coup “might seize control of one or more of the inadequately protected weapons.”
The JCAE staffers also stated that they had learned that during the 1960 military coup that the Turkish “situation was so unstable that twice [Supreme Allied Commander General Lauris] Norstad almost ordered all the weapons to be evacuated.” Norstad later denied that any such thing had happened, but the JCAE remained concerned about overall stability in Turkey.
|
b9618b3668f02017d559037371bb31f3 | https://historynewsnetwork.org/article/173550 | The Overlooked Aftermath of the Chernobyl Nuclear Disaster | The Overlooked Aftermath of the Chernobyl Nuclear Disaster
April 26, 1986. Ukrainian Republic of the Soviet Union. The Number 4 reactor at the Chernobyl nuclear power plant exploded. The blast propelled a massive amount of radioactive material into the atmosphere. This fallout covered a wide area of what is now Ukraine and Belarus, and western Russia. Soviet officials put the death toll at no more than 54 people.
Within weeks, the Soviet government declared that the radioactive fallout posed no danger to human health, and it offered reassurance to affected citizens as it distributed numerous manuals with recommendations on continuing to live in the contaminated regions.
Eventually international agencies such as the United Nations also minimized the human health and environmental aftereffects of the Chernobyl explosion. Those who complained of problems from nuclear contamination were labeled “radiophobic.”
Acclaimed historian Professor Kate Brown embarked on an unrivaled journey of scholarly investigation to learn more about the aftermath of this devastating nuclear disaster. In her impassioned and lively new book, Manual for Survival: A Chernobyl Guide to the Future (WW Norton), she recounts her findings based on extensive archival research, travels in the “Zone of Alienation” and beyond, and numerous interviews of scientists, officials, factory workers, farmers, health care professionals, radiation monitors, and others.
In her exploration, Professor Brown found evidence of extensive medical and environmental damage from radioactivity in Ukraine, Belarus, and beyond. She also unraveled an international effort to minimize public awareness about the dangers posed by nuclear power, nuclear testing, and nuclear weapons research. She traces similar efforts to downplay damage from radioactive contamination since the 1945 atomic bombings of Hiroshima and Nagasaki, and warns of the dangers that the nuclear radiation presents after almost eight decades of nuclear weapons and energy.
Based on her investigation, Professor Brown learned of dramatic increases in cancer, birth defects and other medical problems linked to Chernobyl. As documented in archives and as reported to her by scientists and other professionals, she found that tens of thousands of people—not a few dozen—died as the result of radiation from the massive nuclear explosion. She also describes ongoing medical and environmental problems that persist in the aftermath of the disaster. And, as clean energy initiatives often prescribe nuclear energy as an alternative to carbon-based fuels, Professor Brown calls for careful consideration of what happens when technology fails and we are left with in the wake of nuclear disasters. Her book raises profound environmental concerns based on careful investigation in the vein of Rachel Carson’s iconic volume Silent Spring.
Kate Brown is currently a professor in the Science, Technology and Society Program at the Massachusetts Institute of Technology. She is renowned for research that illuminates the convergence of history, science, technology, and bio-politics. She has written three other award-winning books, including A Biography of No Place: From Ethnic Borderland to Soviet Heartland; Dispatches from Dystopia: Histories of Places Not Yet Forgotten; and Plutopia: Nuclear Families in Atomic Cities and the Great Soviet and American Plutonium Disasters. Plutopia earned many awards including the American Historical Association’s Albert J. Beveridge and John H. Dunning Prizes for the best book in American history: the George Perkins Marsh Prize from the American Society for Environmental History, the Ellis W. Hawley Prize from the Organization of American Historians (OAH), the Wayne S. Vucinich Book Prize of the Association for Slavic Studies, East European, and Eurasian Studies, and the Robert G. Athearn Prize from the Western History Association.
Professor Brown’s teaching and research are also widely recognized. For example, she has received numerous fellowships and, in 2017, she was awarded the Berlin Prize by the American Academy in Berlin. Her current research focuses the history of “plant people:” indigenes, peasants, and scientists who understood long before others that plants communicate, have sensory capacities, and possess the capacity for memory and intelligence.
Professor Brown generously responded by telephone to questions about her work as a historian and her new groundbreaking new book, Manual for Survival.
Robin Lindley: Congratulations on your groundbreaking book on the Chernobyl disaster, Professor Brown. You are a recognized expert in Soviet and Russian history. What sparked your interest in this field? Was there something in your family background or in your childhood that drew you to this history?
Professor Kate Brown. I don't have any Slavic heritage or anything that I know of. But I remember I went to a movie called Red Dawn about the Soviets attacking Colorado. And fortunately, the Coloradans had guns and they could defend themselves. It’s a stupid movie and I recognized it to be a cult movie or propaganda, but I was upset that the kids in the movie theater were cheering every time a Communist was killed. And I went home and I was complaining to my parents about it. My mom was there smoking a cigarette and she says, “Well, do something about it. Study Russian and change the world.” And I decided, well dammit, I'll just do that.
The very next week I signed up for classes in everything Russian. Russian history, Russian grammar, and Russian literature mostly in translation. My aim was to go to Russia and see if it really was an Evil Empire. I guess I’ve always liked to know things for myself rather than relying on someone else's knowledge.
Robin Lindley: And then you traveled to Russia.
Professor Kate Brown: Yes. By 1987, I had enough Russian grammar and language to study in Leningrad as an exchange student. Just then all kinds of interesting things were starting to happen between Gorbachev and the United States. After that, I just kept going. I was part of a crowd of Westerners who worked in the USSR at the end of that polity. Gorbachev liberalized visas and politics, which made it easier to spend time in Soviet Union and carry out joint programs.
Robin Lindley: Did you also work as a journalist?
Professor Kate Brown: Yes, I did. When I was in Seattle in a graduate history program at the University of Washington, I initially didn't have funding for my studies, so I worked through a work study program. I worked for KCTS, the public television station, on their weekly news magazine. And then I worked at KUOW, a National Public Radio station where I was a beat reporter. The job wasn’t complicated. I’d get to work at eight in the morning. They'd say, Boeing's on strike or there is a problem over water rights at the Snoqualmie Falls. And I'd go off and I'd get some interviews and I'd have my story on the radio by 4:00 PM, whatever the story. That was a real crash course both in figuring out how to get a lot of information really fast and how to organize material into a news story. I learned to stick a microphone in people's faces. And I learned to tell a story with a narrative arc. And then I met some people at the TV station who went on to make documentary films. They hired me as a researcher and scriptwriter for their documentaries.
And in 1992 I ended up in Munich working for Radio Free Europe. And then I went to Moscow and I did some stories for Radio Free Europe from there in the fall of 1992.
I have that kind of experience, and I enjoyed it. But then I thought I wanted to do longer form journalism and I wanted to write my own books. I didn't want to just write a story that's on in the course of a day. And I wanted more control over the stories I told and how I told them, as opposed to the rigid format, whether it was TV or in short form journalism. So, I chose a career in academia, which meant I would have a smaller audience but I could have more autonomy in what I wrote.
Robin Lindley: And you pursued grad school and earned a doctorate in history, but you wanted to write for scholars and the general public.
Professor Kate Brown: Academia is full of all kinds of wonderful ideas and fantastic research-driven, creative work. But sometimes academic writing turns off popular readers. And so that was one other missing part for me. Was it possible to do nuanced, creative research and then tell about it in a way that's compelling and can reach any kind of high-school level reader? That's always been my mission. So I wrote my dissertation as a first-person travelogue. I got some trouble for it because dissertations are usually more formally written, but I was stubborn and finally my advisors just said do whatever you want.
Robin Lindley: Did that desire to put your brand on a scholarship bring you to history graduate school?
Professor Kate Brown: I didn't think so much of a brand, but as a lease to liberate myself from the constraints that I saw imposed on grad students and academics, and we often put these constraints on ourselves.
Robin Lindley: Your journalism background served you well. Your writing is very engaging and accessible. I believe you have described yourself as a partisan historian. In Plutopia, your book on the plutonium cities of Hanford, Washington, and Ozersk, USSR, you included information from non-expert people you interviewed who actually lived in the areas you wrote about in addition to your archival research. You also broke from most scholarly writing with first-person reporting on your research.
Professor Kate Brown: When you work in the archives, you get kind of a sketch or an outline of what real life is like in whatever period you're studying. And, when you go to a place, especially if it's going to a place for recent history, you can see what it looks like. You can see how the physical world is also an actor in your stories: the way the rivers flow, how the soils soak up water.
You can read the archival record, but it really helps to get the fine grain detail by going to a place and then talking to people. People can really clue you into their local knowledge that is so important. And they know things that experts don't know. They know things that you only get a glimmer of working in the archives.
I don't just take people's word for it. After talking to people, I can go back to the archives and try to cross-check what they tell me. Often, I have a whole new understanding of an event after hearing firsthand accounts.
Robin Lindley: And you also did extensive interviews for your new book on Chernobyl.
Professor Kate Brown: With Chernobyl, I did a good number of oral histories. What I found in the archives is that the officials were having arguments among themselves. Some doctors and scientists who studied the accident were reporting major health problems. But experts in nuclear medicine who were Moscow, Vienna, Paris, or New York were saying that, with the kinds of emissions and the kinds of estimated doses that they calculated people received, they didn’t expect major health problems. They would explain the rise in the frequency of disease by saying that these people were anxious, or they drank too much, or they had a poor diet and a poor economic situation. They basically devised an alternative narrative to attribute to those health problems, though I didn’t see hard evidence of drinking or a rise in anxiety. So, what helped, I think, is that I would just go talk to people and get their stories, and confirm one version or another of the oral histories with the material in the archives. But then I still wasn't sure.
So, in this project, I took yet another step and I enrolled myself as a participant observer with two biologists who were the only two scientists I could find who regularly worked in the Chernobyl Zone twice a year since 2000. They are like clockwork arriving in the Chernobyl Zone in June and in September. They use the contaminated Chernobyl Zone as a natural experiment, a massive field lab. I started going along with them, and I learned a lot. I learned forensic methods to detect radioactivity in the natural environment as I traveled with them. Later, outside the Zone, when I went to Chernobyl-contaminated areas where people continued to live, I could see evidence of damage in the environment using techniques that I had learned from the biologists.
And that was a third way to cross check the story, which I knew was going to be controversial. I was really intent on verifying the stories I was getting. And, as I was talking to people, I figured I could use science also. People lie and archives lie, but maybe trees don't.
Robin Lindley: Thanks for describing your approach to research. Did your research for Plutopia on those plutonium cities spark your book on Chernobyl?
Professor Kate Brown: For sure it did. I felt like this was a bit of a sequel for Plutopia. I started out with a very different set of questions for Plutopia, but I kept running into these farmers, whether they were in Siberia or in Eastern Washington, who had very similar stories to tell me about their health problems. And I knew that they weren't talking to each other and they didn't share a common language.
I tried to do as much research as I could in archives, but those cities were both military sites. The American government wasn't terribly curious about what happened off site of the nuclear reservation. And the Russians kept some studies of people living down river and downwind of the plant who were exposed, but those studies were off limits to me as a researcher. So I figured Chernobyl might be a good place to look and try to get more about that health story because it was a civilian site and it happened later.
I walked into the archives in Kyiv (Kiev) one day. I asked what they had from the Ministry of Health on the medical consequences of the Chernobyl disaster. They said that was a censored topic during the Soviet period and I would not find anything. I asked to look anyway.” Sure enough, it took three seconds to find a whole document collection entitled “The Medical Consequences of the Chernobyl Catastrophe.” Big bound collections. I started reading them and I realized that the archivists didn't know about these files. They discouraged me not because they were trying to deceive me, but because nobody else had ever pulled them before.
Robin Lindley: It surprised me those files had been untouched until you came in.
Professor Kate Brown: Yes. And over and over again, I had that experience in Minsk and Zhytomyr, Gomel and Mogilev. To be the first to check out the files. With two research assistants, we found files down to the county level. In sum we found thousands of records that described, in one way or another, environmental exposures and health problems.
I also was convinced that I came across again untapped archives in the Belorussian Academy of Science. The Belorussian government was doing a great job of ignoring the contamination story and pretending Chernobyl didn’t exist, but scientists at the Academy had privately gotten very worried and they started their own case control studies on several topics, but mostly related to children's health and the health of pregnant women. And those studies are really convincing. They had all the relevant data such as dose estimates. I guess that's when I determined I believed what’s called the alarmist stories.
Robin Lindley: What were some of your major findings on the medical and environmental consequences of the Chernobyl catastrophe? The official death toll was about 50 people but you learned that radiation illness probably contributed to tens of thousands of deaths.
Professor Kate Brown: Yes. The official death toll most often cited in the big publications is that 33 to 54 people died from the Chernobyl explosion, but that’s just from the acute effects of radioactivity. Those were fireman and plant operators who were exposed massively right during and right after the accident, and most of them died within the next couple of months in one hospital in Moscow.
But I found the death toll was much higher. I found that not 300, the official count, were hospitalized after the accident for Chernobyl exposures, but 40,000 were hospitalized, 11,000 of them were kids were hospitalized for exposures in the summer after the accident.
The Ukrainian government gave compensation to 35,000 women whose husbands died from causes related to radiation. Now that number is limited. It included just men who had documented exposures. It doesn't include children or women or babies. And that's the number just for Ukraine, which got the least amount of any radioactive fallout, while Belarus received far more.
We tried really hard to get some kind of count for Belarus and Russia about fatalities from Chernobyl, but there simply is no kind of official count. So, 35,000 is the lowest possible number. On the thirtieth anniversary at the Chernobyl visitor center, the official tour guide said that the death toll was at least 150,000 in Ukraine alone.
Robin Lindley: How again did you come up with the figure for men who died?
Professor Kate Brown: About 35,000 wives received compensation because their husbands died from a Chernobyl-related radiation illness. That means that these men did some kind of work in which they were monitored so they had a film badge or some other dose estimate. Their doses were recorded or reconstructed, and then their illnesses were on a list of illnesses that were attributed to Chernobyl contamination. They died leaving their widows some income as compensation. So that's how that number was created.
Robin Lindley: What evidence did you find on birth defects?
Professor Kate Brown: There's all kinds of evidence in the book. The evidence I had was a study here and a study there and observations here and there. But the one study that's been done that fits standardized Western protocols was by Wladimir Wertelecki who teaches at the University of South Alabama. He carried out a study in the northern province of Ukraine. He found that there was a six times higher rate of neural tube birth defects (a category that includes spina bifida and anencephaly) in people who live in those northern regions. He also found elevated rates of cesium in the bodies of people in that northern Rivni Region.
Anencephaly and spina bifida are also big problems in Eastern Washington [the site of Hanford]. In 2010 the State of Washington became alarmed because there was a 10 times higher than expected number of babies with anencephaly in Eastern Washington, especially in the three counties around Hanford. They did a study and wrote a report that you can get it online. To the best as I know, this little epidemic is not over and the numbers continue to be high. The Washington State epidemiologist wrote in this report that they don't know what's causing these defects. He said they looked into all kinds of things such as nitrates and pesticides and genetic factors and radiation from Hanford.
They reported that they were told by the Department of Energy that radioactivity does not leave the Hanford site. If you know anything about Hanford, you know that’s a statement that only a very gullible person would believe. So that's largely a silent, unexplored topic, but one we see in areas where people have been exposed to radioactivity.
Robin Lindley: What a tragedy for those families. Another issue that's related to the physical health consequences of the Chernobyl disaster of course is the mental health of citizens after the catastrophe itself. I think you mentioned cases of posttraumatic stress disorder and just the general stress of living in that situation.
Professor Kate Brown: The United Nations’ bodies first said that the health problems were from the fear of radiation. But some researchers and scientists find that real neurological damage caused by exposure to radioactivity can cause emotional disorders. There are also people who work in microbiology who have found that when you have a disorderly microbiome in your gut that is damaged from some toxin, whether it's a chemical toxin or a radioactive contaminant, that that can trigger emotional problems as the gut serves as a sort of a second emotional brain. A lot of how we feel every day has to do with our microbiome and our gut.
These cases suggest a lot of unanswered questions. A purpose of the book is to urge citizens to ask their leaders and public health officials to get more curious about the long-term effects of chronic low doses of radioactivity. We know a lot about high doses of radioactivity and human health, but researchers repeat that they know next to nothing about low doses. We know about high doses from the Hiroshima studies. We don't know about these low-dose effects because we have never really studied people who live in those conditions.
Robin Lindley: Your book serves as a call for further research. What were some of the environmental consequences of the disaster that struck you? You mention harm to animals and plants and even decreased pollination.
Professor Kate Brown: What was really striking was when I went from the Ministry of Health records to the State Committee for Industrial Agriculture records, I saw that the people in the Soviet Union did their best to monitor food supplies and levels of radioactivity in the in soils, water and air. And when they found high levels of radioactivity, they went in with bulldozers and they scraped away the topsoil and dumped it far away from the villages. And they scrubbed down surfaces and asphalt and buildings with chemical solvents to try remove any radioactivity.
They could get these villages to a level of making them livable, but then they would come back two weeks later, and the radiation levels would be nearly as high. And they realized that radioactive isotopes could mimic minerals that plants and animals need to survive, that go from soils, air and water into the plants, then into the animals. And then, because humans sustain themselves on plants and animals, they take in these materials and bring them to their villages. And as they go into the villages with their shoes or their tractor wheels, they bring in dust and dirt from the forest and the field. And that all those contaminants gather in human population points.
So the exposures for humans were consistently from ingesting contaminants. Once you ingest radioactive isotopes, the natural biological barrier of your skin and your body no longer helps, and beta and alpha particles penetrate your skin. Once they're inside your body, they can do a lot of damage to lungs, hearts, various organs, and inside the joints. They wreak havoc on bone marrow.
But before these acute problems, people have subacute problems. And interestingly enough, we don't much care about those. Few journalists have asked me how many people had digestive tract disorders or respiratory problems. Mostly, they want to know more about cancers and deaths and birth defects—the acute problems. But subacute problems mount in a body. A person may have that one chronic disease, but maybe two or three subacute problems. A family would have several people with chronic health problems. They're still alive and not in the death toll, but their lives are shorter and far more painful. They are not able to be productive as members of the community in terms of work and a creative life.
None of us would wish this kind of medical history on our own families and communities. And that's I think something that we don't statistically track because we have failed to ask this question.
Robin Lindley: The damage to the immune system must be serious.
Professor Kate Brown: Yes.
Robin Lindley: One of the themes of your book is the international effort to minimize evidence about the medical and environmental consequences of the Chernobyl disaster. You also note that there's a long history, even in United States, of covering up problems with radiation illness. That goes back to the 1945 bombing of Hiroshima and Nagasaki when General Groves refused to disclose the effects of radiation. You recount a history of similar cover up efforts since then.
Professor Kate Brown: Unfortunately, we have a real track record in the United States of minimizing the record of radiation exposure and illness. We have a long-term life span study of the survivors of the Japan bombings, but that study doesn't take into account radioactive fallout. It estimates that the doses survivors received was that of one big x-ray that lasted a second. But the other exposures, the Chernobyl-like exposures, with people living in these environments who take in radioactive contaminants by ingesting them in the air in their lungs or through the food cycle, was never considered as part of that study. There were also exposures of Marshall Islanders and people from the Nevada test sites. We really don’t know what happened in those cases for a lack of curiosity.
Finding out about radiation is a real threat [to nuclear power advocates]. In 1987, a group of health physicists, specialists in nuclear medicine, had a convention in Columbia, Maryland. A lawyer from the Department of Energy addressed them. He said that after Chernobyl, the biggest threat to the nuclear industry was lawsuits. He announced that they were going to break out into small groups with lawyers from the Department of Justice to train them on how to become expert witnesses in defending the US government against lawsuits. These scientists then served as “objective” witnesses in lawsuits where Americans took corporations to court for their exposures in the production and testing of nuclear weapons. It comes as no surprise that few won those lawsuits.
Other nuclear powers including Great Britain, France, and Russia, were facing similar lawsuits. If industry scientists could say that Chernobyl was the world's worst nuclear accident and only 33 or 54 people died, then those lawsuits could and indeed did go away.
Robin Lindley: When you were traveling through the Zone of Alienation and when you were finding information that a lot of people probably didn't want you to have, were you threatened? Did you feel that your safety was endangered at all?
Professor Kate Brown: No, I didn't. Since I published the book there have been a couple of people who are industry scientists and a guy who runs two pro-nuclear NGOs, and they make a living by promoting the nuclear industry. They’ve been attacking me and my book but they're not disinterested parties. Other than that, I haven't really endured any hardships.
Robin Lindley: I'm glad. With the KGB involved and a series of cover ups, your book reads like a thriller. It’s a compelling scholarly expose’ with popular appeal on what happened after Chernobyl.
Many witnesses you spoke with were women who were close to the ground level in areas of contamination--the sort of people you wanted to hear from who'd lived through this experience. They included doctors, teachers, and women who work in the wool and leather factories, among others. Their contribution to your book was impressive.
Professor Kate Brown: Yes. Well, women are the ones who take care of the kinship networks. They're the ones who normally, especially in Soviet society, take care of family members when someone is sick. And women are also the ones who staffed hospitals. Being a doctor in the Soviet Union was a low paying job usually left to women. Men were researchers who worked in institutes and universities. So, it was women who noticed these trends in poor health and they're the ones that are there in the book. The women were the ones sitting around in the waiting rooms and they exchange information there day after day for hours, and they start to see trends.
Robin Lindley: Thanks for that personal insight from your travels. What was the political fallout of the Chernobyl disaster in terms of the future of Mikhail Gorbachev and the fall of the Soviet Union?
Professor Kate Brown: Gorbachev said at some point after the fall that the main cause of the fall of the Soviet Union was Chernobyl, but I'm not sure Gorbachev is the most reliable person to ask on this point because everybody else in the former Soviet Union blames him for the collapse. So it makes sense that he was looking for outside factors to deflect attention away from himself. But, as I worked through archives, I took note of the incredible resources that the Soviet government spent to try to deal with this disaster from cleaning up this huge territory and then putting a sarcophagus over the reactor itself. In Ukraine alone, they sent out 9,000 medical staff to look at everybody they could find who might have been exposed to contaminants. They dealt with the medical fallout, and set up studies of the ecology and the human health problems. And on and on.
Chernobyl was a huge drain at a time when the Soviet Union was experiencing a collapse in oil prices and oil exports, the main source of hard currency revenue. And so Chernobyl was certainly a confounding factor.
And then they kept this all under wraps and they weren’t honest with people. When, in 1989, they finally published the first maps of radioactivity showing the high levels of radioactivity in places where people were living for three and a half years, residents were furious. People poured out to the streets in June 1989. There were marches and strikes and pilgrimages and protests, and new people started to run for office.
Robin Lindley: You recount some of the history of previous nuclear accidents in the Soviet Union. Wasn’t the Soviet military using nuclear weapons to stop forest fires, or is story apocryphal?
Professor Kate Brown: The story I report my book that we have from archival sources and one eyewitness was that there was a gas fire in a well when someone digging tapped into underground flows of gas and that caused a fire in the ground. They couldn’t extinguish it. They tried for a year to put out the fire this way and that, and finally a team came from a closed military establishment in Russia and they dug down 200 meters, right next to the gas fire, and they dropped a nuclear bomb in there and blew it up, expecting to spill this big mound of dirt on top of the gas fire and just snuff it out. But what happened instead is somehow the bomb went sideways horizontally, and not on the gas fire. And then the plume from the nuclear bomb went up through the gas well and just made this huge column a mile in the sky from the explosion. And then fallout rained down. They had to evacuate Russian soldiers and villagers nearby. It was not far from Kharkiv.
That was in the 1970s. And that was in the same year that a nuclear explosion for civilian purposes became an experiment that went disastrously wrong. And there are lots of incidents like that.
They had 104 accidents at the Chernobyl plant in the five years before the big accident in 1986. It was a tottering enterprise to run. Lots can go wrong and lots apparently did.
Robin Lindley: What have you learned about the nuclear accident in Northern Russia this past August where here was an explosion perhaps involving a missile experiment?
Professor Kate Brown: I only know what we all read in the papers. I've been reading a little bit in the Russian papers and they don’t have much more than what's in the English papers but this seems like a case of press the replay button from Chernobyl with denials that it happened. And then, seven Russian scientists died. That's significant. There’s a lot of secrecy about the situation. It doesn't appear to have created anywhere near the levels of radioactivity and fallout as Chernobyl. They were trying to develop some kind of weapon, but we don't know exactly what. There's some speculation about a weapon for a nuclear submarine or some kind of missile. So it's unclear.
Robin Lindley: Congratulations on your new role at MIT as a professor in an interdisciplinary program.
Professor Kate Brown: I’m teaching in a program in science, technology and society. At MIT we train future scientists and future engineers so it’s a wonderful place to think about science and to talk with students about not just creating beautiful machines, but also thinking about how they will be used in the worst and the best of all possible situations. It’s an exceptional opportunity.
Robin Lindley: Are you continuing your research on Russia and on nuclear issues?
Professor Kate Brown: No. I thought I'd move on from that. I feel like I’d just start repeating myself. Now I'm interested in what I call “plant people.” Now that Western scientists have validated the notion that plants have distributed intelligence and communicate with one another and across species. I thought to myself, peasants have known that for hundreds of years. So I am going back in time and looking at people who had these insights. I want to know what else have they have known that we have missed.
Robin Lindley: I’d like to conclude by asking you how you decided on the title of your book about the aftermath of Chernobyl: Manual for Survival: A Chernobyl Guide to the Future?
Professor Kate Brown: I found in the archives all kinds of [post Chernobyl disaster] manuals for how to live on a radioactive landscape. There was a manual for doctors who treated exposed patients, another manual for the meat packing industry, and one on how to deal with high and low levels of radioactive farm crops, and manuals for the dairy industry, for the leather industry, and for the wool industry, and manuals for farmers who were going to live here.
The manuals were to reassure citizens, and said we've checked the radiation in your population point and everything's fine. No need to worry. There are just a few things you need to keep in mind. Take all your topsoil and remove it and bury it somewhere far from your village. Don't eat any berries or mushrooms. In fact, it’s better not to enter the forest at all. They go on and on like that. Clearly everything was not fine.
That's where I got the idea of the manual. I decided to call it Manual for Survival because I considered the people who lived there to be survival experts. This place had suffered through the revolution, the Russian civil war, the First World War, the Second World War, and famine, and purges. They'd seen it all. And then, they tried to make it better by building a nuclear power plant to bring cheap energy to the villages. And then it blew up.
So they'd seen all the calamities the twentieth century had to offer. And I thought, as we talk about coming threats because of the ecological crisis, that it might be good to know something about how to survive a severe ecological crisis. And so that's what I was looking for: the everyday heroes.
I did find lots and lots of people who resisted the bosses who told them to fudge the numbers or to overlook troubling facts or not report radioactivity in the water or the land. And these people stood up to power and said, No, I'm not going to do that. I don't care what you do to me, but I'm going to do what's right. And I found that extremely inspiring.
Nobody got shot and they weren't throwing people in jail for resistance. Some people got docked in pay and other people faced more demands on the job or were demoted. But they continued. So it was possible to be courageous and they actually did a great deal of good. I was purposely looking for that story.
Robin Lindley: Thank you for your thoughtful responses Professor Brown and congratulations on your new book and your new position at MIT. I wish you the best.
|
55680b747d98cde230f10f219c8914f9 | https://historynewsnetwork.org/article/173570 | Berlin Didn’t Want a Reagan Statue—but It’s Getting One Anyway | Berlin Didn’t Want a Reagan Statue—but It’s Getting One Anyway
Ronald Reagan loved Berlin. For the German capital, things are more complicated.
For years, the city’s successive governments have resisted gentle prodding by U.S. dignitaries to erect a statue of the late president, who famously called in a 1987 speech here on his Soviet counterpart Mikhail Gorbachev to tear down the Wall that had divided the city since 1961.
|
ebe7beeedacc074885fad72f1813a3fc | https://historynewsnetwork.org/article/173710 | Fourth Spy Unearthed in U.S. Atomic Bomb Project | Fourth Spy Unearthed in U.S. Atomic Bomb Project
The world’s first atomic bomb was detonated on July 16, 1945, in the New Mexican desert — a result of a highly secretive effort code-named the Manhattan Project, whose nerve center lay nearby in Los Alamos. Just 49 months later, the Soviets detonated a nearly identical device in Central Asia, and Washington’s monopoly on nuclear arms abruptly ended.
How Moscow managed to make such quick progress has long fascinated scientists, federal agents and historians. The work of three spies eventually came to light. Now atomic sleuths have found a fourth. Oscar Seborer, like the other spies, worked at wartime Los Alamos, a remote site ringed by tall fences and armed guards. Mr. Seborer nonetheless managed to pass sensitive information about the design of the American weapon to Soviet agents.
The spy fled to the Soviet Union some years later; the F.B.I. eventually learned of his defection and the espionage but kept the information secret.
His role “has remained hidden for 70 years,” write Harvey Klehr and John Earl Haynes in the current issue of Studies in Intelligence, the C.I.A.’s in-house journal; their article is titled “On the Trail of a Fourth Soviet Spy at Los Alamos.” In separate interviews, the sleuths said they were still gathering clues regarding the exact character of Mr. Seborer’s atomic thefts.
Mr. Klehr is an emeritus professor of politics and history at Emory University, and Mr. Haynes is a former historian for the Library of Congress. Both have written books on Soviet spies and American communism, often together. Their tale has an eerie resonance at a time when Russian intelligence agencies are again at the center of American life.
|
c2d7fbc6ca6f3d4614f71e0c24361e13 | https://historynewsnetwork.org/article/173732 | William Barr’s Upside-Down Constitution | William Barr’s Upside-Down Constitution
Attorney General William Barr’s November 15 speech before the Federalist Society, delivered at its annualNational Lawyers Convention,received considerable attention.Barr attackedwhat he views as progressives’ unscrupulous and relentless attacks on President Trump and Senate Democrats’ “abuse of the advice-and-consent process.” Ironies notwithstanding, the core analysis of his speech is a full-throated defense of the Unitary theory of executive power, which purports to be an Originalist view of the Founders’ intent.
This defense, however, reveals the two fundamental flaws of the Unitary view: first, that it is built on a fictional reading of constitutional design; and second, that its precepts attack the fundamental tenets of the checks and balances system that the Founders did create.
Barr’s speech begins with his complaint that presidential power has been weakened in recent decades by the “steady encroachment” of executive powers by the other branches. Even allowing for congressional resurgence in the post-Watergate era of the 1970s, no sane analysis of the Reagan era forward could buttress Barr’s ahistorical claim. Ironically, the presidents in this time period who suffered political reversals—Bill Clinton’s impeachment and the thwarting of Barack Obama’s agenda by congressional Republicans in his final six years of office—nevertheless emerged from their terms with the office intact in powers and prestige.
Attorney General Barr’s reading of colonial history claims that the Founders’ chief antagonist during the Revolutionary period was not the British monarchy (which, he claims, had been “neutered” by this time) but an overbearing Parliament. Had Barr bothered to consult the penultimate statement of American grievances, the Declaration of Independence, Barr would have found the document to direct virtually all of its ire against “the present King of Great Britain.” The lengthy list of grievances detailed in the document charge “He,” George III, with despotism and tyranny, not Parliament (where some of whose members expressed sympathy for the American cause). Barr’s message? Legislatures bad, executives not so much.
Barr insists that by the time of the Constitutional Convention there was “general agreement” on the nature of executive power and that those powers conformed to the Unitary vision—complete and exclusive control over the Executive branch, foreign policy preeminence, and no sharing of powers among the branches. Barr dismisses the idea of inter-branch power-sharing as “mushy thinking.” Yet the essence of checks and balances is power-sharing. As the political scientist Richard Neustadt once noted, the Founders did not create separate institutions with separate powers, but “separate institutions sharing powers.”
And as if to reassure himself and other adherents, Barr insists that the Unitary view is neither “new”—even though it was cooked up in the 1980s by the Meese Justice Department and the Federalist Society—nor a “theory.” Barr says, “It is a description of what the Framers unquestionably did in Article II of the Constitution.” Yet aside from the veto power, he fails to discuss any actual Article II powers. And in the case of the veto, he fails to note that this power is found in Article I, and is generally understood as a legislative power exercised by the executive. Shouldn’t an Originalist take a passing interest in original text? Nor does he explain why Article II is brief and vague, compared to Congress’s lengthy and detailed Article I powers. What we know about that brevity and vagueness is that it reflected two facts: the Founders’ difficulty and disagreement in defining presidential powers, and the wish of a few Founders who favored a strong executive to leave that door open, hoping that future presidents might help solidify the office. That wish, of course, came true.
Most of the latter part of Barr’s speech is devoted to a condemnation of the judiciary, where it has not only set itself up at the “ultimate arbiter” of interbranch disputes, but worse has “usurped Presidential authority” by the very act of hearing cases and ruling against asserted presidential powers. Underlying these complaints are the Unitary tenet that the courts have no rightto rule in any area of claimed executive power. Barr vents his frustration at the extent to which Trump administration decisions and actions have found themselves tied up in court. Experts continue to debate what issues and controversies are or are not justiciable. But to assert by Unitary theory fiat that the courts cannot rule is to make an assertion found nowhere in the Constitution. And Barr also misses the fact that court rulings historically have most often favored executive powers.
The Trump administration’s many questionable actions have raised both new and old concerns about the extent and reach of executive power. There is plenty of blame for abuses of power to spread around, most certainly including to Congress. But the Unitary theory offers no remedy to the power problems of the present era. And the idea that it somehow is an Originalist reading of constitutional powers would be laughable if it didn’t have so many adherents in the seats of power.
|
4bfa727e2303a006d2ef599692472888 | https://historynewsnetwork.org/article/173928 | Americans Are Ready for a Different Approach to Nuclear Weapons | Americans Are Ready for a Different Approach to Nuclear Weapons
Although today’s public protests against nuclear weapons can’t compare to the major antinuclear upheavals of past decades, there are clear indications that most Americans reject the Trump administration’s nuclear weapons policies.
Since entering office in 2017, the Trump administration has withdrawn the United States from the nuclear agreement with Iran, scrapped the Intermediate-Range Nuclear Forces (INF) Treaty with Russia, and apparently abandoned plans to renew the New START Treaty with Russia. After an overwhelming majority of the world’s nations agreed on a landmark UN Treaty on the Prohibitions of Nuclear Weapons in July 2017, the Trump administration quickly announced that it would never sign the treaty. The only nuclear arms control measure that the Trump administration has pursued―an agreement by North Korea to abandon its nuclear weapons program―appears to have collapsed, at least in part because the Trump administration badly mishandled the negotiations.
Moreover, the Trump administration has not only failed to follow the nuclear arms control and disarmament policies of its Democratic and Republican predecessors, but has plunged into a renewed nuclear arms race with other nations by championing a $1.7 trillion program to refurbish the entire U.S. nuclear weapons complex. Perhaps most alarming, it has again and again publicly threatened to initiate a nuclear war.
These policies are quite out of line with U.S. public opinion.
Polling Americans in July 2018 about Trump’s withdrawal of the United States from the Iran nuclear agreement, the Chicago Council on Global Affairs found that 66 percent of respondents preferred remaining within it. In February 2019, when the Chicago Council surveyed Americans about U.S. withdrawal from the INF Treaty, 54 percent opposed the action. Moreover, when Americans were presented with arguments for and against withdrawal, opposition to withdrawal rose to 66 percent.
The Center for International & Security Studies at the University of Maryland also reported overwhelming public support for nuclear arms control and disarmament agreements. Polling Americans in early 2019, the Center found that two-thirds of respondents (including a majority of Republicans) favored remaining within the INF Treaty, while eight out of ten respondents wanted the U.S. government to extend the New START Treaty. Indeed, more than eight out of ten U.S. respondents backed new nuclear arms control treaties with Russia―findings similar to those of the Chicago Council, which reported that 87 percent of American respondents to a poll in early 2019 wanted the United States and Russia to secure a further nuclear arms limitation agreement.
But just how much arms control and disarmament do Americans want? It might come as a shock to the many pundits in the mass media who have never mentioned the 2017 UN Treaty on the Prohibition of Nuclear Weapons, but roughly half the U.S. population supports nuclear abolition along the lines of the treaty. According to a YouGov opinion survey done in late September 2019, 49 percent of American respondents thought the United States should work with other nations to eliminate all nuclear weapons in the world. Only 32 percent disagreed, while 19 percent said they didn’t know.
When it comes to actual use of nuclear weapons, Americans are even clearer in their preferences. AYouGov/Huffington Post poll in August 2016 found that 67 percent of American respondents thought the U.S. government should never initiate a nuclear attack. In mid-2019, Zogby Analytics surveys of American respondents in key primary states also discovered very high levels of opposition to first use of nuclear weapons.
Not surprisingly, Donald Trump’s angry, impulsive behavior, coupled with his threats to launch nuclear attacks upon other nations, has left many Americans uneasy. This might help to explain why 68 percent of Americans surveyed in early 2019 by the Center for International & Security Studies backed congressional legislation requiring that a president, before ordering a nuclear attack upon another nation, consult with congress and secure a congressional declaration of war upon that nation. As the U.S. congress has not passed a declaration of war since 1941, this opinion, too, provides a substantial challenge to current U.S. nuclear policy.
There are other indications, as well, that the American public wants a new approach. In July 2019, the U.S. Conference of Mayors, at its 87th annual meeting, unanimously passed a resolutioncalling on all U.S. presidential candidates “to pledge U.S. global leadership in preventing nuclear war, returning to diplomacy, and negotiating the elimination of nuclear weapons.” Calling for negotiations to replace the INF Treaty and to extend or replace the New START Treaty, the resolution demanded that candidates support the Treaty on the Prohibition of Nuclear Weapons and renounce the option of first use of nuclear weapons.
Yet another sign of public discontent is the emerging Back from the Brink campaign, supported by numerous peace, environmental, religious, health, and other organizations. Endorsed by dozens of cities and towns across the country, it has also received the official backing of the state legislatures of California and Oregon, as well as of the New Jersey State Assembly and the Maine State Senate. The campaign calls on the U.S. government to “lead a global effort to prevent nuclear war” by: “renouncing the option of using nuclear weapons first”; “ending the sole, unchecked authority of any U.S. president to launch a nuclear attack”; “taking U.S. nuclear weapons off hair-trigger alert”; “cancelling the plan to replace its entire nuclear arsenal with enhanced weapons”; and “actively pursuing a verifiable agreement among nuclear-armed states to eliminate their nuclear arsenals.”
Looked at from the standpoint of most Americans and, indeed, survival in the nuclear age, this departure from the dangerous direction of U.S. nuclear policy makes a lot of sense. Looked at from the standpoint of candidates seeking election to national office, it would also make good politics.
|
b754a63177b6cb19775ff0e3a1eadca3 | https://historynewsnetwork.org/article/173977 | Iran Air Flight 655: Iran’s president invokes 1988 tragedy many Americans have forgotten | Iran Air Flight 655: Iran’s president invokes 1988 tragedy many Americans have forgotten
On Saturday, President Trump invoked history when tweeting out a threat to destroy “52 Iranian sites … some at a very high level & important to Iran & the Iranian culture.” He said the potential targets represent the 52 Americans who were held hostage there for 444 days from 1979 to 1981.
On Monday, Iranian President Hassan Rouhani invoked history right back in response to Trump’s threat.
His hashtag “#IR655” refers to Iran Air Flight 655, a commercial jet shot down by the U.S. military by mistake on July 3, 1988, killing all 290 civilians and crew on board, including 66 children.
Although the incident is nearly forgotten in the United States, it is etched deeply in memory in Iran, where the country is mourning the U.S. airstrike that killed Iranian military commander Qasem Soleimani.
In 1988, the long war between Iraq and Iran was close to ending. At the time, the United States supported Iraq and its leader Saddam Hussein in their fight against Iran. U.S. Navy ships patrolled the Persian Gulf to protect shipping routes.
On the morning of July 3, the cruiser USS Vincennes was engaged in a skirmish with Iranian gunboats in the Strait of Hormuz. Not far away, in the coastal city of Bandar Abbas, an Iran Air commercial jet took off for a routine flight to Dubai. This flight was frequently packed with weekend shoppers going to Dubai for jewelry and electronics, The Washington Post’s Valerie Strauss reported.
|
d3eedd88eeb4f9aa3842077a2ecf1fee | https://historynewsnetwork.org/article/174090 | Was Martin Luther King Jr. a Republican or a Democrat? The Answer Is Complicated | Was Martin Luther King Jr. a Republican or a Democrat? The Answer Is Complicated
Martin Luther King Jr.’s influence on American politics and his views about policy issues are a perennial topic of discussion around the time of his January 15 birthday and the Martin Luther King Jr. Day federal holiday. However, the civil-rights leader’s personal political party affiliation remains a mystery.
His niece Alveda King, an Evangelical supporter of President Donald Trump, has argued that her uncle was a Republican, like his father Martin Luther King, Sr., who was also a Baptist minister. That idea has been repeated often, but videos that claim to show that Martin Luther King, Jr. is Republican have been proven not to do so. King’s son Martin Luther King III said in 2008 that it’s “disingenuous” to insist he was when there is no evidence of him casting a Republican vote. “It is even more outrageous to suggest that he would support the Republican Party of today,” the younger King added, “which has spent so much time and effort trying to suppress African American votes in Florida and many other states.”
The idea that King would have been a registered Republican is not far-fetched, given the party’s history and its position in national politics in the 1950s, but scholars and those who knew him best say they can’t imagine that he would have supported Republican presidential candidates in the 1960s. In fact, King himself said he voted for Democrat Lyndon B. Johnson for President in 1964.
“I know of no one who has verified MLKJ’s party registration,” says Clayborne Carson, editor of King’s autobiography and Professor of History and Founding Director of The Martin Luther King, Jr., Research and Education Institute at Stanford University. “[He] may have been registered as a Republican and voted Democratic [in national elections].”
|
22f10d7ad2f285b8fa30c82d669388f7 | https://historynewsnetwork.org/article/174106 | The History of Presidential Misconduct Beyond Watergate and Iran-Contra | The History of Presidential Misconduct Beyond Watergate and Iran-Contra
In 1974, James Banner was 1 of 14 historians tasked with creating a report on presidential misconduct and how presidents and their families responded to the charges. The resulting report was a chronicle of presidential misconduct from George Washington to Lyndon B. Johnson. It indicated moments of corruption, episode by episode with no connective tissue between administrations.
The historians delivered the report in eight weeks and it was prepared for distribution to the House Judiciary Committee. But then Richard Nixon resigned and the hearings it was designed for never happened. The report was published as a book but it got very little attention. Very few American historians even heard of it. “And thus it lay,” stated Banner.
Then, in August 2018 historian Jill Lepore called Banner and asked him about the book. Afterwards, Lepore wrote about the 1974 report in the New Yorker and the press and surrounding political events ignited interest in it. To bring the chronicle of presidential misconduct up to date, Banner identified and recruited seven historians to write new chapters so that a new version of the book would end with Barack Obama. Presidential Misconduct: From George Washington to Today was published by the New Press in July 2019.
At the American Historical Association’s annual meeting in early January, Banner monitored a panel with three of the book’s new authors: Kathryn Olmstead, Kevin M. Kruse, and Jeremi Suri. Examining the presidencies of Richard Nixon, Jimmy Carter, and Ronald Reagan, the historians discussed how the recent past shapes the current discussion of presidential misconduct.
Kathryn Olmstead examined presidential misconduct beyond Watergate in the Richard Nixon administration and argued abuse of power and law-breaking were central to Nixon’s presidency.
Dr. Olmstead urged historians to remember just how unusual Nixon was. Popular accounts of Watergate often minimize the crimes and focus on the subsequent cover-up, but Nixon’s crimes were substantial and began before he was elected.
During the 1968 election, President Lyndon B. Johnson announced that if the North Vietnamese made certain concessions, LBJ would halt bombing campaigns and being negotiations with the North Vietnamese. Nixon publicly agreed with LBJ’s stance, but privately he took action to sabotage the plan. Nixon appointed Anna Chennault, a prominent Republican fund-raiser, as a go-between to encourage South Vietnam to not accept the negotiations. When North Vietnam signaled it would make the necessary concessions, many thought the war would end and Hubert Humphrey, the Democratic candidate for president, started to do better in the polls. Then, the South Vietnamese indicated that the concessions would not be sufficient for them to negotiate.
LBJ knew Nixon influenced the potential negotiations because he instructed the FBI to listen to Nixon’s phone calls. LBJ believed Nixon’s actions amounted to treason but he did not have definitive proof to show Nixon knew about the entire operation so LBJ did not reveal the information publicly.
It is likely the Chennault Affair contributed to the paranoia that eventually led to the Watergate break-in. F.B.I. Director J. Edgar Hoover informed Nixon that LBJ knew of Nixon’s role in sabotaging negotiations with North Vietnam and Nixon became obsessed with the idea that the Democrats had information that could hurt him.
Once in office, Nixon’s illegal behavior snowballed. Nixon authorized secret bombings of Cambodia, warrantless wiretaps on news reporters, and created the infamous “Plumbers.” The Committee to Reelect the President raised 20 million dollars--much of it acquired through bribery and extortion--and then used the funds for massive harassment and surveillance of Democrats during the 1972 election.
Thus, illegal behavior was central to Nixon’s conception of the presidency. Nixon himself explained this to David Frost in 1977. Nixon insisted he should have destroyed the tapes and maintained that “when a president does it that means it’s not illegal.”
Princeton historian Kevin Kruse discussed the presidency of Jimmy Carter. While Carter had a pronounced commitment to ethical government, a closer look at Carter’s presidency shows that even those who tried to meet anticorruption standards can be brought low by their efforts.
Even before Watergate, Carter presented himself as a political outsider. After Watergate, Carter capitalized on American concerns about a lack of morals in politics. Carter wanted to seem as trustworthy as possible on the campaign and after he was elected he worked to maintain the public’s trust. Carter famously put his peanut farm into a blind trust and the White House implemented stricter rules regarding conflicts of interest and financial disclosure.
Because of these promises, Carter’s family came under intense scrutiny and the constant hunt for dirt hurt the administration.
Carter asked Burt Lance, the incoming manager of the Office of Budget, to put his stocks in a blind trust. Lance agreed but then the stocks began to plummet to Lance delayed doing it. In response, a Senate committee and Comptroller General opened investigations into Lance. The investigations revealed sloppiness but concluded there was no wrongdoing. Nonetheless, Carter’s administration was consumed by the Lance investigation and Carter stuck by his friend despite his staff’s recommendation to fire Lance. Finally, in the fall of 1977 Carter pressed Lance to resign and in retrospect Carter realized he should have sooner.
Next, a scandal emerged centered on Carter’s peanut warehouse that was put in a blind trust. As the business had fallen on hard times, it was revealed that Bruce Lance had once given a loan to the warehouse. Many were concerned that the funds were diverted to Carter’s campaign. A team of investigators reviewed 80,000 documents and Carter even gave a sworn deposition (this was the first time a president was interviewed under oath in a criminal investigation). The investigation concluded there was no evidence was diverted.
The third scandal of Carter’s administration centered on Billy Carter. Billy ran a gas station and when the business started to struggle, Billy tried to capitalize on his brother’s fame. In September 1978, Billy took a trip to Libya seeking to make a deal with Muammar Gaddafi. While there, Billy made anti-Semitic comments and urinated on the airport tarmac. This all caused a great deal of embarrassment for Jimmy Carter. Worse, it was soon revealed that Billy had received hundreds of thousands of dollars in loans from Libya. The scandal was dubbed Billygate. After a Senate investigation, a bipartisan report concluded Billy had not done anything illegal.
These sloppy practices each invited close investigation but in each case, officials concluded the acts were not criminal. Nonetheless, these scandals overshadowed much of Carter’s presidency and demonstrate that Carter’s action never lived up to his high-minded intensions.
Jeremi Suri, a historian at the University of Texas at Austin, discussed presidential scandal during the Ronald Reagan administration. Suri noted that he was surprised at how little attention historians have paid to presidential misconduct, likely because historians like to stay away from scandal and research more “serious” events. Suri, however, thinks that misconduct was central to policy for Reagan.
Scandal under Reagan is a paradox because Reagan personally was not corrupt—he did not personally get money from misconduct—and was adverse to discussions he thought were unseemly. Nonetheless, because of his personal qualities, Reagan’s policies were built on a pyramid of misconduct or a “cocktail of corruption” centered on the intersection of deregulation, ideological and at times religious zealotry, lavish resources, and personal isolation from the daily uses of those resources.
In other words, the institutional structure of the executive branch created incentives for corrupt behavior. Strikingly, over 100 members of the Reagan administration were prosecuted and 130 billion dollars were diverted from taxpayer uses.
Dr. Suri focused on a few particular scandals and started with the Environmental Protection Agency scandal. Reagan appointed Anne Gorsuch, the mother of Supreme Court Justice Neil Gorsuch, as the administrator of the EPA. Gorsuch directly negotiated contracts with land developers and when she was investigated, Gorsuch refused to turn over documents or testify and was held in contempt of court. Her deputy served two years in prison.
Secretary of the Interior James Watt was forced to resign after he made explicitly racist comments. After resignation, Watt used his connections to work with the Department of Housing and Urban Development and lobby for his friends to get contracts to build affordable housing that wasn’t actually affordable. Watts was convicted on 25 felony accounts.
As Reagan approached his second term, many advisors resigned and became lobbyists, flagrantly going past the legal limitations on lobbying. Those who continued to work in the White House continued the corruption that plagued the administration in its first term.
Attorney General Edward Meese combined petty corruption with the gargantuan. Niece would try to get double reimbursed for expenses. He took out personal loans from people who were bidding for government contracts, would not disclose the loans, and then would lobby for the loaners to get the contracts. Niece was not convicted but resigned.
Assistant Secretary of the Navy Melvin Paisley stole 622 million dollars from the government. The F.B.I. concluded this was a consequence of large-scale appropriations with insufficient oversight.
Suri argued that the Savings and Loans Crisis and the Iran Contra scandal emerged from an administrative culture that was unregulated, permissive in the misuse of resources, and lavish in spending. He concluded that this structural corruption has not gone away.
To conclude the panel, James Banner gave a thoughtful comment that connected the history discussed to the present impeachment of President Donald Trump. Nixon was a pioneer in orchestrating misconduct from the oval office. Reagan pioneered allowing a shadow administration to implement policies that were not approved by Congress. To Banner, it seems that the Trump administration is doing both of these things at the same time.
|
d34cf2586a655e1de8210ea564645901 | https://historynewsnetwork.org/article/174142 | Could the Climate Crisis Be “The Good News of Damnation”? | Could the Climate Crisis Be “The Good News of Damnation”?
On August 12, 1945, six days after the U.S. government obliterated the city of Hiroshima with a single atomic bomb, Robert Hutchins, the president of the University of Chicago, delivered a remarkable public address. Speaking on his weekly radio program, the Chicago Roundtable, Hutchins observed that Leon Bloy, a French philosopher, had referred to “the good news of damnation” under the assumption that only the fear of perpetual hellfire would motivate moral behavior. “It may be,” Hutchins remarked, “that the atomic bomb is the good news of damnation, that it may frighten us into that Christian character and those righteous actions and those positive political steps necessary to the creation of a world society.”
According to Hutchins, this world society would serve as the foundation of a world government, and, in the context of the existential danger posed by nuclear war, he was totally committed to creating it. “Up to last Monday,” he said, “I didn’t have much hope for a world state.” But the shock of the atomic bombing, he added, crystallized “the necessity of a world organization.”
In the following months, Hutchins created and, then, presided over a Committee to Frame a World Constitution―a group of farsighted intellectuals who conducted discussions on how best to overcome humanity’s ancient divisions and, thereby move beyond nationalism to a humane and effective system of global governance. In 1948, they issued a Preliminary Draft of a World Constitution, with a Preamble declaring that, to secure human advancement, peace, and justice, “the age of nations must end and the era of humanity begin.”
The Chicago committee constituted but a small part of a surprisingly large and influential world government movement that, drawing on the slogan “One World or None,” flourished during the late 1940s. In the United States, the largest of the new organizations, United World Federalists, claimed 46,775 members and 720 chapters by mid-1949. The goal of creating a world federation was endorsed by 45 major national organizations, including the National Grange, the General Federation of Women’s Clubs, the United Auto Workers, the Junior Chamber of Commerce, the Young Democrats, the Young Republicans, and numerous religious bodies. That year, 20 state legislatures passed resolutions endorsing world government, while 111 members of the House of Representatives and 21 Senators sponsored a congressional resolution declaring that the new United Nations should be transformed into “a world federation.” Much the same kind of uprising occurred in nations around the world.
Although this popular crusade waned with the intensification of the Cold War, as did the hopes for a sweeping transformation of the nation-state system, the movement did secure a number of vital changes in the international order. Not only did the United Nations begin playing an important part in global peace and justice efforts, but the original impetus for the world government movement―the existential danger of nuclear war―began to be addressed by world society.
Indeed, a massive, transnational nuclear disarmament movement, often led by former activists in the world government campaign, emerged and rallied people all around the planet. In this fashion, it placed enormous pressure upon the world’s governments to back away from the brink of catastrophe. By the mid-1990s, national governments had reluctantly agreed to a sweeping array of international nuclear arms control and disarmament treaties and were no longer threatening to plunge the world into a nuclear holocaust.
More recently, however, that world society has been crumbling thanks to a dangerous return of nationalism. From the United States to Russia, from India to Brazil, numerous countries have been swept up in xenophobia, triggering not only a disastrous revival of the nuclear arms race, but an inability to work together to challenge the latest existential threat to human survival: climate change. Championing their own narrow national interests―often based on little more than enhancing the profits of their fossil fuel industries―these nations have either torn loose from the limited international environmental agreements of the past or, at best, shown their unwillingness to take the more significant steps necessary to address the crisis.
And a crisis it is. With the polar ice caps melting, sea levels rising, whole continents (such as Australia) in flames, agriculture collapsing, and storms of unprecedented ferocity wreaking havoc, climate catastrophe is no longer a prediction, but a reality.
What can be done about it?
Clearly, just as in the case of heading off nuclear annihilation, no single nation can tackle the problem on its own. Even if a small country like the Netherlands, or a large country like the United States, managed to quickly develop a system of 100 percent renewable energy, that action would be insufficient, for other countries would still be generating more than enough greenhouse gasses to destroy the planet.
So there really is no other solution to the onrushing climate catastrophe than for people and nations to forget their tribal animosities and start behaving as part of a world society, bound together by an effective system of global governance. The climate crisis, like the prospect of nuclear annihilation, really is “the good news of damnation.” And we can only overcome it by working together.
One world or none!
|
55f7d85b76ee265be93faafeb1d6feb7 | https://historynewsnetwork.org/article/174199 | Historians Struggle to Understand Oral History Written in Forgotten Shorthand | Historians Struggle to Understand Oral History Written in Forgotten Shorthand
Scholars at a Utah university are trying to unlock a mystery after discovering a nearly 70-year-old transcript of an interview with a notorious brothel owner that is written in a shorthand style that few people can read today.
The interview was with madam Rossette Duccinni Davie, who ran the Rose Rooms brothel in Ogden with her husband in the 1940s and 1950s. Today, the location is home to the nightclub Alleged, the Standard-Examiner reported.
The interview with former Standard-Examiner reporter Bert Strand was hidden inside a box of 1970s photos from the newspaper, said Sarah Langsdon, head of the Weber State University’s special collections.
The pages could be a treasure trove of material for historians in Ogden, a city of about 88,000 located 40 miles (64 kilometers) north of Salt Lake City.
But there’s a problem: The 1951 transcription is written in a decades-old shorthand style that few people use today. “It’s definitely a lost art,” Langsdon said.
|
27a1dcd30a51c847e0b2c887985a55b3 | https://historynewsnetwork.org/article/174213 | Tulsa plans to dig for suspected mass graves from a 1921 race massacre | Tulsa plans to dig for suspected mass graves from a 1921 race massacre
Nearly a century after a race massacre left as many as 300 people dead, the city plans to dig for suspected mass graves that may have been used to dispose of African American bodies.
Archaeologists searching for mass graves connected to the 1921 Tulsa Race Massacre agreed Monday night that Tulsa will conduct “limited excavations” in a city-owned cemetery to determine whether a “large anomaly” detected by ground-penetrating radar contains human remains.
“We do not propose to exhume any human remains during this phase of the investigation,” the committee of archaeologists and forensic anthropologists said. “However, any human remains that are uncovered during the excavation will be treated respectfully and with reverence.”
The excavation is set to begin in April.
The decision comes two months after a team of forensic archaeologists, led by the Oklahoma Archaeological Survey at the University of Oklahoma, revealed that they had found “possible common graves” at two sites in Tulsa. They identified the sites as the Canes, located on a bluff along the Arkansas River near Highway 75, and the Sexton area of Oaklawn Cemetery, which is a few blocks from Greenwood, the black community that was destroyed during one of the worst episodes of racial violence in U.S. history.
|
4e9fac76cecdd9e698b131fc3ea5c6b0 | https://historynewsnetwork.org/article/174256 | Re-Animating the 1619 Project: Teachable Moments Not Turf Wars | Re-Animating the 1619 Project: Teachable Moments Not Turf Wars
Who wins when distinguished historians, all white, pick fights over the history of slavery with prominent New York Times journalists, all black, who developed the newspaper’s 1619 Project? Beginning last year, a stream of well known scholars have been objecting publicly to the journalists’ contention that slavery, white racism and African American resistance so fundamentally define the American past that it acts as our history’s (so to speak) prime mover. The historian/ critics’ basic reply: Wrong. It’s more complicated!
One of these scholars, Sean Wilentz, quite recently detailed his objections in the January 20, 2020 issue of The Atlantic. At about the same time a dozen additional dissidents published their critiques (to which the 1619 Editors responded) in the History News Network. Meanwhile, out-front bigots like Newt Gingrich, Tucker Carlson, Michael Savage and Rush Limbaugh hijacked the controversy. They have captured the headlines, dominated the news cycle and-- as they would have it-- taken home the trophy and delivered it to Donald Trump. The New York Times journalists have emerged with collateral damage and the historians as unmitigated losers in the court of public opinion. Lost as well was a rare opportunity for a substantial evaluation of slavery’s role in shaping our shared American experience.
But here’s what’s most important. Those of us who value the 1619 Project can reclaim our “teachable moment” by excavating beneath the heated rhetoric. There we will discover that the journalists and the historians embrace conflicting but equally valuable historical truths regarding slavery’s power to shape our nations past and present. I will soon articulate why this is so and what we can learn as a result.
First, however, we must move beyond the conflict that erupted when Wilentz, joined by James M. McPherson, Gordon Wood, James Oakes, and Victoria Bynum, eminent scholars all, forgot that they also have an obligation to serve us as educators, not as censors. By so harshly attacking credibility of the 1619 Project in their letter to The New York Times, they squandered the “teachable moment” that the Project itself intended to create. Instead, these scholars appointed themselves gatekeepers charged with the heavy enforcement of their personal versions of high academic “standards."
Instead of constructively dissenting and inviting dialogue, they berated the 1619 journalists for pushing “politically correct” distortions grounded in Afro-centric bias. “The displacement of historical understanding by ideology” is how one of them phrased it. They demanded retractions, worked assiduously (and failed) to recruit scholars of color to their cause, and sent their complaints directly to the top three editors of the Times and its Publisher A.G Sulzberger. That looks a lot like bullying. Dialogue dies when one contending party publicly attempts to undercut the other with his/her bosses.
The historians, however, were not alone when criticizing the 1619 Project. Newt Gingrich proclaimed that “The NYT 1619 Project should make its slogan ‘All the Propaganda we want to brainwash you with.’” Ted Cruz fulminated that “There was a time when journalists covered ‘news.’ The NYT has given up on even pretending anymore. Today, they are Pravda, a propaganda outlet by liberals, for liberals.” The Trumpite commentators who had been attacking the 1619 Project since last August seized the distinguished historians’ arguments and repeated them on FOX News. Eric Erickson’s reactionary website, The Resurgent, freely appropriated them (without acknowledgement). Though the Times has defended the Project’s integrity while other media outlets have highlighted the general controversy and The Atlantic has published Wilentz’s academic critique, the Trumpistas have high-jacked the conversation.
So thanks to the triumph of Team Bigotry we have yet to discover what the historians, the journalists and the 1619 Project more generally can actually teach us. But we can make a strong start by reflecting on the contrasting points of view signaled by these titles: John Hope Franklin’s From Slavery to Freedom and August Meier’s and Elliot Rudwick’s From Plantation to Ghetto. Back in the 1960s, when African American history was first establishing itself as a mainstream field of study, these two dominated the textbook market. Together they presented sharply differing alternatives for teaching about slavery and its legacies. Each is as influential today as it was back then.
Pick From Slavery to Freedom and you develop a history course around a text that foregrounds sustained activism that produced significant change, presumably for the better. Select Plantation to Ghetto and you prepare students for a sobering overview of racist continuity that has persisted across the centuries despite all the struggles against it. Martin Luther King perfectly captured the spirit of Franklin’s text when affirming that “The arc of history bends toward freedom.” Amiri Baraka (Leroi Jones) did exactly that for Meier’s and Rudwick’s text when lamenting the “ever changing same” of African American history.
Guided by both King and Baraka we hit bedrock. Their conflicting insights, partial though they are, carry equal measures of truth. Heeding King and the historian/critics who share his perspectives, let’s inquire: Was African American history replete with life-changing moments, epochal struggles, liberating ideologies, unexpected leaps ahead, and daring cross racial collaborations? History’s reply is “of course.” Heeding the 1619 journalists who share Baraka’s perspective, let’s ask: Was African-American history determined by a white racism so intense that it repeatedly crushed aspirations, inflicted terrible violence, undercut democratic values, made victories pyrrhic and pulled leaps ahead rapidly backwards? The answer again is “of course.” By making these two affirmations we have finally identified a central question that can reanimate our “teachable moment.
Imagine the Times journalists and an open-minded group of historian-critics debating this question: “To What Degree has the Arc of African American History Bent toward Freedom?” Also imagine them pursuing this discussion in a spirit of informed collegiality in, say, a nationally televised PBS forum. And since we are in the business of reanimating, let’s finally imagine that we have summoned the ideal Moderator/ Commentator, a journalist who made history, and who did more than any other survivor from slavery to educate Americans about the depth of white racism and black resistance. Frederick Douglass. Can you imagine a more “teachable moment?"
This scene, fanciful as it is, amply suggests what should have taken place had the historian-critics chosen to be teachers, not gatekeepers. It also provokes us to realize that we can searchingly interrogate our nation’s complex chronicle of racial injustice while acknowledging the areas in which we have made palpable progress. Opportunity still awaits us to snatch back the trophy from Team Bigotry and push back together, journalists and historians alike, against our own rising tide of white supremacy.
Editor's note: The History News Network is attempting to create a forum to discuss how historians can invite the public to learn and reflect on the pain and paradoxes of African American history and how journalists and historians might learn from and collaborate with one another in addressing this history. We hope to update readers soon.
|
a8a7803aea144e0aaf650be38406dd72 | https://historynewsnetwork.org/article/174263 | These 19 black women fought for voting rights | These 19 black women fought for voting rights
August 18, 2020 marks 100 years since the ratification of the 19th Amendment guaranteeing all American women “suffrage,” or the right to vote. The dominant narrative about the women’s suffrage movement is framed through the experiences of white women (and to some extent, abolitionist Frederick Douglass, a noted and outspoken supporter of women's rights). But African-American women played a major role in obtaining the right to vote even though many of them would not truly enjoy the right themselves to the same extent until decades later.
In 1872, Susan B. Anthony attempted to vote in the presidential election and was arrested and tried in Rochester, New York. In Battle Creek, Michigan, Sojourner Truth demanded a ballot and was turned away. The suffrage movement was in full swing.
Women’s rights activists like Elizabeth Cady Stanton and Betsy Ross, who championed gender equity, didn't feel the same about race. While many white suffragists worked to help eradicate the institution of slavery, they did not work to ensure that former slaves would have citizenship or voting rights.
“Black women were not accounted for in white women’s push for suffrage. Their fight wasn’t about women writ large. It was about white women obtaining power – the same power as their husbands, black women and black men be damned,” says Howard University Assistant Professor Jennifer D. Williams.
...
“There was a concerted effort by white women suffragists to create boundaries towards black women working in the movement,” says historian and author Michelle Duster. “White women were more concerned with having the same power as their husbands, while black women saw the vote as a means to improving their conditions."
|
b01100f7b99a60f3849d0c7c58f0406a | https://historynewsnetwork.org/article/174269 | Chernobyl disaster worse than we were told, historian Kate Brown says | Chernobyl disaster worse than we were told, historian Kate Brown says
The 1986 Chernobyl nuclear plant explosion may seem far away in time and place to Montanans, but radioactivity from that disaster and from decades of nuclear bomb tests can be traced to every American and we still don’t really know the health consequences.
That was one of the messages from Kate Brown, a science historian from MIT, who spoke to a packed crowd Thursday at the Museum of the Rockies as part of Montana State University’s Science Matters lecture series.
“All humans have radioactivity in their bodies,” Brown said. She is the author of “Manual for Survival” on Chernobyl and “Plutopia,” on disasters and cover-ups at plutonium plants from Russia to Hanford, Washington.
Chernobyl should be seen, not as a one-time accident, Brown argued, but as part of a larger, still ongoing story of global nuclear contamination and Russian, American and the international officials hiding the truth.
|
e08bbe48088ab9907ddb44db59697d4e | https://historynewsnetwork.org/article/174273 | Carrying Community: The Black Midwife’s Bag in the American South | Carrying Community: The Black Midwife’s Bag in the American South
The classic 1953 documentary film All My Babies features the life and work of Mary Coley, a legendary African-American “granny” midwife.1 The film follows Coley as she travels around her rural Georgia community carrying her ever-present black satchel. In one memorable scene, the exhausted midwife returns home after a long night of “catching babies.”2 As Coley falls into bed, the camera pans to her midwifery bag, which she has hastily discarded on a trunk alongside her coat. Drifting off to sleep, Coley hears a voice in her head: that of the local white doctor, who, in an earlier training session, reminded the lay midwives that infection could result if “something wasn’t clean.” Despite her fatigue, Coley gets up and puts on the kettle. Emptying her midwifery bag of its contents, she boils and scrubs each tool in it. When she finally returns to bed, the sun has come up. Immediately, there is a knock at Coley’s door: another woman has gone into labor. There will be no rest for this baby-catcher on this morning.
The film’s intended message is about hygiene. The Georgia Department of Public Health produced All My Babies to help instruct black “granny” midwives on modern medicalized (read: white and male) birthing practices, particularly sanitation. The film thus was part and parcel of early- to mid-twentieth-century attempts to surveil and regulate lay midwives, most of whom were black, in the American South.3 This project privileged white, male, allopathic medical knowledge over the woman-dominated and community-based health traditions of black communities. Lay midwifery ended completely in most places by the 1960s, when state regulations finally shut black midwives out of business.4 Still, up to the 1960s, white hospitals prevented black women in the South from accessing reproductive services in white hospitals. These women continued to receive healthcare from the “grannies” who learned their trade by apprenticeship. Women like Coley, then, worked, and even thrived, in an era of transition.
The policing of midwives’ bags and what was in them was central to the mission that would ultimately destroy black women’s traditional health networks. What midwives carried in their medical satchels, and how they took care of their supplies, featured prominently in reform movements. In 1920s Virginia, for example, regulations stated that “A midwife’s bag was only supposed to contain certain items: soap, clean towels or cloth, a white apron and hat, scissors to cut the umbilical cord, silver nitrate to prevent blindness, and birth certificate forms.” A white Georgia nurse employed to educate black lay midwives in the late 1920s later recalled: “I had classes at least once a month. I taught them how to pack their supplies, wrap them and bake them in the oven at a low temperature.”5
|
abfa0d160f47996be8ff0e96bb2e7687 | https://historynewsnetwork.org/article/174287 | When White Women Wanted a Monument to Black ‘Mammies’ | When White Women Wanted a Monument to Black ‘Mammies’
In 1923, a group of white women wanted to build what they called a “monument to the faithful colored mammies” in Washington. These women, members of the United Daughters of the Confederacy, pressed lawmakers in Congress to introduce a bill. The Senate passed it, but the bill stalled in the House after fierce opposition from black women, including Mary Church Terrell and Hallie Quinn Brown, members of the National Association of Colored Women.
The fight over a proposed monument to black “mammies” exposes the lie of those who describe Confederate monuments as innocuous celebrations of Southern heritage. Lost Cause memorials are hurtful public symbols of white supremacy. Consider that most Confederate monuments were not erected by grieving widows or relatives immediately after the Civil War. A majority were put up in the 1890s and early 1900s by Southern whites hoping to justify the spread of Jim Crow while erasing the legacy of Reconstruction, a time when African-Americans had gained citizenship and voting rights.
There are now more than 1,740 Confederate monuments, statues, flags, place names and other symbols in public spaces across the country, not counting more than 2,600 markers, battlefields, museums and cemeteries that commemorate the Confederate dead or the many hundreds of statues of staunch segregationists. To date, only about 115 have been removed. In stark contrast, fewer than 100 monuments pay tribute to the civil rights movement.
Southern white women in organizations like the United Daughters of the Confederacy, founded in 1894, raised funds to build many of the Confederate memorials and placed “loyal slave” plaques nearby. These celebrations of the “loyalty” of formerly enslaved people implied they had been happier in subordination, were still unequal and so should be segregated and treated as inferiors. In addition to the plaques, many formal monuments to “faithful slaves” were proposed; three were built, including the Faithful Slave Monument in Fort Mill, S.C., in 1895.
|
6fdfeb19a4472906c4c342d5382ba072 | https://historynewsnetwork.org/article/174290 | New England Historic Genealogical Society Receives Collection of Roosevelt Family Papers from the Theodore Roosevelt Association of Oyster Bay, New York | New England Historic Genealogical Society Receives Collection of Roosevelt Family Papers from the Theodore Roosevelt Association of Oyster Bay, New York
Deeper glimpses into the fascinating lives of the extended family of Theodore Roosevelt (1858–1919), the 26th president of the United States, have been added to the collection of American Ancestors | NEHGS through a generous donation of Roosevelt family papers from the Theodore Roosevelt Association (TRA). The TRA, chartered by Congress in 1920, is a historical and public service organization dedicated to perpetuating the memory and ideals of Theodore Roosevelt.
Letters, previously unpublished family photographs, a handwritten multi-page genealogy, period family ephemera, and scrapbooks of hundreds of items are included in papers documenting the lives of members of the eclectic Dutch New York family whose members have included merchants, bankers, politicians, soldiers, explorers, and socialites. Many in the family became prominent in New York City business and political circles where “to be a Roosevelt was to be something distinctive.” The world followed the family’s exploits closely for decades as two Roosevelts, Theodore, and Franklin Delano Roosevelt (1882-1945), rose to national prominence with their election to the presidency, and another, Anna Eleanor Roosevelt (1884-1962), became the nation’s longest-serving first lady, and a prominent diplomat.
Ryan Woods, EVP and COO of American Ancestors, announced the donation of the collection from the Theodore Roosevelt Association of Oyster Bay, New York. “We are delighted to be the recipient of these materials collected by the TRA since the 1920s ” said Woods. “We will make them readily accessible to the members of American Ancestors and other family historians, recognizing the immense value this collection will have to scholars around the globe.”
|
cd43d4c2e253ae98313c63ad8d2f3af2 | https://historynewsnetwork.org/article/174308 | Shifting Collective Memory in Tulsa | Shifting Collective Memory in Tulsa
As a native Tulsan, I felt like I was dreaming when I saw a hologram of Henry Louis Gates Jr. greeting visitors at the Greenwood Cultural Center in Tulsa, Okla. But this wasn’t real life — it was the parallel reality of HBO’s “Watchmen,” where the fictional Gates character is the treasury secretary who helps descendants of the Tulsa Race Massacre of 1921 trace their history and claim reparations for what was — in the show and in reality — the country’s worst act of racial violence since the Civil War.
I’d been to the real Greenwood Cultural Center many times, and there were no holograms, no virtual reality displays of what happened in Tulsa almost a century ago. In the cavernous shell of the actual Greenwood Center, there is a no-frills, but emotionally chilling, history of a place once called Black Wall Street. A decidedly low-tech exhibition features black-and-white photo reproductions of a war zone, charred bodies and menacing white looters. Also in the real Tulsa: Reparations for 1921 are far more implausible than a Robert Redford presidency.
Since the debut of “Watchmen,” Greenwood has become a destination for celebrities hoping to see the real Black Wall Street. Michael Bloomberg and Cory Booker swung through the neighborhood while campaigning for the Democratic nomination. Phil Armstrong, a project director of the 1921 Tulsa Race Massacre Centennial Commission, told me that tourism, virtually nonexistent a few years ago, is thriving. The show, however, was mostly filmed in Georgia, and visitors are often perplexed by the massive overpass bisecting Greenwood, a product of a later plan for “urban renewal.”
More important than Tulsa’s pop culture moment are the African-American community’s efforts to change the narrative of the massacre that has been ingrained in the city since the last fires of 1921 died out. For almost one hundred years, Tulsa called the events of 1921 a “race riot,” when the city mentioned the event at all. As a kid in predominantly white Tulsa schools in the 1980s and 1990s, I never learned anything about the invasion and destruction of black Tulsa. The silence around the tragedy was broken two decades ago when a state commission was formed to study it. But its first recommendation — reparations to survivors — was never taken up. In 2018, one of the last survivors died, just as the battle over the historical memory of the “riot” became more visceral than ever.
|
8aef97bc1716e60e82a91f5dba45c1ac | https://historynewsnetwork.org/article/174337 | In Pursuit of Knowledge: A New Book about Black Women’s Educational Activism | In Pursuit of Knowledge: A New Book about Black Women’s Educational Activism
The story of school desegregation in the United States often begins in the mid-twentieth-century South. Drawing on archival sources and genealogical records, Kabria Baumgartner uncovers the story’s origins in the nineteenth-century Northeast and identifies a previously overlooked group of activists: African American girls and women.
In their quest for education, African American girls and women faced numerous obstacles — from threats and harassment to violence. For them, education was a daring undertaking that put them in harm’s way. Yet bold and brave young women such as Sarah Harris, Sarah Parker Remond, Rosetta Morrison, Susan Paul, and Sarah Mapps Douglass persisted.
In Pursuit of Knowledge argues that African American girls and women strategized, organized, wrote, and protested for equal school rights — not just for themselves, but for all. Their activism gave rise to a new vision of womanhood: the purposeful woman, who was learned, active, resilient, and forward-thinking. Moreover, these young women set in motion equal-school-rights victories at the local and state level, and laid the groundwork for further action to democratize schools in twentieth-century America. In this thought-provoking book, Baumgartner demonstrates that the confluence of race and gender has shaped the long history of school desegregation in the United States right up to the present.
|
6b81fc25ad6709e89308cd8dc301fcf9 | https://historynewsnetwork.org/article/174342 | How a Fake Priest Duped Oxford and a World-Famous Historian | How a Fake Priest Duped Oxford and a World-Famous Historian
A con man is only as good as his charm. Frank William Abagnale, reincarnated by Leonardo DiCaprio in “Catch Me if You Can,” inhabited half a dozen identities by the time he was 21 and did so with such brio that he was able to fool hundreds. Charles Ponzi was a dapper operator who tooled around in a Locomobile. And Ronnie Cornwell, the father of the novelist John le Carré, was an insurance fraudster who later became the model for the charismatic Rick Pym in le Carré’s “A Perfect Spy.” The most famous image of Cornwell shows him in a top hat and buttonhole striding confidently through a top-class English crowd, with a look of knowing concentration mixed with an offhand breeziness. You can feel the charm coming off the image.
In Adam Sisman’s amusing and elegantly written biography of the midcentury British impostor Robert Parkin Peters, excitedly styled “Peters the Parson” and “Romeo of the Church” by the yellow press, the subject is a curious and relatively harmless man of many faces who managed to attract the attention of one of Britain’s most august modern historians. The suavely aristocratic and yet strangely gullible Hugh Trevor-Roper first encountered Peters at Oxford in 1958 when Trevor-Roper, then a Regius professor of modern history, received a letter from an unknown supplicant on behalf of a Mr. and Mrs. Peters. They were young academics suffering “vindictive persecution from outside the university.” Could the professor help?
Trevor-Roper was curious. Having made his name in 1947 with “The Last Days of Hitler,” which drew on his wartime work with MI6, he was, at 44, one of the most famous men at Oxford and indeed in the country. He offered his assistance to Peters and eventually agreed to meet him. “Peters,” Sisman writes, “was a small, chubby-cheeked, bespectacled man with thinning hair and an earnest manner, who spoke with a slight lisp.” He was a graduate student in divinity at the prestigious Magdalen College, which had for some reason neglected to go through his application materials with their customary diligence.
Although Trevor-Roper did not know it at the time, Peters had been born with a skeletal deformity that had forced him into a metal frame during his formative years. He claimed to be 34 but was most likely 40. Pugnacious, yearning to be a genuine theological academic, Peters struck the historian as curiously impressive in some way. He said he had been persecuted by the bishop of Oxford “in the most unaccountable manner,” barred from officiating for reasons unknown. Intrigued, Trevor-Roper agreed to look into the matter — and by doing so opened a door into the parallel life of Robert Peters, bigamist extraordinaire, false priest, phony academic and, for a time, a respected member of Magdalen College. Not to mention erstwhile husband to at least seven women, none of whom suspected that he was not an upright man of religion.
|
9c5af7fe049f3e2fa0ade423b0611807 | https://historynewsnetwork.org/article/174385 | Bernie Sanders, Social Democracy, and Democratic Socialism | Bernie Sanders, Social Democracy, and Democratic Socialism
Bernie Sanders is the best kind of social democrat and sort of a democratic socialist. He roars against economic corruption, severe inequality, and donor class politics, protesting that Wall Street Democrats dominate the Democratic Party. He espouses a welfare state politics of economic rights, which he calls, with a modicum of historical warrant, democratic socialism. Sanders wants the U.S. to institute policies that social democratic governments achieved long ago in Denmark, Finland, Norway, Sweden, and Germany, and he opposes authoritarian forms of socialism, contrary to an incoming avalanche of red-baiting. He resolved forty years ago to wear the socialist label as a badge of honor since people were going to call him one anyway. This decision served him well until now. Now he is obliged, with much at stake, to explain more than ever what the s-word does not mean to him.
I say this as a longtime supporter of Sanders and as a current supporter of Elizabeth Warren who believes that she is the best candidate to unite the Democratic Party and attract various others to it. Taking ethical responsibility for the likeliest electoral outcome of November 2020 is a very serious business. But my subject is the democratic socialism of Sanders, which looms ever larger in American politics after the Iowa and New Hampshire primaries.
Classic democratic socialism calls for centralized public ownership of essential enterprises, or worker ownership, or mixed forms of public and worker ownership, either decentralized or not. But Sanders has never pushed for any of these things. The closest that he comes to classic democratic socialism is his plank calling for worker control of up to 45 percent of board seats and 20 percent of shares. Similar planks in European platforms have long marked the boundary between modern social democracy and democratic socialism. In Germany, all companies employing more than 1,000 workers have been required since 1951 to institute a supervisory board consisting of 50 percent worker representatives and 50 percent management representatives. German codetermination has been a firewall against runaway manufacturing plants, since workers do not sabotage their own jobs. The Swedish version of codetermination, a union capital fund called the Meidner Plan for Economic Democracy, would have crossed the line eventually from social democracy into democratic socialism. But it was scuttled in 1992 after a 10-year run, confirming that even in Sweden the guardians of neoliberal capitalism do not tolerate transitions to a different kind of system.
Social democracy and democratic socialism have never completely overlapped. Originally, “social democracy” named all socialists that rejected anarchism and ran for political office, while “democratic socialism” named the flank of social democrats that insisted on the liberal democratic road to socialism. Democratic socialists contended against their Marxian comrades that democracy is the indispensable road to socialism, not a byproduct of achieving it. Social democracy was a name for the broad socialist movement and democratic socialism was a liberal-democratic flank of it.
But “social democracy” acquired a different meaning after European socialists compiled records in electoral politics and shed much of their Marxian background. Their prolonged struggle against Communism was formative and defining, as was their swing away from collective ownership after World War II. “Social democracy” came to signify what democratic socialists actually did when they ran for office and gained power. They did not achieve democratic socialism, the radical idea of social and economic democracy. They added socialist programs to the existing system, building welfare states undergirded by mixed economies.
“Social democracy” became synonymous with European welfare states in which the government pays for everyone’s healthcare, higher education is free, elections are publicly financed, and solidarity wage policies restrain economic inequality. In the United States, meanwhile, healthcare depends on what you can afford, many have no health coverage at all, students enter the workforce with crippling debt, private money dominates the political system, and severe inequality worsens constantly. Sanders has long challenged the U.S. to aspire to social democratic standards of social decency. In his early career he called it “socialism” because he was a stubborn type who didn’t flinch at red-baiting and the social democratic parties no longer talked about economic collectivism. Then he found himself speaking to a generation that grew up under neoliberalism and does not remember the Cold War.
Occupy Wall Street, in 2011, was a harbinger that people are fed up and a breaking point had been reached. Forty years of letting Wall Street and the big corporations do whatever they want yielded protests against flat wages, extreme inequality, the specter of eco-apocalypse, and the neoliberal order. The way that Sanders describes democratic socialism is prosaic in comparison to its history and in the context of the global rebellion to which he speaks. He conceives democratic socialism as the fitting name for his belief that a living wage, universal healthcare, a complete education, affordable housing, a clean environment, and a secure retirement are economic rights.
These six economic rights come from Franklin Roosevelt’s 1944 Economic Bill of Rights. The much-dreaded “radicalism” of Sanders is in fact a throwback to FDR’s State of the Union Address of 1944. Sanders lines up with FDR, Martin Luther King Jr., and Catholic social teaching in believing that real freedom includes economic security. In November 2015 at Georgetown, he reeled off 32 paragraphs about what democratic socialism means to him. All were FDR themes in social democratic language: “Democratic socialism means that we must create an economy that works for all, not just the very wealthy. Democratic socialism means that we must reform a political system in America today which is not only grossly unfair, but, in many respects, corrupt.”
Today he is building upon the spectacular campaign he ran in 2016, rewarded for decades of never wavering on behalf of equality. Last time it showed that Sanders didn’t know many black or Latino organizers and had spent his career speaking mainly to white Vermont audiences. This time his campaign feels much less white and Old Left. The Sanders campaign is a social movement, as he says, not a conventional campaign. Sanders is a magnet for many progressives who hold no interest in joining the Democratic Party and who trust that he is no more of a Democrat than they are. My qualm about his candidacy is that he will not be able to unify the party if he is nominated; Sanders has been a terribly solitary figure in the Senate for a long time. But the party bosses must let this play out without cheating him as they did last time. And they really must resist red-baiting the person who may emerge as their nominee.
|
0966ae5e921bc91fec6581d628017b16 | https://historynewsnetwork.org/article/174396 | California Plans to Apologize to Japanese-Americans Over Internment | California Plans to Apologize to Japanese-Americans Over Internment
Nearly 80 years after President Franklin D. Roosevelt authorized the military to move thousands of Japanese-Americans into internment camps, California plans to formally apologize for its role in their detention.
The resolution, introduced by State Assemblyman Al Muratsuchi on Jan. 28, is expected to receive broad support this week from the rest of the Assembly. While many welcomed the measure, the latest step in the United States’ long reckoning with its imprisonment of American citizens during World War II, some Japanese-Americans said it was far overdue.
The resolution said the California Legislature “apologizes to all Americans of Japanese ancestry for its past actions in support of the unjust inclusion, removal, and incarceration of Japanese-Americans during World War II, and for its failure to support and defend the civil rights and civil liberties of Japanese-Americans during this period.”
The resolution also states that “given recent national events, it is all the more important to learn from the mistakes of the past and to ensure that such an assault on freedom will never again happen to any community in the United States.”
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.