id stringlengths 32 32 | url stringlengths 31 1.58k | title stringlengths 0 1.02k | contents stringlengths 92 1.17M |
|---|---|---|---|
595dbe11c8db540efe98d1d13cd1bd8e | https://historynewsnetwork.org/article/154924 | Maximilian Schell Changed the Way We Think About the Holocaust. Really. | Maximilian Schell Changed the Way We Think About the Holocaust. Really.
Image via Wiki Commons.
The recent death of actor Maximilian Schell prompted recollections of his Oscar-winning performance in the 1961 film Judgment at Nuremberg. One of the most prominent films made by producer-director Stanley Kramer from an original screenplay by Abby Mann, Judgment at Nuremberg was a fictional film based on factual events. The film restaged and reworked the 1947 Nuremberg trial of judges and legal officials of the Nazi regime, the so-called “Justice Case.” Nonetheless it related closely to events in 1961, chiefly the trial of Adolf Eichmann and the building of the Berlin Wall. Mann’s screenplay began as an episode on television’s Playhouse 90 and also won him an Academy Award. Schell reprised his television role in the film, playing Hans Rolfe, the defense lawyer for the four German judges on trial for their crimes during the Nazi era.
Without a doubt the Austrian-born Schell, whose own family fled to Switzerland after the Nazi occupation and annexation of Austria, gave a compelling, convincing performance as Rolfe. But the historical message Schell’s character communicated also deserves recognition. In defense of his clients, Rolfe presents a historical interpretation of the causes of Nazi crimes that emphasizes what in Holocaust studies has come to be called “universalism,” as opposed to “particularism.” Instead of holding the German people solely responsible for Nazi crimes, or seeing Germany as uniquely evil, Rolfe claims many people and nations were to blame, including the United States.
Following a confessional outburst from the lead defendant Ernst Janning (played by Burt Lancaster), admitting that he and every other German knew about the Final Solution, Rolfe attempts to salvage his case and save his client. “Ernst Janning says he is guilty. If he is, Ernst Janning’s guilt is the world’s guilt. No more and no less.” Rolfe cites damning evidence of international collaboration with Hitler and the Nazi regime, such as the Vatican’s 1933 Concordat, the Soviet Union’s 1939 Non-Aggression Pact, and American weapons manufacturers, arguing, “The whole world is as responsible for Hitler as Germany.”
Rolfe also calls into question the right of the former Allied nations to judge Germany, given their own past atrocities. His main target is the United States. When discussing the Nazi sterilization of one of the witnesses at the trial Rudolph Petersen (played by Montgomery Clift), Rolfe offers a 1927 quote from Supreme Court Justice Oliver Wendell Holmes in support of a similar eugenic policy in Virginia: “Three generations of imbeciles are enough.” In rejecting U.S. moral superiority, Rolfe evokes the tragic consequences of the atomic bombs dropped on Hiroshima and Nagasaki. “Thousands and thousands of burnt bodies! Women and children!” Bringing even more emotional force to these arguments is Schell’s powerful delivery in these scenes. Time magazine called his final speech “a pulverizing passage of eloquence.”
Although Schell’s Rolfe makes a strong case for universal guilt and responsibility, Abby Mann’s script and Stanley Kramer’s film additionally conveys the responsibility of the German defendants for Nazi crimes, as Janning’s confession indicates. “I am Jewish,” Kramer later recalled. “I wanted to film Judgment at Nuremberg because those trials said something that I didn’t think the world had fully grasped.” Also Jewish, Mann wrote his original teleplay because “I wanted to pierce the lie; the big lie [in Germany] was, ‘we didn’t know about it.’” This interpretation is consistent with “particularism,” and critics at the time and scholars since have pointed out that this muddied the film’s message. Yet, in offering an interpretative mix of particularism and universalism, the film more fully portrayed the complexity of history and the multiple causes of historical events.
As a result, Judgment at Nuremberg cannot be categorized as an example of the so-called “Americanization of the Holocaust,” the process by which a catastrophic European event became an American moral touchstone. Crucial to this process were American movies, and scholars characterize and criticize Hollywood films for offering redemptive universal narratives about the Holocaust and avoiding confronting audiences with the grim reality of Nazi genocide. Other films appearing in the same years, such as The Diary of Anne Frank (1959), helped to establish the optimistic Hollywood approach to the Holocaust. But viewers of Judgment at Nuremberg did not leave the theater buoyed by a comforting message about human nature or absolved from responsibility. Instead, Kramer, Mann, and Maximilian Schell made a movie about the Nuremberg trials that did not deny the unbearable facts of the larger history.
|
5f47055ed42783f7acca166690ad08bc | https://historynewsnetwork.org/article/154940 | George Takei Reflects on Travel, Both Painful and Pleasurable | George Takei Reflects on Travel, Both Painful and Pleasurable
George
Takei is a man of many speaking engagements. He talks at “Star Trek”
conventions about his role as Mr. Sulu. He tells soldiers and students
about the role of Japanese-Americans during World War II, a subject in
which he is well versed. In a new musical, “Allegiance,” based on his family’s experience living in internment camps during the war, he actually sings the story....
In
May, Mr. Takei, 76, will do a speaking tour at universities in Japan
and Korea, organized by the State Department, to talk about his life and
career as an openly gay Asian-American....
Q. Why were you speaking at military bases in Bavaria a few years back?
A. May
is Asian-Pacific American Heritage Month, and about five years ago the
military decided they needed to know more about the Asian-American
contribution to our military. I spoke at every American base in southern
Bavaria and described some of the battles. I talked about the
absolutely amazing heroism of the Japanese-Americans during World War
II, young men who came from behind barbed-wired internment camps, who
were labeled enemy non-aliens. They even took the word “citizen” from
us. And yet, a year later, when they opened up service to us, thousands
went to fight in Europe, in a segregated unit. When they came back, they
were welcomed back on the White House lawn.
Where was your family interned during World War II?
I
will never forget that scary day. My parents were packing, and I saw
two soldiers with bayonets march up to our front door, and we were
marched out simply because of our ancestry. We were taken to a swamp in
Arkansas and later to Wyoming....
|
7e0f45a466aaeb94f2daf5b146d37e39 | https://historynewsnetwork.org/article/155725 | What Would George Kennan Say About Ukraine? | What Would George Kennan Say About Ukraine?
“We
must be gardeners and not mechanics in our
approach to world affairs” (George F. Kennan)
The spectre of
Russian expansion is once again haunting Europe. The longer the
Ukrainian crisis rumbles on, the louder become the voices in favour
of reviving the cold war policy of containment. Putin may be an
authoritarian nationalist rather than a totalitarian communist, but
those voices contend that -- like his Soviet predecessors -- the
Russian President is intent on creating a sphere of influence to
challenge western values and political systems.
Putin has even been
compared to Hitler and his critics ask: after Russia’s absorption
of the Crimea, what next?
The original
architect of containment was George F. Kennan, a hitherto obscure
diplomat in the US embassy in Moscow who captured the public
imagination when, in 1947, he published an article in Foreign
Affairs entitled “The Sources of Soviet Conduct.” His article
was published anonymously but the author’s identity soon became
known and Kennan became a celebrity commentator on Soviet affairs.
Kennan’s analysis
captured the mood of the moment. He explained why efforts to
negotiate a postwar peace settlement had failed in the face of Soviet
expansionism in central and eastern Europe. Power was the only
language the Kremlin understood, argued Kennan. The only way to stop
the Soviets and their communist allies was through deploying
countervailing power.
Less well noted was
Kennan’s comment in the same article that containment was not a
moral posture and “had nothing to do with outward histrionics: with
threats or blustering or superfluous gestures towards toughness.”
It was a policy tool to protect vital American interests. The Soviet
Union was an ideological state committed to spreading communism, he
noted, but it was also a great power with its own interests and
sensibilities. Soviet leaders were not beyond considerations of
prestige and, as with leaders of other great nations, they should be
given ways to save face.
Kennan saw
containment as fundamentally a political strategy. Military power
should be reserved for protection not projection. The Soviet foe
would be vanquished in a contest of values and ideas. In the late
1940s Kennan was disturbed by what he saw as the militarisation of
his concept of containment – the establishment of NATO, the
division of Germany and the ever-deepening cold war divide in Europe.
Kennan opposed the
1950s version of today’s regime-change policy, the so-called
liberation strategy of Eisenhower’s Secretary of State, John Foster
Dulles. Kennan argued the communist bloc would change as a result of
internal processes not through the force of external threats or
intrigues. Liberationist rhetoric would only entrench Soviet
hardliners. “We must be gardeners and not mechanics in our approach
to world affairs,” urged Kennan in his lectures on The Realities
of American Foreign Policy at Princeton University in 1954.
Kennan was
particularly irked by the western failure to understand Soviet
anxiety about NATO and the rearming of West Germany in the 1950s –
it was, after all, less than a decade since the end of a war in which
millions of Soviet citizens had been massacred by the Germans. While
Soviet perceptions of a western military threat were exaggerated,
their underlying fears were genuine. Western leaders seemed unable to
grasp how their own fears were being mirrored by those of the
Soviets.
When Kennan was
appointed Ambassador to the Soviet Union in 1952 he recorded in his
diary that he felt “we were expecting to gain our objectives
without making any concessions whatsoever to the views and interests
of our adversaries. Our position seemed to me to be comparable to the
policy of unconditional surrender.” From Moscow he cabled the State
Department that “if one were able to strip away…propagandistic
distortion and maligning of foreign intentions, one would find that
there remained a certain hard core of genuine belief in the
sinisterness of western intentions.”
Kennan’s vision of
containment included a degree of US military disengagement from
Europe so as to open an American-Soviet dialogue based on an
acceptance of differences in perspectives and interests. The United
States need not fear that it would be subverted or weakened by such a
dialogue. America only had to be true to itself to win the cold war,
Kennan believed. In his Reith Lectures in 1957 Kennan advocated
Soviet and Western withdrawal from West and East Germany and the
reunification of the country as a neutral state – an act which he
believed would in time help loosen the Kremlin’s grip on the
communist bloc.
As a realist rather
than an idealist Kennan was fond of quoting John Quincy Adams that
America “goes not abroad, in search of monsters to destroy.”
While the United States “was the well-wisher to the freedom and
independence of all”, it was through example rather than force that
America should lead the world. If it pursued force the United States
would undermine its own values and beliefs.
The cold war ended
much in the way Kennan envisaged – through a process of internal
change within the Soviet bloc led by Mikhail Gorbachev. In the 1990s
Kennan opposed taking too much advantage of the collapse of communism
and the Soviet Union. He believed NATO’s expansion to Russia’s
borders was “the greatest mistake of the entire post-Cold War
period.”
Kennan died in 2005
but his likely advice on the Ukraine crisis would be threefold.
First,
understand Putin’s point of view about the vital Russian interests
he believes to be at stake in Ukraine – a country in Russia’s
backyard, not America’s.
Second, defend
America’s vital interests but pursue broader, transformational
goals through a process of constructive engagement with Russia.
Third, learn
the negative as well as the positive lessons of cold war history. Do
not allow containment to become an instrument for the isolation of
Russia that may turn a potential ally in world affairs into a
dedicated foe. A new cold war is certainly not in the interests of
the people of Ukraine, who need not the mutual enmity of Russia and
the United States but rather to benefit from aid and collaboration
with them both.
|
d76318fd401b8c52d058e848009c5392 | https://historynewsnetwork.org/article/156419 | Journalist Michael Wolraich says he wrote his new book about the Progressives to teach Americans how to do liberal politics | Journalist Michael Wolraich says he wrote his new book about the Progressives to teach Americans how to do liberal politics
HNN Editor: This excerpt features an interview with Michael Wolraich, the author of the new book, Unreasonable Men: Theodore Roosevelt and the Republican Rebels Who Created Progressive Politics (Palgrave Macmillan).
What inspired you to write this book? Why look look so closely at this era of U.S. history?
The idea for the book started off fairly simply. There was, several years ago … a resurgence of interests … and concern about economics and economic inequality and corporate influence on government … Particularly back during Occupy Wall Street protests, there was a lot of rhetoric that mimicked the rhetoric of the early progressive movements; the criticisms of Wall Street and corporate control and the whole 99 percent echoed class arguments that Democrats hadn’t really emphasized for a long time. And what struck me was that many of the people who were employing these arguments really had little sense of where they came from and where they started. There’s a fair amount of understanding and recognition and appreciation for FDR and the mid-twentieth century. But in talking to people, both in the protests and the blogosphere, I just get the sense that people don’t really know how progressivism started.
So my original idea was just to write a book, partly for the left, partly for the country as a whole, to remind us of how that movement started, why we have a lot of the progressive laws that were passed in the first place and why it’s important today. Then just as a follow-up, as I was digging into it, figuring out what exactly I was going to write about and I started moving towards Robert M. La Follette, one of the early progressive leaders from Wisconsin, and I was fascinated with his relationship with Theodore Roosevelt, which I had only a broad understanding of before I started to write the book. I was fascinated by the story and fascinated about what it can teach us about politics today.
Let’s talk about those two guys, not only their personalities but what you feel they embodied at the time and what they still represent today. Start with the one that I’m sure people have heard less about, Robert M. La Follette. What’s his story and what about him do you think is so interesting and relevant to today?
La Follette was governor of Wisconsin. Well, he was originally a congressman and then governor of Wisconsin, and then ran for senator of Wisconsin. He was a lifelong Republican but he was very critical about the practices and ideology of the Republican Party at the time. You could call the Party conservative — they didn’t really use that terminology at the time — but he had ideas about progressive reform that were shut down. He was particularly upset with the corruption that was endemic to both parties. There was a lot of corporate influence and old-school bribes. A lot of it was very similar to today where corporations would fund political campaigns and then politicians once elected would do favors for their benefactors.
La Follette decided to lead a fight against this. And he was shut down by the state Republican Party in Wisconsin. So he mounted an insurgency against them. He was a very eloquent and inspirational public speaker and he went around the state talking about the power of the corporations, particularly the railroad corporations, and the corruption in the Party. He ran three campaigns for governor before he finally won. Then [he] had another fight the next four years where his progressive insurgents — they called them “half-breeds” at the time — took over the state party. He was in many ways the Ted Cruz of his day, particularly when he got to the Senate. He mounted primary challenges against fellow Republicans. He did sensational filibusters Roosevelt regarded as pointless. He refused to compromise. I’d say he was the mirror image of Ted Cruz because he was arguing not for conservative ideology, but progressive ideology.
Who constituted La Follette’s “base”? Cruz has the Tea Party — mostly a group of older-than-average and richer-than-average white people, many of whom could be called petty bourgeois because they own their own small businesses or property. Who did La Follette have?
La Follette’s base was very rural. It was farmers, small accounts people, people who felt at the mercy of East Coast elites, as they thought of them. There was, to a growing extent, laborers in the cities [that] became a part of his base. … He would speak often about the “common man.” That was his base. That’s who he spoke to. That’s who drove his movement.
You mentioned that he had to run three times before winning the governorship in Wisconsin. What changed in the state between his first and second runs and his eventually successful third attempt?
During the early progressive movements, many people — even Woodrow Wilson, who had been more conservative almost until he ran for election — they often use the wordawakening [to explain their embrace of progressivism]. La Follette very much saw himself as an educator, and he would help voters to understand the problems that were happening in the country and understand they could, by mobilizing, fix the problems with government that were preventing reform legislation from advancing the country forward....
|
cbc0271a2c246a93a8517a3eded2d4dd | https://historynewsnetwork.org/article/156779 | There's no such thing as a gentle execution | There's no such thing as a gentle execution
Did Joseph Wood suffer when he was executed in Arizona this summer?
Some witnesses reported that Wood gasped over 600 times during his July 23 execution by lethal injection, which took nearly two hours. But one official said that Wood "appeared to be snoring," while another stated flatly that the inmate "did not endure pain." We'll never know.
But here's what we do know: The quest for a pain-free mechanism of capital punishment is a fool's errand. As Amherst professor Austin Sarat shows in a terrific new book, "When the State Kills," we have spent two centuries trying to put people to death without putting them in discomfort. And it hasn't worked.
Start with hanging, which Englishmen brought to our shores during the colonial era. But they did not import professional hangmen, so executions fell to sheriffs and other local officials.
Too often, they had no idea what they were doing. Poorly tied ropes snapped, earning the condemned a return trip to the gallows. Or sometimes the rope wasn't taut enough, so executioners had to pull on the prisoner's legs until he expired.
So communities erected higher gallows, in the hopes that a so-called "long drop" would insure a quick and painless demise. They also devised pulley systems to jerk the prisoner's head upward, instead of relying on gravity alone.
But many inmates still gurgled and choked for long stretches of time; others were decapitated, generating lurid newspaper headlines.
So Americans turned to the new technology of electricity, which promised to execute prisoners "in a less barbarous manner," as New York's governor declared in 1885...
|
f62794c5cea3c50e945b54ba665fbb3e | https://historynewsnetwork.org/article/157795 | This Is What Ken Burns Neglected to Tell You About Eleanor Roosevelt | This Is What Ken Burns Neglected to Tell You About Eleanor Roosevelt
International
Human Rights Day (December 10) and the recent airing of the Ken Burns
series The Roosevelts: An Intimate History
have focused renewed attention on the life
and work of Eleanor Roosevelt. While such interest is welcome and
long overdue, the fact remains that Burns overlooked much of ER’s
life and work in a series that purported to include her as an equal
to her uncle Theodore and her husband Franklin.
As a
savvy producer and consumer of television, ER would have been the
first to appreciate Burns’s series on her family. She would have
welcomed his interest in their lives and accomplishments but she
would have been
puzzled and dismayed at the amount of time
devoted to her private life. She would have been particularly unhappy
about the portrayal of the last seventeen years of her life (a mere
35 minutes in a
fourteen-hour program). This period is a
complete mystery to most Americans who usually associate ER with
Franklin and assume that her role in American life ended with his
death in 1945 or that her postwar life merely echoed his New Deal.
Neither
of these statements is true. From 1945 until her death in 1962, ER
took the ideas about community, inclusion and democracy that she, her
husband, and uncle espoused, and pushed them much farther than
Theodore or Franklin ever dreamed. However, because she usually
exercised political power indirectly and often played down or
obscured her own achievements, ER’s contributions are often
overlooked and undervalued.
Nevertheless
she left a voluminous record, and it is possible to tell the story of
ER’s post-White House life in a compelling, coherent fashion
because she based much of what she did on three key concepts:
political courage, civic education and citizen engagement. In terms
of political courage, ER’s postwar career coincided with the early
years of the Cold War. Americans of that
era feared communist incursion and nuclear attack. ER fought those
politicians who preyed on those fears, speaking out against the
ravages of McCarthyism at home and urging her fellow Americans to
spend more time improving democracy and less time witch hunting. At a
time when many Americans believed non-aligned countries like India
were communistic, she was among the few to argue that befriending
nations who refused membership in either the US or the Soviet camp
would be smart politics.
When
the US government sharply limited travel to Eastern Europe and the
Soviet Union, she dared to visit Yugoslavia once and the Soviet Union
twice. Her 1957 interview with Soviet Premier Nikita Khrushchev was
front page news and
carefully monitored by both the State Department and the FBI, neither
of whom cared much for her views or her politics. In fact, ER’s
progressive politics and insistence on freedom of speech and
association made her a target of the FBI and its director J. Edgar
Hoover. For almost forty years, beginning in 1924 when she supported
the US’s entrance into the World Court and American participation
in the League of Nations, the Bureau kept a file on her. At her death
it ran to thousands of pages, much of which remains redacted.
Civic
education for ER took many forms. Besides “My Day,” her
syndicated newspaper column, which ran six days a week for more than
twenty years, ER wrote 27 books and hundreds of magazine articles on
topics ranging from Cold War politics to raising children. During her
White House years alone she hosted six different sponsored radio
programs of her own and appeared on countless others. In the last
seventeen years of her life she hosted two additional radio programs
and three public affairs television programs including one,
“Prospects of Mankind,” that ran on the precursor to PBS.
ER’s
view of citizen engagement was similarly expansive. On a partisan
basis, citizen engagement meant doing all she could to shore up the
liberal wing of the Democratic Party---everything from helping to
found Americans for Democratic Action to mediating the fight between
Democratic conservatives and Democratic liberals over the issue of
civil rights at the 1956 Democratic convention. At the same time she
continued to work with Republicans on issues of mutual concern.
Another
important aspect of ER’s civic engagement philosophy was her
support for American labor. ER did more than foster the labor
movement, she actually joined it. In 1937, one year after she started
writing “My Day,” she became a member of what is today the
Newspaper Guild, AFL-CIO. Despite allegations that her membership
implied communist affiliation, she remained a member for over
twenty-five years. Indeed, her union card was in her wallet when she
died. ER also numbered many union leaders among her personal
friends. She was particularly close to United Auto Workers Union
president Walter Reuther. Reuther and ER worked and relaxed
together---staying at each other’s homes and befriending each
other’s families.
During
the postwar years, ER gradually became a strong supporter of public
sector unions, and vigorously led an effort to defeat so called
“right-to-work laws” in six states. She was a keynote speaker at
the AFL-CIO merger convention in 1955, a merger she had championed
for twenty years. When A. Philip Randolph, president of the
Brotherhood of Sleeping Car Porters, asked her to join the National
Farm Labor Advisory Committee in 1959, despite failing health she
agreed. She attended meetings, wrote columns and testified before
Congress on behalf of migrant farm workers.
As for
the Universal Declaration of Human Rights (UDHR), Burns rightfully
noted ER’s centrality to its creation and passage, but failed to
mention its significance to subsequent history. According to Harvard
Law Professor Mary Ann Glendon, author of a well-regarded book on ER
and the UDHR, “the most impressive advances in human rights--the
fall of apartheid in South Africa and the collapse of the Eastern
European totalitarian regimes--owe more to the moral beacon of the
Declaration than to the many covenants and treaties now in force.”
To the
very end of her life, political courage, civic education and citizen
engagement governed ER’s activities. Her last major undertaking,
the chairmanship of President John F. Kennedy’s Commission on the
Status of Women, yielded a report in which the federal government
documented for the first time the inequities women faced in the home
and in the workplace. The report called for an end to discrimination
in all walks of life and recommended family supports such as paid
maternity leave and quality, affordable child care. It reinforced the
importance of unions, and it challenged the United States to become a
world leader in the struggle for human rights.
Clearly
ER was a critical link between her uncle Theodore and her husband
Franklin. She was also an integral part of Franklin’s presidency.
Yet her own achievements were substantial, long-lasting, and it can
be argued, critical to the lives of millions of people around the
globe. Ken Burns was right to include ER in his portrait of the
Roosevelts. He just didn’t do her justice.
|
897f1dadce3a2046f384ffe4b4241bdc | https://historynewsnetwork.org/article/158384 | A member of the Warren Commission staff now worries he was a victim of a “massive cover-up” | A member of the Warren Commission staff now worries he was a victim of a “massive cover-up”
Half a century after the Warren Commission concluded there was no conspiracy in John F. Kennedy’s assassination, the commission’s chief conspiracy hunter believes the investigation was the victim of a “massive cover-up” to hide evidence that might have shown that Lee Harvey Oswald was in fact part of a conspiracy. In new, exclusive material published today in the paperback edition of a bestselling history of the investigation, retired law professor David Slawson tells how he came to the conclusion, on the basis of long-secret documents and witness statements, that the commission might have gotten it wrong....
Slawson’s most startling conclusion: He now believes that other people probably knew about Oswald’s plans to kill the president and encouraged him, raising the possibility that there was a conspiracy in Kennedy’s death—at least according to the common legal definition of the word conspiracy, which requires simply that at least two people plot to do wrongdoing. “I now know that Oswald was almost certainly not a lone wolf,” Slawson says.
Slawson is not describing the sort of elaborate, far-fetched assassination plot that most conspiracy theorists like to claim occurred, with a roster of suspects including the Mafia, Texas oilmen, anti-Castro Cuban exiles, southern segregationists, elements of the CIA and FBI, and even President Johnson. Slawson did not believe in 1964, and does not believe now, that Fidel Castro or the leaders of the Soviet Union or of any other foreign government were involved in the president’s murder. And he is certain that Oswald was the only gunman in Dealey Plaza.
What Slawson does suspect is that Oswald, during a long-mysterious trip to Mexico City only weeks before the assassination, encountered Cuban diplomats and Mexican civilians who were supporters of Castro’s revolution and who urged Oswald to kill the American president if he had the chance. “I think it’s very likely that people in Mexico encouraged him to do this,” Slawson told me. “And if they later came to the United States, they could have been prosecuted under American law as accessories” in the conspiracy.
|
04e19ee54d2aa8b7238696b078a2c6b8 | https://historynewsnetwork.org/article/158716 | Obama and Netanyahu Differ on History, Not Just Iran | Obama and Netanyahu Differ on History, Not Just Iran
Related Link Benjamin Netanyahu’s Long History of Crying Wolf About Iran’s Nuclear Weapons
The differences between Israeli Prime Minister Benjamin
Netanyahu and President Barack Obama extend beyond their views of the current
negotiations to limit Iran’s capacity to develop nuclear weapons. The two leaders draw on fundamentally
different lessons from history as they shape their country’s respective
positions in regards to a nuclear Iran. In fact, their whole approach to using
history in making their case on whether or not to engage Iran diplomatically
differs.
Speaking before the Joint Meeting of Congress,
the Prime Minister engaged in historical selection in looking backward to
predict what might lay ahead. After the
obligatory political thank you’s, Netanyahu started his speech by citing the
Old Testament story of Esther, a “courageous Jewish woman” who warned the Jewish
people of a plot to destroy them conceived by a Persian viceroy. He drew the direct line from the religious
holiday of Purim commemorating the story of Esther to what he sees is yet
another plot by a Persian ruler to destroy Israel and he cites the sitting
Ayatollah’s tweets as evidence. Another
Old Testament figure that Netanyahu chose to ignore is the Persian King Cyrus who
ended the Babylonian captivity, called for the rebuilding of a “house of God”
in Jerusalem and restored the religious vessels that his predecessor
Nebuchadnezzar had taken when he destroyed the temple and the city.
Netanyahu also cites more recent history in
highlighting the case of North Korea, when it reneged on its commitments to an
international agreement hammered out through diplomatic negotiations to
forestall the acquisition of a nuclear bomb.
North Korea, like Iran, was a signatory to the Nuclear Non-Proliferation
Agreement. To date, North Korea is the
only country to have withdrawn from the agreement, and all obligations under it
to limit nuclear technology to peaceful purposes and prevent the development
and spread of nuclear weapons. Here
again, another historical selection overlooked by Netanyahu is South Africa,
which abandoned its nuclear weapon program and signed the non-proliferation
treaty in 1993.
The point is not that the appropriate
historical parallels ought to be Cyrus and South Africa, but that historical
precedents abound and can land on any side of a political debate.
President Obama, on the other hand, uses his
understanding of history not to find past precedents but to look for present,
transformative opportunities that could reshape history, as understood only by
looking backward years from now.
Obama certainly knows of such historic moments. It’s not hard to anticipate the histories 50
or 100 years from now, citing his election and re-election as the first
African-American President. They may
also include his willingness to open a new chapter in U.S. relations with Cuba
or to tackle the issue of affordable health care in this country, just to
mention two. In announcing the opening
of diplomatic talks with Iran in September 2013, Obama looked ahead and
envisioned the possibilities of not only “a major step forward in a new
relationship between the United States and the Islamic Republic of Iran,” but
also one that would “help us to address other concerns that could bring greater
peace and stability to the Middle East.”
Obama also recognizes that individuals who seek
such opportunities run enormous risks, not just for their own political careers
but for their nations that they lead.
It’s why he is leaving the door open for tougher sanctions and even the
use of military options should Iran decide to use these negotiations as a cover
to work towards producing nuclear weapons.
He does not want those histories in the future to write of him as
another architect of appeasement. The
roads he has chosen to walk are littered with obstacles and critics. However, he also acknowledges that the
alternatives on Iran, however politically expedient they may be in dealing with
crises, do not offer to resolve them, simply delay them, perhaps for another
President or Prime Minister. His model,
after all, is Lincoln, not Buchanan.
Both leaders take a long view of history, but
while Netanyahu’s view goes backward, Obama’s looks forward. When Netanyahu looks forward, he sees only
weeks, to the next election in Israel on March 17, or to inject himself into
the current political stalemate in Washington.
David Remnick’s profile of Obama in the pages of the New Yorker in
January 2014 quoted aides repeatedly discussing Obama’s sense of understanding
his actions under the long telescope of history. Obama
told Remnick that “at the end of the day we’re part of a long-running story. We
just try to get our paragraph right.”
There is a great distance to go before reaching
any accord with Iran over its nuclear program and its commitments under the
Non-Proliferation Treaty and an even greater distance to restoring diplomatic
relations with Iran. But history is the
story of change, and that change does not happen without taking the first
steps, as risky as they might seem. Netanyahu
uses history to avoid the first step; Obama looks way down the road to see
where that first step might lead.
|
31d1cc7a0bf2e9faf8d7098a02bddc20 | https://historynewsnetwork.org/article/159221 | This Is Microhistory? | This Is Microhistory?
Historians
might want to think about becoming active participants in the
“bookternet,”
the Internet’s social media communities for book-lovers.
Book-publicist Rachel Fershleiser describes the bookternet as a way
to form community around the solitary pursuits of reading and
writing. Publisher’s
Weekly
featured a forum
on the subject earlier this year on which columnist Clive Thompson
and others talked about how difficult it is to "monetize
conversations" among “far-flung communities” of knitters and
snow-boarders. It's
also a way for writers to connect to their readers. One important
node on the bookternet is Goodreads,
a site where readers share reviews and commentary about what they are
reading or plan to read. This website allows readers to follow
writers they like and create genre-based lists. They can also join
fan groups who read various genres from the group for “mysteries
and thrillers” to one entitled “History
is not boring.”
One
of Goodreads’ more popular history lists was created in 2008 with
the title, “Microhistory:
Sweeping Social Histories of Just One Thing,”
by “Blueguitar411” a public librarian in New York, who was
developing a book display for her local branch library. Her
definition of the word microhistory builds on Wikipedia’s: “an
intensive historical investigation of a well-defined smaller unit of
research (most often a single event, the community of a village,
family or a person)” but substitutes the words “general concept”
and “trend” for “family or a person.” Many people, judging
from the 1158 “likes” for the list, and the number of voters for
each individual book on it, agree that some of the best exemplars of
microhistory are Mark Kurlansky’s Salt:
A World History
and Mary Roach’s Stiff:
The Curious Lives of Human Cadavers.
The list also includes titles on the history of marriage or cancer
because they focus on a single topic. Right now, the list contains
nearly 900 books. Carlo Ginzburg’s The
Cheese and the Worms: the Cosmos of a 16th
Century Miller,
the book most historians identify as the paradigmatic example of the
genre, is way down the list with just six votes.
But
there’s a problem. The
problem is that the definition of microhistory sounds plausible
enough to be persuasive, but is oxymoronic. I’ve thought of various
analogous mis-definitions: “Marxist Historiography: Books about How
Ideas Can Change the World,” or “Post-Modernism: Books that Tell
the Truth about the World Today.” The
definition is also too broad. Almost
any book of non-fiction could be defined as being about "just
one thing" -- whether that one thing is a battle, a disease, the
Mediterranean world, your favorite cocktail, a year in world history,
or the story of a 16th century heretic known as Menocchio.
This
Goodreads list has become the definitive source for general readers
of nonfiction and is the reference to which a number of online
articles about microhistory and public library bibliographies now
point. This year, the popular bookish online community, BookRiot
created a 24-category reading challenge (read a book by a GLBTQ
author; read a book by an author from Africa, etc. You can find it
here
in which thousands of eager readers, myself included, are
participating. Task ten on this list is microhistory, and it also
refers to this Goodreads list for its definition, which is how I
found it.
Recalling
how excited I was when I first read the
Cheese and the Worms
or how much I learned from Patricia Cline Cohen’s The
Murder of Helen Jewett,
I went on a quixotic mission to introduce the original definition of
microhistory to BookRiot and their followers. I was able to draw on
a number of resources, including Wikipedia. It was in 2005 that the
definition of the word began to change, which you can see in
discussion around the Internet. At Wikipedia’s “talk
microhistory” link, the site’s editors explain
why the work of “Kurlansky et al” is not
microhistory.
In 2005, the History News Network noted
Arthur Krystal’s use of the term “microhistory” for commodity
histories like Kurlansky’s, and in 2006, featured an article by
Sigurdur Gylfi Magnusson entitled, “What
is Microhistory?” reconnecting the term to the 1970s generation
of writers interested in the “normal exception” to the story of
broad historical patterns unearthed by the Annales
school. Magnusson headed the Microhistory center and journal based in
Iceland, on the web at microhisotry.org. There are also a few
historians who have blogged about the subject.
(See
for example, Brodie Waddell’s blog here).
Another historian
joined me in the effort to revise the definition. Responses to our
comments were varied, some interested, but many defensive. I was
called a “book snob.” As the argument got heated, BookRiot’s
moderator deleted one of my new friend’s comments for
“name-calling.”
Meanwhile,
at Goodreads, other readers started trying to fix the microhistory
bibliography in September 2014. They started deleting books that
weren’t “sweeping social histories of just one thing.” One
librarian remarked, “Biographies
shouldn't be on here, nor should accounts of particular historical
incidents.” I don’t know how many microhistories were deleted,
but I suggested that rather than deleting books, it would be better
to change the list title, and
then start a new microhistory list with a different definition. I
renamed the old list “commodity history.” An infuriated Goodreads
user switched the title back, referring to my action as “brazen,”
my definition “stringent.” Microhistory should be defined
broadly enough, she argued, to include both commodity histories and
books like A
Short History of Rudeness by
literary scholar, Mark Caldwell. By the popular vote, it’s true,
that book beats both The
Return of Martin Guerre
and The
Cheese and the Worms.
For
historians, seeing this list and talking to its supporters is like
entering a strange land. Popular books about the very same subjects
that preoccupied the Annales
school historians: diseases and diet through the ages (and how these
have changed world history on a grand scale) are now the exemplars of
microhistory. It’s as if a mystery-writer came across a group of
readers who really liked what you and I call science fiction, but
lacked a name for the genre. This group then decided that these books
should be called "mysteries" because the subjects are
strange, and made a list for science fiction books with the title,
"Mystery: a narrative work about fictional characters dealing
with the unknown, most often about outer-space." Then other
readers started adding their favorite novels until the list came to
embrace the entirety of fiction (because you don’t know the ending
until you get to the end). At that point, another group of people,
identifying themselves as in-the-know, would attempt to bring the
list back to its original definition by expunging such inappropriate
books as The
Adventures of Sherlock Holmes
because they are not set in outer-space or another dimension. If any
actual mystery writers chimed into the discussion, they would be
utterly ignored, seen as exotic representatives of some obscure
knowledge community, or castigated as annoying language police who
want to ruin mystery reading for everyone.
However,
not every reaction has been defensive. Some people have begun to ask
me, "Is X a microhistory?” I want to be supportive, but I’ve
noticed that many of the “Xs” are focused studies of current
events. This micro-battle has taught me something about how general
readers think about history itself. If we look at Amazon’s
“history” best-sellers list, we’ll see few works by academic
historians and a plethora of journalistic studies of current events.
“Five minutes ago” is the historical past and historical research
means interviewing people about what they did last year. While I
believe that oral history is a valid method, I also think there’s a
value in learning about eras whose players are all dead and about
whom sources are hard to come by and tricky to interpret.
I’ve
been recommending books to people and I think a couple of them are
going to read Melton Mclaurin’s Celia,
a Slave.
Another person, using my definition, found Greg Grandin’s Empire
of Necessity.
It’s been worth the price of the insults. As the Publisher’s
Weekly article
I mentioned above suggests, historians could find new readers for
important but forgotten books by participating in social media spaces
like Goodreads, and BookRiot. There we will meet book-lovers of all
types. Many of them are interested in history, but they may not agree
with us about what it is.
|
b732db812f3e42d536a7e3672914c754 | https://historynewsnetwork.org/article/159433 | Could a Movie Help Lead to the Departure of Scotland from the UK? | Could a Movie Help Lead to the Departure of Scotland from the UK?
Among
the surprising results in the United Kingdom’s elections on May 7
was the striking success of the Scottish National Party. The SNP won
56 of Scotland’s 59 parliamentary seats. In some important
respects, the nationalists’ prominence in Scotland can be
attributed to the popularity of a Hollywood movie, Braveheart.
That film, directed by Mel Gibson (and starring the actor), gave new
energy in 1995 to the Scottish separatists’ campaign for
independence. Now, three decades later, demands for self-rule
threaten to dismember the United Kingdom.
Braveheart
is just one among many generators of
nationalist ardor in Scotland, and it is certainly not the principal
stimulus to Scotland’s separatist movement. But the movie’s
impact on politics in the British Isles is a significant chapter in
the recent history of Scotland’s extraordinary political
transformation.
Braveheart
drew attention to Scotland’s mythic hero,
William Wallace. Much of modern-day knowledge about William Wallace
comes from a 15th
century poem by Blind Harry. That document served nicely as a
springboard for cinematic storytelling. It gave the screenwriter,
Randall Wallace, abundant opportunities to exercise artistic license.
In the tradition of Hollywood’s epic historical dramas, Randall
Wallace populated his story with heroes and villains. He designed a
morally uplifting tale about poor Scottish commoners struggling
against the rich and powerful English. Some wealthy Scots are
villains in the screenplay as well, since they profit from alliances
with the English. Braveheart
shows Scotsman William Wallace leading a rebellion against English
domination after his wife is raped and executed. Many Scots join him
in the fight. Eventually Wallace is betrayed. Rejecting an offer of
mercy before his execution, the defiant hero shouts “Freedom!”
Some
scholars, movie reviewers, and politicians blasted the production for
historical inaccuracies. They criticized the writer and director for
demonizing of the English and presenting a simplistic, rose-colored
perspective of William Wallace and his soldiers. Allan Massie, a
conservative Scottish journalist, vilified the screenplay, claiming,
“It would be a perversion of truth to call [Randall Wallace’s]
way with history cavalier. He has no way with history at all.”
Detractors pointed out, for example, that Braveheart
shows William Wallace having an affair with
Isabella, Princess of Wales, yet Isabella did not arrive in Scotland
until a few years after Wallace’s execution. Additionally, kilts
and other apparel worn by Scottish fighters in the movie did not come
into fashion until centuries later.
Many
Scots cared little about historical inaccuracies. They loved the
movie’s story about a righteous struggle. Scottish audiences
cheered during the screenings and gave the movie a standing ovation
when it ended. Books about Scottish history sold vigorously after the
film’s release. Thousands of Scots from around the world traveled
to Stirling, Scotland, where Wallace’s famous victory occurred. A
few months after Braveheart’s
screenings, a headline in the Glasgow Herald
reported, “Labour’s Popularity Plummets, SNP [the Scottish
National Party] Rides High on Back of Braveheart.”
It was an early sign of the election pounding the Labour Party
received this year in Scotland.
Alex
Salmond, principal leader of the Scottish Nationalist Party, observed
that the movie created a “sea-change” in public attitudes. Before
1995, Scotland’s nationalists enjoyed occasional gains, but they
could not make steady progress. Some Scots favored a proposal for
devolution in 1979 aimed at creating a Scottish Assembly, but that
proposal failed to attract sufficient voter support. Margaret
Thatcher’s election victory in 1979 made the nationalists’
situation difficult. Thatcher and her allies staunchly opposed the
separatists’ cause. Many left-oriented Scots viewed Thatcher’s
brand of free-market economics with disdain. They blamed her
conservative policies for accelerating the decline of the Scotland’s
industrial heartland.
In
1995 Scottish nationalists faced a difficult political environment,
but leaders of the SNP detected rich opportunities for a fresh start
when Braveheart arrived
at local theaters. They stood outside the movie houses, handing out
pamphlets and postcards that associated their campaign with the film.
One of their leaflets said, “You’ve seen the movie . . . now face
the reality.” Alex Salmond, the SNP’s leader, referred to
Braveheart often when
speaking to the public. Salmond announced he was “head and heart”
with William Wallace.
For
awhile, it looked like members of the Labour Party could successfully
put out the flames of local nationalism by offering concessions to
the Scots. In the late 1990s, Prime Minister Tony Blair rewarded
Scottish political allies with a promise of “devolution.” The
Scots opened their own Parliament in 1999. Nevertheless, calls for
independence turned louder in the early 2000s, and the question of
independence came up for a vote in 2014. The nationalists lost,
receiving approximately 45%, while the pro-union forces garnered 55%.
Then, in the elections of just a few weeks ago, resounding victories
by the Scottish National Party showed that the question of
independence had not been settled. “By Friday morning,” reported
the New York Times,
“Scotland and England looked and felt like different countries, and
many wondered whether a breakup had become inevitable.”
It
remains to be seen whether the Scots can achieve independence, but
Braveheart surely gave
their separatist movement a powerful lift. Before the movie’s
appearance in Scotland, the nationalist cause had been substantially
marginalized. After the film excited intense local interest in
Scottish history and culture, talk about independence amplified.
Braveheart’s impact
on the people of Scotland reveals the potential of film to shape
public opinion and agitate national politics.
|
1a7bd83703c9cccbb0b46085cdb0d630 | https://historynewsnetwork.org/article/159662 | Y’all: It’s Older Than We Knew | Y’all: It’s Older Than We Knew
Few
words in the English language have been the subject of as much
research and debate as y’all.
For a historian, perhaps the most interesting part of this
scholarship has looked at the origin and development of the word: How
and when did y’all
come about? Until recently, the word was assumed to have a short
history, at least in the literary record. The second edition of the
Oxford English
Dictionary traced it
back no further than 1909.
In
2006, we found that y’all
existed at least half a century before that. In a brief article in
American
Speech,
I described two nineteenth-century examples of y’all.
One was from the New
York Times,
which in 1886 ran a piece titled “Odd Southernisms: A Few Examples
of Quaint Sayings in South Carolina” that included the following in
its penultimate paragraph: “‘You
all,’ or, as it
should be abbreviated, ‘y’all,’ is one of the most ridiculous
of all the Southernisms I can call to mind.” My second example,
nearly three decades earlier, was from the April 1858 issue of
Southern Literary
Messenger. The piece
was written by “Mozis Addums,” penname of George William Bagby, a
mid-nineteenth-century American writer who specialized in dialect
humor. “Mozis” described the crowded conditions in the
Washington, D.C. boarding house where he was living: “Packin uv
pork in a meet house, which you should be keerful it don’t git hot
at the bone, and prizin uv tobakker, which y’all’s Winstun nose
how to do it, givs you a parshil idee, but only parshil.”
These earlier citations for
y’all
would have been almost impossible to discover without the use of new
databases that present millions of pages of old documents in a
full-text and searchable format. Prior to the digitalization of the
New York Times,
for example, someone trying to find the first time the newspaper used
a particular word would have been faced with a lifetime’s work. It
took me a few minutes to find these antedatings to the OED’s
first citation. I found the 1858 y’all
in the Southern
Literary Messenger
through American
Periodical Series Online—like
the Historical New York
Times, a ProQuest
database.
A few months after the American
Speech article
appeared, Barry Popik, famed for his work tracking down the origin of
the phrase “The Big Apple” to describe New York City, found a
couple examples that beat my 1858 citation by two years. Both were in
a novel by Alfred Arrington, The
Rangers and Regulators of the Tanaha, or, Life among the Lawless
(1856). Popik found this through Wright
American Fiction,
another new database—or rather the digitalization of an older
database, a bibliography of American fiction that was begun in the
mid-1950s. As a result of this discovery, the OED
currently offers 1856 as the earliest y’all.
With the proliferation of these
databases, lexicographers, both scholarly and otherwise, have been
able to find new early citations for many words and phrases. Hardly a
day goes by on ADS-L, the email discussion list of the American
Dialect Society, without someone reporting an antedating of some word
or another. And as new databases are made available (or as databases
become more easily accessible), new antedatings are sure to show up.
This has happened with y’all.
Eighteenth Century Collections
Online (ECCO), from Gale Digital Collections, contains over 180,000
titles (books, pamphlets, and the like, mainly British). Early
English Books Online (EEBO), a ProQuest database, contains over
125,000 titles published between 1473 and 1700. An examination of
these two databases reveals “new” citations for y’all
that are much earlier than those previously known.
One of the thousands of volumes
in ECCO is a poetry anthology, appropriately titled A
Collection of Poems,
published in London in 1702. The book includes “Prologue to The
Fate of Capua,”
a light poem that assesses the play by that name by Irish dramatist
Thomas Southerne. The following delightful couplet contains an
example of y’all
from a century and a half before the OED’s
current earliest citation:
To
Write well’s hard, but I appeal to y’all,
Is’t not much
harder not to Write at all.
Boyle
meant “I appeal to you all” (or perhaps “ye all”—you
and ye
were both in use in the early eighteenth century, and both appear in
the book), but “you all” has one syllable too many to fit the
line, so he used a contraction: “y’all.” He did the same thing,
and for the same reason, at the beginning of the next line: “is it”
does not fit, so he used the contraction “is’t.”
Another
early example of y’all
from ECCO is found in The
Spanish Curate, a
comedic play written by John Fletcher and Philip Massinger in the
1620s. The play appears in The
Works of Mr. Francis Beaumont and John Fletcher
(1750):
But
I know y’all for merry Wags, and ere long
You shall know me too
in another fashion.
This
is especially interesting because in an earlier version of the play
(1711), the same publisher had given these two lines as:
But
I know ye all for merry Wags, and e’er long
You shall know me
too in another fashion.
In
1711, the publisher spelled out “ye all”; in the 1750 edition,
the publisher used the contraction to improve the poem’s
readability.
In
an unattributed poem in The
Scarborough Miscellany
(1734), Miss Copen, a girl “not yet full ten,” apologizes for her
youth, reminds the men that “Rose-buds are as sweet as Roses
blown,” and concludes:
---Thus
to Y’all I make my fond Address,
Excuse my Faults, accept my
Will to please,
Else!---May ye lose your Loves by being
thrifty,
---And ne’er kiss Woman---under nine and fifty.
In
another poetry collection (“by the Author of a poem on the
Cambridge Ladies,” [1733]), an unattributed “Satyrical Poem on
the Beggers Opera” alludes to London’s Lincoln’s Inn Fields
Theatre, where the play was first produced in 1728:
And
all ye Heroes of the Roman
Line,
Whose God-like Actions will for ever shine,
Attend, and
listen to the Muses Call,
Enrol at Lincoln,
else G---- ruins y’all.
The
examples above are from the eighteenth century, but we can also find
examples of y’all
in the seventeenth. John Dryden’s Conquest
of Granada by the Spaniards
(1672) has been noted as having the first known use of the phrase
“the noble savage,” but it also has an early usage of y’all.
In one scene, Lyndaraxa speaks to Prince Abdalla about the attention
given the woman engaged to King Boabdelin: “Heav’n, how y’all
watch’d each motion of her Eye.”
At
the beginning of a translation of Plautus’ Amphitryon,
published in 1694, Mercury offers some background to the play:
Amphitryon, a Theban general, is returning home from war, unaware
that Jupiter (Mercury’s father), having taken Amphitryon’s form,
has been sleeping with his wife. “Im sure y’all know my Fathers
Good Nature, his large Allowance upo’ these Occasions, and how much
he makes of a Sweet Bit,” says Mercury.
The
earliest of these newly-found examples of y’all
is in William Lisle’s The
Faire Æthiopian
(1631, 225 years before the current citation in the OED),
a re-telling of Heliodorus’ history of Ethipoia. In the relevant
section, Thyamis, the leader of a band of thieves, has captured
Chariclea, with whom he has become smitten. Thyamis assembles his
men, reminds them that he has always give them the strongest and most
servile from among those they have kidnapped in the past, and asks if
in this case he might keep Chariclea for his own. (They assent to his
request.)
The
captiue men of strength I gaue to you,
The weaker sold; and this
y’all know is true,
The free-borne women ransom’d, or set
free
For pittie sake, the seruile sort had yee.
With
the help of digitized databases, we can now show that y’all
has a considerably earlier origin in the literary record than we
previously realized. What does this mean for our understanding of the
history of the word? We will leave that to the linguists, but it is
perhaps worth noting a few quick points. First, these antedatings
offer not just a chronological expansion of the word’s use, but
more importantly, a geographical expansion: these are the first
historic uses of the word from outside the United States. In fact,
all these newly-found examples are from England. Second, these older
examples originated in a more formal context than did later examples
from the American South. Most were necessitated by the demands of a
metered line of poetry, and hence might be thought of more as a
simple contraction than a pronominal phrase. (Douglas Patton, writing
in Southern Living,
called y’all
“the quintessential Southern pronoun”; these British examples are
neither southern nor strictly a pronoun.) The presence of y’all
in a poem does not mean that we might expect to see frequent uses of
the word in other eighteenth-century English contexts (just as we
would not expect to see “is’t” except in a poem). Third, there
is almost a century-long gap between the last known usage of this
British version of y’all
and the first known usage of the American version, certainly an
important consideration when discussing the relationship between the
two. In fact, scholars may well decide that these two versions of
y’all
are essentially two different words.
Those
questions aside, this discussion of the history of y’all
shows the usefulness of digital databases: not only do they provide
easy access to largely unavailable texts, they can serve as a useful
tool for lexical analysis—even if this particular brief essay has
ended up raising more questions than answering them.
|
bbbb86c06a67345160b3c52b22a6f958 | https://historynewsnetwork.org/article/161010 | A Historian’s Report Card on the Fed | A Historian’s Report Card on the Fed
"Marriner S. Eccles Federal Reserve Board Building" by AgnosticPreachersKid - Own work. Licensed under CC BY-SA 3.0 via Commons.
Eight
years after plunging into the worst financial crisis since the 1930s,
the United States economy continues to suffer from numerous
structural
issues, including a weakened
middle class, an aging infrastructure, and unjustifiably high
CEO pay. Yet amidst these troublesome conditions, the monetary
policy pursued by Ben Bernanke and Janet Yellen at the US Federal
Reserve deserves recognition.
Historically,
central banks like the Fed have varied widely in their regulatory
responsibilities. The western world’s first central banks, which
originated in late seventeenth century Europe, possessed only limited
ability to control the money supply. By the early nineteenth century,
their fiscal and monetary powers had grown. Alexander
Hamilton and Nicholas
Biddle tailored their banks to serve as the nation’s “lender
of last resort,” rescuing smaller banks from collapse to
prevent a larger contagion from spreading.
It
was not until the Great Depression of the 1930s, however, that
central banks took on their more modern
and familiar form, breaking free from the rigid laissez-faire
shackles of the international gold standard to pioneer new methods of
currency management. Indeed, the
prevailing orthodoxy among academic economists holds that the
gold standard, far from being the stable and self-equilibrating force
envisioned by its proponents, actually
contributed to and lengthened the Depression. Currency
devaluation proved surprisingly effective at halting the downward
spiral in prices, wages, employment, debt defaults, and bank
collapses. The creation of the Federal
Open Market Committee (FOMC) in 1933, moreover, established a
more managed economy where the Fed could more actively pursue its
dual mandate of price and employment stability by buying and selling
treasury bonds in the open market.
Fast forward to
2008. Decades of financial deregulation, which stoked an
unsustainable housing bubble built on the widespread brokering of
risky collateralized
debt obligations, had put capitalism itself on the precipice.
Financial
meltdown was imminent. The Fed immediately injected emergency
funds into the system, flooding the country with paper money. It
lowered the discount rate – the percentage it charged to lend money
to commercial banks – to an unprecedented 0%. For the longer term,
the Fed embarked on a multi-part program of monetary stimulus known
as “quantitative easing.” Expanding its balance sheet immensely,
the Fed started buying up not just treasury bonds, but
mortgage-backed securities, physical property, and other assets. The
most recent phase of quantitative easing, QE3, increased
the Fed’s assets to $4.48 trillion. Repeated cash infusions on
the scale of hundreds of billions of dollars drove down long-term
interest rates and had a propitious impact on the stock market. From
a low of 6,400 in March 2009, the Dow Jones Industrial Average nearly
tripled in value to a record high of 18,300 in May 2015.
FED critics
contended that permanently low interest rates would fuel asset
bubbles and lead to hyperinflation. In 2009, conservative economist
Arthur Laffer predicted
an inflationary environment that would make the 1970s look tame. That
Laffer and other inflation hawks were spectacularly
wrong did not seem to matter. Wittingly or not, they spoke for
the wealthy, whose large savings accounts and ability to profit from
high-interest loans would lose out under inflation. The wealthiest
1%, who frame
much of the political and media agenda, constantly shouted
“theft,” “socialism,” and “redistribution” at the
slightest hint of monetary expansion. And yet, since roughly 1980,
almost all gains in wealth, income,
and GDP have gone
to this very same 1%. Talk about redistribution!
For those of us who
are concerned for more than just the nation’s oligarchs,
maintaining an inflation rate of 2% makes sense. The money supply
required for this target rate drives down interest rates, which
makes it easier for companies, governments, and individuals to
pay off debts. Student
loan debt currently tops $1.2 trillion and many countries today
suffer from high public-debt-to-GDP ratios. Interestingly, there is
some
historical precedent – namely the post-WWII era – for
demonstrating how inflation can help reduce countries’ public
debts. Moderate inflation also stimulates spending. If people think
that their money will lose value, they are likely to spend it, which
is crucial since consumer spending comprises
70% of all economic output in the United States. One could even
argue that the Fed should raise its target rate to 4%, especially
because the US dollar appreciated
18% against a basket of other currencies between mid-2014 and
mid-2015. A strong dollar makes US manufacturers and exporters less
competitive for overseas consumers, but a higher inflation rate would
mitigate this effect. As economist Dean
Baker has argued, a weaker dollar would add manufacturing jobs
and also reduce the trade deficit.
To be clear,
inflation is not the silver bullet to solve complex problems. It
could hurt minimum-wage workers and some pensioners, and its salutary
effects on employment, wealth inequality, and public debt would
encounter diminishing returns once bondholders started demanding
higher interest rates. One critical issue is that our paralyzed
political system has failed to produce sensible fiscal policy, which
puts even more pressure on the Fed to act alone. Since their takeover
of the House in 2010, the Tea
Party has maintained a puerile
obsession with government shutdowns and austerity
at all costs.
We
should be thankful that the Tea Party does not run the Fed. Ben
Bernanke, who had published
on the Great Depression before chairing the Fed, heeded the
lessons of the 1930s. Yellen has continued ably where Bernanke left
off and the two have kept price inflation in the last ten years
remarkably
close to the target rate of 2%, seen GDP rise between 2% and 4%,
and helped to lower
the unemployment rate from 10% to 5%. Even as calls have mounted
for the FED to begin
raising rates, Yellen has aptly
understood that the global economy remains weak and prices have
failed to rise significantly. The point here is not to glorify the
Fed, which may not have anticipated all the warning signs leading up
to 2008. Rather, it is to underscore the wisdom of constructing
monetary policy based on empirical evidence, scholarship, and
historical experience.
|
3d217a07de9974cdc8fbc6ae3e16de48 | https://historynewsnetwork.org/article/161483 | It’s Been 125 Years Since Wounded Knee. The Lakota Are Still Seeking Justice. | It’s Been 125 Years Since Wounded Knee. The Lakota Are Still Seeking Justice.
Big Foot's camp three days after Wounded Knee Massacre; with bodies of four Lakota Sioux wrapped in blankets in the foreground; U.S. soldiers amid scattered debris of camp
This
year marks the 125th anniversary of the massacre at
Wounded Knee Creek, South Dakota, where the U.S. Seventh Cavalry
killed the Lakota Chief Big Foot and more than two hundred members of
his band on December 29, 1890, ostensibly for their adherence to the
Ghost Dance religion. Wounded Knee is an internationally-recognized
symbol representing past massacres and genocide, as well as
indigenous demands for recognition and sovereignty. Dee Brown’s
1970 Bury My Heart at Wounded Knee, for example, was a New
York Times bestseller and has been translated into dozens of
languages. The American Indian Movement’s (AIM) 1973 occupation of
the massacre site ensured that the name Wounded Knee appeared
regularly on the nightly news in connection with AIM’s demands for
the United States to honor treaties. In 2015, the Healing
Hearts at Wounded Knee (HHAWK) initiative has called upon all
people throughout the world to remember not only those slain at
Wounded Knee, but also the victims of all atrocities, in hopes that
such remembrance will lead to the eradication of violence, massacre,
and genocide.
Wounded
Knee, however, has not always been remembered primarily as a horrific
massacre. In the years after 1890, the U.S. Army made Wounded Knee a
central event in American public memory, awarding 20 Medals of Honor
to the Seventh Cavalry and erecting a monument to the soldiers killed
in the engagement. Wounded Knee was heralded as the final victory in
the 400 year “race war” between civilization and savagery, the
event that laid the foundation for the American nation’s subsequent
prosperity. Since the 1890s, Wounded Knee has undergone a
breathtaking transformation in public memory, from a heroic battle to
a horrific massacre of historic significance that could be readily
invoked by Dee Brown, AIM, and the HHAWK initiative. In Surviving
Wounded Knee: The Lakotas and the Politics of Memory I place the
Lakota survivors of Wounded Knee at the center of this
transformation. The book tells the story of the survivors’
half-century pursuit of justice that culminated in their appearance
before the United States Congress in the late 1930s to testify in
support of a bill intended to “liquidate the liability of the
United States for the massacre of Sioux Indians at Wounded Knee.”
The
Lakota survivors—traumatized, impoverished, and confined to
reservations—nevertheless found multiple ways to challenge the
army’s “official memory” of Wounded Knee. In accordance with
traditional Lakota approaches to conflict resolution, the Lakotas
sought compensation from the government for their human and property
losses, filing several claims in the years following 1890. These
claims resulted in a series of government inquiries, in which
bureaucrats attempted to reconcile official records with the
survivors’ memories. In each investigation, the bureaucrats
concluded that because government officials had classified Big Foot
and his people as “hostiles” in 1890, due to their association
with the Ghost Dance, the Lakotas were ineligible for compensation.
Undeterred, the survivors dictated to white interlocutors their
accounts of Wounded Knee. In addition, in 1903 the Lakotas erected a
monument at the Wounded Knee mass grave “in memory of the Chief Big
Foot Massacre,” thereby ensuring that Lakota perspectives would
shape future interpretations of the site.
What
emerges in these written sources is clear evidence of Lakota
engagement in the politics of memory, as the survivors challenged the
official explanations for the killings. Wounded Knee was not a heroic
battle that ended centuries of race war, but rather a brutal massacre
of historic proportions. Big Foot was not a hostile, but a peaceful
chief seeking nonviolent solutions to the conflict. Wounded Knee was
not the result of a Ghost Dance-inspired insurrection against
American sovereignty, but was instead a violation of the 1868 Treaty
of Fort Laramie, which guaranteed peaceful relations between the
United States and the Lakota nation.
In
the late 1930s, South Dakota Representative Francis Case introduced a
bill in Congress calling for $1,000 compensation for each Lakota
survivor of Wounded Knee or heir of a victim. Two survivors, Dewey
Beard and James Pipe on Head, testified in support of the bill before
a House Indian Affairs subcommittee in 1938. Beard had lost his wife,
infant son, parents, two brothers, and a niece on December 29, 1890;
Pipe on Head, who was just ten years old in 1890, had witnessed the
soldiers shoot and kill his pneumonia-stricken grandfather, Big Foot,
as he laid on the ground. Both men described the traditional Lakota
practice of murderers offering compensation to a victim’s family in
order to seek reconciliation, arguing that the government should
offer compensation for the murders committed at Wounded Knee. This
would help restore the peaceful relations that had been established
by the 1868 treaty and would permit the survivors, as Beard
testified, to “forget the whole Wounded Knee affair.” In 1940, as
the 50th anniversary of Wounded Knee approached, the House Indian
Affairs Committee voted to approve the Wounded Knee compensation
bill. However, with attention increasingly focused on the
international scene, the bill was unable to receive a full hearing on
the floor.
As
Surviving Wounded Knee argues, the Lakotas’ pursuit of
justice nevertheless laid a foundation for activists who subsequently
drew upon Wounded Knee’s symbolic power to promote indigenous
sovereignty and objectives. For example, Dee Brown relied almost
entirely on the survivors’ accounts of the massacre in his Wounded
Knee chapter. AIM’s 1973 occupation of Wounded Knee included the
mass grave, which is still primarily interpreted by the survivors’
1903 monument. The survivors’ descendants—many of whom are
involved with the HHAWK initiative—have sustained a strong
collective memory, orally preserving their ancestors’ stories and
passing them on to the next generation. As Surviving Wounded Knee
demonstrates, the survivors and their prolonged engagement in the
politics of memory remain central to the symbolic power of Wounded
Knee.
|
dd42fcf4eebce693039500ac59d20e0d | https://historynewsnetwork.org/article/161551 | UCLA Condemns Anti-Semitic Facebook Post | UCLA Condemns Anti-Semitic Facebook Post
The University of California at Los Angeles last week condemned an anti-Semitic comment that a UCLA student posted on the Facebook page of Mayim Bialik, the actress. Bialik, a UCLA alumna, wrote on Facebook about her pride in being Jewish and Zionist. The student -- in a comment widely discussed on the UCLA campus -- posted a comment apparently addressed to Jews who immigrated to the United States from Europe.
The comment verbatim (with language that may be upsetting to some): "If you're of Euro ancestry and you were born in the Americas, you are still a white immigrant, the way you call us brown people immigrants and aliens in our own damn space. YOU people invades our space and used your bogus gods to justify taking land that was never yours. I don't know how that's different from what's happening in Palestine -- you come into their land, crying persecution and diminished numbers, and instead of returning to your own homes in Poland, Germany and Russia, your people chose to invade another culture's homeland, invoking your bullshit sacred pacts with your gods and massacring an entire culture unless they bend to your will. GTFOH with all your Zionist bullshit. Crazy ass fucking troglodyte albino monsters of cultural destruction."
|
f6c9fa971f005b09d30c097881292c4d | https://historynewsnetwork.org/article/161838 | Did Racism Taint Woodrow Wilson as a Historian? | Did Racism Taint Woodrow Wilson as a Historian?
Related Link The Puzzling Apologies for Woodrow Wilson’s Racism in A. Scott Berg’s Recent Biography By Sheldon Stern
The greater part of Woodrow Wilson’s career was spent in
the academic world, as a professor and later as president of Princeton. And, he
was the only American president to hold a Ph.D. (in political science). In 1902, he published a handsomely
illustrated, five-volume History of the
American People, [hereafter HOAP]
which was a surprising commercial success that made his name familiar outside
academia. Indeed, his 2013 biographer, A. Scott Berg, declares that the books
“made him one of the best known historians in the country” and “the
Presidency’s most accomplished student of American history and politics.” [Theodore
Roosevelt doth protest!]
Wilson himself was far more modest about this work. “I am
only a writer of history … a fellow who merely tried to tell the story, and is
not infallible on dates. … I wrote the history of the United States in order to
learn it.” Later, In the White House, he candidly admitted to family and close
aides, “I have never been proud of that History. I wrote it only to teach
myself something about our country.” It is amusing to speculate about how
Wilson would have reacted if he had known that one of the most renowned scholars
he personally recruited for Princeton, later dismissed the HOAP as “a gilt-edged pot boiler.”
Regardless of these disagreements about the overall historical
merits of HOAP, it turns out that Wilson’s analysis of slavery and
Reconstruction provides striking insights into the racial mindset he brought to
the White House. Professor Wilson’s interpretation of American racial history was
entirely formulaic and unoriginal—in fact, little more than a summary of the
dogma of several generations of Southern apologists. He took for granted the
reigning assumption that only “the African could stand” working “in the wet southern rice fields, upon the
broad acres of tobacco, amidst the sugar cane, and out in the hot furrows of
grain.” “The indolent slaves,” he declares, “did not work as free laborers
would have worked, and could not be made to.” But, for the slave owners: “The
care of the slaves, their maintenance like a huge family of shiftless children,
remained a duty and a burden which the master could not escape, good season or
bad, profit or no profit.” He concludes that principle and self-interest
motivated most masters to be indulgent, humane, and committed to the well-being
of their charges, especially house slaves who “were treated with affection and
indulgence.”
The Civil War, he insists, did not significantly alter
the loyalty of the slave population: “How quiet, how unexcited, how faithful
and steady at their accustomed tasks, how devoted in the service of their
masters the great mass of the Negro people had remained amidst the very storm
and upheaval of war.” Inevitably, as the war produced rumors of emancipation:
the
negroes ignorantly … dreamed that the blue-coated armies which stormed slowly
southward were bringing them, not freedom only, but largess of fortune as well;
and now their dream seemed fulfilled. The government would find land for them,
would feed them and give them clothes. It would find work for them, but it did
not seem to matter whether work was found or not: they would be taken care of.
They had the easy faith, the simplicity, the idle hopes, the inexperience
of children. Their masterless, homeless freedom made them the more pitiable,
the more dependent,
because under slavery they had been shielded, [and] had never learned
independence or the rough buffets of freedom.
Of course, worse was yet to come when defeat in the Civil
War was followed by military Reconstruction. “It was a menace to society itself,”
Wilson insists, “that the negroes should thus of a sudden be set free and left
without tutelage or restraint. Some stayed very quietly by their old masters
and gave no trouble; but most yielded, as was to have been expected, to the
novel impulse and excitement of freedom. … The country filled with vagrants,
looking for pleasure and gratuitous fortune … and the vagrants turned thieves
or importunate beggars.” The ignorant and credulous freedmen “were easily
taught to hate the men who had once held them in bondage.”
Inevitably, white southerners rose to defend their accustomed
way of life: “There were men … who could not sit still and suffer what was now
put upon them. It was folly for them to give rein to their impulses; it was
impossible for them to do nothing.” The night-riding comrades of the Ku Klux
Klan and the Knights of the White Camelia made it their goal “to silence or
drive from the country the principal mischief-makers of the reconstruction
regime, whether white or black. … It threw the Negroes into a very ecstasy of
panic to see these sheeted ‘Ku Klux’ move near them in the shrouded night; and
their comic fear stimulated the lads who excited it to many an extravagant
prank [!]. … [But] the Negroes were generally easy enough to deal with: a
thorough fright usually disposed them to … do anything their ghostly visitors
demanded.”
Over the last half-century, this kind of “history” has
been exposed as racist rubbish. A vast literature, often based on slave
sources, has documented how black Americans survived, coped, and resisted
during more than two centuries of slavery and an additional century of violence,
lynching, racism, and de-facto and legal segregation. But, we would be guilty
of presentism, pure and simple, if we arraigned Wilson for not interpreting the
past as we do. However, that does not mean that we should simply excuse his
blatantly biased writing.
In fact, it was possible during Wilson’s lifetime for a
historian who shared his racial assumptions to nonetheless dig further and
deeper into the evidence about slavery and Reconstruction. Ulrich B. Phillips,
born in Georgia, spent the bulk of his academic career at the Universities of
Wisconsin and Michigan. Phillips shared Wilson’s racial outlook; his writing about
black Americans is filled with adjectives like “inert,” “backward,” “sluggish,”
“inept,” “lazy,” “passive,” and “dependent.” He believed that negroes were inherently
limited “by the fact of their being negroes … [who] were by racial quality
submissive rather than defiant, light-hearted instead of gloomy, amiable and
ingratiating instead of sullen, and whose very defects invited paternalism
rather than repression.”
Nonetheless, Phillips’s careful investigation of
plantation records led him to conclude that “the lives of the whites and the
blacks were partly segregated partly intertwined. … [but] the slaves themselves
were by no means devoid of influence [and] ‘a negro understands a white man
better than the white man understands the negro.’ ” Phillips found countless
examples of slave initiative: theft of food and other necessities, sabotage,
desertion, neglect of tools, protection of loved ones, etc. He also recognized
the decisive role of slave preachers and foremen in mitigating harsh
conditions. Phillips even found cases in which slaves assumed virtually
complete management responsibilities amounting to self-government: “The slaves
were encouraged to earn money for themselves every way they might, and the
discipline of the plantations was vested in courts composed wholly of slaves,
imposing penalties to be inflicted by slave constables.” On occasion, he contended,
grievances led slaves, much like free wage-earning laborers, to strike as a
group. “Their power of renewing the pressure could not be ignored. A happy
ending for all concerned might be reached by mutual concessions and pledges. … The
slaves themselves would not permit
indifference [italics added] even if the masters were so disposed.” The
evidence revealed as well that, far from being the helpless and dependent
post-emancipation wanderers described by Wilson, the freedmen struggled to obtain
land, establish schools and reunite grandparents, parents, and children.
Thousands of former slave couples “flocked to get their already socially
recognized marriages legally sanctioned and approved.” Clearly, Wilson’s professed
modesty about the HOAP was fully
justified.
In 1915, in a well-known episode, President Wilson agreed
to screen D.W. Griffith’s racially-charged Birth
of a Nation (based on The Clansman,
by Wilson’s former college associate, Thomas R. Dixon) in the White House for
his family and cabinet. After the film, he allegedly said, “It is like writing
history with lightning. And my only regret is that it is all so terribly true.”
Wilson’s recent biographers, John Milton Cooper, Jr, and A. Scott Berg, have
argued convincingly that there is no reliable evidence that he ever made that
remark. Cooper even concludes that the controversy over the film made Wilson’s
“racial views look worse than they were.” But,
Wilson’s writings on slavery and Reconstruction, used in the film’s written subtitles
and cited above, leave no doubt that regardless of whether he actually made that
remark, his racial views were entirely in sync with the historical message of
the film.
President Wilson would learn, to his apparent surprise
and irritation, that black Americans had neither been content to be slaves, or
after 1865, to remain second-class citizens.
|
db0f3f10c43a3329d685eec0c9b07bdf | https://historynewsnetwork.org/article/161881 | This Is How February Became Black History Month | This Is How February Became Black History Month
It was in 1964 when the author James Baldwin reflected on the shortcomings of his education. “When I was going to school,” he said, “I began to be bugged by the teaching of American history because it seemed that that history had been taught without cognizance of my presence.”
Baldwin’s thoughts echoed those of many before and after him. Half a century earlier, when Carter G. Woodson had the same frustration, he set the foundation for what would become today’s national Black History Month, observed each February.
In the early 20th century, while he earned a Masters degree from the University of Chicago and a Ph.D. from Harvard, both in history, Woodson witnessed how black people were underrepresented in the books and conversations that shaped the study of American history. According to the way many historians taught the nation’s past, African Americans were barely part of the story—a narrative that Woodson knew was not true. So in 1915, he and Jesse E. Moorland founded the Association for the Study of Negro Life and History (now the Association for the Study of African American Life and History, or the ASALH). The organization would promote studying black history as a discipline and celebrate the accomplishments of African Americans.
|
f07be003a2305b3e6811428deb481fd7 | https://historynewsnetwork.org/article/161883 | Teddy Roosevelt Defends the Presidential Primary | Teddy Roosevelt Defends the Presidential Primary
On March 20, 1912, Carnegie Hall was filled to capacity. Only two months earlier, Theodore Roosevelt’s opponent and rival for the Republican presidential nomination Senator Robert La Follette had delivered a powerful address to an overflow crowd of cheering supporters in the same venue. Notwithstanding La Follette’s dramatic primary victory in North Dakota a day earlier, more than 3,000 people crammed into the lavish main hall to hear what the press was calling Roosevelt’s first speech of the campaign.
The balcony was filled with activists and social workers while the main floor glittered with men and women in evening dress. Another crowd of Roosevelt supporters filled the Carnegie Lyceum, a smaller theater and recital hall, and as many as 5,000 others, who could not gain admission, stood cheering in the street. The main floor had the excitement and flavor of opening night of the opera season.
When Roosevelt bounded onto the stage, the building exploded with wild cheers. “Teddy, O! Teddy,” people shouted, waving handkerchiefs and hats from the galleries.
Standing on the raised platform, Roosevelt waved his hand, asking his admirers to take their seats. But a shout came from the back of the hall, and most of those in the crowd jumped back to their feet and cheered for another two minutes.
Roosevelt had been working on his speech for more than a week. Some of his closest advisors urged him to stress his conservative, pro-business credentials. They were trying, in part, to offset the impact of the proposal for the recall of judicial decisions in his Columbus speech, which had sent tremors through the business world, including many of his staunchest supporters. “All your friends and the committee here unanimously and strongly urge that your Carnegie Hall address be mainly a charter of business prosperity,” his former Secretary of the Navy wired from Chicago. The treasurer of Roosevelt’s New York campaign pleaded for such a statement: “If a strong chord of sympathy can be struck in your Carnegie Hall address with the aims of those who are working conservatively to develop large business enterprises on the basis of the square deal, I think it would do more for our cause between now and election time than any other subject which could be discussed there.” TR’s treasurer sat in a private box, hoping that Roosevelt would heed his advice. ...
|
fedbce2631d775aab384778f6971d4c0 | https://historynewsnetwork.org/article/162082 | Fresh evidence of declining American interest in historic sites | Fresh evidence of declining American interest in historic sites
In addition to using libraries and visiting museums, historic site visitation is another common form of public engagement with the humanities. The percentage of people reporting at least one such visit in the previous year fell by more than a third from 1982 to 2012, with declines across most age groups.
Findings and Trends
● In 2012, 24% of Americans age 18 or older had visited a historic site in the previous year. This was 13 percentage points lower than in 1982, with the bulk of the decline occurring from 2002 to 2012 (Indicator V-13a).
● The decline in historic site visitation from 1982 to 2012 was largest in the 25-to-44-year-old population, an age group that includes many parents of young and adolescent children. However, because no reliable national data on children’s visits to historic sites currently exist, establishing whether a corresponding decline occurred in the percentage of children who visited historic sites is not possible.
● Over the 30-year period studied here, the differences among age groups with respect to rates of historic site visitation substantially decreased. For example, in 1982, the rate of visitation among 25-to-34-year-olds (the group most likely to visit a historic site in that survey) was approximately 11 percentage points higher than that of the youngest age group (18-to-24-year-olds), and more than 17 points higher than that of people ages 65–74. By 2012, however, the visitation rates of 25-to-34-year-olds had dropped to within five percentage points of the younger cohort and fell slightly behind the rate for the older cohort. In 2012, the age group most likely to have visited a historic site was 55-to-64-year-olds, but their visitation rate was only six percentage points higher than that of 18-to-24-year-olds, the group least likely to visit.
● The data reveal generational differences with respect to Americans’ tendency to visit historic sites (Indicator V-13b). With each birth cohort, Americans of all ages have been less likely to visit historic sites. For example, those born from 1938 to 1947 had a 45% likelihood of having visited a historic site in the previous 12 months when they were ages 35–44, while those who were born in the 1968–1977 period had only a 23% likelihood of having visited a historic site when they were the same age.
● As people aged they were less likely to visit a historic site. In each of the three cohorts for which the most complete data are available, the drop-off in historic site visitation over the life course is at least 25%.
|
97e0ef24e79ce6485a2cb75d7c904567 | https://historynewsnetwork.org/article/162411 | After Hack by Neo-Nazi Group, Anti-Semitic Fliers Appear on Campus Printers | After Hack by Neo-Nazi Group, Anti-Semitic Fliers Appear on Campus Printers
Students at various colleges nationwide were stunned and upset Friday to find anti-Semitic fliers (below) on campus printers. Many students initially assumed that someone in their library or residence hall had printed the flier. But as the day went on, more campuses reported the same flier on their printers. The flier -- including two swastikas -- accuses Jews of "destroying your country through mass immigration and degeneracy." A neo-Nazi group that runs the website called The Daily Stormer (named in the flier) took credit for hacking the various printers and expressed pleasure in the distress of students who found the fliers.
Colleges and universities that received the fliers denounced them and said that they were investigating how their printers were hacked and were taking steps to try to prevent further such hacking.
Among the institutions where the fliers showed up were Brown University, California State University at Long Beach, Clark University, DePaul University, Northeastern University, Oregon State University, Smith College, Princeton University, the University of California at Davis, the University of Massachusetts at Amherst, the University of Oregon, the University of Rhode Island and the University of Southern California.
|
3a83f08838a8f9b089db269a3cfa3eee | https://historynewsnetwork.org/article/162476 | Lessons from 1912: Why Trumpmania Probably Won’t Last | Lessons from 1912: Why Trumpmania Probably Won’t Last
This year’s Republican convention appears to be primed for a rupture of a kind we haven’t seen since Teddy Roosevelt broke dramatically with the party in 1912. The parallels are striking: The party is riven between establishment and insurgents. The people’s choice is prone to intemperate remarks and hotheaded declarations—to the delight of his followers and the frustration of party leaders. And, as Roosevelt did, this year’s front-runner claims he’s being screwed by the establishment—even having delegates stolen—and is vowing to do something about it. Donald Trump has openly flirted with the idea of running as a third-party candidate, and on Tuesday he even took back his pledge to support the GOP nominee if he’s not chosen. “[If] I go,” he warned earlier this month, “I will tell you, these millions of people that joined, they’re all coming with me.”
But if history is any guide, even a Trump exodus may be less consequential than many are imagining.
Running as a third-party candidate, TR certainly put up a strong fight, placing second overall. But, beyond that, his eventual decision to run on a third-party ticket had few lasting effects. It didn’t radically change the character of the GOP. It didn’t even present its members with a credible alternative beyond 1912. Like many third parties in American history, the Progressive Party that Roosevelt created was held together primarily by its adherents’ love for its standard-bearer. And that cult of personality wasn’t enough to make up for the history, infrastructure and deep bench of talented leaders that his movement lacked.
At the time, the high drama of the situation suggested that it might turn out otherwise. On Saturday, June 15, 1912, two days before the Republican National Convention was to begin, Theodore Roosevelt, the former and would-be president, pulled into Chicago’s La Salle Station. Despite the late-afternoon summer heat and humidity, huge crowds spilled into the railroad yards. Cheerfully flapping his cowboy hat, Roosevelt rode to the Congress Hotel on Michigan Avenue, several blocks east. Brass bands blared and diehard supporters jogged alongside his car. When his legions remained massed in Grant Park across the street, TR spoke from his hotel room balcony, vowing that William Howard Taft—the incumbent president and TR’s rival for the nomination—wouldn’t get away with stealing delegates (some 72 were in dispute). Two days later, in a formal speech, he said much the same thing, warning that if the delegates weren’t fairly tallied, Republicans shouldn’t feel bound to support the convention’s choice. “We stand at Armageddon,” he bombastically declared, “and we battle for the Lord!” By the time the convention was over, Roosevelt—rebuffed by the party establishment—resolved to form a new political party of his own.
One can imagine a similar scene unfolding at the GOP convention this summer in Cleveland. This year’s struggle, however, may also prove to be long on theatrics but short on consequence. If Trump is denied the nomination and runs as an independent—or even if the #NeverTrump crowd loses and mounts its own third-party run—the GOP will more likely than not remain the same uncomfortable alliance of business conservatives and right-wing cultural populists that it has been, more or less, since the time of Richard Nixon. Whether in 1912 or 2016, one man, no matter how charismatic, strong-willed, or iconoclastic, cannot invent or remake a political party. ...
|
d3e9fd252ff3616e17a16bfa68b245ae | https://historynewsnetwork.org/article/162625 | Why Did MacArthur Become a Hero? In a Crisis We Are Desperate for Leaders. | Why Did MacArthur Become a Hero? In a Crisis We Are Desperate for Leaders.
Douglas MacArthur has been dead for
over fifty years yet he remains one of the most controversial
military figures of the twentieth century. His well-polished
reputation was far from unanimously accepted among his
contemporaries, and a half-century has done nothing to smooth the
contradictions of his personality and his career. There remains no
middle ground with Douglas MacArthur.
Eight hours after receiving reports of
the Japanese attack on Pearl Harbor, MacArthur’s procrastination
resulted in half his air force being destroyed on the ground at Clark
Field in the Philippines. Blindsided by the speed of the ensuing
Japanese invasion, he belatedly ordered a retreat to Bataan but
failed to stockpile food and supplies on the dead-end peninsula.
As his besieged troops slowly starved,
the American public read a far different version. “MacArthur,”
proclaimed the Baltimore Sun—just one of many newspapers
singing his praises, “is something in the nature of a military
genius with the capacity to foresee contingents and make the best use
of resources at his disposal.”
At one level, the public’s
infatuation with MacArthur in the spring of 1942 defies logic. Within
a few short months, he went from the commander who had been caught
with his planes on the ground and ordered the withdrawal that
resulted in the largest surrender in American history, to become
America’s esteemed military hero. The phenomenon is a lesson in the
power of controlled media in unsettled times that continues to
resonate.
MacArthur’s staff tightly controlled
news from his headquarters, not only casting it in singular terms of
“MacArthur,” but also playing up the miracles the general
reportedly achieved with limited resources against enormous odds.
This gave rise to numerous parodies about his ubiquitous communiqués,
including the assertion that all the news on Judgment Day would go to
press in one of them.
The second reason for MacArthur’s
rapid rise to hero status was that the Philippines was then the only
theater of operations where Americans were readily—and
publicly—engaging the enemy. The Doolittle mission, for example,
was kept hush-hush for fear of betraying Allied capabilities and the
battle against German U-boats in the North Atlantic was heavily
censored because of merchant ship losses. By contrast, Americans
could keep their eyes on the Philippines. MacArthur’s dramatic
escape by PT-boat and pithy “I shall return” only added to his
image.
Most importantly, in a fragile period
of the American psyche when the general American public, still
stunned by the shock of Pearl Harbor and uncertain what lay ahead in
Europe, desperately needed a hero, they wholeheartedly embraced
Douglas MacArthur—good press copy that he was. There simply were no
other choices that came close to matching his mystique, not to
mention his evocative lone-wolf stand—something that has always
resonated with Americans.
MacArthur’s stature among the rank
and file left on Bataan was another matter. His starving soldiers
called him “Dugout Doug” and composed countless derogatory
ditties. The most oft-repeated version, sung to “The Battle Hymn of
the Republic,” proclaimed: “Dugout Doug MacArthur lies a-shaking
on the Rock, safe from all the bombers and from any sudden shock.”
There was not much truth to that, but just as with the glowing
positive reports, it was the perception—fueled by repetition—that
stuck.
When a wave of conservative sentiment
promoted MacArthur for the 1944 Republican presidential nomination,
those driving the MacArthur bandwagon found the differing
characterizations of MacArthur in the field and on the home front to
be disconcerting. One MacArthur backer complained that he found it
inexplicable that soldiers returning from the South Pacific were not
enthusiastic MacArthur supporters. A skeptical correspondent went so
far as to suggest that only anti-MacArthur veterans got furloughed
home.
Meanwhile, MacArthur’s positive
image was bolstered by his regular assertions that he was doing more
with less than anyone—despite being ignored by Washington in the
process. “Probably no commander in American history has been so
poorly supported,” MacArthur wrote late in 1943. “At times it has
looked as though it was intended that I should be defeated.” The
only thing more disingenuous was MacArthur’s assertion that his
opinions were rarely sought.
Nothing was further from the truth. In
addition to almost daily communications with the War Department,
Chief of Staff George C. Marshall always dispatched a high-ranking
officer to brief MacArthur on the results of global planning
conferences. After the 1943 Cairo conference, Marshall himself made
the ultimate show of support by returning to Washington via a
globe-girdling tour of the Pacific so that he might confer with
MacArthur in person. As for operating on a shoestring, so many
supplies flowed into the Southwest Pacific by 1944 that Marshall had
to urge MacArthur to get his ships unloaded more quickly lest they
clog the pipeline.
Examine the record and Douglas
MacArthur was as controversial in his times as he remains today. In
the years immediately after World War II, several of MacArthur’s
close associates wrote laudatory biographies and the general’s own
memoirs added an untarnished cap to his reputation. More recent works
have tended to repeat well-worn stories of dubious veracity without
analyzing their source or context. Among these: MacArthur getting
caught with his planes on the ground; true, but the whole story is
much more complicated. Franklin Roosevelt calling MacArthur “the
most dangerous man in America”; an intriguing anecdote—if indeed
Roosevelt ever said it. And George Marshall admonishing MacArthur
that the general had a court, not a staff; an utterance that Marshall
almost certainly never made.
Where then does Douglas MacArthur
stand three-quarters of a century after the four-year period of both
his greatest military defeats and triumphs? MacArthur was a
polarizing figure during World War II and he remains one today. His
most important contribution to history may well have been to appear
as the hero who rallied America and its allies when they were at low
ebb. He became the symbol of determined resolve so desperately needed
in the grim months of 1942. It was the role of a lifetime, and he
played it brilliantly—but not without controversy.
|
7ec2576f9ee9963a3b96c3be88adf2b5 | https://historynewsnetwork.org/article/162814 | Before Germans Slaughtered Jews They Slaughtered Africans | Before Germans Slaughtered Jews They Slaughtered Africans
Herero
prisoners of war in chains. C 1904
In
recent years, some in the African-American community have expressed a
disconnect to Holocaust topics, seeing the genocide of Jews as
someone else’s nightmare. After all, African-Americans are still
struggling to achieve general recognition of the barbarity of the
Middle Passage, the inhumanity of slavery, the oppression of Jim
Crow, and the battle for modern civil rights. For many in that
community, the murder of six million Jews and millions of other
Europeans happened to other minorities in a faraway place where they
had no involvement.
However,
a deeper look shows that proto-Nazi ideology before the Third Reich,
the wide net of Nazi-era policy, and Hitler’s post-war lega
|
48b862138ffa35a6477f1616094ef896 | https://historynewsnetwork.org/article/162889 | ISIS Destroys Ancient Adad & Mashki Gates in Nineveh, Iraq | ISIS Destroys Ancient Adad & Mashki Gates in Nineveh, Iraq
In a new photo report purportedly released by the Islamic State, ISIS confirms news reports of satellite imagery showing the destruction of the reconstructed Adad and Mashki Gates and a large portion of Nineveh's fortification wall. The photo report was released on ISIS terrorist channels on May 15.
|
824f1eac536017a47b51cd46cf24ab7a | https://historynewsnetwork.org/article/163194 | Yes, Yes to "No-No Boy," the New Play About a World War II Japanese-American Who Faces a Wrenching Decision in an Internment Camp | Yes, Yes to "No-No Boy," the New Play About a World War II Japanese-American Who Faces a Wrenching Decision in an Internment Camp
There have been several plays, movies
and television documentaries about the 100,000 or so Japanese-Americans confined at internment camps on the West Coast and
in some other towns during World War II. No-No Boy, a play by
Ken Narasaki presented by the Pan Asian Repertory Company and based
on the novel by John Okada, is the latest and it tells a troubling,
scorching story of one Japanese-American’s decision to refuse to
join the U.S. Army while he lived in one of the camps.
Now, just after
the war has ended, the man, Ichiro, is back in his home town,
Seattle, after a two year stretch in prison for his refusal to serve
and runs into a blitz of criticism from his old friends and loving
support from his parents. No one really understands why he did not go
into the military, as did so many other Japanese-Americans. His
former friend, Elo, even spits at him. Taro, his younger brother,
outlines the stand of most Japanese-Americans when tells Ichiro that
“we were born here, we play ball here, we listen to music here,
we’re gonna get marred here, we’re gonna have kids here. We owe
this country something for that! I’m an American and I’m going to
fight like an American,” he said.
There are several monologues and
conversations in the play that underline the idea that many people
from other ethnic groups saw themselves as Americans and joined the
army. No group of people was interred in camps except the Japanese,
though. The playwright argues, as all Japanese-Americans argue, that
was not right.
When he comes back, a friend hooks
Ichiro up with a wife whose husband has seemingly abandoned her and a
romance starts. He also visits the parents of a boy who was killed in
front of him. He drives around town with another friend explaining
his current plight. His associations with friends in a desperate
desire to connect with them once again leads to an explosive ending
of the drama.
Narasaki’s play is yet another
indictment of American policy and we feel sorry that the government
prosecuted and jailed Ichiro and others because it believed, at the
time, that was proper policy and that Japanese-Americans were spies
and saboteurs. The playwright does not make Ichiro any kind of a
hero, though. He sees him as someone who followed his conscience and
rebelled against American policy and the cultural oppression many
Japanese Americans felt in the U.S., but at the same time sees him as
a man who, once he gave his big speech, probably should have joined
the army. Narasaki’s jury is still out on Ichiro.
One really successful part of the play
is Ichiro’s relationship to his father. The father is a lovable guy
and very much like all dads when it comes to his son. He is
constantly trying to give his son money to go out on the town, or to
spend on women. He forgives him just about everything and suffers
emotionally as his son goes to prison and encounters difficulty upon
his return to Seattle. The relationships between Ichiro and his
former friends are good, too, although some cannot forgive him for
what he did.
I have seen a number of plays about
the internment camps. My Dad served in World War II (radar in
England). I met a lot of his war buddies over the subsequent years.
That American story of the war, both in combat and at home for
“Americans” was quite different than the story of the
Japanese-“Americans.” It should have been the same and the great
tragedy is that it was not.
No-No Boy is a very good
play with a harsh, scalding beginning, but it has some weaknesses.
One big problem the play should have addressed was the legal standing
of Japanese-American men. If they were interred in camps, they could
leave them by joining the army and many did so. Some of the others
were prosecuted but others were not. Narasaki should have done more
with this and more with the overall backdrop of the internment camps.
We find out little about them. Many Asian-American writers make that
mistake. Asian-Americans know a lot about the internment camps, but
many other people know nothing about them. Playwrights need to fill
in the historical blanks.
The Pan Asian Repertory Company,
a superb theater troop, did not build a decent set for this play (I
suppose because of budget). The actors sit around in a large
semi-circle and get up for dialogues and small scenes from the play.
The power of the play gets across, but it would work better with a
full set of some kind. Much of the success of the play is due to
spirited work by skilled director Ron Nakahara.
You have to give a yes-yes to this
No-No Boy production because it is stinging and because the
audience learns a great deal of history, and history about a sad
chapter in U.S. life, from it.
PRODUCTION: The play is produced by the
Pan Asian Repertory Company. Sets: Sheryl Liu, Lighting and
Projection Design: Douglas Macur, Costumes: Hahnji Jang, Sound: Ian
Wehrle. The theater company plans to stage the drama at different
national venues later this summer, with cities and dates to be
announced.
|
d24abd4f3df61a169ddae0b5d8cf8434 | https://historynewsnetwork.org/article/163500 | Review of Brian J. Snee's "Lincoln Before Lincoln: Early Cinematic Adaptations of the Life of America’s Greatest President " | Review of Brian J. Snee's "Lincoln Before Lincoln: Early Cinematic Adaptations of the Life of America’s Greatest President "
Every
four years during the Presidential election cycle, Republicans
proudly tout themselves the party of Lincoln. In 2016, however, this
familiar theme seems to be largely ignored as the Republican Party,
founded on principles of free labor and opposition to slavery
expansion, is led by a candidate who appears to have little interest
in history and the principles of inclusion once embraced by his
party. As Brian J. Snee suggests, however, there is considerable
interest among the American people in Abraham Lincoln, a fascination
which is often expressed in the cinema where many Americans learn
their history lessons. This popular interest in Lincoln was most
evident in the 2012 film Lincoln
directed by the legendary Steven Spielberg and which enjoyed both
commercial and critical success with Daniel-Day Lewis earning a Best
Actor Academy Award for his portrayal of the sixteenth President.
While Spielberg’s film has received considerable attention, Snee
focuses upon the major film and television productions on Lincoln
before Spielberg’s movie.
Although the image of Lincoln is addressed in numerous productions,
Snee’s decision to concentrate upon The
Birth of a Nation
(1915), Abraham
Lincoln
(1930), Young
Mr. Lincoln
(1939), Abe
Lincoln in Illinois
(194), Sandberg’s
Lincoln
(1974-1976), and Gore
Vidal’s Lincoln
(1988) seems warranted. In addition, his thesis that these earlier
films portrayed Lincoln primarily as a champion of the Union rather
than as the great emancipator, a perception altered by Spielberg’s
emphasis upon the slavery issue, appears well grounded in a close
reading of the film texts. Nevertheless, Snee is a professor of
communication and media at Manhattanville College rather than a
historian. He provides readers with insights into critical film and
communications theory that are accessible to the general reader, but
one is often left wanting greater depth of analysis into the
historical Lincoln.
Snee
begins his analysis with the depiction of Lincoln contained in D. W.
Griffith’s controversial epic The
Birth of a Nation.
As the filmmaker’s career was faltering during the Hollywood sound
revolution, Griffith attempted to resurrect many of his ideas about
Lincoln with Abraham
Lincoln,
in which Walter Huston portrayed the title role, but Griffith was
unable to recapture the commercial success of his earlier work in the
silent cinema. Snee concludes, however, that the Lincoln Memorial,
completed in 1922, and Griffith’s Lincoln
“encouraged an entire generation of Lincoln lovers to remember
unity and not freedom as the greatest gift he had bequeathed to them”
(60). Employing the analytical tool of hyperreality, Snee argues
that Griffith’s portrayal of Lincoln in The
Birth of a Nation
as a martyr who gave his life to bring about national reconciliation
following the Civil War has become the accepted image rather than
imply the “reel” description. Snee, however, tends to
underestimate the power of Griffith’s imagery when he writes, “His
interpretation of the Civil War and Reconstruction has not been
adopted and is uniformly regarded as biased and racist” (41). This
statement might be true for academics today, but in the larger
popular culture the myth of the Southern Lost Cause continues to hold
sway as propagated by such Hollywood fare as The
Birth of a Nation
and Gone
With the Wind
(1939). Griffith’s interpretation of Reconstruction as the “rape”
of the South by carpetbaggers, scalawags, and freedmen also reflected
the historiography of leading historians such as William Dunning of
Columbia University and progressive leaders such as Woodrow Wilson.
These myths, despite the best efforts of revisionist scholars such as
Eric Foner, may still be found in some textbooks employed in Southern
school systems.
Lincoln
is also acknowledged as the “great commoner” in such films as
John Ford’s Young
Mr. Lincoln.
Snee observes that there is a degree of conflict with images of
Lincoln as both simultaneously ordinary and extraordinary, but he
argues that Ford resolves this contradiction by borrowing from Soviet
filmmaker Sergei Eisenstein’s dialectical montage. It is difficult
to conceive of Ford as being influenced by Marxism, but Snee
maintains Lincoln’s (Henry Fonda) innate commonness juxtaposed with
his emerging greatness produced the synthesis of the great commoner.
On the other hand, Abe
Lincoln in Illinois,
based upon Robert Sherwood’s Pulitzer Prize-winning play and
directed by John Cromwell with Raymond Massey in the title role,
concentrates upon Lincoln’s years in Illinois before assuming the
Presidency and presents Lincoln as embodying the principles
established by the Revolutionary Founding Fathers. While depicting
Lincoln as a symbol of American democracy, the divisive issue of
slavery is largely ignored. Lincoln is also introduced as somewhat
of an American innocent, and Snee fails to analyze whether Americans
are really as innocent as they would like to be perceived. Also,
America was on the verge of entering a global conflict with fascism
when these films were made, and the historical/cultural context in
which the pictures were produced is not fully developed and analyzed.
In addition, Snee neglects to consider Frank Capra’s Mr.
Smith Goes to Washington
(1939); a film text that employs Lincoln as a symbol of American
democracy although Capra’s America is a white community with few
shades of color.
Snee
concludes his study by examining two television depictions of
Lincoln. Sandberg’s
Lincoln
(1974-1976) was a six-part miniseries adaptation of Carl Sandberg’s
two-volume best-selling biography of the President. Featuring Hal
Holbrook as Lincoln, the series includes six episodes featuring both
the political and private Lincoln, but the series fails to follow a
chronological narrative with the exception of the final installment
that concentrates upon the President’s assassination. This time
Snee pays more attention to historical context; insisting that
Sandberg’s sentimentality provides a useful antidote to the
political unrest of the 1960s and early 1970s with the Vietnam War
and Watergate as the nation prepared to celebrate its bicentennial.
In the final analysis, Sandberg’s
Lincoln
continues the primary concept of Lincoln as the great unifier;
however, a television series such as Roots
(1978) indicates that Americans were beginning to interrogate the
historical reality of American slavery that cinematic images of
Lincoln failed to confront. Gore
Vidal’s Lincoln
(1988) challenged some of cinema’s heroic assumptions as the
iconoclastic novelist’s Lincoln (Sam Waterson) faces criticism from
Southerners who perceive the President as a tyrant unwilling to
compromise on the slavery issue, Northern politicians who believed
the President violated civil rights and the Constitution in the
prosecution of the war, and abolitionists impatient with the
President’s reluctance to move more quickly against the institution
of slavery. Thus, Snee concludes that Vidal was attempting to
present a more complicated Lincoln than “the simplistic
two-dimensionality of other representations” (135). Nevertheless,
Gore
Vidal’s Lincoln
still presents emancipation as a goal influenced by a concern for
union victories on the battlefield rather than the plight of enslaved
people. Spielberg’s Lincoln
depicts a man far more committed to the principle of equality, but
Snee fails to address the complexity of Lincoln’s views on race and
how these perceptions may have evolved over time; for opposing
slavery expansion into the territories so they could remain open to
white social mobility is considerably different than commitment to
racial equality. Snee’s book would benefit from consideration of
these issues as well as additional discussion of Spielberg’s
Lincoln
as
the changing depiction of the President in this film is essential to
the author’s thesis. Also, the observation on page nine that the
Constitution rather than the Declaration of Independence states “all
men are created equal” is a little disturbing. Nonetheless, Snee’s
observations on the cinematic Lincoln offer some insights that will
be of use to historians, and perhaps a re-examination of Lincoln’s
image may foster further discussion as to what has happened to the
party of Lincoln.
|
99167d16812c8f3f541836f18c729ae2 | https://historynewsnetwork.org/article/163708 | Shoulda Woulda Coulda – A Presidential Role Model Donald Trump Desperately Needed | Shoulda Woulda Coulda – A Presidential Role Model Donald Trump Desperately Needed
If Donald Trump
hoped to win this election, he needed a new role model. He had
flirted with a half dozen, from Barry Goldwater to Ronald Reagan to
Richard Nixon. The latest changes in his campaign staff make it clear
that he has decided he only needs one model – the one he sees when
he looks in the mirror each morning.
Too bad.
Historian Daniel Ruddy has published a book about a new dynamic
model that could have won Trump the election. In a mind-bending tour
de force, Ruddy has demolished the standard image of Theodore
Roosevelt as an early 20th Century liberal. Again and again, Ruddy
smashes the myths about Roosevelt and reveals he was a conservative
reformer, fueled by dynamic energy and ferocious contempt for the
Democratic Party leaders of his day like William Jennings Bryan and Woodrow Wilson.
Seen from this
point of view, Roosevelt reveals an uncanny ability to back causes
that spoke directly to the people, even when his own Republican Party
opposed them. A good example was the franchise tax, which he pushed
through the New York State legislature when he was governor. Theodore
said it was intended to relieve the “improper and excessive portion
of the general taxes” paid by the “the farmers, the market
gardeners and the mechanics and tradesmen” of the state. He
declared a corporation which “derives its powers from the state
should pay to the state a just percentage of its earnings as a return
for the privileges it enjoys.” One newspaper called this tax “the
most radical departure in tax legislation New York State has ever
known.”
The tax
established a political stance that would pervade TR’s presidency.
Again and again he enacted policies over the objections of the
organized leadership of the Republican Party but with the support of
Democrats. He calculated that the anger he generated among “machine
Republicans” would be more than offset by his increased popularity
among the people. By the time he left the governorship he had created
a tax system that increased the revenue of New York City alone by $15
million yearly. That was enough money to build a new Brooklyn Bridge
every year.
Next Ruddy tackles
the trustbuster myth. TR supposedly waged war against monopolistic
corporations, breaking them up in order to protect American consumers
from ravenous captains of industry who were intent on stifling
competition and amassing mountains of wealth for themselves. Ruddy
demonstrates that TR’s policy toward the trusts was far more
complicated. By the end of his presidency he was more a trust
regulator than anything else. He praised “good trusts” as he
called large corporations that complied with the law. They were
encouraged to promote the nation’s economic progress under what
would later become the Federal Trade Commission
During the
generation that preceded TR’s presidency, the United States had
undergone a tremendous economic and social transformation. The
industrial revolution wrenched the country out of its agricultural
past and thrust it into an unstable present where many Americans
lived in densely populated cities rather than on isolated farms. This
rapid change created new problems that soon found an outlet in
legitimate public complaints about the poor quality of life for those
on the lower rungs of society.
Worse, Roosevelt
noted that the rich flaunted their wealth. As a consequence public
opinion turned against the industrial magnates and Wall Street
financiers. Leftist firebrands generated widespread rage against the
monopolistic one percent.
Meanwhile Roosevelt
was forming his own opinion about the trusts. Catapulted into the
presidency by the assassination of William McKinley, TR noted with
mounting alarm the growth of popular unrest and anger about the power
of the trusts. He soon began saying that the government had a “right
to interfere” in their operations for the public good in the same
manner in which the government regulated banks. This call for
sensible legislation went nowhere.
TR decided he had
to show the American people he was on their side, that he was a
trustbuster as passionately devoted to their interests as the
Democratic populist William Jennings Bryan. TR picked a fight with
the man recognized as the paramount leader of the nation’s
plutocracy – the most powerful financier in the world, J.P.
Morgan.
In April 1901 a
newspaper headline blared “Octopus Gigantic: The Trust of all
Trusts Planned.” It was an announcement that Morgan was about to
create a new combination of wealth and power, joining together the
Northern Pacific and Great Northern railroads, to form the second
biggest corporation the world after US Steel.
TR’s response
created a sensation. He announced his Justice Department would begin
legal proceedings against J.P. Morgan’s Northern Securities Company
with the goal of dissolving the combination as an illegal “restraint
of trade” under the Sherman antitrust law. This law had been on the
books for a dozen years but had never been enforced.
The news shook
Wall Street like an earthquake. The market capitalization of Morgan’s
railroad combination dwindled from $400,000,000 to $55,000,000 in a
single day. An angry financier allied with the Morgan interests
observed wryly: “the business of the country appears to be
conducted at Washington now.”
As the stunned
the Republican Party pondered TR’s decision to prosecute, they were
forced to recognize that Roosevelt had landed a political
masterstroke. His old friend Sen. Chauncey Depew declared
“Roosevelt’s stock is going up with the people every day.”
The Democratic
Party was baffled by TR’s new policy. One of their leaders called
him “a weird magician of politics.” Meanwhile Wall Street reacted
with predictable wrath when it learned that the Republican Party was
no longer in its pocket. Refusing to accept the change with quiet
grace, J.P. Morgan and his allies used the newspapers they controlled
to unleash a barrage of criticism against Roosevelt
TR brushed aside
Wall Street’s fulminations as nothing more than “wooden-headed
stupidity.” His self-confidence grew even stronger when the midterm
elections of 1902 gave the Republicans control of Congress. It was a
national referendum on TR’s leadership and he won with the popular
wind at his back. Soon he was attacking John D Rockefeller’s
Standard Oil Trust, winning more cascades of public approval.
Meanwhile his
attack on J. P. Morgan’s Northern Securities railroad combination
went to the Supreme Court, which ruled that it was “a restraint of
trade” under the Sherman antitrust law, and must be dissolved. The
New York World,
a newspaper that was usually a fierce Roosevelt critic, ruefully
admitted: “Politically the effect of the decision can hardly be
exaggerated. It will greatly strengthen Roosevelt as a candidate.
People will love him for the enemies he has made. ” With the scalps
of J. P. Morgan and John D. Rockefeller on his belt, Roosevelt
cruised to an easy victory in the 1904 presidential election.
Need we say more?
Here is the style and not a little of the content that Mr. Trump
needed to give him a fighting chance to become president of the
United States. Too bad he’s decided to go it alone.
|
10354b0fb299b769b2bea729fd055629 | https://historynewsnetwork.org/article/163914 | Silver Spring, Maryland Has Whitewashed Its Past | Silver Spring, Maryland Has Whitewashed Its Past
As journalists and academics swept into
American communities in turmoil over African American residents
killed by police officers, James Loewen urged these visitors to
question what they see. “When researching a town or county, if it
is overwhelmingly monoracial, decade after decade, ask why,” Loewen
wrote earlier
this year. The same could be said about a place’s history: if the
history omits African Americans, historians should ask why.
A pair of
Ethiopian restaurants on Georgia Avenue, one of Silver Spring’s
main streets.
I have written about the erasure
of African American history in an Atlanta, Georgia, suburb. Ever
since encountering the deliberate efforts to produce a historical and
historic preservation narrative that fits Decatur’s carefully
crafted municipal image, I now read history and historic preservation
documents with Loewen’s questions in mind. When I returned to
Silver Spring, Maryland, in 2014 after nearly four years in Georgia I
revisited the histories produced about the Washington suburb and
found disturbing parallels to what I was documenting in Decatur.
Silver Spring is an unincorporated
place in Montgomery County adjacent to the District of Columbia’s
northern boundary. Today it is a rich polyglot community with large
numbers of immigrants. The community has been dubbed “Little
Ethiopia” for the many restaurants and groceries that have
opened there in the past 20 years. It is easy to walk through the new
Veteran’s Plaza and hear conversations in Spanish, Amharic, French,
and Georgian.
Yet, Silver Spring hasn’t always been
such a heterogeneous place. It emerged in the first quarter of the
twentieth century as a sundown suburb knit together from several
dozen residential subdivisions that excluded African Americans
through racially restrictive deed covenants. Once a sleepy
agricultural hamlet, Silver Spring became a middle-class bedroom
community for Washington bureaucrats and real estate entrepreneurs.
Typical
Montgomery County racially restrictive deed covenant. Source:
Montgomery County Land Records, Liber 342, Folio 463 (January 23,
1924).
For much of the previous century, the
only African Americans who lived in Silver Spring’s core, the area
within a one-mile radius of the intersection of its two main streets
-- Colesville Road and Georgia Avenue – were domestic servants.
African Americans could work in Silver Spring but they could not live
there, worship there, go to school there, or play there. The
businesses willing to take black money – there weren’t many,
longtime area residents have told me – rigidly enforced Jim Crow
rules: no eating in, no using the front door, and no trying on
clothes or hats.
Silver Spring was a strictly segregated
Southern town that vigorously resisted integration well into the
1960s. The community’s Jim Crow past is well known by the African
Americans who grew up in neighboring suburban communities where
African Americans weren’t excluded and in nearby Washington.
But because African Americans didn’t
historically live in Silver Spring doesn’t mean they were absent
from its historical landscape. They could be found in middle class
homes cleaning, gardening, and raising their employers’ children.
They worked in the back rooms of the stores that didn’t serve
African Americans. And, they built the streets, stores, and homes
that today’s historic preservation advocates celebrate.
The African American presence was an
important part of Silver Spring’s history that was just as
important as their absence from other parts: the schools, the tax
rolls, and the growing entrepreneurial class. This was by design:
Silver Spring, a place without formal boundaries, was the creation of
a handful of white real estate speculators who owned the downtown
businesses and who bought the large former farms turning them into
subdivided tracts that sprouted bungalows and period revival cottages
for thousands of middle class white residents.
The same entrepreneurs who founded the
Silver Spring Chamber of Commerce and who owned the real estate
companies were the boosters behind packaging 11 communities under the
marketing umbrella, “Maryland North of Washington.”
Silver Spring Chamber of Commerce advertisement published in the
Washington Evening Star, September 10, 1927.
The rules in Silver Spring’s
businesses were widely understood by the area’s African Americans
who knew by word of mouth where they were unwelcome. The blunt
racially restrictive covenants attached to more than 50 residential
subdivisions between 1900 and 1948 completed the barrier to African
American entry to Silver Spring society.
Key legal and legislative actions
helped deconstruct the exclusions to African Americans. In 1948, the
U.S. Supreme Court ruled racially restrictive covenants
unenforceable; in 1962, Montgomery County enacted an open
accommodations law prohibiting discrimination by businesses located
in the county; and, in 1967, Montgomery County enacted an open
housing law prohibiting discrimination based on race.
White flight in the 1960s and 1970s
that included the relocation of businesses to nearby shopping malls
turned Silver Spring’s business district into a distressed area
with many vacancies and neglected buildings and streetscapes.
Redevelopment efforts at the turn of
the twenty-first century brought new capital, new buildings, and a
new crop of residents who hailed from Central America, Europe,
Africa, and Asia into Silver Spring. Some people, like local author
George Pelecanos called it “gentrification.” Others, like Silver
Spring’s historic preservation activists, saw it as a threat to the
community’s heritage.
But what was Silver Spring’s
heritage?
A Silver Spring
Heritage Tour marker located on Georgia Avenue.
To the Silver
Spring Historical Society, it is a narrowly defined part of
Silver Spring’s past: stores, restaurants, movie theaters,
churches, and homes built by and for the community’s white boosters
and residents in the first half of the twentieth century. The
society’s leaders have constructed a nostalgia narrative that –
like the Jim Crow policies from the past – omits African Americans.
“The heritage that the Silver Spring Historical Society …
[celebrates] and aim to preserve is frequently at odds with the
community's history remembered by older African American residents,”
wrote a University of Maryland graduate student in a 2005 Ph.D.
dissertation titled, “Imagined
pasts, imagined futures: race, politics, memory, and the
revitalization of downtown Silver Spring, Maryland.”
The bias towards a romantic,
whites-only history is evident in the two books on about the
community written by historical society founder Jerry McCoy and in
the group’s historic preservation efforts. The latter has spilled
over into public policy and planning through the designation of
Montgomery County historic properties and in the Silver Spring
Heritage Tour signs mounted throughout the central business district.
A historic
resources survey of downtown Silver Spring memorialized the
Silver Spring Historical Society’s nostalgia narrative by excluding
African Americans from the discussion of Silver Spring’s history.
The 2002 survey, conducted by a consultant under contract to the
Montgomery County Planning Department, also did not identify or
discuss sites associated with Silver Spring’s civil rights
struggles in the 1950s and 1960s.
African Americans are similarly absent
from the heritage trail. Only one of the signs mentions African
Americans: a paragraph detailing a 1957 NAACP survey of business
discrimination. Even that brief mention appears to minimize Silver
Spring’s segregationist past. The organization, “conducted a
survey of 18 cafes,” reads the text in one marker at the site of a
former Little Tavern restaurant. “Six were cited for refusing
sit-down service to African-Americans, including the Little Taverns
in each community (they only offered carry-out)” [emphasis
added].
Silver Spring in 2016 looks nothing
like the Silver Spring of 1956 except in the community’s histories
and the monuments to white supremacy, e.g., buildings and nostalgia
narratives, those histories seek to preserve. Historians, public
officials, and the general public should heed Loewen’s charge to
ask why African Americans are missing. I did and though I didn’t
like what I saw (or didn’t see), it opens up opportunities for
discussions about equity and for exploring ways to correct the
mistakes of both the distant and more recent pasts.
|
8224016716f4d3434b9c3b63f2ccbaab | https://historynewsnetwork.org/article/163970 | The Obvious Lesson We’re Ignoring from Our Internment of the Japanese During World War 2 | The Obvious Lesson We’re Ignoring from Our Internment of the Japanese During World War 2
Throughout
our history, when Americans have been attacked or felt threatened,
fear and vengeance sometimes have ruled. On more than one occasion
an entire ethnic group of native-born Americans has been branded a
threat. Entire communities have been forcibly unrooted without due
process in a passion first captured by Cicero when he wrote, “In
times of war, the laws fall silent.”
In
this century, 9/11 and more recent jihadist-inspired domestic
violence have spawned speculative calls for databases of Muslim
Americans, mosque closures, and broad banishment of ethnic immigrants
of similar faith. These draconian cries for action are hardly
precedent-setting.
In
the past, America’s retribution against ethnic groups lasted years
and sometimes it has become part of our culture. When Native
Americans were viewed as a threat to white settlement and expansion,
tens of thousands were forcibly moved onto more than 300
reservations.
The
attack at Pearl Harbor in 1941 led to unparalleled fear and anger
directed at thousands of our neighbors who were our classmates, who
ran restaurants, and who grew our food—simply because they were
Japanese Americans. President Franklin Roosevelt ordered the removal
of more than 100,000 Japanese Americans from the West Coast in 1942
solely because of their ethnicity. There was no due process. No
formal charges. Families were given only a few weeks’ notice to
sell their businesses, homes, personal belongings, and even family
heirlooms. “Japantowns” from San Diego to Seattle were gutted
within a few months.
Indeed,
Cicero proved prescient when our Japanese American neighbors were
sent to internment camps in some of the same desolate regions that
had become home to Native Americans. It was euphemistically called
“relocation” and “evacuation” at the time. But the reality
was far different. Most endured about two years in a prison-camp
environment of barracks as families lived in a single room. They
were surrounded by barbed wire and guarded by armed soldiers, weapons
turned inward.
A
year later, President Roosevelt authorized the segregated Japanese
American 442nd Regimental Combat Team and asked their sons to
volunteer for an army commanded by white officers and possibly die
for their country in Europe and the Pacific. Remarkably, 10,000
volunteers from Hawaii stepped forward. Together with about 1,500
volunteers from the internment camps and draftees, army recruiters
were overwhelmed by the response.
The
442nd suffered horrendous casualties on questionable missions as it
compiled a remarkable war record. Ultimately the Japanese American
442nd became the most-decorated unit of its size in World War II.
One of its battalions, the 100th from Hawaii, brutally earned the
moniker “Purple Heart Battalion.” The 442nd ultimately earned
more than 18,000 awards for valor, more than one for every man. (Yet
Japanese Americans were denied Medals of Honor until President
Clinton issued 21 in 2000. Only seven were alive to receive them
personally.)
They
returned home after the war and some suffered continuing hatred from
their neighbors. Yet they endured and rebuilt their lives as
parents, teachers, merchants, church leaders, and mechanics. Even
though they had been treated as a faceless, homogenous, and undefined
internal threat against America, for the most part they suffered
silently as they rose above America’s fear and vengeance.
Today
their legacy sounds a cautionary note against partisan political talk
of Muslim American databases; muddled policy statements about Muslim
Americans abroad; and the dangers of American mosques. Today’s
sweeping characterizations of Muslim Americans are a dangerous echo
of America’s treatment of Japanese Americans nearly 75 years ago
when Oregon governor Walter Pierce stated, “Their [Japanese
American] ideals, their racial characteristics, social customs, and
their way of life are such that they cannot be assimilated into
American communities. They will always remain a people apart, a
cause of friction and resentment, and a possible peril to our
national safety.”
His
statement sounds eerily familiar today. It is a sentiment that
sullies the American spirit and one that should be stifled against
the backdrop of our history when thoughtful discussions about
national security take place in today’s America.
|
e5c6224e26a3aba62a91f593f6a255ad | https://historynewsnetwork.org/article/164050 | Theodore Roosevelt is finally getting a presidential library. It will be in North Dakota! | Theodore Roosevelt is finally getting a presidential library. It will be in North Dakota!
Related Link Website for the Roosevelt library
The rigorously authentic Elkhorn Ranch re-creation will be the centerpiece of the Theodore Roosevelt Presidential Library & Museum. So far, they have $15 million committed from the state and the city of Dickinson toward an estimated $85 million project. Its organizers hope to open the complex — which they project could draw 300,000 visitors or more annually — on the 27-acre site in time for the centennial of Roosevelt's death on Jan. 6, 1919.
They concede it will be a heavy lift, literally and metaphorically. And they are undaunted.
"The presidential library will be a facility we can be immensely proud of, it will bring many people to Dickinson and we will create something that will endure," said Dr. Bruce Pitts, a retired Fargo physician and board chair of the Theodore Roosevelt Presidential Library & Museum. He knows a two-year timeline is perhaps overly ambitious, but the board is in the process of hiring a national fundraising and marketing outfit and an architect.
"The biggest mistake is to think too small. This is Theodore Roosevelt after all, not Millard Fillmore," said author and scholar Clay Jenkinson. The Dickinson native is head cheerleader and a walking encyclopedia of Theodore Roosevelt's life. He can quote long sections of the 26th president's speeches by memory and portrays on stage the Rough Rider in period costume with teeth-snapping, fist-pounding swagger. His mellifluous voice and erudition are familiar to viewers of Ken Burns' documentaries.
|
84f7040037cceeb7faabc2c5e872f91a | https://historynewsnetwork.org/article/164447 | Are Trump Supporters Seriously Citing the Internment of Japanese Americans as a Model? | Are Trump Supporters Seriously Citing the Internment of Japanese Americans as a Model?
The
ghosts of internment are haunting political discourse once again.
Ever
since Donald Trump began singling out Muslims as a security threat,
some among his supporters have been frighteningly quick to note that
the federal government has done this before, with Japanese Americans.
What they tend not to cite is the congressional study that
definitively found that internment was not based on credible threats
but on fear.
Earlier
this week, Carl Higbie, a former spokesman for an independent
fund-raising committee that backed Mr. Trump, noted that the Supreme
Court upheld the rulings undergirding the internment. But Mr. Higbie
apparently neither knew nor cared that those cases were vacated by
the 9th Circuit Court of Appeals in the 1980s through a coram
nobis or
writ of error process. The court recognized that the Supreme Court
was given false and misleading information when the cases were first
tried.
For
over a decade it has been my honor to work with Japanese Americans
who were among the over 120,000 people removed from their homes and
incarcerated during World War II. My research site, Amache,
Colorado, is one of the ten primary incarceration camps built by the
War Relocation Authority. As an archaeologist, I was first drawn to
this work for the kinds of important anthropological topics the
material remains of the camp can address: for example, how do people
cope during incarceration and how do they remake hostile landscapes.
Indeed, the remains at Amache have revealed myriad strategies for
resilience and community building. Some of these acts were as simple
as making room for toys and porcelain rice bowls in the few pieces of
luggage they were allowed to bring. Others were as complicated as
amending the soil in their prison gardens or building a proper sumo
ring.
But
perhaps more importantly, these physical remains are tangible
touchstones for an often unspeakable history. The fragments of
artifacts evoke the fragmented lives of those who, innocent of any
crime, were nevertheless forced to live behind barbed wire. Site open
houses and museum exhibits provide a venue for former internees and
their descendants to share personal and family stories that have been
too long suppressed. They create safe spaces for talking about the
larger specter of how such sites came to be.
Our
research at Amache speaks directly to an American moment not so long
ago in which racist-fueled hysteria spun quickly out of control and
too few tried to stop it. My students, colleagues, and I share with
Amacheans and their descendants a conviction that such an internment
must never happen again.
As
a nation, we are experiencing the high cost of the fraying of civic
trust. But unlike the weeks following the bombing of Pearl Harbor, we
do have a precedent. It is one that tells us what NOT to do, how NOT
to proceed. We owe it to all in the body politic to remember those
lessons. But we owe it in particular to all those who survived
Japanese American internment and to their descendants. This
past cannot repeat itself on our watch.
|
4cdc8e0c7656f8170f4a02a1d6d57891 | https://historynewsnetwork.org/article/164525 | Historian says it’s a relief to see Trump supporter hammered for suggesting Muslims be registered | Historian says it’s a relief to see Trump supporter hammered for suggesting Muslims be registered
When Donald Trump and other Republican legislators proposed a ban on Muslim immigration to the United States last November, many commentators turned to history. My colleague Matt Ford argued that the incarceration of Japanese Americans during World War II, along with the jurisprudence initially used to justify it, shows why these kinds of ethnic- or religious-based policies are flawed. More recently, Trump and his aides have spoken in favor of reviving a registry for Muslims entering the United States and undertaking “extreme vetting” of Muslims fleeing persecution, including potentially creating holding areas for them outside of the United States.
In the wake of Trump’s election, some Americans fearthe possibility that hate crimes and incidents of bigotry will multiply, enabled by the new president’s rhetoric and policies. The comparison between Japanese internment and policy proposals related to Muslims speaks more to this fear than a significant chance of history being repeated. But Japanese Americans’ experiences are still instructive: They illustrate how America in 2016 resembles America in the 1940s, and show the ways that systematic discrimination can shape a minority group’s self-understanding.
Anne Blankenship, an assistant professor of history at North Dakota State University, recently wrote a book about Japanese incarceration, specifically focusing on the experience of Japanese Christians in the camps. Our conversation about her research and its renewed relevance in today’s politics is below; it has been condensed and edited for clarity.
Green: Before Thanksgiving, a spokesman for a pro-Trump PAC suggested that Japanese internment is a “precedent” for some of the Trump administration’s tentative proposals related to Muslim immigration and registries. As a scholar of Japanese internment, what have you been thinking about most in recent weeks—and throughout this election?
Blankenship: More than anything—and this might sound strange—relief is part of it, the fact that so many people are speaking out against it. And there’s a strong history within the Japanese American community to speak up for Muslims.
This wasn’t the case back in 1941 or 1942. Incarceration was widely accepted. People who were normally considered progressive heroes, like Dr. Seuss, who has all these books on progressive causes, drewcartoons showing caricatures of the Japanese lining up from Washington to California, picking up their bricks of TNT to go do their sabotage. The only people who did speak out against it were the church groups.
It’s nice to see, now, that it’s not just a limited number of people. ...
|
8ebaefca88055e0234e761e5b939b97e | https://historynewsnetwork.org/article/164750 | The Forgotten First Attempt to Plant a Colony on US Soil | The Forgotten First Attempt to Plant a Colony on US Soil
Less
than two generations after Columbus landed in 1492, and while his son
Diego served as governor of Spain’s New World headquarters on
Hispaniola, another Spanish colony was planted in South Carolina, the
first on what is now part of the United States. Since
its 490
years there has been no official recognition of
its
appearance or its dramatic story. But its struggles and ultimate fate
speak to today’s unresolved racial issues.
In
June 1526, Lucas Vásquez de Ayllón, a wealthy Spanish official in
the city of Santo Domingo, Hispaniola, founded a colony at or near
the mouth of the Pee
Dee River in eastern South Carolina. Six decades before Roanoke
Island (1587), eight decades before Jamestown (1607), and almost a
century before the Mayflower landed at Plymouth Rock (1620), Ayllón
began his North American dream.
Ayllón’s
effort has been overlooked, perhaps because most people prefer to
believe that US life began with the arrival of English-speaking
Anglo-Saxons living under British law. Perhaps his settlement is
neglected because of its tragic fate— death by mismanagement,
disease, and slave revolt. Perhaps it is unmentioned because of its
unique rebirth in the woods by people not considered a worthy part of
the US heritage.
Ayllón
prepared for his great adventure in 1520 by sending Captain Francisco
Gordillo to locate a good landing site and build friendly relations
with the local inhabitants. The captain instead teamed up with a
slave hunter, Pedro de Quexos. While failing to survey a site or
build good relations with anyone, the two men captured seventy Native
Americans and brought them back to Santo Domingo as slaves. The first
European act on what is now US soil was making slaves of free men and
women.
The
two adventurers returned to Ayllón with enchanting tales of naming a
great river in honor of St. John the Baptist and cutting Christian
crosses in trees. Ayllón was not impressed with their seizure of
seventy Native Americans and brought the issue to the attention of a
commission presided over by Diego Columbus. The Indians were declared
free and ordered returned, but Spanish records do not show whether
the order was carried out.
But
they do show that Ayllón, to make amends with the natives who lost
their loved ones, sent back the slaver Quexos who started the
problem. Once again Quexos returned with other captured natives he
this time claimed volunteered to serve as guides for the Spain’s
expedition.
Ayllón
also retained one of the original seventy, Ferdinand Chicorana, as
his New World interpreter. Impressed with his skills and knowledge of
the mainland, he brought Chicorana to Spain to meet the king. After
this meeting King Ferdinand issued an order permitting Ayllón to
sail for the coast of North America. The king’s orders forbade
enslavement of Indians, and added “you be very careful about the
treatment of the Indians.” Three Dominican missionaries were sent
to protect Native Americans from the Europeans.
With
this record as a backdrop, Ayllón prepared to launch his expedition
to North America. After some delays his fleet of six vessels sailed
from Puerto de la Plata. Aboard were five hundred Spanish men and
women, one hundred enslaved Africans, six or seven dozen horses, and
physicians, sailors, and several Dominican priests.
Mishap
and disaster dogged the enterprise as it landed on the wrong coast,
lost a ship, and Chicorana deserted. The other Indian interpreters
seized by Quexos also fled. The Europeans were on their own.
Determined
to succeed, Ayllón drove his people until they came to a great
river, which was probably the Pee Dee. Selecting a location in a low,
marshy area, Ayllón ordered his men to set up camp. He paused to
name his settlement “San Miguel de Gualdape.” When he ordered the
Africans to begin building homes, he launched black slavery in the
United States.
The
neighboring natives fled inland and stayed away. It was probably
enough for them that the Europeans who had seized seventy of their
loved ones had now returned with Africans in chains.
Europeans,
arriving to exploit land and labor, contrasted in many ways with the
peaceful natives. The Indians lived harmoniously with nature and
shared huge pine, weather-insulated homes that slept about three
hundred people each. Europeans tried to construct homes that kept men
and women in separate rooms. Europeans wrote that these Indians lived
long lives and “their old age is robust.” While European men
dominated their women, Indian women doctors served their people plant
juices to cure fevers.
While
native life moved peacefully ahead, the foreigners slipped toward
into crises. Disease and starvation ravaged the colony and internal
disputes tore it apart. The river was full of fish, but few Europeans
were healthy enough to fish. Then an epidemic swept the settlement
and before housing was in place wintery winds blew in. Ayllón became
gravely ill and died on October 18, after having named his nephew,
Johan Ramirez his successor. But Ramirez was in Puerto Rico, and the
leaderless Spaniards split into bitter armed factions.
Men
drew swords and marched in howling winds to arrest and sometimes
execute those who wished to become leaders of the colony. Some
survivors complained that in the midst of their tribulations Africans
began setting fires, and Native Americans sided with them and made
trouble.
In
November a crisis erupted when Africans rebelled and fled to Indian
villages. One authority on slave revolts believes the revolt was
instigated by Native Americans angry over whites using their land.
Africans, used to freedom in their homeland, probably needed no
outside prodding to strike for liberty. They understandably fled
enslavement in a dying European colony to start new lives in the
woods among people who also rejected European enslavement.
The
surviving 150 Spanish men and women, no longer able to face a
freezing winter without shelter or their labor supply, packed up and
sailed back to Santo Domingo. It would be another quarter of a
century before Spanish colonists would arrive to build another North
American colony with slave labor.
San
Miguel de Gualdape was not a total failure as the first foreign
colony on US soil. The Europeans left after five months, but Africans
remained to build their society with Native Americans. In the
unplanned way that history meanders and careens, a new community
emerged in the woods – one that also included foreigners from
overseas, the Africans. This new mixed Indigenous and foreigner
settlement would soon sprout many American models, often called
Maroon colonies.
In
distant South Carolina forests, two and a half centuries before the
Declaration of Independence, two people of color lit the first fires
of freedom and exalted its principles. Though neither white,
Christian, nor European, they established the first settlement of any
permanence on these shores to include people from overseas. They
qualify as our earliest inheritance.
There
is no way of knowing how long this new settlement remained free of
European intervention
or how it carried on its life, customs and family survival.
Within a century the march of foreign conquest and colonization would
spill into their lovely streams and forests. But while this Black
Indian community lived, it provided the Americas its earliest example
of frontier hospitality, peace, and democratic camaraderie.
Some
will mourn the symbolic loss of the white pioneers of Roanoke Island,
Jamestown, or Plymouth. But can we not take heart from the daring
heritage bequeathed us by the African freedom fighters who fled San
Miguel de Gualdape and by the Native Americans who welcomed them in
as sisters,
brothers and family? The two peoples began a gallant American
tradition carried forward by other Americans at Concord Bridge and
Valley Forge.
This
new community of color says
that our vaunted democracy did not march into the wilderness with
buckled shoes and British accents. Rather it danced
around fireplaces in South Carolina wrapped in dried animal skins and
sang
African and native songs before any British colonists arrived. South
Carolina’s dark democracy lived in family groups before London
companies sent out settlers with muskets, Bibles, and concepts of
private property.*
The
Black Indians of the Pee Dee River
in 1526 also
became the first settlement on the continent to practice the belief
that all people— newcomer and native—are created equal and are
entitled to life, liberty, and the pursuit of happiness. Theirs is a
story worth remembering and teaching our children.
*The
nature and practices of the Black Indian settlements that followed
the departure of the Spanish colonists in 1526 need
further investigation.
Douglas T. Peck, “Lucas
Vasquez de Ayllon’s Doomed Colony of San Miguel de Gualdape,”
Georgia Historical Quarterly, Summer 2001, details its history under
Spanish rule, but not after.
|
ed85838ab92c0157ee1d5e7d56715c09 | https://historynewsnetwork.org/article/164999 | Another National Park Service tweeter has a not-so-subtle message for Donald Trump | Another National Park Service tweeter has a not-so-subtle message for Donald Trump
On the same day President Donald Trump is expected to order a temporary ban on immigrants from certain Muslim countries, as well as a hold on refugees, Death Valley National Park’s official Twitter account posted stark reminders of the U.S. internment of Japanese-Americans during World War II. “During WWII Death Valley hosted 65 endangered internees after the #Manzanar Riot. #JapaneseAmericanInternment,” the first tweet read, matter-of-factly. Following that, the account posted a picture of one particular internee, Togo Tanaka, accompanied by his famous quote against the racist internment policy: “We want the opportunity they have to prove their loyalty. We are asked to accept a denial of that privilege in the name of patriotism.”
|
0828123e103f34464bc712de40e92900 | https://historynewsnetwork.org/article/165095 | During World War II, the U.S. Saw Italian-Americans as a Threat to Homeland Security | During World War II, the U.S. Saw Italian-Americans as a Threat to Homeland Security
The incarceration of Japanese-Americans is the best-known effect of Executive Order 9066, the rule signed by President Franklin Roosevelt on February 19, 1942. And for good reason. The suffering and punishment placed upon innocent Japanese-Americans was a dark chapter in American history. But the full extent of the government order is largely unknown.
In addition to forcibly evacuating 120,000 Americans of Japanese background from their homes on the West Coast to barbed-wire-encircled camps, EO 9066 called for the compulsory relocation of more than 50,000 Italian-Americans and restricted the movements of more than 600,000 Italian-Americans nationwide. Now, the order has resurfaced in the public conversation about immigration.
Says Tom Guglielmo, a history professor at George Washington University: “It’s as relevant as ever, sadly.”
|
2e6f9310321824015141a4182112385b | https://historynewsnetwork.org/article/165204 | On this 75th anniversary of the order to intern Japanese-Americans, many are seeing parallels to the immigration ban | On this 75th anniversary of the order to intern Japanese-Americans, many are seeing parallels to the immigration ban
In 1988, the final full year of his second White House term, Ronald Reagan apologized to the 120,000 Japanese-Americans who’d been confined to internment camps during World War II, of which there were 10 around the nation, and of which Manzanar is the most notorious. The survivors of the camps also received reparations, a rare concession by the American government. “Here we admit a wrong,” Reagan said. “Here we reaffirm our commitment as a nation to equal justice under the law.” The announcement was made in San Francisco, whose Japantown was cleared out by interment, which began in 1942, about three months after Pearl Harbor.
Some of the survivors of the camps, many of them now aged, watched as Reagan, in a mustard-colored suit, apologized for the sins of Franklin Delano Roosevelt.
“I think this is a fine day,” the president added.
|
f2c0740d4dd00e7be2bfb18cc50f0eb9 | https://historynewsnetwork.org/article/165230 | The 75th Anniversary of FDR’s Order to Intern Japanese-Americans Is Frighteningly Timely | The 75th Anniversary of FDR’s Order to Intern Japanese-Americans Is Frighteningly Timely
February
19 marks the 75th anniversary of President Franklin Roosevelt’s
executive order authorizing the forced relocation of Japanese
Americans to internment camps following the outbreak of war with
Japan.
The
decision was a mistake. No internees were ever convicted of
espionage or sabotage. It was a shameful descent into tribalism. Yet
some would have us go down that path again.
Recently
the Hartnett
Gallery at the University of Rochester mounted an exhibit on the
Japanese-American internment camps and how Americans viewed the
Japanese during World War II. Among the lessons: dehumanizing a
whole category of people makes the most horrific violence against
them acceptable.
Anti-Japanese
propaganda postcards went on sale in the U.S. soon after Japanese
Imperial forces bombed Pearl Harbor. Bucktoothed, slant-eyed
Japanese soldiers, small, grimacing, and yellow, are shown being shot
at, punched, slapped, spanked, kicked, stomped, bombed, and
bayonetted. Their aggressors are various representations of the
U.S.A.—a towering and stern Uncle Sam, a smiling G.I. Joe – tall,
powerful, and white.
Wartime
propaganda postcards ridiculed all of the Axis powers, but the
anti-Japanese cards still shock us today because of how viciously—and
literally—they target an enemy of a different race. The violence
they depict reflects raw hatred, retribution, and revenge. The nation
was in shock, its borders breached. These cards offered reassurance
that the world could be put right again.
We
included in our exhibit a handheld dexterity game, a small box with a
glass lid framing a map of Japan. As you tilt the box, two weighted
plastic capsules roll across the map. The goal is to navigate the
capsules into two holes on the map labeled “Hiroshima” and
“Nagasaki.” This toy bears witness to the power of wartime
hysteria to block our capacity for rational thought.
The
heart of the exhibit is photography documenting what remains of 10
government incarceration camps. In the months after Pearl Harbor,
120,000 ethnic Japanese living on the West coast were forced to move
to them. Most were U.S. citizens, but they looked like the enemy.
Milton
Eisenhower, director of the War Relocation Authority, explained the
mass internment in a 1942 film by the Office of War Information:
“We
knew that some among them were potentially dangerous. Most were
loyal, but no one knew what would happen among this concentrated
population if Japanese forces should try to invade our shores.
Military authorities therefore determined that all of them, citizens
and aliens alike, would have to move.”
He
glosses over Roosevelt’s executive order authorizing the U.S.
government to exclude civilians from any area without trial or
hearing. Although the order did not specify any single ethnicity,
Japanese Americans on the West coast were the only group imprisoned
en masse. Eisenhower describes the camps as “full of opportunity”
but in fact they were desolate outposts. Internees were allowed to
take only what they could carry. Most people lost everything. For
three years they lived in barracks within barbed wire enclosures,
surrounded by armed guards.
In
1983, the Commission on Wartime Relocation and Internment of
Civilians described the conditions that led to the violation of so
many people’s rights as “race prejudice, war hysteria, and a
failure of political leadership.” The U.S. government formally
apologized and provided reparations to former internees as part of
the Civil Liberties Act of 1988.
But
now some in our nation would have us repeat the mistakes of the past.
In November 2015, David Bowers, mayor of Roanoke, Virginia, said:
“I’m reminded that President Franklin D. Roosevelt felt compelled
to sequester Japanese foreign nationals after the bombing of Pearl
Harbor, and it appears the threat of harm to America from ISIS now is
just as real and serious as that from our enemies then.” One month
later, presumptive Republican presidential candidate Donald Trump
hijacked history by choosing Pearl Harbor Day to call for a “total
and complete shutdown of Muslims entering the United States.” Last
month he acted to follow up on his Pearl Harbor Day promise.
Other
news in 2016 overshadowed two profoundly meaningful events: President
Obama and Japanese Prime Minister Abe visited Hiroshima and Pearl
Harbor together. They came to these sacred sites to reaffirm a
commitment to peace. President Obama spoke of choosing “a future in
which Hiroshima and Nagasaki are known not as the dawn of atomic
warfare but as the start of our own moral awakening.” History shows
that retribution is never a solution—people seize on retaliation
when they are fueled by nationalist and even racist sentiment. More
than ever, we need to think beyond borders and identities, to reframe
the issues at hand in terms of what it means to be human.
|
4ce7d2449eea3d664fbee16b6a0f9d24 | https://historynewsnetwork.org/article/165603 | Exactly One Year Before His Death Martin Luther King Denounced the War in Vietnam | Exactly One Year Before His Death Martin Luther King Denounced the War in Vietnam
Related Link When Martin Luther King Came Out Against Vietnam By David Garrow
“I
am sure that since you have been engaged in one of the hardest
struggles for equality and human rights, you are among those who
understand fully, and who share with all their heart, the
indescribable suffering of the Vietnamese people. The world’s
greatest humanists would not remain silent. You yourself cannot
remain silent.”—Letter from Thich Nhat Hanh to Dr. Martin Luther
King Jr., 1965.
On April 4, 1967,
Dr. Martin Luther King Jr. stood before a crowd of 3,000 anticipating
listeners gathered at the Riverside Church in the City of New York.
King was no stranger to Riverside Church, known for its national and
global activism since opening its doors in 1930. For almost a decade,
the civil rights leader’s visits to the Neo-Gothic edifice was an
annual event; but this evening was different as King’s message was
a departure from domestic issues about race and economic inequality.
Instead he addressed the most pressing foreign policy issue of the
day: the war in Vietnam.
The even toned
eloquence of King’s voice echoed throughout the heightened
cathedral as he patiently laid out his case against “my own
government” which he characterized as “the greatest purveyor of
violence in the world today.” Using the platform of the Clergy and
Laymen Concerned About Vietnam (CalCAV),
an interfaith organization created in October 1965, and the sponsor
of the Riverside Church event, King admonished his audience, “Silence
is betrayal.” Hence, “For the sake of those oppressed in the
[American] ghettoes, for the sake of this government, for the sake of
the hundreds of thousands trembling under our violence, I cannot be
silent.”
King’s nearly hour
long speech,
“A Time to Break Silence: Beyond Vietnam,” was a scathing rebuke
of the American government for its role in the escalating violence in
the small Southeast Asian country riven by civil war. The official
platform was that U. S. intervention was necessary to prevent what
officials characterized as a “domino effect” for the potential
spread of communism. But King challenged this assertion by recounting
a laundry list of U. S. overreach in Vietnam spanning two decades
which included the financial backing of France’s post –WWII
effort to recolonize the peninsula (1946-1954); its disregard for the
Geneva
Agreement; its support of dictator Ngo Dinh Diem (1955-1963); its
deployment (beginning in March 1965) of an estimated 400,000 American
troops—disproportionately Black and/or poor—to the region thus
“testing out our latest weapons [Napalm] on them just
as the Germans tested out new medicine and new tortures in the
concentration camps of Europe”; and engaging in human rights
atrocities with impunity “destroying families and villages.” The
esteemed orator concluded that rather than a war against communism,
the Vietnam War was a war of imperialism.
That historic
evening served as a capstone of what had been a two year journey of
King’s move “to break the betrayal of my own silences,” which
began not with his membership in the CALCAV, but rather with a letter
from a young Buddhist monk in Saigon named Thich
Nhat Hanh.*
A prolific author,
calligraphy artist, social activist, and humanitarian, Thich Nhat
Hanh was born in Central Vietnam in 1926; he was ordained a Buddhist
monk in 1949, and established the Phuong Boi
(Fragrant Palm Leaves) Meditation Center in 1955. Hanh came
to the United States in 1961 to study and teach at Princeton and
Columbia Universities, but returned to his “fatherland” in 1963
after several monks self-immolated in order to bring international
attention to the ruthless policies of the Diem regime which included
government raids on Buddhist temples.
In his June 1, 1965,
letter
to King, “In Search of the Enemy of Man”—written three months
after the U.S. campaign Operation Rolling Thunder began in Vietnam—
Hanh began by refuting Western interpretations of self-immolation as
mere suicide which is caused by, among other things, “a lack of
courage to live and cope with life’s circumstances,” which “is
considered by Buddhism one of the most serious crimes.” But to
sacrifice one’s self—as the Buddha had done in Jakata:
Or Stories of the Buddha’s Former Births when he allowed a
starving lioness to devour him instead of her cubs—the monks and
nuns believed they were “demonstrating a willingness to “protect
the people” by “practicing the doctrine of highest compassion by
sacrificing [themselves]in order to call the attention of, and to
seek help from, the people of the world.” Hanh reminded King of the
most recent self-immolation which occurred on April 20, 1965, when “a
Buddhist monk named Thich Giac Thanh burned himself.” These were
acts of a desperate people continually victimized by the atrocities
of war with no end in sight. As Hanh emphatically stated to King,
“Nobody here wants the war. What is the war for, then? And whose is
the war?”
The Buddhist monk
compared the long struggle for an independent Vietnam to the Civil
Rights Movement identifying the enemies of man as not man himself,
but rather in the case of Vietnam “intolerance, fanaticism,
dictatorship, cupidity, hatred, and discrimination; and in the case
of the U.S., “intolerance, hatred, and discrimination.” All were
a condition of the heart which could only be cured by divine
intervention via the activism of communities of faith. Hence, Hanh
implored the prominent Black southern preacher stating, “You cannot
be silent since you have already been in action and you are in action
because, in you, God is in action—too.”
Hanh’s letter
awakened King’s conscience and he began immediately to speak out
against the war; however, due to strong opposition from those within
his inner circle who feared that his anti-war stance was
counterproductive to the Civil Rights Movement, King tempered his
public criticism and agreed that it would be more prudent, at least
for the moment, to use his influence behind the scenes.
King and Hanh met in
Chicago in 1966 during the monk’s three-month international tour as
a representative of the people, particularly the Vietnam village
peasant population who were pleading for an end to the war. At a June
press conference in Washington, D.C. Hanh stated “that the war
kills far more innocent peasants than it does Viet Cong.” He
reiterated this point in his book,
Vietnam: Lotus in a Sea of Fire—A Buddhist Proposal for Peace,
stating that the masses cared little about ideology and did not
support the South Vietnamese-U.S coalition or the National Liberation
Front. “The first problem of the Vietnamese peasant” Hanh
asserted was “How to survive in the midst of all the forces that
threaten him; how to cling to life itself.”
King was so
impressed with Hanh that in January 1967, he wrote a letter
to the Nobel Institute nominating the peace activist for that
year’s Nobel Peace Prize stating, “I do not personally know of
any one more worthy of the Nobel Peace Prize than this gentle
Buddhist monk …. Here is an apostle of peace and non-violence.”
King also echoed the views of his Buddhist colleague and friend in
his Riverside speech, quoting directly from his peace proposal:
This
is the message of the great Buddhist leaders of Vietnam. Recently one
of them wrote these words, and I quote: Each
day the war goes on the hatred increases in the heart of the
Vietnamese and in the hearts of those of humanitarian instinct. The
Americans are forcing even their friends into becoming their enemies.
It is curious that the Americans, who calculate so carefully on the
possibilities of military victory, do not realize that in the process
they are incurring deep psychological and political defeat. The image
of America will never again be the image of revolution, freedom, and
democracy, but the image of violence and militarism (unquote).
In
other words, the U. S. presence in Vietnam guaranteed a Viet Cong
victory.
The
following month King and Hanh met again in Geneva, Switzerland, at
the Pacem
in Terris
(“Peace in the World”) Conference
sponsored by the World Council of Churches; they would meet for the
final time in Atlanta, Georgia, in late February of 1968. By this
time Hanh—having been assured that he faced certain assassination
if he returned to Vietnam for his peace activism—was living in
exile in the South of France. As was customary, the two men spoke
about their common goal for peace and their longing to create Beloved
Communities built on compassion and mutual respect for all of
humanity.
On April 4, 1968, on
the anniversary of his historic speech, King was shot dead by an
assassin’s bullet. The long night’s war in Vietnam persisted for
an additional seven years.
In a poem “Our
Green Garden” from his collection of poems on Vietnam, Hanh
reflected on the tragedy of war stating,
Here is my breast.
Aim your gun at it, brother, shoot!
Destroy me if you
will
And build from my
carrion whatever it is you are dreaming of.
Who will be left to
celebrate a victory made of blood and fire?
Indeed. On April 30,
1975, King and Hanh’s prophetic words were fulfilled as the red and
blue flag of the Viet Cong was raised over the presidential palace in
Saigon as the last of the troops headed home to an America boiling
over with disdain.
*King
was the lone Black and southern member of the CALCAV. According to
Hanh in his book Vietnam:
A Proposal (see
footnote 36) “The word ‘Thich’ has been widely but erroneously
interpreted as meaning ‘Venerable’
or ‘Reverend.’ Its actual purpose is to replace, for the monks
and nuns of Vietnamese Buddhism, the family names to which they were
born . . . and represents the family name of the Lord Buddha, Sakya
(in Vietnamese, Thich-Ca; abbreviation, Thich), of whose spiritual
‘family’ they have become a part. The appropriate title in
Vietnamese, which is the equivalent of ‘Venerable’
or ‘Reverend,’ is
Dai Due.’’ After 39 years in exile, Hanh made his first visit to
Vietnam in 2005. He is now 92 years old and continues to live in the
South of France at his Plum Village Monastery.
|
fdaf8f38fc07cd30fdc1875ab9597162 | https://historynewsnetwork.org/article/165805 | What’s Happened to Historians? | What’s Happened to Historians?
History
isn’t being kind
to historians these days. Shrinking university history
departments are turning over teaching responsibilities to poorly
paid, part time adjuncts. Underfunded museums and cultural
institutions are asking volunteers and interns to do work once
handled by paid professional staff. And the federal government is
once again targeting humanities and education programs for radical
downsizing, perhaps even elimination.
● ● ●
What
happened? With the passage of the GI Bill and the growth of the Baby
Boom generation, post-war Americans overwhelmingly supported public
education—and history was always an essential, if contested,
component of the curriculum. People respected historians, even if
they didn’t always read or understand the narrow scholarship that
university professors were turning out for their peers. Such work
was scientific in its methodologies and free from undue commercial
and political bias, and lots of people trusted professional
historians—like scientists—to uncover, verify, and interpret
factual truths. Government and private enterprise were creating and
expanding museums and historic sites everywhere, meanwhile, and state
and local historians, teachers, librarians, and others were serving
growing and appreciative public audiences. And then with the
expansion of federal aid to education through Pell Grants and the
creation of national endowments for both the arts and humanities, the
National Trust for Historic Preservation, and other Great Society
programs, the good times kept on rolling through the 1960s.
But
that would all change during the economic crises of the 1970s;
enrollments in university history programs dropped ominously and
history graduates found themselves waiting on tables or standing on
unemployment lines. A few professors tried, with mixed success, to
place their graduates in non-university jobs—but budget cuts were
decimating and sometimes even eliminating programs suddenly
considered non-essential (New York’s Office of State History, for
example).
To
be sure, institutions can be agents for democratic change, as they
were during the New Deal and afterwards, but as employment
opportunities declined during the 70s, internal rules constricted,
and bureaucratic hardening of the arteries set in everywhere. Far
too many professionals lost sight of the public interest and became
preoccupied with getting and keeping their jobs. Indeed, the days
when Franklin Roosevelt was motivating people to identify his goals
as theirs, getting them to take ownership of their work, and
empowering them to become leaders themselves were coming to a close.
More than a few archivists and museum curators prioritized their
collections care responsibilities over their larger educational
mission, for example, historic preservationists often remained blind
to the social purposes their internal policies were intended to
serve, and many academic historians convinced themselves that their
often obscure and inaccessible work was an indication of their
objectivity.
It
seems that far too many history workers were forgetting something
Carl Becker had said in 1940:
educational institutions employed history professionals—not because
they deserved, more than others, “to have their way of life made
easy”—but to maintain and promote “the humane and rational
values which are essential to the preservation of democratic
society.” In other words, Becker believed, like Jefferson, that an
educated electorate was “the cornerstone of democracy.”
That
cornerstone erodes, however, whenever rationally minded people
seclude themselves in their privileged institutional fortresses. At
that point, democratic discussion deteriorates into the kind of
ill-informed and opinionated squabbling that the wealthiest and most
powerful among us will almost certainly misuse to their advantage—as
they did in 1980 with Ronald Reagan as their champion. This was
indeed the right moment to sell a historical
fantasy of visionary risk takers selflessly transforming the
United States from a small agricultural nation into a great economic
power (never mind corporation and patent laws, tariffs, land grants,
state-supported transportation improvements, and other public efforts
to promote economic growth) and of European immigrants achieving the
American Dream, not with taxpayer support, but strictly through their
own initiatives (forget about social security, public works programs,
public schools, labor and Civil Rights laws, etc.).
Many
everyday Americans facing an uncertain future considered this a more
optimistic and emotionally appealing vision of the past than the one
being offered by stale politicians and critically minded eggheads.
So lots of working and middle class people, emboldened by talk radio
and a changing news media, set forth on a campaign against subversive
textbooks, self-serving teachers’ unions, and elitist professors.
Billionaires, meanwhile, used
their enormous wealth to undermine the credibility of historians
and other professionals by employing their own well-educated,
well-paid experts at the Heritage Foundation, the Cato Institute, and
scores of other tax-deductible, non-profit organizations (modeled
after think tanks, such as the Brookings Institute and the Ford
Foundation). These mercenaries helped rationalize the decidedly
unscientific theory of supply side economics and fed the public
predetermined and unchanging notions about American history through
pseudo-science, old time religion, and an originalist interpretation
of the Constitution.
As
memories of child labor, Robber Barons, and the Great Depression,
faded, so did FDR’s contention that “true individual freedom
cannot exist without economic security and independence.” More and
more Americans were persuaded that the concept of freedom meant only
the unrestricted pursuit of self-interest. And in Reagan’s
America, government didn’t guarantee that kind of anarchic freedom.
It threatened it.
Regulations,
red tape, and discretionary appropriations thus had to go, and
government agencies were faced with oversight by budget-cutters at
war with the democratic principles of the New Deal and Great Society
(not to mention the Enlightenment and centuries of increasingly
rational thought). These new authorities claimed that corporations
and private foundations would support arts and humanities causes more
generously and efficiently than the government had ever done. But as
companies began running their philanthropic enterprises through
marketing offices rather than community service operations,
government and non-profit groups soon found that corporate support
was less than promised and anything but disinterested.
Tobacco
giant Philp
Morris, for example, helped the National Archives get the
Bicentennial of the Bill of Rights off the ground—but its promotion
of First Amendment rights came conveniently at a time when the
company’s advertising campaigns were being challenged by
anti-smoking advocates.
For
its part, the underfunded National Park Service got into the heritage
park business: partnership arrangements in which the feds provided
local cooperators with grants and technical assistance. These new
enterprises were promising in theory—and certainly less costly than
turning land and operations over to the government—but preservation
and educational considerations often took
a back seat to overstated assurances that the parks would revive
local and regional economies by building lucrative tourist
attractions.
By
the twenty-first century, the tourism industry and for-profit
businesses were selling their services to strapped-for-cash state and
local governments. Governor Edward Rendell,
for example, crippled Pennsylvania’s state history agency with a
forty-three percent budget cut and substantially increased his
support for over-hyped tourism schemes, and New York’s Andrew Cuomo
offered little more than highway signs and happy thoughts to private
museums and historic sites—while pumping millions of dollars into
market driven economic development initiatives (full disclosure: I
was an advisor to that effort).
Some
states facing budgetary shortfalls even shut the doors of time
honored historical institutions: the Georgia
State Archives in 2012, for example, and the Illinois
State Museum in 2015 (both reopened within a year but with
reduced state support). The ground under university
historians—increasingly scorned for what was termed their political
correctness—also shifted. State after state cut their education
budgets. Scott
Walker proudly undermined university professors’ tenure and
academic freedom in Wisconsin. And Florida’s Rick
Scott and Kentucky’s Matt
Bevin even advocated higher tuition costs for liberal arts
students whose coursework failed to meet the market demands of their
states’ business interests (e.g., science, technology, engineering,
and math).
*
With
facts mattering less and less, ideas all becoming political, and
truth subject to crass manipulation, political guru Karl
Rove famously boasted that, by 2004, the success of the so-called
revolution that began in the 1980s was all but assured. Scholarly
experts, according to Rove, had become irrelevant. While they were
studying “discernible reality” and talking among themselves, Rove
claimed that he and his patrons (described as “history’s actors”)
were making reality as they chose.
That reality,
of course, turned out to be one in which the earth suffered from irresponsible
development, the middle class shrunk, the poor grew poorer, and the fortunate
few enriched themselves beyond even their
own wildest dreams. And by denigrating
professionally sanctioned scientific and historical analysis, political
extremists such as Karl Rove have now enabled the presidency of for Donald Trump:
with Trump loving the poorly educated to distracting people with alternative
facts, far too many of today’s Americans seem to be reveling in a Caligari-like
world of amnesia and hallucination.
Still, the
future is uncertain, and there is much history remaining to be written. Many historians, too, know that they can help
write it by taking responsibility for making
history as well as studying it. That is,
they realize that they can help put the nation back on its more rational,
democratic course by experiencing life outside institutional sanctuaries, moving
beyond what Stanley Fish describes as “discipline-specific
activities,” and working responsibly and respectfully with people who often
understand the usable past through personal experiences and local or family
traditions. Maintaining one’s
professional values and identity, while, in effect, becoming one with his or
her fellow citizens is no easy trick, of course, and mastering it will require anyone
who still subscribes to a narrow definition of professionalism to do some
serious soul-searching. What, after all,
is the alternative? As their own history
suggests, the option of staying the course will lead professional historians further
and further into self-satisfied obscurity—and the nation into an increasingly unhappy
and undemocratic reality.
|
3bcc2ad75e6f5a0e8128e05119e81417 | https://historynewsnetwork.org/article/166348 | At the Last Minute the ISIS Captor Had Lowered His Demands and We Had Rescued Fayza from Certain Death in Mosul | At the Last Minute the ISIS Captor Had Lowered His Demands and We Had Rescued Fayza from Certain Death in Mosul
Fayza being driven to freedom after her rescue.
It
began with an electrifying text message from a Yazidi member of a
network dedicated to freeing Yazidis (an ancient race from Northern
Iraq that adhere to a pre-Abrahamic faith that can be traced back to
Babylonia, Persia and Mesopotamia). The message was stark and simple,
“We have a fourteen year old girl whose ISIS captor is willing to
sell her for 17,000 dollars. Her name is Fayza Murad from the
northern Iraqi town of Siber.” If we could get to Iraq with the
required sum we could save one of the thousands of Yazidi girls who
had been dragged off and sold into slavery by the ISIS fanatics who
conquered their remote homeland in Northern Iraq in August of 2014.
If we did not obtain the money there was a high probability that
Fayza would never be seen again as the ISIS “Caliphate” was
beginning to collapse under the assault of the Kurds, Iraqi Army and
U.S coalition bombing.
Thus
began a frantic search for money that led me and a brave group of
multinational volunteers led by a fiery English woman named Anne
Norona from the safety of our homes to the sprawling refugee camps in
the burning deserts of Northern Iraq. For me it was to be the
culmination of a long journey to explore the history of a dying race
whose origins lay in the mists of time.
Lailish:
An Entry into the World of a Dying Race
My
journey to comprehend this fascinating race that had endured Jihad
extermination, conquest and enslavement for centuries by their Muslim
neighbors who defined them as, “Devil worshippers” began while
researching a history of America’s wars in Afghanistan, Iraq and
Syria (Counter
Jihad: The American Military Experience in Afghanistan, Iraq and
Syria.). In the
winter of 2016 I was invited by two prominent Kurdish generals
leading the assault against ISIS, whose territory in Mosul and
Northern Iraq lay perilously close to their own capital, Erbil, and
their mountainous homeland known as Kurdistan. As a platoon of brave
Kurdish Peshmerga fighters (those who face death) defined as a
volunteer fighting force that defended the Kurdish sanctuary in the
mountains of North Eastern Iraq. As we looked across the valley at
ISIS positions facing them I was introduced to my first Yazidi.
This
source enthralled me with stories of the ancient rituals of his
people who the world gravely misunderstood as “pagans” and
brought to life the epic story of his long persecuted race. It was
this fascinating narrative that inspired me to travel northward from
our fire base at the newly recaptured Mosul Dam to the ancient heart
of the Yazidis, their remote mountain temple located dangerously
close to ISIS’s frontlines in Iraq. There I was provided with a
rare opportunity to access a stone temple built in a bygone era and
see Yazidis solemnly praying, dipping their hands in a sacred pool of
Azrael, the Death Angel, and even given the extraordinary opportunity
to meet their second highest priest, Baba Chavush. As I sat with this
holy man, who blessed me and my fellow companion, Adam Sulkowski, he
spoke of centuries of genocide as well as his hopes for peace for his
people and all of humanity. With a gold peacock next to him, the
peacock being a figure that represents the Yazidi’s primary god,
Malak Tawus (The Peacock Angel) he lamented the fate of thousands of
Yazidi girls who had been dragged against their will from their
families into ISIS captivity and forgotten by an uncaring world.
I
flew back to my own safe home in Boston feeling both blessed for
having been given such a rare entry into the mystical world of one of
the most ancient races in existence, but at the same time troubled by
the pain in Baba Chavush’s voice as he described the unimaginable
and horrific fate of Yazidi slave girls living in the clutches of
their fanatical ISIS captors. Their story moved me to write articles
about the Yazidi
plight, but there was not much more I could do, after all I was
just one man living far away from the warzones of the Middle East.
Little
did I know there was, however, another person on the planet far
braver than myself, who had decided that she would make that
difference. It was my discovery of Anne Norona that was to take me
from Boston and once again launch me into the maelstrom of the Middle
East just as ISIS’s greatest triumph, Mosul, collapsed under the
assault of a vast array of armies and militias bent on revenge.
Anne
Norona. Single Mother, Nurse, and “Angel of Sinjar”
Following
my field research in the embattled mountains of Iraqi Kurdistan, I
began to connect with a growing network of Yazidis who I had met on
Facebook. They spoke of their dreams for the liberation of their
homeland, a return from their refugee camps to their holy mountain
haven, Mount Sinjar, and most painfully of the plight of thousands of
daughters and sisters living as Sabbiya
(Koran endorsed slave girls). I was even shown a horrific video some
Yazidis had acquired of black clad, heavily armed, bearded ISIS
fighters waving the black banner of Jihad and shouting “Allah u
Akbar!” (God is Greatest) as they triumphantly dragged screaming
girls as young as eleven from the pleading hands of their terrified
mothers. I was nauseated when I heard that ISIS members considered
raping “Pagan infidels,” to be an act of worship. I was moved by
online interviews of members of this peaceful race who spoke of the
horrors of enslavement by the men who had ritualistically slit the
throats of their fathers and brothers, gunned down women over the age
of 40 in trenches and blown up their ancient temples with their
priests still inside of them.
Some
of the most impactful images I had ever seen in my life were of a
Yazidi girl
named Nadia
Murad
who had escaped captivity and told the world of the horrors she had
endured during her time as an ISIS slave.
It
was as I burned with a sense of helplessness, fury and desire to help
that I received an unusual Facebook message from someone identifying
herself as Anne Norona. Her initial messages were guarded and she
wanted to know where my interests in the Yazidis came from. When I
explained I was a Welshman/American who had dedicated his life to
performing fieldwork amongst various persecuted ethnic minorities
ranging from the isolated Kalash
Pagan’s on the Afghan Pakistan border,
to the embattled Chechen
highlanders of the Russian Caucasus,
to
the dying Crimean Tatars of Ukraine/Russia,
to the Kosovo Albanians and Bosnians she began to open up to me. In
the process I got to know someone whose life dream was to “grow
flowers in my garden and save Yazidis.”
It
soon became apparent that Anne was a fascinating English globe roamer
of the sort that had marched out and conquered much of the world and
provided us such names as Lawrence of Arabia, Gertrude Bell
(wonderfully played in a recent movie starring Nicole Kidman), and
Dr. David Livingston (who disappeared in the depths of Africa in the
19th century). Anne similarly burned with the desire to get out into
the world and help others, but instead of writing books and articles,
as I did, she put boots on the ground and worked as a volunteer nurse
in places ranging from Haiti to the Greek Island, Lesbos, located
just off the Turkish coast. Having myself spent thirteen summers in
Turkey living South of Lesbos with my ex-wife Feyza’s family in the
beautiful costal village of Cesme, I had myself witnessed the flow of
desperate Iraqi and Syrian refugees fleeing through Turkey in a
desperate attempt to reach the Greek Isles and obtain asylum in the
European Union. Lesbos was the frontline on the largest immigration
of humans since World War II and tens of thousands of refugees were
living in squalor in makeshift refugee camps on the island.
It
was while Anne and a team of volunteers working with the Health Point
Foundation under Dr. Hadia Aslam in the medical tent in the city of
Moria, in Lesbos that she came across her first Yazidis. Much of the
volunteers’ work consisted of online communication between an
amazing group of core humanitarians from around the world who worked
tirelessly and remotely to ensure the refuges received help in every
way, from legal to medical to boat rescue to basic assistance and
supplies of food and clothes. These volunteers were doing the job
that the big NGO’s were so negligently failing to do.
For
Anne, a single mother who had run away from home as a rebellious
teenager and explored much of the world from Africa to the Orient,
her meeting with the Yazidis was in many ways a fulfillment of what
the Arabs call kismet,
“Fate.”
The
Yazidis Anne encountered were different from all of the other Muslim
Arab refugees in the Lesbos camps. They were physically smaller, were
more shy, were often embarrassed to receive assistance and sadly
faced continued persecution from Arab /Muslim refugees who mocked
them by chanting “Allah u Akbar’’ or even attacked them. They
had in many ways been deprived of much of the assistance going to the
Arab /Muslim refugees as a result of their shyness and continued
persecution. It was while working that it became obvious to Anne and
her medical team, who she dubbed, “The Mosquitos” that the
Yazidis needed special care and that is how Anne’s life was changed
forever.
Anne
and her then Yazidi counterpart and friend Shaker Jeffery became
involved in the personal cases of Yazidis, realizing that they had
the best of both worlds, Anne having all the contacts in Greece and
Shaker all the Yazidi contacts. It was the perfect match. With this
combination they were able to help cases, such as a young woman who
urgently needed an eye operation to save her from certain blindness
to finding emergency rescuers to help Yazidis petrified and
surrounded by violent smugglers in Macedonia, to alerting the Greek
coastguards when Yazidi boats were crossing the Mediterranean Sea and
encountering difficulties.
Anne’s
instinct to side with the underdog and to fight in their corner
propelled her determination to defend these much-persecuted people.
Ultimately, this burning sense of mission drove her to Iraq itself
where she and a trusted team of Yazidi key workers and doctors who
joined with her to provide emergency support to the most vulnerable
in any given situation. She soon became known throughout the Yazidi
community as someone to be contacted in moments of need and remained
available 24 hours a day online. She would utilize ‘Crowd funders’
on Facebook to raise money for desperate cases, providing emergency
assistance, for ISIS survivors, orphans and medical cases.
When
Anne made her way back home to Britain, to her self constructed home
which she calls “The Shed” situated in a flower covered field
near the cliffs of Penzance in remote Cornwall, England, she
continued her work to assist Yazidis in obtaining passports,
supporting survivors and orphans, providing access to medical
treatments, and on occasion even helping to free one of the poor
Yazidi girls trapped in ISIS slavery.
While
Anne would make desperate pleas for help online and among her local
community her mission to provide multifaceted assistance to a race
that found itself scattered in refugee camps far from their home and
facing extinction went largely unnoticed by an uncaring world that
was more interested in things like Donald Trump’s latest Twitter
storm or Kim Kardashian’s weight gain.
Fayza with her father after her rescue
Operation
Fayza: A Mission to Free One Slave
The
mission to free Fayza actually began in May of this year when I was
carrying out fieldwork in Bosnia for the defense in a Federal
terrorism case. It was at this time that Christopher Natola, one of
my brightest students who had assisted me in writing my book Counter
Jihad,
suggested that I actually go to scenic Cornwall, England to meet Anne
while I was in Europe. Spurred on by his words, I took a flight from
Sarajevo to London (sadly arriving on the night of the terrorist
attack on the pop concert in Manchester) and took a wonderful
five-hour train ride across England, down to the cliff side town of
Penzance to meet the woman who so fascinated me.
I
was welcomed at the train station overlooking a scenic bay and was
driven by Anne to the famous “Shed” in her amazing garden. For a
few days I did an “embed” with Anne and got to see her in action.
Living with Anne was like being in the center of a one- person global
enterprise that saw her communicating via Facebook with Yazidis who
had found asylum in Germany, members of her network in Iraqi
Kurdistan trying to free a sex slave, hosting fundraisers in her
local community, and in between taking time to tell me personal
stories and showing me pictures of all of the Yazidis she and her
network of “Mosquitos” (as her team was called in their secret
Facebook group) had helped.
Anne
did all of this while single handedly raising a wonderful son and
working as a nurse. I was in awe of her. Anne, a single English
mother was making a difference in a world dominated by war,
fanaticism, cynicism and apathy. Her story was almost Hollywoodesque
in its beauty. Anne demonstrated that nothing is impossible, that one
person can make a tremendous difference.
I
flew home back to Boston inspired to tell her story and it was at
this time that the now famous text message arrived, “We have a girl
named Fayza, her ISIS captor is asking 17,000 dollars for her
release or she will disappear into the burning black hole that is
Mosul, Iraq.”
We
needed money and we needed to get it to a smuggler, who would take
most of the profits for going into the heart of darkness, to evacuate
Fayza out of besieged Mosul. I was deeply touched by the fact that a
young Yazidi girl who had the chance to be liberated had the same
name as my former wife Feyza. I lost no time in contacting Feyza and
she instantly offered her support to our cause. Together we collected
funds to assist and with Feyza’s blessing and prayers for
protection, I decided to join Anne and her team which included: K.P.,
a Canadian Sikh optometrist; Juliet, an English woman from Devon;
and Baderkhan and Khairi, Yazidi friends and members of The
Mosquitos.
In
early June I flew from Boston to Erbil in Iraqi Kurdistan with the
raised funds. There I reunited with Anne, who by now I had dubbed the
“Angel of Sinjar” (Sinjar being the Yazidi’s sacred mountain).
With Baderkhan and Khairi as our local guides, we drove northward
parallel to the frontlines of the ongoing war with ISIS towards the
northern town of Duhok. As Anne’s contact kept us updated by the
hour, we waited anxiously to see if the money we delivered would
actually free Fayza and reunite her with her family.
While
we waited for news in 110 degree heat we visited various Yazidi
refugee camps where we met with girls who had been recently liberated
from slavery. There I watched as Anne, Juliet, and K.P. gave each
girl several hundred dollars (a small fortune for these, the poorest
and most traumatized of refugees who had returned from slavery with
only the clothes on their backs).
Apart
from those who had literally just escaped captivity, Anne knew all of
her cases and their families intimately and was greeted with hugs and
tears as she met with one Yazidi woman who had the sad fate of having
lost her husband to ISIS and had suffered for 3 years with a
prolapsed/herniated disc in her back, with 11 children to care for
and no way of making a living while stuck in a tented camp in
Kurdistan. Anne and her team went from tent to tent reuniting with
people who had become well known to them. In the process, money was
given to a woman who needed surgery, toys were given to children of a
former ISIS slave, and Anne met with UN High Commission for Refugees
representatives to discuss a Canadian resettlement program. We all
awaited anxiously for word on Fayza.
Then
came the news we had been waiting for; the smuggler sent a triumphant
cell phone photo of himself driving Fayza, whom we had only seen in
ISIS photographs nervously wearing a headscarf, being driven from
Mosul to her parents in the refugee camp. At the last minute, the
ISIS captor had lowered his demands and we had rescued Fayza from
certain death in Mosul and reunited her with her family.
The
images of Fayza being embraced by her weeping father and her mother
were for me in many ways a rare image of joy in a land defined by
death, misery, fanaticism and slavery. Our team did not probe Fayza
on her personal details or the horrors she experienced, it was not
our place to do so. Sadly, there is rarely consistent psychological
counselling for Yazidi girls or child soldiers freed from ISIS.
Depression and posttraumatic stress syndrome are sadly extremely
common. We knew that their lives had been shattered and picking up
the pieces would take many years, but we all took consolation in the
fact that our small group had made a difference. One beautiful young
Yazidi girl now had something that so many other sex slaves did not
have, freedom and a chance to live her life. Although Fayza is now
out of the reach of her ISIS tormentors, her future is still vaguely
uncertain as she is living in tent 16 of the Chem Misko refugee camp
amongst tens of thousands of fellow refugees in the town of Zacho.
While it is difficult to know what sort of demons, nightmares or PTSD
Fayza is suffering from, I took some consolation from the last
imagine I saw of Anne enveloping Fayza in her loving arms and saving
one more of her “Children”.
I
am now safely back in Boston once more, and I guess some of my own
demons and sense of guilt that long haunted me have been exorcised by
the freeing of just one fourteen year old girl from the horrors of
slavery at the hands of brutal terrorists. But I, like Anne, have
been touched to my soul by the plight of the Yazidis, and
particularly of those young girls still languishing in captivity. I
cannot help but wonder how much more we Americans or Europeans would
care if we had saved one American or British girl from slavery.
It
is the image of Fayza sitting in Anne’s arms smiling at the camera,
still in shock, that inspire me now to make this plea. If you have
long felt that you cannot make a difference in the world, overcome
your apathy and doubt in order to believe that you can. And you can
start by reaching out to Anne and assisting her in her mission
through funds, online activism, or who knows perhaps traveling to the
wind swept deserts of sun blasted Northern Iraq to help one
determined English woman save Yazidis … one person at a time.
To
assist Anne please be sure to visit the following Facebook group page
–Y.E.S
– Yazidi Emergency Support group.
|
a29ee48bd1f5255ca01e9669bc503430 | https://historynewsnetwork.org/article/166640 | The Origins of American Imperialism: An Interview with Stephen Kinzer | The Origins of American Imperialism: An Interview with Stephen Kinzer
In
1898, the United States won a quick victory in the Spanish American
War and liberated Cuba, the Philippines, Puerto Rico and Guam from
Spanish colonial rule. But the war sparked the greatest foreign policy
debate in American history as best minds of the age considered
whether the United States should grab, “civilize,” and dominate
foreign lands or leave the people of those countries to rule
themselves.
Expansionists
led by Theodore Roosevelt and Henry Cabot Lodge with the help of news
baron William Randolph Hearst ultimately won the argument then, but a
closely divided nation questioned the new imperialism as influential
thinkers including Mark Twain, Booker T. Washington, Jane Addams,
Samuel Gompers, and Andrew Carnegie warned against foreign
intervention and cited the terrible consequences of European empire,
including the brutalizing of colonial subjects.
And
it was a time when the United States forces evolved from liberators
to occupiers who crushed the independence movement in the horrific
Philippine American War (1899-1902), leaving over one hundred
thousand Filipinos dead—mostly civilians—in a conflict fueled by
a sense of American superiority and divine exceptionalism that
presaged our future wars of intervention in Vietnam, Iraq, and
Afghanistan.
Award-winning
foreign correspondent and expert on foreign policy Stephen Kinzer
chronicles this overlooked history in his new book
The True Flag: Theodore Roosevelt, Mark Twain, and the Birth of
American Empire (Henry Holt & Company).
He covers the raging debate in detail over intervention based on
extensive research of official documents, letters, diaries, and other
resources. He stresses how this debate erupted on the role of the
U.S. in the world and dominated news and discussions at the turn of
the twentieth century.
Mr.
Kinzer’s book appears at a time when America is again examining its
role in the world, and the issues argued in this forgotten history
are still relevant today—although these concerns likely will not
garner anywhere near the wide attention they received almost 120
years ago.
The
title of the book, The True Flag,
comes from a speech by prominent anti-imperialist Carl Schurz, a
German immigrant who served as a Union general, U.S. Senator, and
Secretary of the Interior:
Let
us raise the flag of our country—not as an emblem of reckless
adventure and greedy conquest, of betrayed professions and broken
pledges, of criminal aggressions and arbitrary rule over subject
populations—but the old, the true flag, the flag of George
Washington and Abraham Lincoln, the flag of government of, for, and
by the people, the flag of national faith held sacred and of national
honor unsullied, the flag of human rights and of good example to all
nations, the flag of true civilization, peace, and good will to all
men.
In
his study of this period, Mr. Kinzer demonstrates the dangers and
folly of a foreign policy of violent intervention and domination.
Mr.
Kinzer, an award-winning
journalist, worked
as The New
York Times’s
bureau chief in Turkey, Germany, and
Nicaragua and as The Boston
Globe’s Latin
America correspondent. His other
books include The
Brothers: John Foster Dulles, Allen Dulles, and Their Secret World
War; Reset: Iran, Turkey, and America's Future;
A Thousand Hills: Rwanda's Rebirth and the Man Who Dreamed It;
Blood of
Brothers: Life and War in Nicaragua;Overthrow:
America's Century of Regime Change from Hawaii to Iraq; All
the Shah's Men: An American Coup and the Roots of Middle East Terror;
Crescent and Star: Turkey Between Two Worlds;
andBitter
Fruit: The Story of the American Coup in Guatemala,
with Stephen Schlesinger. Mr. Kinzer
also serves as a senior fellow at the Watson Institute for
International and Public Affairs at Brown University and writes a
column on world affairs for The
Boston Globe.
Mr. Kinzer talked
about The True
Flag by
telephone from his office in Boston.
Robin
Lindley: You’ve written widely on American foreign policy and
diplomatic history. Now, in The True
Flag, you examine the period of the
Spanish-American War and the Philippine-American War. Your book could be
entitled The Origins of American
Imperialism, and you describe the
tremendous debate over expansionist policies then. What sparked this
book now?
Stephen
Kinzer: All American foreign policy questions
can be narrowed down to one sentence and, in fact you could narrow
them down to one word, which is intervention. All of our major
questions in the world now are about where we intervene and for what
purposes and with what means.
We
are the country that intervenes more frequently in more other
countries that are farther away from our own borders than other
countries. Why are we like this? How did we get this way? Where did
it begin? I’ve always been intrigued by these questions. Often we
look for the answers to these questions in the period after World War
II when the U.S. truly became a global empire.
Actually,
when I looked more deeply into the background of those questions, I
saw that the crucial decision was made earlier, in the period around
1898 to 1900. Looking back at that time made it very clear to me how
aware everybody involved was in the debate that would shape the
future of the United States. Everybody debating the issue in 1899 in
the U.S. Senate, for example, understood that he was not debating
only one issue such as whether the U.S. could take the Philippines.
Those senators and other opinion makers across the country, as one
senator called it, were debating the greatest question that had ever
been put before the American people.
In the
history of American foreign policy, I realized this was the formative
debate, the mother of all debates.
Robin
Lindley: Didn’t the imperialist sentiment of this period, in a way,
grow out of the westward continental expansion and the idea of
Manifest Destiny?
Stephen
Kinzer: Yes. I think you can see a continuity
in the history of American expansionism. You could argue that the
United States has been expanding since the Pilgrims landed at
Plymouth Rock.
Perhaps
the history could best be understood as coming in three phases.
First, the United States created a continental empire in North
America by clearing native people and seizing a large part of Mexico.
Then, in the period after 1898, we became an overseas empire. And
then finally, after World War II, a global empire.
When
the Census Bureau in 1890 declared that the American frontier was
closed, that posed a dilemma for the United States. We had been
expanding for so long and, in the 1890’s. there was a sense that we
needed foreign markets for our goods and foreign raw materials. We
had to face this question: What do we do after reaching California?
Once we conquer North America, do we turn inward and do we do
something different and stop trying to conquer other lands? Or do we
continue overseas? That was the essence of this debate.
Robin
Lindley: Your book illuminates this basically overlooked period in
history. Most of us in school probably learned little of the Spanish
American War except for the sinking of the U.S. battleship Maine
in Cuba and Teddy Roosevelt’s charge up San Juan Hill. And I think
most Americans probably learn nothing of the brutal war waged by the
U.S. in the Philippines that followed the victory over the Spain
there.
Stephen
Kinzer: I think you’re right, and it’s
another example of how not just Americans but all people like to
remember things that they did or their country did that put them in a
good light.
We
tend to forget episodes that don’t show us in the way that we like
to think that we are. The Philippine War falls in that category. We
left hundreds of thousands of Filipinos dead in a horrifically brutal
campaign. We had our first torture scandal. We had serious war crimes
committed as a matter of official military policy. And yet very few
Americans are even aware that this war ever happened. Actually, it’s
been a huge scar on the minds of Filipinos and it’s well known in
East Asia, but because it doesn’t fit into our narrative of what we
do in the world, we’ve allowed it to fall out of our history books
and our consciousness.
Robin
Lindley: As I recall, this period was sanitized and glorious in our
old schoolbooks. There was the Great White Fleet and a glorious new
American Empire. We didn’t learn that there was an anti-imperialist
movement. Your book is a corrective.
Stephen
Kinzer: I recently photographed a monument in
San Francisco to the veterans of the Spanish American War who were
described in the plaque as having “extended the hand of friendship
to alien people.” That is the narrative that Americans are told
about this period. Our ignorance of what really happened feeds our
puzzlement as to why we are not so beloved in the world. We are part
of the view of our own history, and therefore people are surprised
when people with more direct experience as victims of our foreign
policy don’t look at us the way we look at ourselves.
Robin
Lindley: And that seems to hold true for the general view of Theodore
Roosevelt. He’s remembered as an energetic genius who wrote dozens
of books and was devoted to the environment and progressive domestic
policies. As you point out in your book, however, he was also a
bloodthirsty militarist, a rabid imperialist and a racist when it
came to non-white people in other lands. He was seen as insane by
some detractors, including Mark Twain. I don’t think we usually get
that view of Roosevelt.
Stephen
Kinzer: I had a great deal of fun learning
about the main characters in my book, Mark Twain and Theodore
Roosevelt. I some ways, they’re very different. Theodore Roosevelt
was a spoiled rich kid. He grew up looking at ships from his estate
on Oyster Bay. He became fascinated with navies, as young boys
sometimes do. He traveled to other countries as an aristocrat who got
to know European capitals much more than he got to know anything
about the way most Americans live. He liked to shoot animals. He had
tremendous contempt for people in non-white countries and had no
belief that they could rule themselves.
Mark
Twain was very different. He also traveled widely but not to shoot
animals. He really got to meet people. He had been in places like
India and South Africa where the state of European imperialism was
quite brutally clear. He had great sympathy for the native people
that Roosevelt held in such contempt.
On the
other hand, in some ways they were similar. Roosevelt and Twain were
both prima donnas. They both created an image for themselves and
invented themselves in a way. They were people that could never turn
away from an interview or a mirror. In a sense they epitomized the
vibe of the American soul during this period. Mark Twain believed
that every human being was as good as every other human being and if
the United States could produce people that could rule their country,
then the Philippines and other countries could rule themselves too.
Theodore Roosevelt thought this was nonsense: that people who were
non-white had no way of ruling themselves and needed to be ruled by
others.
We’re
still debating that in our own minds. What do we want to do in the
world? Americans want to guide the world, but we also want every
country to guide itself. These are opposite impulses and we can’t
do both of them. But we still hold them both in our minds and, in a
way, Roosevelt and Twain represent that dichotomy.
Robin
Lindley: They are two complex personalities. I believe that after his
presidency, Roosevelt didn’t even mention the Philippine-American
War in his memoirs. Do you think he was displaying some remorse?
Stephen
Kinzer: Roosevelt had an interesting turn of
mind in the period after he became president. As a vice president and
as governor of New York, he was a forceful advocate of nation
grabbing. He wanted the United States to annex possibly the entire
world. When he became president, it was presumed that this impulse
would guide him. There was speculation that he might take colonies in
Africa or that he might try to join the race for slices of China.
There was the possibility that he would try to take Mexico or
Nicaragua or even Canada.
He
didn’t do any of those things. I think the shock of what happened
in the Philippines must have affected him. I never found an actual
phrase where he and his friend Henry Cabot Lodge said that Americans
would be greeted with flowers in the Philippines, but that was more
or less the opinion that they transmitted to the American people—that
the Philippine people would welcome us. Instead, we had to wage a
horrifically brutal war to subjugate them.
This
sobered Roosevelt. He began to understand the sorrows of empire. When
he became president, he ordered one operation in which he seized land
for the Panama Canal. After that, however, he turned his interest to
other issues. He focused on controlling corporate power and
protecting the natural environment.
I
think he actually fit the pattern for an American president. They
tend to start off with great enthusiasm for using American military
and coercive power around the world. After that, they see the
limitations, they see the blowback, they see the trouble it brings,
so at the end of their terms they’re less likely to intervene than
at the beginning. You see this in presidents from Roosevelt up to
Bush and Obama.
Robin
Lindley: You certainly see that pattern in recent administrations.
And you look back at Roosevelt before the Spanish American war and he
was eager to fight and wanted to see combat, which he did in Cuba. He
was bloodthirsty. He said it was “a great day” when he killed a
Spanish soldier who was apparently running away at San Juan Hill.
Stephen
Kinzer: Roosevelt was a war lover. He had a
fascination with war and believed that war was the only noble pursuit
for a man or for a nation. I found a letter in which he speculated on
the possibility that perhaps Germany could be baited into burning a
few cities on the American East Coast because then we’d finally
have an enemy that would rouse Americans to the necessity of creating
a large military establishment. He wrote about wanting to participate
in fighting against the Tatars in Russia or against the Aborigines in
Australia. He was always looking for enemies and that certainly is a
pattern in American history.
Robin
Lindley: It seems that Roosevelt and his friend Senator Henry Cabot
Lodge were drivers of this imperialist sentiment. And newspaper
magnate William Randolph Hearst supported expansionism and promoted
the war against Spain. That press role strikes a chord today too.
Stephen
Kinzer: The imperialist triumvirate that
drove the United States to succumb to the imperial temptation in 1898
was comprised of three interesting figures.
Teddy
Roosevelt was the public face of the expansionist project. Henry
Cabot Lodge was the Mephistopheles in Washington that organized the
project politically. William Randolph Hearst was the megaphone who
sold Americans a diet of super-patriotic bunkum that drove them
crazy. He understood something that editors understand to this day:
If you want to have people buying newspapers or clicking on your
story, you need a running story that unfolds every day, not just on
one day. War is the best running story of all.
Hearst
set out quite consciously to set the United States off to war to sell
more newspapers. That he did splendidly. Hearst also understood
something that is still true today about how to get Americans to go
to war. He understood that Americans are a very compassion people who
hate the idea of anybody suffering anywhere. Our leaders, therefore,
use our people’s sympathy for the suffering of others. Whenever
they want to go to war for any reason, they start feeding us images
of poor, suffering people being brutalized by some evil tyrant.
That’s enough to move Americans into thinking we need to go to war
in some country.
We
don’t stop to think usually whether we’re going to be able to
improve the situation or what the long-term plan might be, but we’re
very impulsive. And Hearst understood this. He filled his paper with
articles about the brutalization of womanhood and other evils
perpetrated in Cuba, and that created a public climate that allowed
us to go to war. That’s like stories about Khaddaffi and Saddam and
Assad that were heavy news in later years.
Robin
Lindley: How do you see the role of Republican President McKinley at
this time? It seems he was ambivalent about aggressive expansionism
in foreign lands, but he eventually embraced a policy he called
“benevolent assimilation.”
Stephen
Kinzer: McKinley was known as a person who
followed public opinion rather than trying to lead it. The Speaker of
the House, Thomas Reed, famously said that McKinley “kept his ear
so close to the ground, it was full of grasshoppers.”
McKinley
sensed that Americans were caught up in the fever of expansionism and
that to try to put a stop to it or to try to stand in its way would
hurt him and his party politically. He saw that the popular thing to
do would be to latch onto this bandwagon, and he did so. His
explanation was that he was guided by God in a visitation in the
White House one night in October 1898, but that night sounded a lot
like Henry Cabot Lodge and Teddy Roosevelt.
Robin
Lindley: Was it mainly commercial interests that propelled this
imperialist policy? It seems that greed, profit, and the desire of
businesses for new markets played a large role.
Stephen
Kinzer: A confluence of factors drove the
United States to make this epochal decision at the end of the
nineteenth century.
Economics
played a large role. When you read newspapers of that period, as I
did while researching this book, you see that there is much written
about what was then called glut. The argument was that American farms
and factories were becoming so productive that they were producing
more than Americans could consume. This was producing social rifts
with strikes and labor conflict. People began to sense that there was
a need to export some of social problems, and the way to do this
would be to find foreign markets. In those days, that meant you had
to take over foreign territories. That’s the way Europeans did it.
You then would prevent other countries from trading with those
colonies.
The
United States saw the Philippines partly as a source of great raw
materials and as a potential market for goods, but even more
tantalizingly, as a potential springboard to the China market. In
those days, the China market was held up as a great phantasm of
tremendous prospects for wealth. Articles were appearing about how
much cotton the Chinese would buy if they could be induced to make
their clothes of cotton, or how many nails they could buy, or how
much beef they could buy if they converted to American habits.
No
doubt Lodge, in weaving the imperial project together, used the
ambition of commercial interests as an important thread.
Robin
Lindley: It may surprise some readers that so many great minds were
on the anti-imperialist side of the debate: Booker T. Washington,
Jane Addams, Carl Schurz, Mark Twain, and even the richest man in
America, industrialist Andrew Carnegie. The debate was by no means
one-sided and the imperialist impulse was not overwhelming.
Stephen
Kinzer: Actually, the power of the
anti-imperialist movement and the earnestness that many Americans
took its arguments was something that I hadn’t realized. This
episode has essentially fallen out of American history. There was a
great debate that seized America. It was on the front page of every
newspaper day after week after month. Every major political and
intellectual figure in America took sides and it shaped the entire
subsequent history of the United States.
All of
my books are voyages of discovery and, in this book, my main
discovery was that this debate ever happened. It’s a vitally
important episode of American history that shaped who we are today
but has fallen out of our history books. So the greatest satisfaction
for me in writing this book is being able to recover this debate and
hoping to make clear to Americans today who question, as I do,
aspects of American policy. The idea that the United States should
allow other nations to rule themselves and not try to project our
military and coercive power around the world is very deeply rooted in
American history.
Those
of us who are trying to push America to a more prudent and restrained
foreign policy are standing on the shoulders of titans—great
figures of American history who first enunciated the view and to
continue to make their argument is something quintessentially
American.
Robin
Lindley: To go back, our brutal Philippines campaign is shocking
today. Apparently, the leader of the independence movement there,
Emilio Aguinaldo, had a promise from the U. S. that, if his forces
fought with the U.S. against the Spanish, the U.S. would assure
Philippines independence. Instead, after liberating the Philippines
from the Spanish with the helped of Aguinaldo’s forces, the U.S.
turned on Aguinaldo and his “insurgents” in a horrific war.
Stephen
Kinzer: The Americans were told that
Filipinos has every reason to rebel against Spanish rule. After all,
being portrayed as under a cruel Spanish master, Filipinos in
rebellion seemed to us the equivalent of George Washington and the
Continental Army fighting to overthrow British. Then, after we
changed our ideas about what we wanted to do with the Philippines and
decided we wanted to take the Philippines rather than grant them
independence, we began to tell ourselves that we were a very
different master from the Spanish.
You
certainly can understand why the Filipinos wouldn’t want to be
ruled by the Spanish because they were brutal and oppressive and far
away and had evil intentions. We were told Filipinos would love to be
ruled by Americans. They would realize Americans are benevolent and
only want to help.
Americans
were never able to grasp the idea that, for many Filipinos, being
ruled by a foreign power was [anathema] no matter what power it was.
These Filipino rebels were not willing to accept the exchange of one
distant master for another. They wanted full independence. Americans
were never able to see this. We deluded ourselves into believing
that, although they hated being ruled by the Spanish, they would love
being ruled by the Americans. This is the kind of self-delusion that
characterized much of our approach to the world.
Robin
Lindley: Racism also played a role in these interventions.
Imperialists not only saw the U.S. mission as liberating Cuba and the
Philippines, but they saw non-white people as inferior and primitive
creatures who needed us to “civilize” them. Roosevelt called
Filipinos “wild beasts.”
Stephen
Kinzer: It was particularly vivid in Cuba. We
were told when we entered the war there that the Cuban patriots were
great heroes and the equivalent of the leaders of our American
Revolution. They were lionized in the American press. That’s why we
felt they should have the independence they were fighting for.
Then,
after the war ended, our commanders in Cuba reported back the
horrible realization that many of these leaders that we had been
taught to admire was that they were black. That suddenly changed
American opinion. We began to think that there might be a government
in Cuba that would be partly black, and that certainly would have
happened if we allowed Cuba to become independent.
Our
racial attitudes at that time made it absolutely impossible for us to
accept that result. That’s one reason that the United States
refused to permit Cuba to become independent after 1898.
Robin
Lindley: It’s interesting that some white supremacists were
anti-imperialist because they were worried that we would bring more
non-white, less-than-civilized immigrants into the United States.
Stephen
Kinzer: You’re right. Racism was used on
both sides of this argument. It’s easy to understand how
imperialists viewed it because they believed that non-white people
couldn’t govern themselves and needed white people’s help. But
some anti-imperialists also were racist. They came from the south and
they didn’t want the United States taking in people who were not
white.
I do
think that racial attitudes played a big role in this debate. Another
example is the experience of Hawaii. Hawaii, with the connivance of
the United States government, had a change of regime in 1893. A group
of white American planters and their friends overthrew the Hawaiian
government so that they could come into the United States and sell
their sugar at a cheaper rate. But there was a change of
administration in Washington. Grover Cleveland became president and
he didn’t want to take in Hawaii under these conditions.
Hawaii
had to become an independent nation—something these white settlers
had never imagined. Their challenge was to find themselves a
constitution which would look good to Americans in case they ever
became a part of the U.S., but also would disenfranchise most of the
population. They couldn’t have native people voting; otherwise
they’d be voted out of office. They chose as their model the
constitution of the state of Mississippi, which was ingeniously drawn
up with all sorts of qualifications for voting so that it looked
democratic while denying most people the vote. So you can say that
the racism that permeated the United States definitely shaped our
foreign.
Robin
Lindley: Thanks for your insights on the role of race. I appreciate
your comment on intervention in the book: “Violent intervention in
other countries always produces unintended consequences.” That is
writ large in the period you examine and in our foreign policy in the
past two decades that has produced terrible blowback.
Stephen
Kinzer: I think you’re right that our
interventions have produced terrible unintended consequences. What I
find even more puzzling is that we don’t seem to learn from these
experiences. There doesn’t seem to be any limit to the number of
times we can crash into another country violently and have it come
out terribly until we begin to reassess whether this is a good idea
or not.
One
reason I was so interested to write this book The
True Flag is that I envy the debate they had
in those days when the U.S. Senate convened for an epochal 32-day
debate for this vital question of expansionism in the winter of 1899.
Senators debated this great question: Should the United States try to
push its power onto other people and other countries or how do we
leave them alone and build up our own country?
We
don’t have that debate today. We’re debating whether to send four
thousand troops to Afghanistan for the new surge or should it be six
thousand. We never pull back to have this larger debate and, if we
ever did, it would probably sound a lot like debates that I write
about in my book.
Robin
Lindley: You have a wealth of good advice for the new administration.
In a speech on July 6 in Warsaw, Trump asked if the West has the will
to survive? What do you think of that remark from our new president.
Stephen
Kinzer: The West has the will to survive, but
do we could survive without trying to impose our will on others? The
more we crash into other countries, the more we weaken ourselves.
This is the lesson our interventions teach us. We can survive and
thrive but we should pay more attention to building our own nation
than trying to use our thousand-mile screwdriver to fix others. How’s
that for a coda?
Robin
Lindley: That’s quite fitting. At the close of The
True Flag you go back to the words of
George Washington. Your book reveals the wisdom of Washington’s
warning to Americans: to avoid the “mischiefs of foreign
intrigues.” Thank you for your thoughtful comments and your
illuminating new book.
|
6cdac1ff755bed2098bc21c0422c911a | https://historynewsnetwork.org/article/166864 | Lessons from German History after Charlottesville | Lessons from German History after Charlottesville
Related Link HNN's Full Coverage of Charlottesville Updated Continuously
The
first eight months of Trump create a dilemma for a historian of
modern German history. If you raise the specter of Hitler, ask what
Trump has in common with fascism in the past and make comparisons to
the emergence of the German dictatorship from the collapse of Weimar
democracy, a chorus erupts about the misuse of historical analogies.
If you focus on the differences between the United States in 2017 and
Germany in 1933 and offer reassurances about American checks and
balances, another chorus bemoans your complacency and facile
optimism. In reality both choruses are speaking up within me, keeping
me up at night and asking how I can best to be true to my vocation as
a scholar and my responsibilities as a citizen.
The
first point to be made is that I and others were right in spring 2016
when we spoke of
Trump’s authoritarianism, demagoguery, appeals to racism and the
taint of fascism that surrounded his candidacy. We have learned
nothing about Trump since January 20th that we did not know before.
The yearning for a strongman, attacks on minorities and thinly veiled
appeals to racial hatreds, a song-and-dance with Nazis, white
supremacists and an embrace of the identity politics of the
“alt-right,” a penchant for conspiracy rally speeches
full of rhetorical violence and juxtaposition of an idealized past to
a miserable present do recall the themes of the anti-democratic right
in Europe’s twentieth century. Trump’s reference to “the very
fine people” that included Nazis and white supremacists in
Charlottesville continues his flirtation with the extreme right, a
tactic that was apparent when he adopted one of the far right’s
most famous slogans: “America First.” His gut reaction to
Charlottesville, equating Nazis armed with assault rifles to liberal
counter-protesters, is in the tradition of the America firsters of
the 1930s, including Charles Lindbergh, who opposed intervention in
the war against the Nazis.
Yet
the contrast to Hitler’s first eight months is stark, in terms of
both what Hitler did and the way the established elites responded. In
March 1933, following the Reichstag Fire, the Reichstag (Germany’s
parliament) passed the Enabling Act, thus giving Hitler the power to
pass laws without consent of the parliament. With that action, it
ceased to exercise any restraint on his powers and lost its raison
d’etre. It became a rubber stamp that provided Hitler with a fig
leaf of legality that facilitated his destruction of the rule of law.
That spring, the German government and the police under its control
were using force and violence to arrest, intern or drive underground
or into exile political opponents, especially Communists and Social
Democrats. Independent trade unions were destroyed. On April 7, 1933,
the “Law for the Restoration of the Professional Civil Service”
purged Jews and political opponents from in the universities. Soon
other laws implemented similar purges in the legal and medical
professions. In March, Heinrich Himmler, leader of the SS, announced
the opening of the first concentration camp in Dachau. That same
month Joseph Goebbels, the minister of the newly created Ministry for
People’s Enlightenment and Propaganda, announced at his first press
conference that “there is no such thing as absolute objectivity.”
The destruction of Weimar’s free press and the establishment of
government press censorship followed. On July 14, 1933, Germany was
officially declared a one-party state. All non-Nazi political parties
ceased to exist.
In
the following eight months, Hitler became more popular, his power
grew and the conservative elites continued to support him. Relieved
that he had destroyed the political left and the power of German
trade unions, big business shook off its initial puzzlement about the
“socialism” in National Socialism and fell into line. Prominent
German jurists offered justifications for dictatorship in the face of
a national emergency. The military leadership, eager to break the
arms restrictions of the Versailles Treaty and the budgetary
restraints on rearmament created by parliamentary democracy, lent
their support as well. Leading academics, such as the philosopher
Martin Heidegger, expressed relief at the end of supposedly outmoded
ideas of academic freedom and urged students and faculty to join a
new national people’s community led by the Fὒhrer.
The
Nazis claimed that they were able to “seize power” in January
1933. Historians, however, have punctured that self-serving myth and
shown what actually happened: Hitler did not seize power,
but rather was invited into
it by conservatives who underestimated the man who, in short order,
was able to outmaneuver them while amassing unchecked control. The
political and economic establishment tolerated the use of force and
violence to suppress dissent as well as the codification of
antisemitism in government legislation. In the months after passage
of the Enabling Act, Hitler’s power grew. It increased dramatically
and quickly as one decision to go along after another deepened the
complicity of established elites who ignored Hitler’s obvious
contempt for the rule of law while giving themselves false
reassurance that he was merely continuing the Presidential emergency
rule of the Weimar Republic.
In
eight months as president, Trump has used his power to do much damage
to American economic relations around the world. He has attacked the
free press and, in firing FBI director James Comey and pardoning
former Sheriff Arpaio, manifested his lack of respect for the rule of
law. His very public record of continual lying has undermined his
power in foreign policy, for that power rests in part on the idea
that the United States President speaks the truth about events in the
world. His lack of knowledge of complex policy issues and
demonstrated uninterest in learning about them, combined with his
inability to communicate his arguments in any remotely intelligent
way—all qualities that were fully evident in
spring 2016 – has led to an absence of any legislative
accomplishments for which he can claim credit. He owes his only
legislative victory, the appointment of the conservative Neil Gorsuch
to the Supreme Court, to Senate Majority Leader Mitch McConnell, whom
he has now publicly belittled. Trump has been stymied by the FBI and
by intelligence agencies which he first insulted and compared to the
Nazis, then has been unable to control, and which continue to examine
Russia’s efforts to aid him in the election. Having failed in his
effort to block the appointment of an independent counsel with the
power of subpoena that is tasked with examining ties between his
campaign and the Russian government, he is powerless to stop an
investigation that may bring down his presidency. Yet he has
not expanded his
power or base of support in the public.
In
sharp contrast to the rapprochement between big business and Hitler
after January 1933, Trump’s lies about the “very fine people”
on both sides in Charlottesville led corporate CEOs to resign from a
council on manufacturing that he had called into existence. A number
of four-star generals have publicly reasserted that racism and
antisemitism are at odds with the values of American democracy.
Trump’s threats of trade wars and rejection of the Paris Climate
Accord and the TPP trade accord with Asian nations have opened the
possibility for greater Chinese influence in Asia, reduced American
leverage in trade talks and antagonized close American allies in
Europe, thus alarming American business leaders, who fear the loss of
American influence in the Pacific.
Trump’s
constant attacks on “fake news” in the media delight his core
supporters but, again in contrast to Hitler and Mussolini, he been
unable to destroy the country’s leading newspapers, news networks
or multiple websites. Indeed, ironically thanks to him, these media
outlets are enjoying increased subscriptions and giving employment to
a legion of energetic and determined journalists. Where Hitler’s
popularity grew in the first year, opinion polls indicate that
Trump’s has been steadily declining.
By August 2017, an anxious public was looking to former generals to
prevent Trump from dangerous and rash decisions and, in the case of
Defense Secretary James Mattis, to offset the obsequious worship
emanating from Trump’s civilian cabinet members, and to former
Marine General John Kelly to bring order to the West Wing.
The
expansion of Hitler’s power after 1933 was due to his willingness
to use terror to crush his opponents, to the timing of an economic
recovery he had nothing to do with, and to the support he received
from the established political and economic elites. Trump has been a
failure in multiple ways, but he remains President of the United
States and continues to enjoy strong support among many Republican
voters. He has created a base that can threaten those Republican
politicians who challenge him (especially those up for re-election in
2018), but his support is not limited to this bloc; a remarkable
percentage of Republican voters on the whole say they still approve
of his performance. Moreover, the great majority of Republican
Senators and Representatives align with him because they too want to
abolish the Affordable Care Act, reduce taxes on upper-income groups
and increase military spending. Without the support of the Republican
Party and its leadership in Congress, Trump’s power would be even
less than it is now. That he attacks the
very party that enabled his election is another sign of his political
incompetence and opens the possibility of a Republican revolt against
him.
Though
Trump is dangerous, he has become president of a liberal democracy
that is 241 years old, one whose rules and norms are vastly better
established than they were in post-World War I Germany and Italy. In
this very important sense, the differences between the United States
in 2017 and Hitler’s Germany in 1933 and Mussolini’s Italy in
1922 are greater than the similarities. Trump does not have the power
to end the investigations of Russian meddling in the 2016 election.
He has not turned the FBI into an instrument of presidential will. He
cannot control the actions of state governors or attorneys general.
His threats to Senators have, in some cases, backfired. The military
leadership has been a brake to, not an accelerator or cheering
section for, his outbursts. He is resorting to insult and lies about
his opponents but not to actual violence and terror. He faces the
prospect of criminal indictment and impeachment, something that would
have been inconceivable in a dictatorship. In short, as dangerous as
he is, Trump remains hemmed in by American democratic norms and
institutions.
Moreover,
the fact that Nazism emerged, and that it led to World War II and the
Holocaust, has had a profound impact on subsequent history and on
American politics. That is the case because now, in contrast to 1933,
it is obvious what can happen if a democracy falls into the hands of
a dictator who thrives on conspiracy theories and shows no respect
for the rule of law. We now know how a slippery slope of agreement
with a leader with authoritarian inclinations can lead to
catastrophe.
On
Antifascism
In
our discourse, the noun “antifascism” recalls the history of the
Communist Party and the popular fronts of the 1930s. Yet it properly
refers to far broader segment of opinion of that era. Americans are
not accustomed to describing Franklin Roosevelt and Winston Churchill
as “antifascists.” Yet they were the two most consequential
leaders of an antifascism inspired by liberal democratic values.
Trump’s effort to foster a moral equivalence between Nazis, white
supremacists and those protesting against them was one of the most
revolting moments in American political history; he will never get
over it. With his comments after Charlottesville, he placed himself
outside of a tradition of democratic antifascism blazed by Roosevelt
and Churchill and the Anglo-American alliance that fought the Nazis.
World War II was a war against fascism, one that Americans before
Trump were used to calling “the good war.” The antifascism that
bound together the bipartisan nature of FDR’s wartime cabinet stood
in contrast to the antifascism of the Soviet Union, which was linked
to illiberal ideas about Communist dictatorship. It is important to
keep in mind these distinctions between liberal and illiberal
antifascism.
When
self-styled “antifa” groups shout down speakers, support boycotts
of Israel and engage in violence at rallies, they evoke the
illiberalism and radicalism of the Communist antifascism and repress
the memory of American and British governmental antifascism during
World War II. Yet during those same years, for many Communists and
all leftists, antifascism meant fighting against Nazism and
antisemitism. But the meaning of Communist antifascism changed in
early years of the Cold War when Stalin and the Communist regimes in
Eastern Europe conducted numerous antisemitic “anti-cosmopolitan”
purges. The Communists applied the
label “fascist” to the government of West Germany, to the United
States and, especially following the Six Day War of 1967, to the
state of Israel. In the Soviet bloc, a tradition that began in
opposition to Nazi Germany degenerated into a set of slogans that
incited diplomatic and military attacks on the Jewish state in the
context of anti-Israeli terrorism in the 1970s and 1980s being
carried out by the PLO, with help from its European supporters.
Today, “antifa” is not a slogan of the children of light against
the forces of darkness. It now has a far more checkered and ambiguous
history because it was also a slogan of the anti-cosmopolitan purges
of the early 1950s and then of the Communist and radical leftist
“undeclared wars” against Israel of the 1960s to 1989, and
recently of the efforts to boycott, divest from, and sanction the
state of Israel.
Yet
in Charlottesville it was not antifa groups who carried torches and
assault weapons and bellowed slogans like “blood and soil,” and
“the Jews will not replace us.” It was a young generation of
Nazis and white supremacists who did so. As a historian I have
documented and interpreted both the
ideological core of the Nazi regime during the Holocaust as well as
the antisemitism of the radical left during the Cold War. At the
moment, the greater danger comes from right-wing extremists and the
wink and nod they have received from the President of the United
States. Clearly, the events in Charlottesville were very much an
expression of racism against people of color and a celebration of the
racist legacy of the Confederacy. That said, it is striking that most
of the media and political discussion about the events in
Charlottesville notes but then fails to reflect on the centrality of
antisemitism, that is, hatred of Jews and Judaism, to the marches and
rallies of the neo-Nazis there. We must not forget the obvious:
Nazism was first and foremost about hatred of the Jews. American
neo-Nazism fuses that hatred with racism toward African-Americans,
but the antisemitic dimension of the slogan “the Jews will not
replace us” received insufficient attention in much of the
subsequent commentary.
Germany
after 1945 and the South after the Civil War
The
events in Charlottesville and the controversy over monuments to
defenders of the Confederacy and slavery raise another comparison
between the history of the United States and that of Germany, namely
that between the South after the Civil War and the Federal Republic
of Germany (West Germany) after World War II and the Holocaust. As
the Yale historian David Blight has recently reminded us in his
important work, Race
and Reunion: The Civil War in American Memory, a
desire for whites in the North and the South to reconcile displaced
the voices of those such as the African American abolitionist
Frederick Douglass, who stressed that the cause of the war lay in
racism and the defense of slavery. In place of a truthful reckoning
with the history of slavery and the postwar reconstruction
governments, Southerners resorted to the terror of the Ku Klux Klan,
imposed legalized apartheid against blacks, and fostered comforting
but false myths about the nobility of the South’s “lost cause.”
The monuments that Trump described as “beautiful” were built many
decades after the Civil War to honor defenders of slavery and white
supremacy. The South lost the war but the ideology of white supremacy
that had emerged during slavery survived and thrived.
On
May 8, 1945, the day Nazi Germany surrendered to the Allies, there
were eight million members of the Nazi Party. The American, British,
French and Soviet allies were well aware that Nazism had found deep
and broad support not only among these stalwarts but also within the
German army, which had fought to the bitter end. In the year
following the war’s end, the Western Allies arrested 100,000 former
officials of the Nazi regime, many on suspicion of participation in
horrific acts. During the four years of Allied occupation, from 1945
to 1949, the Western allies convicted six thousand former officials,
including the surviving leaders of the Nazi regime, of serious
crimes. In Nuremberg, the United States conducted separate trials of
military officers, leaders of big business, physicians, judges,
propagandists and members of the Reich Security Main Office, who
carried out the Holocaust. In the process, the core facts of the
regime’s criminality became public knowledge based in large part on
the massive documentary evidence found in the regime’s own files.
Yet after 1949, when West Germany regained some of its sovereignty
and instituted democratic politics, a majority of West German opposed
judicial reckoning with the Nazi past and even called for amnesty for
those already convicted of war crimes and crimes against humanity.
The more democratic West Germany became, the more opposition there
was to continuing the reckoning of the occupation years--years that
bore some comparisons to Reconstruction after the Civil War in the
South. Even so, however, lies about Nazi crimes and romance about a
kind of lost cause did not come to dominate West German politics. Why
not?
Despite
a flood of criticism at the time and historical accounts since about
the West German era of amnesia and avoidance, there was a crucial
difference between post-Nazi West Germany and the US South after
slavery. Nazism did not revive as an important political force in
postwar Germany, a fact that was one of the most remarkable and
taken-for-granted developments of recent European history. The
International Military Tribunal in Nuremberg and successor trials
revealed its leaders to be criminals, and genocide to be the logical
outcome of their antisemitic and racist ideology. Moreover, the
Allied military authorities prevented the emergence of armed Nazi
groups. Neo-Nazi groups formed but remained on the fringes of West
German politics. While former supporters of the Nazi Party joined the
major conservative parties, it was made clear to them that any effort
to revive Nazism would be crushed. The cynicism of rapid changes of
heart was obvious to Allied occupiers, but cynical opportunism was
preferable to glorifications of Hitler and denial of the facts of
Nazi criminality. In contrast to the decision to withdraw Federal
troops from the South in 1877, thus bringing Reconstruction to an
end, the United States military remained in West Germany, both to
deter a possible Soviet attack and to prevent a Nazi revival.
West
German political leaders learned important lessons from the past.
Konrad Adenauer, the leader of the Christian Democratic Union, the
major conservative party, was aware that conservatives had invited
Hitler into power in January 1933. He was determined not to make the
same mistake again while also seeking the votes of Germans who had
supported the Nazis but who, he hoped, had abandoned Nazi
convictions. Though largely silent about the crimes of the Nazi
regime, Adenauer supported financial restitution for the Jewish
survivors of the Holocaust and offered political and financial
support for the fledgling state of Israel. He adopted a policy of
amnesty and integration of former Nazis and eschewed timely trials
for war crimes. Integration and amnesty were there for those ex-Nazis
who, whether out of cynical opportunism or genuine change of heart,
accepted the new democracy and made no effort to revive Nazism.
Establishing a bulwark of the democratic right against the
undemocratic, extreme right was part of what the West Germans called
“militant democracy” and the “anti-totalitarian consensus.”
For
Adenauer, such policies precluded fostering foster a mythic history
of the Nazi era that glossed over its crimes. Speaking in Cologne in
March 1946, he acknowledged that “National Socialism could not have
come to power if it had not found in broad layers of the population
soil well prepared for its poison…. The German people suffered for
decades in all of their strata from a false conception of the state,
of power, [and] of the position of the individual person.” A change
in mentality and renewed respect for individual human rights was
essential. He and other conservatives did not propose building
statues of Nazi personalities in West German towns and cities. In
foreign policy, he rejected the antagonism to “the West” of
pre-1945 German conservatism and replaced it with appeals for
European integration, reconciliation with France and Atlanticist
links to the United States.
Popular
memory followed suit: former SS officers met quietly, not in public;
veterans told war stories that left out the Nazi race war on the
Eastern Front; and generals wrote bestsellers but did not lionize
Hitler (indeed, they blamed him completely for the defeat). In so
doing they obscured their own responsibility for the catastrophe.
There were those who sought to reinterpret World War II as a defense
of the West against the Soviet threat but Adenauer rejected such
retrospective justification. Cities built memorials to the German
citizens killed in Allied bombing attacks. As anticommunism came to
dominate West German politics, timely justice gave way to the desire
to “draw a line under the past.”
In
the midst of this era of integration and amnesty, which one German
historian has called “politics toward the past” and another that
of “a certain silence” about the crimes of the Nazi regime,
another new and unique political tradition emerged.1 It
was one that sought to found a democracy not on the forgetting of
past crimes and injustice but on their vivid memory. The German word
for it was Vergangenheitsbewältigung, roughly,
“coming to terms with the Nazi past.”2 It
rested on the novel idea that such vivid memory and truthful
reckoning were indispensable for the establishment of a liberal
democracy. The tradition had left-of-center beginnings on May 6,
1945, two days before the formal surrender to the Allied forces, when
Kurt Schumacher, the leader of the German Social Democratic Party
(SPD), spoke to his party colleagues in the city of Hannover. He
stated flatly that the Germans “saw with their own eyes with what
common bestiality the Nazis tortured, robbed and hunted the Jews,”
yet they remained silent and hoped for a Nazi victory in World War
II. In May 1946, again in Hannover, Schumacher addressed the First
National Party Congress of the SPD and made clear that his party
would remember Nazism’s victims:
Our
first thoughts concern the dead. The victims of fascism among our own
people. The dead from the freedom struggles of oppressed peoples. The
army of victims of the war among all nations. The women and children
who were swept away by bombs, hunger and illness. The Jews, who fell
victim to the bestial madness of the Hitler dictatorship…. The
victims are not forgotten. We will erect a memorial to those who died
in the German freedom struggle, who gave their blood for the
existence of an “other” and better Germany. They will live on in
our hearts, in our thoughts, and in our work.
The
Germans would build memorials not to the Nazis but to their victims
who, he said, would “live on in our hearts.” The memory of the
victims who lived on in the hearts of West German Social Democrats
became one of the foundations on which a democratic and a better
Germany would be built. In response to an invitation from the
American Federation of Labor, Schumacher became the first German
political leader to be invited to the United States after World War
II.
The
third leading figure of “coming to terms with the Nazi past” was
Theodor Heuss, president of the Federal Republic from 1949 to 1959
and a leader of the liberal Free Democratic Party. He established
that office as a center of rhetorical reflection on the moral issues
facing the country and was a counterpart to Adenauer who, as
Chancellor, held the reins of political power. Like Adenauer, Heuss
refrained from seeking timely justice and even sought amnesty for
some prisoners previously convicted by the Allies. Yet, as a
newspaper editor in Stuttgart in November 1945, he wrote that “the
German political victims within Germany, and on their side the
hundreds of thousands, yes millions of foreigners who were tortured
to death, speak to the heaviest and costliest sacrifice of National
Socialism: the honor of the German name, which has sunk in filth.”
He felt a “duty once again to clear our name and the name of the
German people. The memory of those who suffered yet were innocent,
and who died bravely, will be a quiet, calm light illuminating our
path in the dark years through which we are going.” As with
Schumacher, this founding father of the West German democracy placed
memory of Nazism’s victims at the core of an emergent political
culture.
At
the inauguration of a memorial to Nazism’s victims at the former
concentration camp in Bergen-Belsen in December 1952, President Heuss
delivered a speech entitled “No One Will Lift This Shame from Us.”
He said that “whoever speaks here as a German must have the inner
freedom to face the full horror of the crimes which Germans committed
here.” He rejected arguments that pointed to act of “the others,”
that is the Allies during the war or the Soviets in the postwar
years. Such balancing of accounts “endangers the clear, honorable
feeling for the fatherland of everyone who consciously knows our
history” and faces up to it. Efforts to forget were pointless. “The
Jews will never forget, they cannot ever forget what was done to
them. The Germans must not and cannot ever forget what human beings
from their own people did in these years so rich in shame.” In
1954, in a speech at the Free University of Berlin, he praised the
German resistance to Hitler of July 20, 1944. He called them,
paraphrasing Martin Luther, the “Christian nobility of the German
nation,” who, in attempting to assassinate Hitler and overthrow the
Nazi regime, redefined the meaning of patriotism and love of the
“fatherland” for the German generations coming of age. Patriotism
did not mean spreading comforting myths about the glories of the
past. Rather, it meant staring the truth straight in the eye with an
unflinching gaze. His words inspired several generations of West
German historians, lawyers, journalists and politicians, who
documented the crimes of the Nazi regime. They resonated in President
Richard von Weizacker’s speech to the Bundestag on May 8, 1985 on
the uniqueness of the Holocaust, and again in Chancellor Angela
Merkel’s remarkable speech to the Israeli parliament, the Knesset,
in 2008 when she declared that Israel’s survival (in face of
Iranian threats) was a matter of unified Germany’s reason of state.
The
tradition of “coming to terms with the Nazi past” began among
political and intellectual elites and only over time became a
consensus in the German political establishment and among broad
sections of the public. From its origins until today, it has faced
resistance and repeated appeals to “finally draw a line under the
past.” Yet despite shortcomings and justice delayed, antisemitism
never again became government policy, and a democracy that guaranteed
rights to all of its citizens was established and persists. This
history of the connection between the truthful public memory of past
crime and injustice, on the one hand, and the establishment of a
viable liberal democracy, on the other, is one that Americans, in the
South and elsewhere, would do well to ponder. Financial restitution
to the immediate survivors of the Holocaust (but not their
descendants) was important in the aiding them in the terrible years
after they had lost so much. For the establishment of a liberal
democracy in West Germany, it was important as a public acceptance of
responsibility to assist them. Yet telling the truth about the past
and the reestablishment of the rule of law and respect for the rights
of all citizens were even more important for the successful
transition to democracy.
It
was a tragedy and a disgrace that myth and amnesia about the reality
of racism and slavery persisted for so many decades in the American
South.3 Had
Reconstruction continued and Jim Crow never become established,
racism might no longer be a factor in American politics and Donald
Trump might not now be the President of the United States. West
Germany and unified Germany are not utopias, but the history of the
links between memory and democratization and, yes, between
antifascism and a commitment to liberal democratic values, is one
that should be instructive to Americans of all political persuasions.
When Nazis marched with torches and bellowed “the Jews will not
replace us,” a President of the United States who understood the
history of this country would have said that he, as President, was
proud to stand in a now long and honorable tradition of American
antifascism linked to support for the values of liberal democracy and
opposed to racism and antisemitism. Such a President would have
placed the power and prestige of the office on the side of those
brave people who came to Charlottesville to oppose those evils.
1 On
“politics toward the past” see Norbert Frei, Adenauer’s
Germany and the Nazi Past (New
York: Columbia University Press, 2001).
2 On
coming to terms with the past see Jeffrey Herf, Divided
Memory: The Nazi Past in the Two Germanys (Cambridge,
MA: Harvard University Press, 1997).
3 On
the mythic memory of the Civil War, see David Blight, Race
and Reunion: The Civil War in American Memory (Cambridge,
MA.: Harvard University Press, 2002).
|
f7f4b3d2528cc83691d7ae55449deb1f | https://historynewsnetwork.org/article/167070 | #CouscousGate? | #CouscousGate?
We
haven’t heard a lot in the U.S. about the far-right anti-immigrant
Front National since their leader Marine Le Pen lost the French
presidential election to Emmanuel Macron back in May. But the Front
National is still in the news in France, mostly for growing divisions
within the party over tone and substance: should the party distance
itself from its anti-Semitic, anti-Muslim, and anti-immigrant image?
Enter #CouscousGate.
On September 13,
Kelly Betesh, an official in the French right-wing anti-immigrant
Front National (FN) political party, posted
a photo on Twitter of
herself dining with FN vice president Florian Philippot, one of the
most well-known leaders of the FN at the time, at a couscous
restaurant in Strasbourg, tagging Mr. Philippot’s political
association “Les Patriotes.”
The
photo at the couscous restaurant, though it didn’t even have
couscous in the picture, launched a backlash online. Couscous is a
North African dish, which is very popular in France, but some
commenters decried couscous as inappropriate for patriotic French
politicians. A good dinner in Strasbourg, these internet critics
claimed, was of traditional French fare, especially the Alsatian
regional specialty choucroute
garnie:
sauerkraut served with sausages and other meats.
#CouscousGate
highlights the powerful role of food in marking the boundaries of
French identity, a recurring theme in French politics over the last
decade: In 2009, the Le
Quick restaurant
chain attracted criticism for serving all halal meat at eight of its
fast-food restaurants. Pundits and politicians criticized
Quick for “forcing” its customers to eat Muslim meat,
as if halal meat was somehow inappropriate for non-Muslims. During
the 2012 Presidential campaign, a television documentary made big
news by “exposing”
that all of the abattoirs in Paris used halal slaughtering methods
and didn’t label all their meat halal. In 2015, the mayor of
Chalon-sur-Saône attracted national right-wing political support
when he ended
the town’s 30-year-old practice of
offering an alternative main dish to Jewish and Muslim students on
days when the cafeteria’s main dish was pork. Eating “French”
food, including pork, was portrayed as an essential part of learning
to be French in French schools. With all of this political hype over
Muslim diets, perhaps we should not be surprised that, in January of
2016, socialist French President François Hollande never shared a
meal with Iranian President Hassan Rouhani when Rouhani visited
Paris, because Holland would
not agree to host a halal meal without wine on
the table. These seemingly silly political scandals point to broader
truths about how we identify ourselves as members of nations and
communities through food.
Who
is really considered French? Whose voice, culture, and customs can be
included in the nation? It’s not a new question; it was being
debated through food in the first half of the twentieth century as
the French debated the role of the overseas French empire in
contributing to and shaping France. Some government and business
leaders attempted to convince the French to accept and integrate
foods from the colonies into French diets. While some ingredients
were accepted around the margins, such as tropical fruits
incorporated into desserts, other colonial foods couldn’t cross the
boundaries of French identity. Attempts to promote rice from
Indochina, for example, failed, in part, because many French people
thought rice an inappropriate food for Europeans as bread-eating
people.
Algeria—home
of couscous—was part of the French empire for over a century, and
the two countries continue today with an important exchange of goods
and people that has shaped both cultures for 200 years. Why does its
cuisine still occasionally invoke such visceral political
controversy? In part, because food is so meaningful. What we eat and
with whom we eat is an important way in which we construct group
identity, to define who we are and with whom we associate—and
therefore is a way to delineate difference and place others outside
our identity group. In the current context of anti-immigrant
sentiment and Islamophobia in France, these questions of boundaries
and identity keep coming up, and so food choices carry political
meaning.
Food
traditions can be wonderful and powerful ways to celebrate who we
are—our identities as members of families, communities, and
nations. Yet food also has the power to mark the boundaries of those
identities in ways that can keep us from creatively expanding and
redefining them. Despite America’s long history of immigration, we
aren’t immune to marking the boundaries of acceptable American
culture through food. Remember #tacotrucksoneverycorner? When you see
headlines and hashtags about political food choices, remember the
power of food to include or divide.
|
ab1f43572de4ed698e7a87ac00497f49 | https://historynewsnetwork.org/article/167075 | Hemingway’s First Short Story Found in Key West | Hemingway’s First Short Story Found in Key West
When Hurricane Irma smashed into Key West, Fla., author Shel Silverstein’s historic home was nearly leveled. But it was another home in his neighborhood that had Brewster Chamberlin, a writer and historian, worried.
His friend Sandra Spanier was also nervous. Not only because she feared for Mr. Chamberlin, who weathered the hurricane at home with his wife, but also because they shared a discovery that few people knew about.
In May, she and Mr. Chamberlin found Ernest Hemingway’s first short story — an untitled, previously unknown work that he wrote at the age of 10 — in the archives of the Bruce family, longtime friends of the Hemingways. It was only a few months after she’d first touched the stained brown notebook containing the story, and now this rare artifact could be blown off the island entirely. “I was really terrified,” said Ms. Spanier, the general editor of the Hemingway Letters Project and an English professor at Penn State University.
|
5b4d85688e269eda78a61e987fb29641 | https://historynewsnetwork.org/article/167290 | Trump Is the New ________ | Trump Is the New ________
Every historian worries over presentism — the tendency for contemporary sentiment to distort the study of the past. Some call it projection. In graduate school, it’s teleology, or what the French historian Marc Bloch dubbed "the most unpardonable of sins: anachronism." And so, lightly we tread, tippy-toed, when formulating a historical analogy: the likening of something then to something now.
The historian Arthur Schlesinger Jr. censured such allusive fare. Analogy rips historical example free of root, context, idiosyncrasy, and counterexample. Such evidence plucked from the past suffers from "confirmation bias," speciously corroborating contemporary-minded hypotheses for the already predisposed. "History by rationalization," Schlesinger damned.
Nonetheless, the historical analogy persists. And for it, Moshik Temkin, an associate professor of history and public policy at the Harvard Kennedy School, took a great many of his fellow historians to the woodshed. "Historians Shouldn’t Be Pundits," Temkin proclaimed in an op-ed in The New York Times. The peddling of historical analogy to understand current events might earn TV spots, but such spotty practice belied the historian’s process. It was "useless," even falsely "reassur[ing]," not just bad scholarship but possibly "dangerous."
As the kind of historian criticized by Temkin and the anti-allusionists, I was taken aback by his harsh column. So charged were Temkin’s charges that mere hours later, in The Atlantic,Julian Zelizer and Morton Keller, historians at Princeton and Brandeis respectively, hit back with "Why (Some) Historians Should Be Pundits," coyly puzzling over the contradiction of Temkin’s "argument about avoiding punditry" appearing on the Timesop-ed page.
By some historical coincidence, that same day, The Washington Post unveiled a new section, Made by History. The Post editors promised, "in an era seemingly defined by the word unprecedented," to deliver a steady diet of exactly the kind of historical analysis — "grappling with parallels between the past and present" — that Temkin had just rejected so vociferously. The game, it seemed, was afoot. ...
|
15b09fd5a39554e21c6d560aebecefeb | https://historynewsnetwork.org/article/167596 | The Damage Trump Has Done | The Damage Trump Has Done
President Reagan believed deeply that the United States had a mystical, even providential mission to stand before the world as a beacon of democracy and opportunity. He spoke often of America as the "shining city upon a hill," its eminence enlarged by the generations of immigrants who had arrived in the wake of the Pilgrims. Reagan upheld traditional virtues of discipline, restraint, hard work and gracious self-effacement as essential to the nation's spiritual foundations. Although he repeatedly asserted that America's best days lay ahead of it, he never doubted America's greatness in the present as well as the past.
Trump, by contrast, sees no American mission in the world, only a brutal contest for domination in which the United States must be the winner through him alone. He describes America not as a shining city but as a carnage-filled jungle, beset by crime and drugs and overwhelmed by vicious illegal immigrants. Although Reagan couldn't recall that he skirted the Constitution and violated it in the Iran-Contra affair, he never showed open, truculent disdain for the rule of law as Trump has done with his pardoning of the racist Arizona sheriff Joe Arpaio, convicted of criminal contempt for failing to obey the law. Trump denounces Washington as a thoroughly corrupted swamp, even as he cashes in by turning the White House into a money funnel for his far-flung business operations in violation of the Constitution's foreign-emoluments clause. He has set himself up as the commander of a great movement "the likes of which the world has never seen before," then incited that movement to trash proud Reagan conservatives like Sen. John McCain and effectively endorse the likes of Roy Moore, twice ousted from the Alabama court on which he sat for refusing to abide by the law, to say nothing of the sexual-assault allegations against him. Trump has no use for self-effacement, let alone graciousness or restraint, and instead conducts official business, including international diplomacy, with impulsive, unfiltered outbursts of insults. He has introduced to the presidency something once described by the late Sen. Daniel Patrick Moynihan as "defining deviancy down."
At the core of Reagan's politics was his stern anti-communism, which centered on the "Evil Empire" of the Soviet Union. It was Reagan's disenchantment with what he saw as liberal appeasement of communists, at home and abroad (as well as high income taxes), that first led him to cut his old New Deal ties. Drifting to the right, as a spokesman for the anti-union General Electric, Reagan came to regard the welfare state as a stalking horse for Soviet-style domination. He hailed slashing taxes and reducing regulations as assertions of individual freedom against statist tyranny but also as springboards for a booming economy that could sustain an indomitable military and subdue the Soviets.
Trump has certainly picked up and supercharged the Reaganite anti-government agenda but with no discernible ideology, only a compulsion to demolish established policies and programs, above all, anything associated with Barack Obama. Trump's political success, meanwhile, owes at least something to the supportive machinations of a former KGB colonel, Vladimir Putin, whose authoritarian regime has wreaked political havoc across the entire Western alliance. Putin is the one major world leader above all whom Trump has most conspicuously defended and singled out for praise, describing Putin during the 2016 campaign as a strong leader, "far more than our president." The xenophobic nationalism with which Trump stirs his political base – less an ideology than a ganglia of resentments – closely resembles the insular appeals of other Russia-friendly right-wing extremists, including Britain's Nigel Farage and France's Marine Le Pen. (Farage has been a particularly keen supporter of Trump's and, with his ties to Julian Assange and WikiLeaks, is reportedly a "person of interest" in the FBI's investigation of connections between the Trump campaign and Russian intelligence.) Under Trump, post-Reagan conservatism has come to this: assailing the federal government not in order to check Russian tyranny but, grotesquely, to mimic it. ...
|
488e0d98e6975024ce3bedf364d1fe19 | https://historynewsnetwork.org/article/167654 | Tax Plan Aims to Slay a Reagan Target: The Government Beast | Tax Plan Aims to Slay a Reagan Target: The Government Beast
It was the spring of 1985 when President Ronald Reagan first proposed to put an end to the state and local tax deduction. The idea was, to be sure, politically tricky. The provision had been around since the creation of the federal income tax in 1913, the budgetary expression of America’s celebrated federalism. As Justice Louis Brandeis might have put it, it was the federal government’s way to help pay for policy experimentation in the nation’s “laboratories of democracy.”
And yet to a Republican Party embroiled in a fundamental debate on how to shrink the government, it was an idea hard to resist: a direct shot at states’ capacity to spend. Bruce Bartlett, then a conservative tax expert who would go on to serve under Reagan and his successor, George Bush, estimated that without federal deductibility, state and local spending would fall 14 percent.
Nixing deductibility “threatens the political livelihood of spendthrift lawmakers across the nation,” Mr. Bartlett exulted at the time in an article for the Heritage Foundation. And it “would become more difficult for states to finance programs of doubtful benefit to their taxpayers by ‘hiding’ the full cost within the federal tax system.”
Reagan ultimately failed to kill the deduction. Mr. Bartlett, who often contributes to The New York Times, has come full circle to reject the Republican project to shrink the government at all costs. Still, his words from over 30 years ago provide an apt description of what drives Republican thinking in Congress today.
|
581b62e4441836b44243be65231a4726 | https://historynewsnetwork.org/article/167713 | No one should be surprised by journalism’s sexual harassment problem | No one should be surprised by journalism’s sexual harassment problem
Remember when it was just Roger Ailes and Bill O’Reilly? Over the past month, the news media has been central to the country’s weekly installments of Men Behaving Badly. We’ve seen the fall of the New Yorker’s Ryan Lizza, political commentator Mark Halperin, Charlie Rose of PBS and CBS, the New Republic’s Leon Wieseltier, the New York Times’s Glenn Thrush, NBC’s Matt Lauer and more.
Women in media — as in all industries in which the power structure is predominantly male — are not surprised. Almost half of women journalists globally have experienced work-related sexual harassment, and two-thirds have experienced “intimidation, threats or abuse,” according to a 2014 survey done by the International Women’s Media Foundation. (Yes, 2014 — three years ago.)
How did we get here?
The news media — an industry in which, especially in Washington and New York City, the social and professional lives of powerful people are inseparable — has a storied history of men belittling women and excluding them from access to power. Well into the 1970s, women operated at a disadvantage, excluded from key events and spaces and condescended to by their peers.
The Washington press corps played an essential part in silencing women’s voices and perpetuating misogyny, two ideas we still see deeply intertwined in society. As one female Washington reporter wrote to a colleague in 1954, “It is in an unfortunate truth that the chief discrimination against women reporters in Washington today is practiced by men reporters. Our male colleagues remain the most resistant to accepting us as equals.” ...
|
ff287b2a76abd6cdbb1fa8f8de898470 | https://historynewsnetwork.org/article/167740 | Re-watching Joe Biden’s disastrous Anita Hill hearing: A sexual harassment inquisition | Re-watching Joe Biden’s disastrous Anita Hill hearing: A sexual harassment inquisition
Sen. Joseph R. Biden, then chairman of the Senate Judiciary Committee, lost control of the biggest moment yet in his political career — the Clarence Thomas confirmation hearings — moments after Anita Hill finished describing what the Supreme Court nominee said about his penis.
With Americans glued to their TVs, chaos broke out in the Caucus Room of the Russell Office Building as Hill’s family, which had somehow not made it into the packed room for her opening statement, began trickling in — one after another after another.
“It’s a very large family,” Hill said.
Biden, a 48-year-old Democrat from Delaware who had already made one unsuccessful run for the presidency, watched, growing increasingly helpless.
|
f3efc0ff3827a9beb3ffb09b77f474ce | https://historynewsnetwork.org/article/167759 | Why We Shouldn’t Let the #MeToo Movement Change History | Why We Shouldn’t Let the #MeToo Movement Change History
As if by unspoken assent, we’ve adopted a name for our overdue reckoning with sexual harassment and assault: It’s the “Me Too Moment.” “Me too,” everyone knows, comes from the Twitter hashtag by which women are sharing their experiences with sexual predators of various sorts after the New York Times’ exposé of Harvey Weinstein. But the word “moment” is significant as well. It reminds us that there’s a time factor at work, a historical element. Starting now, it promises, we’re taking a harder line against these offenses than we did in the past.
Most of us take some satisfaction in seeing women emboldened to speak out where they had once been intimidated and seeing justice finally delivered to rank offenders. But for those concerned about history, there’s also a danger in some of the arguments being tossed about. If we’re not attentive to the history implicit in the “Me Too Moment” phrase—the reality that people and the press viewed aberrant sexual behavior differently in other eras—we risk misinterpreting the past. If we expect historical actors to have abided by codes of behavior we set out in 2017, we betray the historical project of understanding why people acted as they did. This concern comes to mind with the deeply confused suggestion, now touted as a form of virtue-signaling, that Bill Clinton should have been removed from office during independent counsel Ken Starr’s jihad-like investigation of his sexual behavior in 1998.
There are a lot of reasons why feminists and other liberals were in fact correct to defend Clinton during the impeachment saga. One is that the charges against him—lying about a consensual if still wildly inappropriate affair—just didn’t rise to the level of impeachment, the way President Richard Nixon’s constitutional crimes in Watergate had. To have countenanced Clinton’s impeachment or resignation would have dramatically lowered the bar for cashiering a president and legitimated the already rampant process of using scandal-mongering as a proxy for electoral politics. A second reason, newly hard to recall in this feverish moment, is that the claims of assault that a few people now regret downplaying were never established as true, and not even Starr saw fit to include them in his referral to Congress.
But perhaps the most profound if subtle reason for rejecting the retrospective support for impeachment or resignation is that it substitutes the norms of 2017 for those of another time. It’s one thing to wish that society overall had taken sexual harassment more seriously in the past (though it was hardly ignored in the 1990s, as some seem to think)—an innocuous though historically meaningless assertion. But it’s another to selectively readjudicate one specific political crisis by the standards of a different historical era—an act that risks distorting our understanding of how and why people acted as they did.
History requires reconstructing the thought processes of historical actors, which are invariably different from our own. The way people acted in the past was shaped by assumptions, conditions and norms, some deeply embedded in their culture. Those norms shift over time, and it can be surprising to see how differently people in other ages thought about any number of problems, including the intersection of sex and politics. The Gilded Age, for example, boasted a rowdy political culture in which the yellow press splashed tales of politicians’ philandering on its front pages, with respectable newspapers like the New York Times often hard on its heels. Today we remember little more from that era than the taunt that greeted presidential candidate Grover Cleveland in 1884—“Ma, Ma, Where’s my Pa?”—when he copped to fathering a child with an unwed woman. (He weathered the story, prompting the riposte from his supporters, “Gone to the White House, ha, ha, ha.” Two years later, at age 49, he married a 21 year-old Frances Folsom in the White House.) Yet debate swirled over how much politicians’ sexual transgressions should be publicly aired and censured. In a famous 1890 law article, Louis Brandeis and his law partner, Samuel Warren called for a “right to privacy” to shield individuals from having their personal lives unduly vetted. ...
|
b18d525c8329400330cfa12d0bcc06ca | https://historynewsnetwork.org/article/167764 | Columbia Professor Retires in Settlement of Sexual Harassment Lawsuit | Columbia Professor Retires in Settlement of Sexual Harassment Lawsuit
Dr. William V. Harris, a renowned Greco-Roman historian and longtime professor at Columbia University, retired on Monday as part of the settlement of a sexual harassment lawsuit.
The retirement, which Columbia announced in an email to students and faculty members on Monday afternoon, came nearly three months after an anonymous graduate student filed a lawsuit against Dr. Harris alleging that he had kissed and groped her repeatedly while he was her academic mentor, and then disparaged her to colleagues when she rebuffed his advances. The student, identified only as Jane Doe, also sued the university for what she called its “deliberate indifference” to her complaints about him.
After the lawsuit was filed on Oct. 2, Dr. Harris wrote in an email to The New York Times that he had no comment and referred a reporter to Columbia’s lawyers. In an email late Monday night Dr. Harris referred The Times to his lawyer and a lawyer for Columbia, neither of whom could immediately be reached for comment.
The terms of the settlement were not immediately known.
In a statement, David Sanford, Jane Doe’s lawyer, called the settlement “excellent.” ...
|
10409feb59330fa238c955099c6873dc | https://historynewsnetwork.org/article/167769 | We Have to Go Beyond Identifying and Punishing Individual Men | We Have to Go Beyond Identifying and Punishing Individual Men
Harvey Weinstein, Roy Moore, Charlie Rose
As controversy swirls in the wake of
the revelations about the abuses of women by powerful men in the
arts, politics, media, academia, restaurants and elsewhere come to
light, it is important to remember that we are dealing not with
exceptional cases, but—as #MeToo
demonstrates—with an enduring culture of masculinity. Women have
the vote, but they are under-represented in legislatures; they have
access to contraception and abortion, but those rights are under
attack from mean-spirited evangelicals and the Republican Party; they
have been admitted to universities and to various professions, but
they are consistently paid less than their male counterparts; even
when they climb the ladders of corporate management, they hit glass
ceilings again and again. Domestic violence plagues wives and
mothers; impoverished single parents are most often female. And now
we learn that workplace sexual harassment is a condition of
employment for more women than we had ever imagined, women across the
class and racial divides.
How can we reconcile this sorry state
of affairs with the deeply held belief that women in the U.S. and
Europe—the secular West—are the most emancipated in the world?
How can we reconcile it with the polemic we often hear that it is the
women of the non-Western world, particularly Muslim women, who are in
need of liberation? “They,” we are told, are sexually repressed
and so lack equality, whereas “gender equality,” and so sexual
liberation, is a primordial value of the (Christian) secular West.
In Sex
and Secularism I argue that, in fact, gender inequality
is the story of modern western nation states. It is an inequality
that has persisted, despite genuine reforms and real improvements in
the situation and status of women. As Vivian Gornick writes in the
New
York Times Magazine (December 17, 2017), there has
been “insufficient progress” on the question of gender equality.
“As the decades wore on, I began to feel on my skin the shock of
realizing how slowly—how grudgingly!—American culture had
actually moved, over these past hundred years to include us in the
much-vaunted devotion to egalitarianism.” In conversation
with me, Gornick attributed this delay to the hold of religion on our
society, but I think that is to misunderstand the ways in which power
and gender have been associated in our democratic, secular worlds.
The intertwining of power and gender is
the product of history, but they are so deeply naturalized that it
has been hard to disentangle. Religious justifications for the
inequality of women and men gave way from the eighteenth century on
to natural justifications; increasingly biology explained why women
couldn’t be citizens. When women were barred from attending
political meetings during the French Revolution in 1793, the reason
given had nothing to do with God. Asked one legislator rhetorically:
“Has Nature given us breasts to feed our children?” In a similar
but kinder vein, Thomas Jefferson found that “our good ladies …
have been too wise to wrinkle their heads with politics. They are
content to soothe and calm the minds of their husbands returning from
political debate.”
These ideas have continued to the
present day, albeit in new forms. Women have the vote, but as the
testimony of the #MeToo victims makes clear, there is a psychological
level that still underwrites inequality. Men assume that their
masculinity is about the assertion of power (in politics, at work, in
relationships), while women have internalized the sense that they can
exercise only diminished agency, that their femininity is defined by
their passivity. If second wave feminism sought to change that
outlook, it seems not to have entirely succeeded.
That is because in our culture
masculinity is synonymous at least symbolically with power;
femininity with its lack. The source of that symbolic association is
an old one—the body and the office of the king were one; his
masculinity was at once assured and confirmed by his possession of
the phallus (the symbol of power). The advent of democracy dispersed
political power, creating great uncertainty about who could claim
legitimacy. Men claimed it on the basis of the association of their
masculinity with the king’s. In their thinking the phallus and the
penis became one—a male body (however socially, economically or
politically deprived of real agency) became the sign of a certain
power—power signified by the domination of women. Indeed, in some
cases—as described
in the New York Times by by Shanita Hubbard, an aggressive
masculinity compensates for social and economic deprivation.
The so-called natural differences
between women and men (those declared immutable by scientists and
social scientists in the nineteenth century and that are once again
being insisted upon by evolutionary psychologists) underwrite this
kind of thinking. So it was that the Scottish biologist Patrick
Geddes opposed giving women the vote on the grounds of their physical
difference: “The hungry, active cell becomes flagellate sperm,
while the quiescent well-fed one becomes an ovum.” It followed
that women belonged in the private, domestic sphere, men in the
public/political realm. “What was decided among the primitive
protozoa,” he concluded, “can not be annulled by an act of
parliament.” According to this logic, If the differences are
natural, then the inequalities that follow from them cannot be
rectified; indeed they are not inequalities at all.
The arguments have changed since the
nineteenth century and today sexual emancipation is often heralded as
a sign of equality. But, as the unfolding revelations of the last
weeks have shown us this is not at all the case. The myth of women’s
sexual liberation is denied by the demeaning treatment they receive
in the workplace, but also in sexual encounters (as reports from
college campuses demonstrate). There seems to be a persistent
belief—deeply rooted in our psyches—that men achieve recognition
of their masculinity, and so of their political and social power, by
exercising domination over women.
If we are to seriously address the
current crisis beyond identifying and punishing individual men as bad
actors, we have to attend to this history and make apparent how
deeply rooted it is in our culture and our psyches. Without that
critical intervention—without attention to what might be called
“the lessons of history”—the flurry of revelations about
longstanding and long tolerated exercises of men’s power, however
horrifying in their details, will not suffice to achieve what is
required to permanently change the gendered power dynamics of our
culture.
|
0ad15dead9c48d4358285a31d626df53 | https://historynewsnetwork.org/article/167773 | The Father of Modern Libraries Was a Serial Sexual Harasser | The Father of Modern Libraries Was a Serial Sexual Harasser
Adelaide Hasse was used to professional challenges. As a young woman, she struggled to be taken seriously by mostly male executive boards. She created a groundbreaking new way to classify government documents—and was disappointed when a male colleague claimed the credit. But armed with a new job at the New York Public Library, a better salary, and an ambitious new project, she finally felt optimistic about her career.
To pull off her newest plan, she’d need support, so she approached the leading voice in her field, Melvil Dewey, a man whose innovations made him a household name. He suggested they meet privately about her new project. Encouraged, she made her way to Albany, New York—only to find that he had arranged what amounted to a weekend-long date. It’s unclear what happened next, but Hasse departed hastily after being taken for a long drive by Dewey, and later spoke to colleagues about how offensive his behavior had been.
The story sounds like it could involve a Harvey Weinstein or Matt Lauer, but it didn’t. It took place in 1905, more than a century before the #metoo movement that exposed the sexual misconduct of America’s most powerful men. And the man in question was Melvil Dewey, the library pioneer whose decimal system of classification is still used in libraries today—a “protean genius” who raised himself from a poor farmer’s son to an icon during his lifetime.
Dewey is remembered today as an innovator who ushered American librarianship into the modern age. He helped invent the modern library, shaping everything from its organizational methods to its look to the roles of the librarians who were their stewards. But his pattern of sexual harassment was so egregious that women like Hasse dared to speak out against it, at a time when women were harshly judged for reporting sexual harassment. So many came forward that he was kicked out of the profession’s most prestigious association after an industry cruise in Alaska turned dangerous for women.
|
35d25ce3f3adaa88a1f7a5410fa607ad | https://historynewsnetwork.org/article/167779 | UC Berkeley settles sex harassment claim against Middle East scholar for $80,000 | UC Berkeley settles sex harassment claim against Middle East scholar for $80,000
Nezar AlSayyad, a tenured architecture professor and an internationally recognized Middle East scholar, remains employed at UC Berkeley more than a year after an independent investigator determined that he sexually harassed his former student, Eva Hagberg Fisher, from 2012 to 2014.
The university has given AlSayyad no classes to teach since fall 2016, but he continues to receive $211,000 a year. He has taught at UC Berkeley since 1985.
Student protests erupted against AlSayyad and the campus administration in November 2016, after The Chronicle first reported the investigator’s findings. Dozens of graduate students also signed a petition demanding that the administration revoke AlSayyad’s tenure if a separate investigation by the Faculty Senate determined that the professor violated the Faculty Code of Conduct. ...
|
db9ce3a811271796dc595fe89b39aaa3 | https://historynewsnetwork.org/article/167810 | Liberals, make some distinctions | Liberals, make some distinctions
When you hear “zero tolerance,” what do you think?
Until very recently, good liberals gave the term a bad rap. It conjured draconian school-discipline rules, which led to a disproportionate number of suspensions for students of color. And it resonated with the broken-windows philosophy of Rudy Giuliani and other tough-on-crime politicians of the 1990s and early 2000s, who sought the maximum penalties for even the most minor violations.
But that was then, and this is #MeToo. In the struggle against sexual harassment, zero tolerance has suddenly become the accepted liberal dogma. Miscreant students deserve second and third chances, just like turnstile jumpers and squeegee men do. But for sexual misconduct, it’s one strike and you’re out.
That’s a huge mistake. School discipline and public safety were not served by suspending or locking up as many people as possible, regardless of their infractions. Likewise, the campaign against sexual misconduct will suffer if we fail to distinguish between different kinds of it.
Don’t tell that to New York Sen. Kirsten Gillibrand, who spearheaded the effort to persuade her fellow senator Al Franken to resign. “I think when we start having to talk about the differences between sexual assault and sexual harassment and unwanted groping, you are having the wrong conversation,” Ms. Gillibrand said, defending her call for Mr. Franken to quit. “You need to draw a line in the sand and say none of it is OK. None of it is acceptable.”
She’s right, of course. But some forms of it are worse than others and deserve greater penalties; some aren’t as bad, and they should be punished less. That’s called the principle of proportionality, which lies at the heart of any reasonable system of justice. And once you do away with it, you can do anything to anyone. ...
|
a1c3b7f8a5bee8a75a90a8bb9e15236f | https://historynewsnetwork.org/article/167823 | The Austrian Scientist Time Forgot Because He Was a Jew | The Austrian Scientist Time Forgot Because He Was a Jew
The
Austrian chemist Ferdinand Münz (1888-1969) invented and synthetized
EDTA (ethylenediaminetetraacetic acid) in 1935 and was also the
author of many patents in the textile field, whose trademarks still
exist today.
So,
what exactly is EDTA? Long story short: EDTA is one of the most
important compounds in analytical chemistry and it is used in
everyday life in products such as shampoos, bactericides, mercury
poisoning treatments, food preservatives, and tissues dyeing.
Yet
Dr. Münz has been virtually unknown to the history of science. But
why? Because of persecution for his Jewish origins.
Ferdinand
Münz was born in Krakow on June 23, 1888. His parents were Michael
and Bertha Münz. The city was then part of the Kingdom of Galicia
and Lodomiria, one of the most populated provinces of the
Austro-Hungarian Empire. We have little information about his
childhood and his family. He had three brothers: Stefan, Ernest,
Amelie; the latter died in a concentration camp. When Ferdinand was
10 years old the Münz family moved to Vienna. After the fall of the
Empire in 1919 he opted for Austrian citizenship. He attended K.K.
Staatsrealschule of the 5th district of Vienna. In 1906 he began to
study chemistry at today's Technical University of Vienna, still one
of the most prestigious universities in the world. In the years
following 1934-35 he apparently tried to emigrate to New York, but
from November 25 to December 1, 1938 he was interned in the
Buchenwald concentration camp. Some documents listed him as a "Jew."
In
marriage documents, specifically in the period around the 1930s when
he lived in Frankfurt, he was listed as "Aryan." It is
probably thanks to his wife that he managed not to fall prey to the
Nazi government and therefore not to perish in a concentration camp.
Or maybe he survived because – like the Italian Primo Levi
(1919-1987) – he was a chemist. The Nazis had a great need for
chemists.
The
years 1930-1943 are "the golden period" of this inventor.
In this time frame he learned how to synthesize numerous commercial
compounds whose trademarks still exists today. Suffice it to mention
EDTA, whose patent was filed in Germany by him in 1935 and patented
anonymously in 1942 because of his Jewish origin and the Nazi
persecution. In 1936 he patented the same discovery in the United
States, in an attempt to give greater visibility to his discovery,
this time with his name on it (patent US 2130505).
He was
interned in Theresiendstadt concentration camp (in today’s Czech
Republic), which was used as a prison camp and as a place of transit
to the eastern death camps. He was released on April 9, 1945, just
weeks before the camp’s prisoners were liberated by the Red Army on
May 8.
In
1949 he co-worked with the Nobel laureate Kurt Alder (1902-1958),
when they published together a paper on diene synthesis and
additions. Kurt needed him because EDTA is used in the polymerization
process to make a kind of rubber (SBR).
On
January 1, 1956 Münz retired to private life and died on August 16,
1969 in the town of Glashütten, aged 81. He was buried in the
hauptfriedhof (main cemetery) in Frankfurt.
Dr.
Münz was working during the darkest years of the 20th century and that
contributed enormously to his eclipse as a man of science.
Unfortunately, in Germany during those years, a Jew could not carry
out research freely and many of his publications and patents appeared
in literature without his name.
Now at
last he can get the credit he deserves.
|
c52fa9072a1ec6386d8269d3798b13a8 | https://historynewsnetwork.org/article/167871 | Do Historians Have a Sexual Harassment Problem? | Do Historians Have a Sexual Harassment Problem?
Related Link Sexual Harassment: AHA to Survey Its Members By Rick Shenkman
After the Harvey Weinstein scandal broke a columnist for the Chronicle of Higher Education wondered if academia has a sexual harassment problem. She asked readers to relate their personal stories. She quickly received more than 1800 replies. Now the total exceeds 1900. You can see them here laid out neatly in a Google spread sheet.
While the self-reports may or may not be representative – the columnist herself acknowledges that her readership is skewed given her general appeal to people in the humanities – the large number of complaints involving historians is stunning: 182. (The number includes scholars in history, art history, medieval history, and American Studies.) And that's just a rough estimate. The number may well be higher. Some contributors to the survey don't identify the discipline of the accused.
The accounts run the gamut from rape to sexual harassment. A freshman in college related how she had been "repeatedly told by one professor that I should use my 'hotness' to get ahead in class and my future work. As a 17 year old (I started college a year early), this was uncomfortable and scary."
A graduate student at the University of North Carolina at Chapel Hill working on her Masters told how her advisor had increasingly become "touchy feely." "He would place his hands on my hips, my waist, my shoulders when we were in his office. I did my best to dance away. In celebration of my MA defense, he took me to a local restaurant, where he kissed me on the lips. That evening, I emailed him to say that I had not consented to the kiss and did not want it to happen again. He admitted to it and replied that it was an 'unintended gesture.' But it was clearly planned, as he knew that I had left my husband mere weeks before and that I was in a seriously vulnerable position."
A PhD student in religious history at Drew University says she was manipulated into a sexual relationship by the second reader of her dissertation. "My primary adviser, and hence first reader, was on family leave, so this second reader became the person I was dependent upon for guidance, mentorship, and affirmation of my work. (also, my marriage was going downhill at the time, so I was pretty vulnerable) This second reader invited me out for drinks on a weekly basis, began to 'dis' my primary adviser and my third reader, and shared with me his marital problems and frustrations. I was flattered that he thought 'highly' enough of me to share these confidences and to be so honest with me - and as a clergy person, I was easily drawn in to his need to disclose and share his 'troubles.' This ultimately led to a sexual relationship while at a conference in England - after my dissertation was turned in, but before my defense. He could have waited to screw me until after the defense - the fact that he didn't demonstrates the power play involved. I wound up utterly 'smitten' (think Monica Lewinsky, here) and also became physically ill - developing ulcerative colitis."
And on and on it goes, one dreary, horrifying story after another. No story was more disturbing than this one by a lecturer at the University of Tennessee, Knoxville: "I strongly suspect that a colleague placed some kind of 'date rape' drug in my drink during a social gathering. While I am now in recovery, I am a long-time substance abuser and alcoholic. I could tell from the effects of the drink that I did not only have alcohol in my system. I went home immediately. This colleague had constantly subjected me to crude, sexual, and degrading comments, as well as multiple sexual propositions."
Until now stories like this have mostly been hidden from public view. No more. But how big is the problem? And what will be done about it? The American Historical Association is addressing the issue at this week's annual convention. The session, "Historians and Sexual Harassment: The Challenge for the AHA," will be held Saturday morning from 10:30 a.m. to noon in the Roosevelt Room 5 (Marriott Wardman Park, Exhibition Level). Cornell's Mary Beth Norton, the incoming president of the AHA, will chair. Members of the panel include: Catherine Clinton (University of Texas at San Antonio), Marcy Norton (University of Pennsylvania), Katrin Schultheiss (George Washington University), and Tyler E. Stovall (University of California, Santa Cruz), the outgoing president of the AHA.
|
08b95c538a088f12d7c86b65a66a9d82 | https://historynewsnetwork.org/article/167940 | Until 1975, ‘Sexual Harassment’ Was the Menace With No Name | Until 1975, ‘Sexual Harassment’ Was the Menace With No Name
Rain poured down in Ithaca, New York, but the women who streamed into the Greater Ithaca Activities Center on May 4, 1975 weren’t daunted by a bit of weather. Hundreds of women packed into the modest room. Then they began tospeak about their experiences being groped and sexually exploited at work.
For journalist-turned activist Lin Farley, the event was life changing. “The solidarity that women felt for one another was contagious,” she later wrote. “No longer did they have to explain to their friends and family that ‘he hit on me and wouldn’t take no for an answer, so I had to quit.’ What he did had a name.”
Attendees spoke of professors, restaurant guests, factory workers, executives—men who turned their workplaces into private hells. They talked about how their bosses pinched them, groped them, and how their coworkers looked the other way when they were harassed. Humiliated, intimidated and bullied, many of these women had lost jobs when they turned down their bosses’ sexual advances. And they were fed up.
As they spoke, these women used a new term: sexual harassment. Until just a few weeks before, the term didn’t even exist. But thanks to Farley and the consciousness-raising efforts of the 1970s women’s movement, the newly coined term would not just help women give voice to their experiences: It would change U.S. law and life in the workplace.
|
d8184278e3ee21d4b6907439e93fb892 | https://historynewsnetwork.org/article/168020 | What a medieval love saga says about modern-day sexual harassment | What a medieval love saga says about modern-day sexual harassment
The tomb of Abelard and Héloise.
Alexandre Lenoir, via Wikimedia Commons, CC BY-SA
Suddenly, popular media is saturated with stories of powerful men outed by women for behavior in the workplace. These alleged harassers seem to assume that power in the workplace grants them sexual access to anyone.
In medieval Europe, most people assumed the same thing, although they didn’t call it “harassment.”
As a historian of gender in the European Middle Ages, I am all too familiar with well-documented cases of sexual harassment, abuse and rape. Such behavior was not considered unlawful or wrong in the medieval period unless one powerful man harassed a woman who belonged to another powerful man.
One famous 12th-century saga involved a young philosopher, Abelard, and his teenage student Héloise. The story has many similarities with news of modern-day aggressors, with one major exception: None of today’s harassers has suffered medieval punishment.
The case of Abelard and Héloise
Abelard and his pupil Héloise. Edmund Leighton, via Wikimedia Commons
In 1115, Abelard was the star of the budding university scene in medieval Paris. Famous for his quick mind and infallible memory, Abelard supposedly never lost an argument. One day he encountered Héloise, who also studied classics and philosophy (rare for a medieval girl). Abelard later wrote of that first glance, “In looks she did not rank lowest while in the extent of her learning she stood supreme.”
Knowing himself to be handsome and brilliant, Abelard stalked the girl and persuaded her uncle, Fulbert, a church official and Héloise’s guardian, to hire him as her personal tutor. Fulbert was delighted to employ the famous Abelard. Fulbert gave Abelard room and board, so that he might tutor Héloise day and night.
Abelard taught Héloise more than philosophy. “My hands strayed oftener to her bosom than to the pages,” he admitted. “To avert suspicion I sometimes struck her.” Eventually, as he wrote, their “desires left no stage of lovemaking untried, and if love devised something new, we welcomed it.”
The affair became the subject of student ballads sung in the streets of Paris.
The wages of sin
Abelard was alarmed at the gossip and sent Héloise off to her old convent school outside of town. Their affair remained torrid, though, and he visited when he could. They once had sex in a corner of the refectory where nuns took their meals.
Their troubles became worse when Héloise became pregnant. Abelard sent her away – this time to his sister in Brittany, where Héloise gave birth to their son Astrolabe, whom she left behind when returning to Paris.
'Les Amours d'Héloïse et d'Abeilard' (1819), by Jean Vignaud via Wikimedia Commons.
When Uncle Fulbert learned of Astrolabe’s birth he “went almost out of his mind,” as Abelard put it, even though Abelard reminded him that “since the beginning of the human race women had brought the noblest men to ruin.” Eventually, to appease Fulbert, Abelard agreed to marry Héloise, but only if Fulbert would keep it secret. Héloise objected but submitted.
As things were, the stalking and beating of Héloise posed no danger to Abelard’s reputation nor did fathering an illegitimate son. News of a marriage, though, would ruin him – for only celibate churchmen could find permanent employment as teachers.
Fulbert, however, spread word of the marriage. Héloise and her uncle argued fiercely until Abelard once more hid Héloise in a convent. Against her wishes, he made her wear nun’s clothing.
Uncle Fulbert believed that Abelard had abandoned Héloise. One terrible night, Abelard awoke to find himself under attack by a gang of ruffians who took shocking vengeance for Fulbert. As Abelard put it starkly, “They cut off the parts of my body whereby I had committed the wrong of which they complained.”
A eunuch, like a married man, was barred from high church offices and teaching positions. Abélard became a monk and Héloise an unwilling nun.
Whose calamity?
We know this sad story from Abelard’s “History of My Troubles” (“Historia Calamitatum”) written about 15 years after his marriage to Héloise. By then, she had become an abbess in charge of a small community of nuns at The Paraclete – a monastery founded by Abelard and named after one of his famous philosophical arguments. The two began to exchange letters in the 1130s. Héloise had never been happy in the convent. She wrote to her husband:
“The pleasures of lovers which we have shared have been too sweet … wherever I turn they are always there before my eyes, bringing with them awakened longings and fantasies which will not even let me sleep.”
Abelard suggested that she give all her love to Christ instead. He sent her handy tips for running a monastery. He refused to visit, though.
“My agony is less for the mutilation of my body than for the damage to my reputation.”
His career was paramount; her grief, less so. “His” reputation, “his” calamity. What about “hers”?
Bad love
Something about the history of Abelard and Héloise endured the centuries until 18th- and 19th-century intellectuals embraced the tale of these star-crossed lovers. Several poets and artists depicted Héloise unhappily entering the convent or dreaming of lost love. Parisians erected an ornate monument to the couple in the cemetery of Père-Lachaise, where today’s lovers still leave fresh roses.
Angelica Kauffman, via Wikimedia Commons
However, despite the discovery of more letters exchanged between Abelard and Héloise, today’s medievalist scholars tend to accept Abelard’s version of the relationship – that Héloise was complicit.
Abelard said Héloise loved him. But did the teenage girl actually consent to sex with the teacher who beat her? Did she agree to have the child? Did she prefer “love to wedlock and freedom to chains,” as Abelard claimed?
We know from her letters to him that she resisted the convent.
“Of all the wretched women, I am the most wretched,” Héloise complained, long after the affair.
Romancing harassment
No one has labeled Abelard a rapist, the seducer of a minor or a sexual harasser. His philosophical works remain crucial to the history of Christian theology and philosophy. Héloise is celebrated mostly for being a female intellectual in a period when there were few.
Such historical “romances” still play out in gender relations today, particularly in the university. A recent survey of graduate students and professors, for example, revealed the extent to which male professors prey on young minds and bodies under their guidance.
And, like Héloise, many such victims still find it hard to voice resistance, although they no longer cower in the cloister. Instead of writing letters to their harassers or singing ballads in the streets, they reveal their secrets in digital media – too often anonymously.
“Plus ça change,” or “the more it changes, the more it’s the same thing,” as Abelard might say. One thing we have learned since the Middle Ages is that sexual harassment is a destructive crime, no matter how romantic the backstory.
|
25f36bbd790a1d47ec4c2569717b8be2 | https://historynewsnetwork.org/article/168060 | 20 years since America’s shock over Clinton-Lewinsky affair, public discussions on sexual harassment are changing | 20 years since America’s shock over Clinton-Lewinsky affair, public discussions on sexual harassment are changing
Twenty years ago, major news outlets reported allegations that then-President Bill Clinton had a sexual relationship with a 22-year-old White House intern.
Looking back, the Clinton-Lewinsky affair heralded a sea change in political discourse by normalizing public discussion of sex acts. Today, it is hard to believe that esteemed presidents, from Thomas Jefferson to John F. Kennedy, were sheltered from public judgment by a code of decorum that conveniently regarded the subject of sex as beneath the dignity of political discussion. That all changed in the Clinton days when terms like “oral sex” and “semen stain” were catapulted from the domain of hushed whispers to front-page news.
Fast forward to today, and once again the man sitting in the Oval Office is dogged by allegations of sexual misconduct. As a scholar who has examined public reaction to political sex scandals since the Clinton days, this is hardly where I expected we’d find ourselves in 2018. Twenty years ago, it seemed plausible that difficult conversations spurred by revelation of the Clinton-Lewinsky affair – about issues ranging from sexual harassment to the nature of sexual consent – would lead to lasting changes in the way women and men conducted themselves in the workplace, and well beyond.
But how far have we really come?
|
54d0e17ac1f42bb710343f99e29e8f62 | https://historynewsnetwork.org/article/168216 | They considered themselves white, but DNA tests told a more complex story | They considered themselves white, but DNA tests told a more complex story
As more Americans take advantage of genetic testing to pinpoint the makeup of their DNA, the technology is coming head to head with the country’s deep-rooted obsession with race and racial myths. This is perhaps no more true than for the growing number of self-identified European Americans who learn they are actually part African.
For those who are surprised by their genetic heritage, the new information can often set into motion a complicated recalibration of how they view their identity.
Nicole Persley, who grew up in Nokesville, Va., was stunned to learn that she is part African. Her youth could not have been whiter. In the 1970s and ’80s in her rural home town, she went to school with farmers’ kids who listened to country music and sometimes made racist jokes. She was, as she recalls, “basically raised a Southern white girl.”
|
02deb1713ce9c1c87647d082ae30e366 | https://historynewsnetwork.org/article/168218 | DNA Tests on an Ancient Skeleton Reveal the First Briton Was Black, Not White | DNA Tests on an Ancient Skeleton Reveal the First Briton Was Black, Not White
The first person known to have lived in Britain had dark skin, according to cutting-edge scientific analysis from London’s Natural History Museum.
In research that may raise eyebrows among modern-day white nationalists, scientists used DNA analysis from Britain’s oldest nearly complete skeleton to reveal he had dark skin and blue eyes.
The skeleton was discovered in 1903 and is known as Cheddar Man, after the area where he was found, which is also where the cheese originated. He’s believed to have lived more than 10,000 years ago and is the oldest Briton to have ever had their DNA tested—with some surprising results.
The research suggests that light skin developed in ancient Britons much later than previously thought, with experts commenting that it flies in the face of modern perceptions of Britain, Europe, and race.
|
98bbeefec31492dadc7517df75f8120e | https://historynewsnetwork.org/article/168298 | Writer Makes the Case for Impeaching Clarence Thomas | Writer Makes the Case for Impeaching Clarence Thomas
... Thomas’s inappropriate behavior — talking about porn in the office, commenting on the bodies of the women he worked with — was more wide-ranging than was apparent during the sensational Senate hearings, with their strange Coke-can details.
But, most of all, because Thomas, as a crucial vote on the Supreme Court, holds incredible power over women’s rights, workplace, reproductive, and otherwise. His worldview, with its consistent objectification of women, is the one that’s shaping the contours of what’s possible for women in America today, more than that of just about any man alive, save for his fellow justices.
And given the evidence that’s come out in the years since, it’s also time to raise the possibility of impeachment. Not because he watched porn on his own time, of course. Not because he talked about it with a female colleague — although our understanding of the real workplace harm that kind of sexual harassment does to women has evolved dramatically in the years since, thanks in no small part to those very hearings. Nor is it even because he routinely violated the norms of good workplace behavior, in a way that seemed especially at odds with the elevated office he was seeking. It’s because of the lies he told, repeatedly and under oath, saying he had never talked to Hill about porn or to other women who worked with him about risqué subject matter.
|
8a500f35f6274792584d94bdc24a2e4d | https://historynewsnetwork.org/article/168306 | Was Pirate Black Sam Bellamy Found? DNA Test Could Tell | Was Pirate Black Sam Bellamy Found? DNA Test Could Tell
Researchers are working to use DNA to identify whether a human bone recovered from a Cape Cod shipwreck belongs to the infamous pirate Samuel "Black Sam" Bellamy.
The Whydah (WIH'-duh) Pirate Museum in Yarmouth, Massachusetts, publicly displayed the bone Monday. It was found near what's believed to be Bellamy's pistol.
The objects were pulled from the Whydah Gally (GAH'-lee) shipwreck several years ago.
|
d307602e12d37946146f43aa14cda2ea | https://historynewsnetwork.org/article/168340 | AHA President Mary Beth Norton says ending sexual harassment is a high priority | AHA President Mary Beth Norton says ending sexual harassment is a high priority
Related Link Background to the AHA’s decision
In fall 2017, as allegations of sexual harassment flooded into public view from women (and some men) employed in movies, television, journalism, and politics, behavior in academe and the historical profession at large almost inevitably became part of the conversation. Stories of the harassment of graduate students by their mentors and of junior faculty by more senior colleagues surfaced on and off social media. Many women who work as historians in a variety of settings, including myself, have similar tales to tell. In the past, such incidents have been treated in isolation and as individual experiences. The AHA has long been on record as decrying sexual harassment in employment, but that statement clearly needs expanding and updating.
The AHA Committee on Gender Equity (CGE, formerly the Committee on Women Historians) and the AHA Professional Division began the Association’s discussions during their regular fall teleconferences in October. The topic was also on the agenda for the November meeting of American Council of Learned Societies executive directors, attended by the AHA’s Jim Grossman.
Shortly thereafter, on November 14, a large number of historians and others submitted a comprehensive Letter to the American Historical Association Concerning Sexual Harassment and Violence in the Profession. (About 45 percent of the eventual 868 signers were AHA members.) The letter asked historians “to take stock of our own professional culture, and the ways in which it may contribute to environments in which sexual harassment and assault are tolerated.” It pointed out that in addition to the harm such behavior has caused victims, the discipline as a whole has suffered when talented individuals have abandoned history for other fields of study. It lamented that scholars and colleagues have often counseled victims to keep silent in response to harassment, largely because of potential long-term negative effects on a victim’s career. And it noted that harassment incidents could occur in settings other than particular workplaces, including at AHA annual meetings and similar professional gatherings involving people from different institutions.
In this context, the Professional Division, led by its vice president, Kevin Boyle, continued discussions by e-mail about how best to address the problem of sexual harassment as it relates to the work of historians. Grossman reported on how other professional associations—among others, the American Philosophical Association and the Society of Biblical Literature—have handled these issues and on what they have learned about relevant legal aspects. After consultations with Grossman, AHA president Tyler Stovall, CGE chair Katrin Schultheiss, and myself, Boyle drafted a memo for the Council, summarizing other associations’ sexual harassment policies and sketching options the AHA might take, but not committing the Association to any specific course of action.
We collectively decided on the following strategy. Sexual harassment was already on the AHA Council’s agenda for Thursday, January 4. To enable the Council to benefit from members’ experience and wisdom, we decided to make sexual harassment the subject of a late-breaking session for Saturday, January 6, chaired by me as president-elect. We also scheduled another discussion for the Council’s meeting on Sunday, January 7, with the goal of setting forth guidelines for an ad hoc committee that could draft a new AHA statement to be presented to the Council (with comments from committees representing the AHA’s various constituencies) at its next meeting, in June.
The initial Council discussions on January 4 were wide-ranging. Councillors concurred that the AHA should adopt a new statement on sexual harassment. Our general counsel advised us that the AHA should focus specifically on the spaces it controls—that is, its own office, the annual meeting, and any other committee meetings or conferences sponsored by the Association. Just prior to the meeting, we learned that the American Political Science Association (APSA) had recently conducted a survey of its members about experiences of harassment at its conventions for the past five years, and we were given advance copies of its analysis of the findings. The Council quickly agreed to submit essentially the same survey to the AHA membership, with the goal of obtaining comparative data. We decided to wait to make other decisions until after the late-breaking session.
On Saturday, January 6, between 100 and 120 people, primarily women but perhaps 10 percent men, attended the late-breaking session. Panelists—Stovall, Schultheiss, Marcy Norton (the spokesperson for the group that composed the letter), and Catherine Clinton (who as president of the Southern Historical Association had focused on sexual harassment in that organization)—each made brief presentations. Then I opened the floor for comments. Many audience members spoke, some movingly recounting episodes of sexual harassment or even assault they had experienced either at conventions or in other professional settings. They offered many thoughtful suggestions about policies the AHA could adopt, calling for statements of what we might term “best practices” to guide historians and their employers. That request for guidance was repeated by department chairs at a meeting Jim Grossman and I attended immediately after the session.
It therefore became clear that, rather than one statement, the AHA needed to adopt several: one on sexual harassment, setting forth principles and complaint procedures for our conventions and other meetings we organized, and others on such topics as hiring and mentoring, outlining principles and best practices in contexts over which we have no direct control.
Accordingly, at the meeting on Sunday, January 7, the Council made a series of decisions. It delegated two tasks to small groups of councillors: making final decisions about the details involved before distributing the APSA survey and reviewing the language in the staff handbook concerning sexual harassment to ensure it was adequate and up-to-date. Councillors—some of whom had attended the session the previous day—concurred that the AHA should issue new or expanded statements summarizing the practices required to create safe environments for historians and their work. The specifics of such statements remain to be developed but will rest on commonly accepted ethical norms.
Significantly, councillors agreed on the basic outlines of a new procedure, which will implement a restated and expanded set of principles and definitions of prohibited behavior at annual meetings and other AHA events. All registrants for AHA-sponsored meetings should be required to indicate that they are aware of these policies as a part of the registration process. Drawing on processes adopted by other professional associations but duplicating none of them exactly, we decided to name an ombuds team consisting of designated members of the Council and representatives from the AHA’s relevant constituencies to receive complaints about harassment at our meetings. Team members’ names and contact information will be publicized, and complainants may choose which individual to contact. That team member would acquaint the complainant with her or his options. If the complaint involves a possible crime, the team member could recommend that the individual report the event to appropriate authorities. In the event the complainant wished to pursue the matter further within the AHA, the ombuds team member would, after further inquiry into the circumstances, turn the information over to the executive director, who would consult the AHA president and general counsel before proceeding. Expulsion from the meeting is a possible sanction for an offender.
The statements and the new procedure will be drafted by a Council committee headed by Teaching Division vice president Elizabeth Lehfeldt and including among its members Tyler Stovall and Kevin Boyle, along with a representative of CGE. We anticipate approval by the Council in June and full implementation at the 2019 AHA annual meeting in Chicago.
|
9f027d343732464f250245d36e90b0cb | https://historynewsnetwork.org/article/168346 | Seven Books Named as Finalists for the 2018 George Washington Prize | Seven Books Named as Finalists for the 2018 George Washington Prize
CHESTERTOWN, MD – Seven books published in 2017 by the country’s most prominent historians have been named finalists for the George Washington Prize. The annual award recognizes the past year’s best-written works on the nation’s founding era, especially those that have the potential to advance broad public understanding of early American history.
“Understanding the first chapter of our national story is more essential today than ever,” said Adam Goodheart, director of Washington College’s Starr Center for the Study of the American Experience, one of the prize’s three cosponsors. “These books reconnect us with ideas that made the United States a beacon for democratic movements around the world.”
Created in 2005 by the Gilder Lehrman Institute of American History, George Washington’s Mount Vernon, and Washington College, the $50,000 George Washington Prize is one of the nation’s largest and most notable literary awards.
The finalists’ books combine depth of scholarship and broad expanse of inquiry with vivid prose that exposes the complexities of our founding narrative. Written to engage a wide public audience, the books provide a “go-to” reading list for anyone interested in learning more about George Washington, his contemporaries, and the founding of the United States of America.
The 2018 George Washington Prize finalists are:
● S. Max Edelson, The New Map of Empire: How Britain Imagined America before Independence (Harvard University Press)
● Kevin J. Hayes, George Washington: A Life in Books (Oxford University Press)
● Eric Hinderaker, Boston’s Massacre (Harvard University Press)
● Jon Kukla, Patrick Henry: Champion of Liberty (Simon & Schuster)
● James E. Lewis, Jr., The Burr Conspiracy Uncovering the Story of an Early American Crisis (Princeton University Press)
● Jennifer Van Horn, The Power of Objects in Eighteenth-Century America (University of North Carolina Press for the Omohundro Institute of Early American History and Culture)
● Douglas L. Winiarski, Darkness Falls on the Land of Light: Experiencing Religious Awakenings in Eighteenth-Century New England (University of North Carolina Press for the Omohundro Institute of Early American History and Culture)
The winner of the 2018 prize will be announced, and all finalists recognized, at a black-tie gala on May 23, 2018 at George Washington’s Mount Vernon.
AUTHOR BIOGRAPHIES
S. MAX EDELSON is associate professor of history at the University of Virginia and the author of Plantation Enterprise in Colonial South Carolina (Harvard University Press). He was the recipient of the National Endowment for the Humanities Digital Implementation Grant to develop MapScholar, a dynamic visualization tool for historic maps.
KEVIN J. HAYES, Professor Emeritus at the University of Central Oklahoma, is the author of several books includingThe Road to Monticello: The Life and Mind of Thomas Jefferson (Oxford University Press) and A Journey through American Literature (Oxford University Press). He is the recipient of the Virginia Library History Award presented by the Library of Virginia and the Virginia Center for the Book.
ERIC HINDERAKER is professor of history at the University of Utah and author of The Two Hendricks: Unraveling a Mohawk Mystery, which won the Dixon Ryan Fox Prize by the New York State Historical Society and the Herbert H. Lehman Prize by the New York Academy of History.
JON KUKLA is the author of Mr. Jefferson’s Women and A Wilderness So Immense: The Louisiana Purchase and the Destiny of America, as well as many scholarly articles and reviews. He has served as the executive director of the Historic New Orleans Collection and of Red Hill-The Patrick Henry National Memorial in Charlotte County, Virginia.
JAMES E. LEWIS, JR.,is professor of history at Kalamazoo College. His books include The Louisiana Purchase: Jefferson’s Noble Bargain? and John Quincy Adams: Policymaker for the Union.
JENNIFER VAN HORN is assistant professor of art history and history at the University of Delaware and specializes in early American visual and material culture.
DOUGLAS L. WINIARSKI is an associate professor of Religious Studies and American Studies at the University of Richmond, where he teaches a wide range of courses on the history of religion in early America.
###
ABOUT THE SPONSORS OF THE GEORGE WASHINGTON PRIZE
The Gilder Lehrman Institute of American HistoryFounded in 1994 by visionaries and lifelong proponents of American History education Richard Gilder and Lewis E. Lehrman, the Gilder Lehrman Institute of American History is the leading American history nonprofit organization dedicated to K-12 education. With a focus on primary sources, the Gilder Lehrman Institute illuminates the stories, people and moments that inspire students of all ages and backgrounds to learn and understand more about history. Through a diverse portfolio of education programs, including the acclaimed Hamilton Education Program, the Gilder Lehrman Institute provides opportunities for nearly two million students, 30,000 teachers and 16,000 schools worldwide. Learn more at gilderlehrman.org
George Washington’s Mount Vernon
Since 1860, more than 85 million visitors have made George Washington’s Mount Vernon the most popular historic home in America. Through thought-provoking tours, entertaining events, and stimulating educational programs on the estate and in classrooms across the nation, Mount Vernon strives to preserve George Washington’s place in history as “First in War, First in Peace, and First in the Hearts of His Countrymen.” Mount Vernon is owned and operated by the Mount Vernon Ladies’ Association, America’s oldest national preservation organization, founded in 1853. In 2013, Mount Vernon Ladies’ Association opened the Fred W. Smith National Library for the Study of George Washington, which safeguards original books and manuscripts and serves as a center for research, scholarship, and leadership development. www.mountvernon.org
Washington College was founded in 1782, the first institution of higher learning established in the new republic. George Washington was not only a principal donor to the college, but also a member of its original governing board. He received an honorary degree from the college in June 1789, two months after assuming the presidency. The college’s Starr Center for the Study of the American Experience, which administers the George Washington Prize, explores the American experience in all its diversity and complexity, seeks creative approaches to illuminating the past, and inspires thoughtful conversation informed by history.
For more information: www.washcoll.edu.
|
84552f1a71c50d7f5864c21341e1c98e | https://historynewsnetwork.org/article/168384 | Review of Elaine Weiss’s “The Woman’s Hour: The Great Fight to Win the Vote” | Review of Elaine Weiss’s “The Woman’s Hour: The Great Fight to Win the Vote”
Just
in time for Woman’s History Month, The
Woman’s Hour
by Elaine Weiss tells the story of how Tennessee became the
thirty-sixth state to ratify woman’s suffrage and made the
Nineteenth Amendment the supreme law of the land. Although we know
the outcome of the ratification struggle, Weiss is a skilled
journalist and writer who knows how to build suspense into her
historical account. While her work is based upon solid archival
research, Weiss is primarily a story teller, and the tale of
Tennessee’s ratification debate is not generally well known,
offering the author an opportunity to broaden the knowledge base of
her readers. Some academic historians may criticize Weiss for her
narrative rather than analytical history of suffrage, but this
somewhat old-fashioned approach to history will be appealing to more
general readers and provides an example of how the past may be
employed to reach a more diverse audience beyond the narrow confines
of the academy.
The
Woman’s Hour
concentrates upon three major protagonists who arrived in Nashville
during the summer of 1920 to lobby the Tennessee state legislature on
woman’s suffrage: Carrie Catt, the legendary president of the
National American Woman Suffrage Association (NAWSA), who was seeking
to complete the work of Susan B. Anthony and Elizabeth Cady Stanton;
Tennessee suffragette Sue Shelton White, who represented Alice Paul’s
more militant National Woman’s Party that had split from the NAWSA;
and Tennessee school teacher Josephine Pearson, who served as
president of the Tennessee State Association Opposed to Woman
Suffrage. Weiss also presents a strong supporting cast of women,
including Anne Dallas Dudley, Catherine Talty Kenny, and Abby
Crawford Milton of the NAWSA; Charlotte Rowe whose acid tongue and
pen scoured the suffragettes and depicted the struggle against
ratification as a crusade to save Western Civilization; and Anita
Pollitzer and Betty Gram of the National Woman’s Party—health
considerations and a lack of funds made it impossible for the party’s
founder Alice Paul to reach Nashville.
Men,
of course, occupy a key place in Weiss’s narrative for they were
the ones casting the votes on whether to expand the suffrage.
Although many suffragettes distrusted Tennessee Governor Albert
Roberts, Weiss concludes that his support for the Nineteenth
Amendment likely cost him re-election. Weiss also offers praise for
state legislators Harry Burn and Banks Turner who resisted tremendous
pressure to provide the margin of victory for the amendment in the
Tennessee House. She is less kind toward Speaker of the House Seth
Walker who abandoned the suffrage cause to lead the movement against
ratification. Weiss suggests that Walker may have been influenced by
railroad and liquor interests who feared the crusading zeal of female
voters. Weiss also credits President Woodrow Wilson and the 1920
Democratic Presidential candidate James Cox with supporting the
suffrage effort in Tennessee, but she concludes that the comments of
Republican nominee Warren Harding were ambiguous at best.
Weiss
also places the Tennessee ratification debate within historical
context by examining the history of the woman’s suffrage struggle
in the United States, noting that suffrage pioneers Stanton and
Anthony did not live to see the fruits of their labor realized with
Tennessee’s ratification of the Nineteenth Amendment. Political
change does not come easily and requires considerable resiliency as
the work of Catt, Stanton, Anthony, and Paul indicate. It will be
interesting to see whether the contemporary children’s crusade for
gun control legislation will demonstrate an appetite for the long
haul as exhibited by such reforms efforts as abolitionism, woman’s
suffrage, the Civil Rights Movement, and the struggles of the LGBTQ
community. And the extreme passions unleashed by the suffrage debate
in Tennessee suggest that the intense partisanship of contemporary
political debate is hardly anything new. Some opponents of gun
control insist that regulation of firearms and ammunition is, similar
to government involvement with health care, the entering wedge of
socialism into the fabric of American liberty and individualism.
Weiss documents that anti-suffrage leaders portrayed the vote for
women as evidence of Bolshevism seeking to destroy the American
family and pave the way for communism and collectivization. Similar
arguments were made in the 1970s against ratification of the Equal
Rights Amendment. What was somewhat different in the Tennessee
debate was the degree to which opponents of suffrage openly appealed
to racism.
Raising
the specter of Reconstruction and passage of the Fourteenth and
Fifteenth Amendments, Tennessee legislators asserted that they would
protect state rights and the Southern Lost Cause against the
imposition of federal authority and racial equality. Opponents of
woman’s suffrage painted a dire portrait of black women attempting
to vote alongside white women, with the threat of the federal
government employing force to protect black suffrage. Chronicling
the uneasy alliance between white female reformers and advocates for
abolition and black rights, Weiss notes that the suffragettes in
Tennessee failed to challenge the champions of white supremacy and
sought to separate themselves from black suffragettes such as Juno
Franklin Pierce. Of the suffragette compromise on the race issue,
Weiss writes, “They might not have the vote, but they had familial
ties, social standing, and elite education and connections. Most
important, unlike the black men and women petitioning for fundamental
dignity and rights, the Suffs were not despised for their skin color
or dismissed as lesser beings, they did not bear the perpetual scars
of slavery” (137).
While
critical of how the suffragettes handled the race question, Weiss
still commends the courage and commitment the suffragettes
demonstrated in confronting the male political patriarchy in a
Southern state like Tennessee. The suffragettes were subject to
social ostracism, arrest, and even death threats. And their
sacrifices have certainly contributed to a nation where young women
have an opportunity to reach their full potential, but Weiss also
recognizes that there is much to be done before gender equality is
achieved in the United States. While the NAWSA evolved into the
nonpartisan and education-focused League of Women Voters and Catt
turned to issues of world peace, the Woman’s Party continued to
battle political and economic discrimination and fight for the Equal
Rights Amendment, providing a legacy for activists seeking equal pay,
challenging sexual harassment, and working to expand the number of
women candidates and office holders.
In
this study of how Tennessee ratified the Nineteenth Amendment, Weiss
reminds readers of how difficult it was for women to achieve the
vote, and it is a right which should never be taken for granted. She
concludes, “The lobbying, public relations, and grassroots
organizing techniques developed by the suffragettes, as well as their
use of nonviolent protests and civil disobedience, stood as a model
for midcentury African American civil rights campaigns, anti-Vietnam
War protest groups, and gay rights activists. No doubt the future
will bring more causes, more necessary repairs to American democracy,
and more need for passionate civic activism” (335). Weiss’s
well-written historical narrative of woman’s suffrage is both
instructive and inspiring.
|
4471ea2e8f9ea16b68bee922929e4dc3 | https://historynewsnetwork.org/article/168492 | “Black Panther” Rewrites the Moment of the Original Colonial Encounter with Africa | “Black Panther” Rewrites the Moment of the Original Colonial Encounter with Africa
The Marvel Comics movie, Black
Panther, is more than a feel-good movie moment for Africans and
people of African descent. It is a bold counterpoint to the
intellectual and psychic violence of Afro-pessimism and
Afro-defeatism.
Afro-pessimism
advances a consistent bromide of Africa’s dysfunction, one that
does not acknowledge, let alone fetishize the socioeconomic and
political prospects of the continent.
Afro-defeatism is my coinage and is a
lexical cousin to Afro-pessimism. The Afro-defeatist perspective on
Africa refers to the view that it is futile and even
counterproductive to resist invidious external forces. Better to
cooperate with such external actors, even if they invade the
continent with malicious intent than resist and risk losing it all.
Afro-defeatism is about accepting defeat and then trying to extract
concessions from powerful external forces in the interest of
preserving the lives and dignities of African peoples.
Afro-defeatists can point to Africa’s
history of unsuccessful attempts to shake off the yoke of
colonialism, neocolonialism, external dependency, and other negative
phenomena unleashed on the continent by external actors. The weight
of historical evidence is on the side of Afro-defeatists, especially
if one wants to cherry-pick African history to make the case.
African colonial history is a history
of lost causes and defeated resistance movements. There are numerous
examples to invoke.
In 1856, a 15 year old Xhosa girl by
the name of Nongqawuse
led a prophetic millenarian movement aimed at dislodging British
colonial invaders who had, since at least the 1770s violently seized
the land, cattle, independence of the Xhosa
people while killing them and undermining their institutions.
Influenced by a mix of Xhosa prophetic traditions, ancestral
religious devotion, and Christianity, Nongqawuse proclaimed that the
Xhosa dead would rise again, that the British colonizers would be
defeated, and that Xhosa sovereignty restored if the people would do
what was revealed to her in a dream when the spirit of one of her
ancestors appeared to her. The Xhosa, she instructed, had to cease
the cultivation of the land, slaughter their cattle, and build new
houses, among other specific directives. Beleaguered and desperate
for a reprieve from British oppression, the Xhosa people complied and
engaged in the Xhosa
cattle killing as it is now known in South African history.
The exercise failed spectacularly. Not
only did the observance of these instructions fail to produce the
promised deliverance from the British; it led to an acute famine and
destitution, which made it easier for the British to decimate the
Xhosa, seize the rest of their lands, and enslave the starving
survivors of the catastrophe who threw themselves at the mercy of the
colonizers as a last gambit of survival.
In 1903, after the British defeated the
Sokoto
Caliphate in what is today Northern Nigeria, diehard anti-British
citizens of the caliphate, most of them motivated by an apocalyptic
anti-colonial Islamic ideology known as Mahdism,
reassembled in a small village called Satiru. There, they planned to
launch a revolution to reclaim their land from the British and usher
in the apocalypse and the arrival of the Mahdi.
According to Islamic theology, the Mahdi is a messianic figure who
would arrive at the end of time to replace the reign of evil,
understood in that context to be British rule, with a just and
righteous rule. In February 1906, the revolutionaries struck but the
overwhelming firepower of the British occupation force took a deadly
toll. The Satiru
revolt was brutally crushed and thousands of the resisters were
slaughtered in the West African Savannah.
Earlier, in the Sudan, a more intense,
longer-lasting Mahdist revolution led by Ahmad Muhammad, who
proclaimed himself the Mahdi in 1881, was crushed by the British. The
Sudan
Mahdist revolution lasted from 1881 to 1898 and showed flickers
of promise, only to be brutally put down at the cost of thousands of
African lives.
In the 1950s, many Kikuyu people, their
lands confiscated and their livelihoods destroyed by British
settlers’ land and labor hunger, launched a guerilla war against
the British, seeking to both recover their land and sovereignty. Led
by Dedan
Kimathi and anchored on Kikuyu traditions of traditional
religious oath taking, the Mau
Mau uprising was crushed through a devastating combination of
scorched earth military tactics, exemplary communal punishments, and
concentration camp internments.
The list of lost causes and futile
resistance struggles in Africa is long, tragic, and depressing. These
tragedies caused humiliation to African peoples and rendered
colonialism a fait accompli. Where African nationalists and
Afrocentric thinkers see heroic resistance against colonialism and
external machinations — a resistance undone by the ferocity of
colonial greed and bloodlust, Afro-defeatism sees the futility of
resistance and the wisdom of pragmatic compromise.
In post-colonial times, Africa’s long
history of defeats has engendered a narrative of dystopia and
fatalism, which has been countered by a discourse of nationalist
restoration, pan-African reclamation, and historical re-imagining in
which defeats and conquests are rewritten
as heroic nationalist resistance.
Black Panther sidesteps this
debate altogether. It remakes and rewrites the moment of the original
colonial encounter. Wakanda, we are told, preserved its independence
by repelling a foreign invasion. Wakanda represents an Africa in
which the colonial project came to naught or the resistance measures
of Africans succeeded in defeating the colonial invasion, preserving
African sovereignty. This interpretation does not correspond to much
of African history, but its obvious Afro-futurism is powerful and
relies on a fantastical retelling of the past.
Black Panther is African history
imagined radically differently. It is a counterfactual history that
ignores the tendency of historians to write the past backwards from
the present rather than begin from the moment that the story began
and then imagine what could have happened—the various roads not
taken and the possible trajectories that fizzled out. Black
Panther asks: what if the aforementioned anti-colonial resistance
movements had succeeded? What kind of present and future would Africa
have? Wakanda is the emphatic answer to this question. In other
words, Black Panther answers its own question.
Wakanda is also a rejection of the
Afro-pessimist notion of Africa as a charity case — as an
aid-dependent, war-ravaged, diseased, and destitute backwater.
The state of Wakanda is not only
self-reliant; it exports aid and is capable of making investments and
acquisitions in the world’s best-known bastion of capitalism —
the United States.
Wakanda also rebuts the Afropessimist
discourse of despair, which perhaps finds its most strident
manifestation in dystopian portraits of the conditions of African
Americans and other African diasporas. As a synecdochical reference
to Africa, Wakanda is not the ancestral source that offers no hope to
its suffering, oppressed diaspora, or that itself needs redemption.
It is the opposite of that—a developed, confident, and capable
cradle. A source of pride. A place to return to.
There is a long tradition in the
African diaspora of return movements. In the nineteenth century
several African
American emigration movements flourished and in the twentieth
Marcus Garvey’s United Negro Improvement Association captured the
imagination of African Americans and West Indians with its
back-to-Africa project. These movements saw themselves as both return
and civilizing missions.
In addition to seeking individual and
group freedom in Africa, the emigrationists wanted to leverage their
exposure and education to uplift a “backward,” “heathen”
continent incapable of developing itself without external help from
those possessing the civilizational aptitudes of Christianity and
post-Enlightenment modernity. They echoed the writings of diaspora
intellectuals such as Edward
Wilmot Blyden and Alexander
Crummell, who saw themselves as black civilizers of Africa.
In the twentieth
and twenty-first centuries, as African Americans became suffocated by
racism, segregation, discrimination, and racialized violence, the
civilizing mission to Africa morphed into a longing for a new kind of
Africa, an Africa that would not only be a refuge from the anti-black
racism of America but would also assist African Americans in
defeating this racism. In Black Panther, Eric Killmonger
laments what he sees as the unwillingness of Wakanda, an African
superpower, to come to the aid of African Americans despite enjoying
peace and prosperity and possessing the capacity to intervene in the
predicament of African Americans. This is an inversion of the
Afro-pessimist template, which casts Africa as needing the assistance
of its diasporas but not the other way around.
As if to respond to Killmonger, in the
last scene of the movie, T’chala goes to America, specifically to
black America, to try to rejuvenate it. Wakanda becomes a giver of
foreign aid, a dispenser of Foreign Direct Investment (FDI), not a
recipient of it.
There is ambivalence and debate in
Black Panther on whether and how Wakanda should relate with
the West — with fellow world powers — partly because it fears
that extensive and unguarded interactions with peers would expose its
most valuable resource, vibranium, to rapacious schemes. Unmediated
openness would render Wakanda vulnerable to the envious, aggrandizing
maneuvers of other countries — a familiar gesture toward the rape
of Africa’s natural resources by outside forces pretending to be
friends.
In Black Panther, the question
of how Wakanda should manage and preserve its greatness against the
malicious intentions of other countries remains up in the air. The
question is unresolved partly because it is not an easy issue of
engaging with or disengaging from the world but rather one of finding
a beneficial but safe equilibrium between engagement and
self-insulation. Nonetheless, Wakanda seems capable of controlling
its destiny, as well as the terms upon which it would engage with the
world. It is in no hurry to open itself up to marauding external
forces.
This Wakandan posture is premised on
leverage and self-sufficiency and it negates the Afro-pessimist
paradigm of an Africa incapable of literally and figuratively
protecting itself against the ravages of malevolent external —
mostly Western — forces.
|
9990ebde9d24f6097c7f03cb8aab7984 | https://historynewsnetwork.org/article/168499 | UCLA historian Gabriel Piterberg is out for sexual harassment | UCLA historian Gabriel Piterberg is out for sexual harassment
In 2017, after an extensive investigation, the UCLA Title IX Office found that Prof. Gabriel Piterberg committed sexual harassment in violation of University sexual harassment policy by making unwelcome comments of a sexual nature and unwelcome physical conduct of a sexual nature (in the form of an open mouth kiss). According to the investigation report, these comments and conduct occurred in 2008 and during the time period of 2009 to 2013. The investigation report was submitted to the appropriate Administration and Academic Senate units for further handling. At the urging of the Privilege and Tenure Committee of the Academic Senate, the Administration and Prof. Piterberg engaged in settlement negotiations. Prof. Piterberg disputes and denies the findings of the Title IX investigation report. The parties have reached a settlement, which includes separation from employment, denial of emeritus status, denial of future employment with the University of California, denial of permanent or temporary office space or support, and denial of parking privileges and campus access beyond that which is afforded to general members of the public. The Administration remains firmly committed to increasing transparency
|
dbdc1f31c293abaa227ca544d75b570b | https://historynewsnetwork.org/article/168517 | Ancient DNA Is Rewriting Human (and Neanderthal) History | Ancient DNA Is Rewriting Human (and Neanderthal) History
Geneticist David Reich used to study the living, but now he studies the dead.
The precipitating event came in the form of 40,000-year-old Neanderthal bones found in a Croatian cave. So well-preserved were the bones that they yielded enough DNA for sequencing, and it became Reich’s job in 2007 to analyze the DNA for signs that Neanderthals interbred with humans—a idea he was “deeply suspicious” of at the time.
To his surprise, the DNA revealed that humans and Neanderthals did interbreed in their time together in Europe. Possibly even more than once. Today, surprisingly, the people carrying the most Neanderthal DNA are not in Europe but in East Asia—likely due to the patterns of ancient human migration in Eurasia in the thousands of years after Neanderthals died out. All this painted a complicated but dynamic picture of human prehistory. Since the very beginning of our species, humans have been on the move; at times they replaced and at other times they mixed with the local population, first hominids like Neanderthals and later other humans.
|
42c1711d2d58793aafcec075d42b8132 | https://historynewsnetwork.org/article/168521 | ‘Colorado College’s Own Harvey Weinstein’ | ‘Colorado College’s Own Harvey Weinstein’
As William F. Slocum left the presidency of Colorado College a century ago, he was awarded an honorary degree. Two buildings were later named for him, in recognition of the nearly 30 years he served as president, years in which he is widely credited with saving the college from financial distress and making it sustainable.
What wasn't revealed publicly in 1917 was that the board had pressured him to leave after an investigation -- unusual for the time -- produced detailed accounts from numerous women about how he had harassed and in some cases assaulted them. Those investigations came to light last year when Jessy Randall, the college's archivist, found the surviving documents from the era and shared them with others on campus. She had been looking into this history for some time, but made the effort a priority after the Me Too movement drew attention to powerful men using their authority to abuse women.
A blog post she wrote on a non-college website, "Colorado College's Own Harvey Weinstein," introduces selections from her findings. The documents are extraordinary in that women -- in an era when they might well not have been believed -- outlined what Slocum did to them. So many eventually came forward that the college investigated and then forced him out. The testimonies show how trapped and humiliated the women felt, and how Slocum attacked women who were employees or had other ties to the college.
|
c328a6a26a13369a6e3a24d798f9a659 | https://historynewsnetwork.org/article/168544 | Why We Think of Babe Ruth as an Overgrown Boy and Why We’re Wrong | Why We Think of Babe Ruth as an Overgrown Boy and Why We’re Wrong
Baseball
season soon will be upon us again, and all elements appear in place
for a resurgence of the national game. In the past, baseball’s
popularity flourished in times of national division, the game
providing comforting connection, tradition, and diversion. If
political strife works in favor of baseball: expect a banner season.
Meanwhile, football—late claimant to the mantle of “national
pastime”—appears mired in political squabbles and burdened by an
existential crisis over brain injury. This may be the year when
baseball reclaims its past glories.
A
game rich in tradition, baseball—America’s dominant sport from
the mid-nineteenth to the mid-twentieth century—has attracted the
serious attention of some fine historians: Steven Reiss, C. Edward
White, and Charles Alexander among others. Harold Seymour and David
Q. Voigt set a standard for serious research and insightful writing,
each penning multi-volume histories of the game (although these works
are now decades old). Of course, there’s also the stirring Ken
Burns documentary series.
Yet
a gap remains between public and scholarly attention. Each year,
publishers churn out dozens of books on baseball history, most
focusing on the minutia of the game and written by amateur
historians. At a time when cultural history reigns supreme, it is odd
that baseball, a ritualistic game loaded with cultural cues and
signifiers, hasn’t attracted more scholarship, especially from a
younger generation of historians. For those versed in how the
complexities of race, class, and gender intersected with economics
and culture to forge modern America, baseball would seem a natural
for closer inspection.
Indeed,
historians have applied, for instance, the rubrics of race and gender
to the national game. Fine histories of the Negro League have been
prepared by Robert Peterson, Don Spivey, Michael Lomax and others.
Yet, we still need to know more about how segregation shaped white
baseball. In the same way historians now view segregation as
affecting every element and institution of the white Jim Crow South,
so might race have defined baseball, even in its pre-Jackie Robinson,
segregated days. Likewise, the game emerged and remained, as
sociologist Anne Roschelle suggests, “a man’s domain”—with a
few prominent exceptions. Gender operates, argue cultural historians,
even in the absence of women. The national game, then, has much to
tell us about gender dynamics.
When
friends and colleagues heard I was writing a book about Babe Ruth,
the response was hardly encouraging: “Can anything really new be
written about a figure so studied and so familiar?” While a
neophyte to sports history (my previous work dealt with US
international relations in Southeast Asia), I felt certain that
dogged research and an alertness for how race, class, gender, and
economic power operate would yield new insights. Indeed, my research
revealed a very different Babe Ruth from the Falstaffian,
sinner/saint so often depicted. I found that a player dismissed by
perhaps his best biographer as having the mind of an adolescent was
actually an early proponent of trade unionism for professional ball
players. Throughout his career, in fact, Ruth remained an outspoken
critic of the harsh reserve clause system and the often shoddy
treatment of his fellow players.
By
1920, his debut year with the Yankees, Ruth emerged simultaneously a
mammoth blessing and a potential curse for baseball. His popularity
and importance to baseball was like nothing seen before or since. At
times, he inspired the sort of hysteria later sparked by a Sinatra,
Elvis, or Michael Jackson at their heights. After two decades of
labor wars, battles against rival upstart leagues, and surly,
violence-prone stars like Ty Cobb, Ruth offered a charismatic and
unifying front—completely diverting attention from the game’s
troubles.
Yet
the Babe’s outspokenness remained a problem. As his fame soared,
baseball officials fretted he might strike out on his own and form a
rival league—potentially dividing and destroying major league ball.
Such a challenge in the form of the Federal League had been beaten
back by baseball magnates only recently. Meanwhile, interested
parties sought to revive a players’ union—also dreaded by the
paternalistic baseball establishment. A regular at horse tracks
around the country, Ruth’s propensity for gambling (although he
never bet on baseball) also proved worrisome. Late in the 1920
season, as Ruth smashed the single-season homerun record, news broke
of the Black Sox scandal. Controlling its workforce became a priority
for owners and league officials.
So
began a campaign to reign in baseball’s biggest star. In 1921,
newly appointed baseball Commissioner Kenesaw Mountain Landis
suspended Ruth on charges of launching a postseason barnstorming tour
without permission. A miserable season for the Babe followed.
Newspaper coverage turned decidedly negative. Ruth recovered with a
well-publicized comeback in 1923, but in 1925, he found himself
struggling in spring training with a serious illness. The source of
his ailment remains unclear, but writers mocked his collapse as “the
stomach ache heard around the world.” The Yankees’ business
manager peddled rumors that venereal disease was the real issue. A
plot to humble the opinionated star was clearly afoot. Even when he
returned to the lineup, Ruth struggled. At one point in the 1925
season when Ruth clashed with manager Miller Huggins, newspapers
“outed” Ruth’s live-in girlfriend.
Sportswriters,
in fact, proved particularly useful in the battle against Ruth. Most
were subsidized by the teams they covered; first-class travel came
gratis, but writers were expected to toe management’s line—and
they did. When it came to Ruth, the strategy was to depict him like
an overgrown child. His nickname, the Babe, served this campaign
well—as did rumors about his racial background. From his earliest
days, Ruth’s dark complexion and facial features drove gossip about
his racial lineage. Ruth’s generally liberal attitudes about race
fed the whispers. On barnstorming tours, Ruth-led squads would
frequently play Negro League teams. The Babe would conspicuously hang
with black players talking baseball, and he supported black
charitable causes (alongside all manner of charities). As a result,
the Babe had a strong following among African Americans. Opposing
players, however, throughout Ruth’s career wielded racist barbs
when riding the Babe during games. Blatant references rarely found
their way into print, but sportswriters frequently used what appears
coded language such as labeling Ruth “the big baboon.” Stories of
Ruth’s prodigious appetite—whether for food, sex, or just plain
fun—also appear rooted in the racial stereotyping of the times.
Baseball’s campaign against Ruth bore more than just a passing
whiff of the racism so prevalent in early 20th Century America.
Later
in his career, when Ruth’s personal life settled, and he adopted a
more diplomatic tone toward those in power (although, especially
during contract negotiations, Ruth continued to lambaste baseball’s
lopsided labor power structure), another weapon was used against him:
gender. Unlike today’s sports figures, Ruth had no real agent
representing his interests. Christy Walsh, an enterprising journalist
who worked with Ruth starting in 1921, offered business advice and
helped make promotional deals. But Walsh had numerous other clients
and ran an extensive newspaper syndicate. In negotiating his
contracts and in many of his professional decisions, Ruth was left to
himself. He had one strong voice in his corner: his second wife,
Claire Ruth, whom he married in 1929. Claire quickly took the job of
Ruth’s self-proclaimed “manager and secretary.” She was
outspoken about her knowledge of the game. When it came to baseball,
she announced, she knew “about as much about it as anyone off the
diamond… Sometimes by the way they perform, I think I know as much
as the players.” Baseball’s leadership class hardly appreciated
Claire’s candor nor her aggressive promotion of her husband. In a
male-centered world, Mrs. Ruth stood out and was resented. She
provided vital support for the Babe and allowed him to concentrate on
baseball, but her strong presence rubbed many the wrong way and made
Ruth appear the junior partner in a relationship that flipped gender
norms. In the early 1930s, as Ruth sought an opportunity to
transition to a manager position, several owners bypassed the Babe,
fearing Mrs. Ruth. If
I hired Babe, fretted Phillies owner and manager Connie Mack, “She
[Claire] would be managing the team in a month.”
In
the end, Ruth never got his chance to manage a team, an undertaking
he hoped would help him evolve from boy wonder to responsible adult
without separating him from the game he loved. Ruth’s independent
streak and the power he briefly held over baseball was never
forgiven. Today, he is remembered much as his critics among
baseball’s leadership ranks would have it: as an overgrown boy not
fully to be trusted. Tropes related to gender and race helped
reinforce this image. Significant truths about a national hero appear
still distorted by a smoke screen set off by baseball’s
establishment.
We
may well be on the cusp of a second golden era for baseball. All
pieces seem in place for a great revival. Let’s hope that
historians join to embrace complexities and fullness of baseball, a
game offering an invaluable window to understanding American culture.
|
5ea55a0e125985811ec98f0b38a18678 | https://historynewsnetwork.org/article/168559 | Geneticist at Harvard Medical School has retrieved DNA from more than 900 ancient people. | Geneticist at Harvard Medical School has retrieved DNA from more than 900 ancient people.
In less than three years, Dr. [David] Reich’s laboratory has published DNA from the genomes of 938 ancient humans — more than all other research teams working in this field combined. The work in his lab [at Harvard] has reshaped our understanding of human prehistory.
“They often answer age-old questions and sometimes provide astonishing unanticipated insights,” said Svante Paabo, the director of the Max Planck Institute of Paleoanthropology in Leipzig, Germany.
Dr. Reich, Dr. Paabo and other experts in ancient DNA are putting together a new history of humanity, one that runs in parallel with the narratives gleaned from fossils and written records. In Dr. Reich’s research, he and his colleagues have shed light on the peopling of the planet and the spread of agriculture, among other momentous events.
|
d7338daba6ddb5a45a03d7f6822745cd | https://historynewsnetwork.org/article/168568 | What You May Not Realize About the Supreme Court Ruling that Backed Affirmative Action | What You May Not Realize About the Supreme Court Ruling that Backed Affirmative Action
Regents
v. Bakke turns 40 this year, no small feat given the numerous
assaults levelled at it by students rejected from their colleges of
choice based on their race (or so they believe). Even the Supreme
Court itself has cast a disapproving eye on the case, arguing in 2003
that Bakke would no longer be relevant by 2028. Yet, here it
is – entering middle age – and with it the compelling interest of
diversity, a concept that has transformed much of American life,
including universities, corporations, even the arts, but still
confounds constitutional experts. How, for example, could diversity
rival national security as a constitutional interest, a position
taken by the opinion’s author, Justice Lewis F. Powell, Jr.?
The answer, surprisingly, might lie in
Jim Crow. Born and raised in Richmond, Virginia; Powell grew up in a
world of profound inequality, but also rich diversity: a place where
race was presumed to be a core factor of one’s identity, and where
separate racial identities were reinforced by law.
In a manner that few understood at the
time, Powell’s insistence on diversity had little to do with
affirmative action, a program that he rejected on the grounds that
most of America was made up of minorities, and that no single
minority had suffered demonstrably more than anyone else – a view
that really only made sense if one belonged to a patrician family
from Richmond devastated by the Civil War. Unsympathetic to the idea
that blacks deserved reparations for slavery, Powell saw diversity as
a value of a different order, a core attribute of American society
that was on full display in the Jim Crow South.
Others agreed, mainly educated white
southerners who overlooked Jim Crow’s discriminatory aspect and
argued instead that racial segregation promoted pluralism: separate
institutions, separate traditions, even separate racial cultures.
Perhaps the most eloquent exponent of this view was Robert Penn
Warren (two years Powell’s senior, and the author of All the
King’s Men), who wrote an essay in 1930 that cast racial
segregation as a “Briar Patch,” a place where African Americans
could develop their own “art” free from white interference and
control. Though he believed that African Americans warranted better
treatment, Warren remained unconvinced that forcing integration on
the region would benefit it culturally. As he explained to Ralph
Ellison in 1956, the Supreme Court’s order that the region
integrate in Brown v. Board of Education threatened the
region’s rich diversity, or what he called its “pluralism.”
William Faulkner advanced a similar
view, portraying black society in the South as a haven free from the
degenerating influences of whites, a theme that ran through his novel
The Sound and the Fury, which contrasted the suicidal,
intellectually disabled, lily-white Compsons with their heroic,
stalwart black servant, Dilsey. Black southerners’ capacity to
“endure” argued Faulkner, held out hope not just for the South,
but the human race, a point he made during his Nobel Prize speech in
1950. Yet, like Warren, Faulkner also expressed doubts about the
blurring of racial lines, casting integration as an inevitable step
towards “miscegenation,” a phenomenon that he depicted in tragic
terms via interracial figures like Charles Bon in Absalom,
Absalom! and Joe Christmas in Light in August.
Even African Americans joined. Black
author Zora Neale Hurston complained to the Orlando Sentinel
in 1955 that Brown v. Board of Education was “insulting”
to her race, in part because it presumed that white institutions were
intrinsically superior to black, a position that ignored the richness
of black life and culture in the South. Hurston celebrated that life
and culture as early as the 1920s, leading her to become a scion of
the Harlem Renaissance and a critic of white southern society, which
she portrayed as violent and uncreative in her 1948 novel Seraph
on the Suwanee.
Arguably no southerner celebrated the
diversity of the American South more endearingly, however, than
Harper Lee, who wrote into life a southern lawyer, Atticus Finch,
committed to providing African Americans with better treatment but
also dedicated to the sacred “code” of segregation, a code that
allowed the races to live and work together in harmony, free from
tension and strife. Atticus made this point clear during a rape
trial in which he defended Tom Robinson, an African American, against
false charges leveled by a poor white woman named Mayella Ewell.
Only when Ewell tried to seduce Robinson, argued Atticus, did she
disrupt the peaceful idyll of Maycomb. So long as the races kept to
their respective sexual spheres, he maintained, peace reigned.
Whites could even learn from blacks, like Atticus’s children Scout
and Jem, who benefitted when their maid Calpurnia took them to her
black church.
Though Lee modeled Atticus after her
father, she could just as easily have modelled him after Lewis F.
Powell, Jr. – who shared many of the same views that she did. The
irony, of course, was that Powell actually was a lawyer, who ended up
on the Supreme Court. That his view lives on in American law is
worth noting, a legacy of the American South more inspiring than
confederate monuments or the KKK, and worth commemorating.
|
d5e45d272a88b3de00d4c6e2498a0f8a | https://historynewsnetwork.org/article/168599 | What Americans Think About Presidential Scandals Like the Stormy Daniels Story Has Changed | What Americans Think About Presidential Scandals Like the Stormy Daniels Story Has Changed
For about as long as the country has existed, the public and the press have imposed few consequences on Presidents for what they do behind closed doors, even when those actions become public, as long as those actions don’t affect the rest of the government. What has changed is the perception of when that line gets crossed.
In the nation’s earliest years, newspapers were associated with political parties, so accusations of infidelity were often brought up to slam political opponents but dismissed by loyalists. “The golden age of America’s founding was also the gutter age of American reporting,” as pundit Eric Burns put it in his book Infamous Scribblers: The Founding Fathers and the Rowdy Beginnings of American Journalism.
The most notorious scandalmonger of that period was James Callender, a Federalist newspaper editor, who, for example, spread stories of Thomas Jefferson’s fathering children with Sally Hemings, a woman enslaved at his estate, and also making a move on the wife of his good friend from college. Of the latter accusation, Jefferson wrote in a July 1, 1805, letter to his Secretary of the Navy Robert Smith, “I plead guilty to one of their charges, that when young & single I offered love to a handsome lady. I acknolege it’s incorrectness; it is the only one, founded in truth among all their allegations against me.” (However, Monticello, the museum at the site of his former home, now acknowledges that the charge about Hemings is true too.)
But such claims about Jefferson didn’t seriously damage his career. The times when personal stories like those did make a difference was when there was concern over whether public figures’ personal lives affected their jobs.
|
2e9d743a886366bd30efd77813bcd2ac | https://historynewsnetwork.org/article/168637 | His Life-Changing Moment Came at Age 5: An Interview with Historian Peter H. Wood | His Life-Changing Moment Came at Age 5: An Interview with Historian Peter H. Wood
Peter H. Wood is professor emeritus of history at Duke University. He was educated at Oxford and Harvard, where he earned a PhD in 1972. He is the author of many books including Black Majority: Negroes in Colonial South Carolina from 1670 through the Stono Rebellion, which Edmund Morgan said had "gone beyond any previous study of the history of slavery in the colonial period."
Why did you choose history as your career?
Both my parents were scientists, but I faint at the sight of blood, so medicine was out. They
nurtured a love of fact over fiction, so even though I wrote lots of poems, I was not going to be a novelist.
Also, I was a lefthander who could never hit curve balls very well, so I gave up my dreams of being the
next Stan Musial for the St. Louis Cardinals. I guess that was fact triumphing over fiction!
I fell in love with history early, because it allowed me to roam widely. Most careers address
some slice of life, while history allows you to go anywhere. Not just any place or time, but bringing any
tools you wish and can manage. If you are fascinated by economics or astronomy, feminism or religion,
literature or cooking, you can probably bring that interest to bear. Our own strengths and weaknesses,
personal interests and blind spots tend to shape our work as much as any “availability of sources.”
Who was your favorite history teacher?
Reaching college, I found that many classmates had endured scores of mediocre teachers and had
encountered few, if any inspiring ones. I was amazed, since my own experience had been the opposite.
From the start, I was lucky—maybe privileged is a better word—to have a huge range of terrific teachers
at every stage, and certainly half a dozen stand out when it comes to being inspired to study history.
In St. Louis, a legendary fifth-grade teacher named Ruth Ferris made life on the Mississippi River
central to everything we did. I moved to Baltimore after the seventh grade, and the next year Ludlow
Baldwin, a would-be archaeologist, infused me with his enthusiasm for the ancient Mediterranean world.
Another teacher and coach, Nick Schloeder, went out of his way to help me see beyond the narrowness of
our small private school, still thoroughly segregated in the 1950s by class, race and gender.
In my first college semester, a lecture class on American social history with Professor Oscar
Handlin introduced me to a broad interdisciplinary world that I hardly knew existed. After graduation,
time at Oxford’s Merton College kept my string of marvelous mentors intact. When I returned to Harvard
for graduate school, J. H. Parry stirred my interest in oceanic and intercultural history, and Bernard Bailyn
showed me the endless vitality of the early American field. He taught me, and many others, that there’s
no limit to difficult and rewarding primary research, and no substitute for clear and engaging prose.
Which history museums are your favorites? Why?
Ten years ago I had the good luck to visit the Corinium Museum in Cirencester, a market town in
east Gloustershire that was the site of a Roman town. I love archaeology, so I was fascinated by their
artifacts from ancient Britain. But what intrigued me most was seeing how creatively they had made this
history museum accessible and exciting for young people. Below the wall copy for adults, they had
simple large-print explanations for children, posted at their eye level but not dumbed down.
For me, art museums are also history museums, and I grew up with the idea that good history and
good art belong together. That’s certainly the conviction behind three books that I have written over the
years about the powerful images of African Americans created by the great artist Winslow Homer. One
museum in particular nurtured this conviction, and I had a wonderful guide. Seventy years ago, I would
visit the St. Louis Art Museum regularly with my mother. I still recall the imposing equestrian statue of
King Louis IX of France (Saint Louis) that stands out front. Inside, the changing displays of objects
made art and history come alive in ways that have shaped my life. What a gift.
Early in 1949, an exhibition touring the U.S. to raise funds for the Marshall Plan came to St.
Louis. It was called “Masterpieces from the Berlin Museums.” I had the chance to stand in front of
Rembrandt’s Man with a Golden Helmet. It seemed both awesome and accessible. If you can have life-changing moments at age five, that was one for me. Another came that fall, when the museum staged a
show about the Mississippi River, full of captivating 19th-century images by George Catlin, George Caleb
Bingham, Seth Eastman and others. (I had already ridden one of the river’s last stern wheelers, when our
family took an excursion from St. Louis to Mark Twain’s Hannibal on the steamboat Gordon C. Greene.)
The exhibit’s centerpiece was a century-old panorama, painted in 1850 on a huge strip of muslin
sheeting, seven and half feet high and nearly 350 feet long. This giant scroll unwrapped between two
bobbins, conveying a chronological history of the river. I recall seeing “DeSoto’s Nighttime Burial” as
the painted scenes spooled by in a darkened room. Generations earlier, other children had watched this
primitive “moving picture” flow past them at the 1876 Centennial Exhibition in Philadelphia.
Ever since, I’ve loved the history of rivers, and of that river in particular. Indeed, I am sure this
helps to explain why I wrote an essay about the French explorer La Salle for the American Historical
Review in 1984, and why in retirement I have been researching dugout canoes on the ancient Mississippi.
After lots of paddling, my piece on dugouts will appear in the journal Early American Studies this spring.
What was your favorite historic site trip? Why?
Starting with that childhood trip to Hannibal, I have done my share of historical tourism over the
years. I still enjoy seeing historic locations that I have read about, or even written about. But now I am
also trying to give back a bit. Last June I had a chance to accompany half a dozen Boulder public school
teachers on their first trip to the South Carolina Lowcountry. They are eager to teach more African
American history in their school district, so we formed a group called AT LAST: Alliance for Teaching
the Legacies of the Atlantic Slave Trade. Lucky for me, they chose to focus on South Carolina.
Our visit started at Sullivan’s Island overlooking Charleston harbor, where tens of thousands of
captured Africans caught their first glimpse of the “strange new land” where they would be forced to
work on huge plantations. Thanks to expertise brought from Africa, they would grow rice and indigo in
these swampy slave labor camps, contributing generations of unpaid labor. At the basketry pavilion in
nearby Mount Pleasant, we met with Gullah basket makers, still carrying on another tradition tied to
Africa, and at Charleston’s Old Slave Mart Museum we gathered information to take back to Colorado.
|
2d03a8a3806083870285ae1fa85f0456 | https://historynewsnetwork.org/article/168757 | Highlights of the Annual Meeting of the Organization of American Historians: 2018 | Highlights of the Annual Meeting of the Organization of American Historians: 2018
If you come across a social media post or have news that you think we should highlight, please send it to the editor of HNN: editor@hnn.us.
Related Link
● HNN's coverage of past history conventions
Key Links
● Twitter @The_OAH OAH Official Tweets
● Twitter #OAH18 Hashtag for Tweets About the Convention (used by some: #OAH2018)
Day to Day Coverage
● Day 1
● Day 2 – history prizes awarded
● Day 2 – everything else that happened
● Day 3
Blogs & Tweets● Ed Ayers OAH Presidential Address: "Everyone Our Own Historian"● Confederate Monuments: What to Do? (Plenary)● Don’t Blame Us... Again: Historical Perspectives on the Democratic Party and the Rise of Trump● Why Puerto Rico Matters to Historians of the United States● Bridging Race, Ideology, and Strategy: Coalitions from the Long 1960s to the Reagan Years● She Persisted: A New Assistant Professor Tells Her Story of Her First OAH● Historians in the Twittersphere● Historians on "Hamilton" (The Play) Here and Here● Constructions of Citizenship and Belonging in the Repatriation Era● Taking Control of Capitalism in 20th-Century Chicago● Doing digital history
|
13b8e72c0b420059972d28586dca0f61 | https://historynewsnetwork.org/article/168762 | Video of the Week: Why we need government regulation | Video of the Week: Why we need government regulation
During the early part of the 20th century, the growing scientific knowledge that certain diseases were caused by vitamin and mineral deficiencies sparked public interest in products that touted these substances. But the public had little understanding of this emerging health care field and, as a result, was often easy prey for unscrupulous marketers who used phony claims that their products had therapeutic value.
One such charlatan was a man named E. Virgil Neal, whose past schemes included palm-reading and hypnotism performed under the name Xenophon LaMotte Sage; a mail-order health and self-improvement program, which earned him a conviction for mail fraud; and a French cosmetics company that marketed false hair regenerators and bust enhancers.
|
ee4ccb01e319c6a359148b43858db031 | https://historynewsnetwork.org/article/168765 | Historian discovers he’s related to the people he’s written a book about | Historian discovers he’s related to the people he’s written a book about
At a time when history has never been so widely and blissfully ignored, and not just by our president, millions of Americans are busy spitting into DNA-collection tubes, scrutinizing old newspapers and tracing their family history back as far as they can via the website Ancestryand other services. Historians like me tend to scoff at these attempts. Who cares if you’ve just found out you’re related to George Washington’s aunt? So what?
But that was before I learned of a relation of my own, a Connecticut woman from the early 19th century named Harriet Gold, and I’ve gotten fairly obsessed with her.
In my defense, she is a figure of genuine historical interest. I included her in a book of history I was writing without even realizing she was, as the genealogists say, “one of ours.” She turns out to be the grandniece of the man who lies under an obelisk at the center of my family graveyard — the “founder” of our clan.
Now you’re the one going — so? Here’s the thing: Suddenly that book was no longer just by me. It was also aboutme. Two different books. History and genealogy, after all, are two radically divergent takes on the past. The first says, “This matters.” The second says, “This matters to me.”...
As a historian, I couldn’t take the story past the facts. But as Gold’s relative, I felt I could hear her brother’s shrieks and imagine what she must have felt while fleeing Cornwall and entering a strange new land full of rising tensions. The whole lot of it.
For a historian, such a leap of imagination amounts to malpractice. But it delivered a more felt connection to the story than straight historiography had been able to provide. Obviously, history can’t depend on genealogy. But history shouldn’t scorn it, either. History can make use of the genealogical perspective and its transporting empathic power.
But let’s broaden it out — not just to identify with one character selected by family lineage, but with all the characters by virtue of our common heritage. To be Harriet Gold, but to be Elias Boudinot, too. And Ross and the Ridge, besides. Try to see and feel life as each of them did. We’ll never fully succeed, but the effort can help collapse time and make for a history we can all relate to. This is the lesson of America: We are all family here.
|
8ac99441a9467bb26da2e18f9a68602b | https://historynewsnetwork.org/article/168788 | The Founders Worried too, About Foreign Meddling in Our Elections | The Founders Worried too, About Foreign Meddling in Our Elections
In December of 1787, leading American statesmen Thomas Jefferson and John Adams were both stationed in Europe on behalf of their fledgling country. Adams had been named the first U.S. minister to Great Britain in February, 1785 and Thomas Jefferson served as minister to France from 1785 to 1789. Comrades during the Revolutionary War, during their time in Europe, the two men were still good friends, although when they later returned to the United States their relationship fractured over radically different political outlooks. But when the Adamses left France after a residence of only eight months for another three years in England, John’s wife, future First Lady Abigail Adams, called Jefferson “one of the choicest ones of the earth.”
In 1787, the new United States Constitution was being debated in Philadelphia, and both Jefferson and Adams followed developments closely from afar. In an oft- quoted letter written by Adams to Jefferson on December 6, 1787, Adams referred to the “Project of the new Constitution,” and the various objections both men had to the evolving document. Adams famously declared “You are afraid of the one – I, of the few.” Jefferson detested the institution of monarchy and was concerned that the installation of a powerful executive would overturn the principles of the American Revolution and create a quasi-monarchy. Adams, on the other hand, feared the creation of an elite aristocracy in the form of senators. Because of his concern about such a possible oligarchy, Adams therefore maintained “I would have given more power to the President and less to the Senate,” and he advocated for a strong executive.
What is more surprising, and for the most part overlooked, about Adams’s letter is his discussion of the potential danger of foreign meddling in American elections, a subject that is especially timely today. “You are apprehensive of foreign Interference, Intrigue, and Influence,” Adams wrote. “So am I, - But, as often as Elections happen, the danger of foreign Influence recurs.” To counteract that danger, Adams maintained that the less frequently elections occurred, “the danger of foreign influence will be less.” Of course, Adams’s view did not prevail and regular elections and the peaceful transfer of power are still regarded as hallmarks of American democracy.
Once the two men returned in the early national period, radical divisions developed in the United States between the two nascent political parties, the Federalists headed by George Washington and John Adams, and the Republicans led by Thomas Jefferson and James Madison. Old alliances were fractured, and the subject of the French Revolution became a pivotal flashpoint. The French Revolution had left a memory of radicalism and bloody terror in the minds of many Americans, heightening disagreements between political factions. Jefferson viewed the event in a largely benign manner as a step toward ousting monarchy and bringing progressive republican ideals of freedom and liberty to the oppressed common people. John and Abigail Adams, on the other hand, were appalled at the violence and chaos that the French Revolution had unleashed, which they viewed as the excesses of unfettered democracy. Later as First Lady, in 1798 Abigail confided to her sister about her fear of French subversion and the specter of French revolutionary “Jacobins,” supported by Jefferson, infiltrating American politics and government. The early camaraderie of the Washingtons and Jefferson also dissolved over political disagreements. A few years after George Washington’s death, the first First Lady, Martha Washington, had come to regard Jefferson’s election to the presidency in 1800 as a tragedy. She told a visitor to Mt. Vernon that “the election of Mr. Jefferson, whom she considered one of the most detestable of mankind, as the greatest misfortune our country has ever experienced.” Certainly not a very tactful way to publicly describe the leader of the country.
The deep schism between the Federalists and Republicans and fear of foreign meddling in elections in the early United States goes back to the dawn of the American republic and make today’s political divisions pale by comparison. Indeed, politics were every bit as divisive and fractious then as they are now. America’s first presidents and even their First Ladies all vocally entered the fray. We should take comfort in the fact that despite that rancorous rhetoric and division, the country survived and thrived. America’s founders exhibited all too human flaws, but they were united by principled convictions and a sincere underlying devotion to the public good. We can only hope a contemporary cadre of American leaders will rise to the occasion and demonstrate a similar measure of dedication.
|
63af1cb2af46d928082a21758d82c6a9 | https://historynewsnetwork.org/article/168791 | Why No One Should Be Shocked at Paul Ryan’s Retirement | Why No One Should Be Shocked at Paul Ryan’s Retirement
Related Link The End of the Strong Speaker By Julian Zelizer
House Speaker Paul Ryan’s (R-Wis.) public announcement that he will not seek reelection in 2018 is important, but not entirely surprising. His campaign war chest is substantial, as always, and he labors at his House duties.
The stated reason is that he wants to spend more time with his family. Observers rightly regard Ryan as a committed family man. He and wife Janna have three teenage children, a particularly challenging period in life. Yet the relentless pressures of the post of House Speaker were also clearly a factor in his decision, as he faced at least the possibility of election defeat in November.
Beyond personal considerations, structural changes in Congress make life tough for any House Speaker. Since the turmoil of the 1968 election, which included the assassination of Democratic presidential contender Sen. Robert Kennedy (D-N.Y.) both parties have embraced state primary elections to nominate their candidates.
The wider context that year included the assassination of Civil Rights leader Dr. Martin Luther King, and the surprise decision by President Lyndon B. Johnson not to seek renomination and reelection. The Tet Offensive in Vietnam destroyed strong public support for the war, and proved perfectly timed to undercut Johnson’s vote in the important early New Hampshire primary.
In theory, the reform was supposed to make the whole process fairer and more transparent. In 1968, Kennedy and rival Sen. Eugene McCarthy (D-Minn.) slugged out a bitter battle in the relatively few primaries, while Vice President Hubert Humphrey sewed up the nomination through the route of party caucuses and party bosses.
In practice, relatively few voters participate in primaries. Those who do are often intense, dedicated left-wing Democrats and right-wing Republicans. Reconciling the rigid zealots now populating Congress steadily gets harder. By nature, they are averse to compromise. Ryan’s predecessor Speaker John Boehner (R-Ohio) stunned everyone, including friends, by announcing in September 2015 he was retiring from Congress. His tour of service in the top leadership post had been, painfully difficult Republican right-wing House members expressed glee that Boehner would soon be gone. Their outlook is essentially narrow, shortsighted, and ultimately destructive given the view that compromise is a form of sin. In 2013, Republicans managed to shut down the government for 16 days as part of the effort to derail the Affordable Care Act. Democrats led by President Barack Obama used the Republican effort to their own political advantage. Boehner’s move headed off another shutdown.
The practice of holding the federal budget hostage to controversial partisan party maneuvers has now gone on for some years. In 1994, Republicans took control of the U.S. House of Representatives after 40 years in minority status. New House Speaker Newt Gingrich (R-Ga.) dramatically accelerated the trend of shifting that office from a relatively nonpartisan to highly partisan pulpit, a marked departure.
Then and later, White House Democrats and Congressional Republicans played an escalating game of budgetary chicken. The federal government did shut down briefly. In the political and public media maneuvering, President Bill Clinton—a brilliant political operator—was able to put the onus squarely on the Gingrich Republicans.
Publicly cool and politically cunning, Clinton moved ahead in the public opinion polls. He was helped by emphasizing fiscal restraint. In the 1996 presidential election, he defeated Republican nominee Senator Bob Dole of Kansas.
Sam Rayburn (D-TX) remains the longest-serving Speaker of the House. From the 1940s into the 1960s, he successfully practiced bipartisanship, despite the difficult politics of that era. Rayburn possessed exceptional political skills, but he had the advantage that both parties then were politically diverse and pragmatic. Additionally, we expected Presidents to be executives, not pure celebrities.
|
cb62e73d3b6a51c27e60a401a5af39dc | https://historynewsnetwork.org/article/168821 | Lost in Battle, Found by Amateur Sleuths: An ‘Unknown’ Marine | Lost in Battle, Found by Amateur Sleuths: An ‘Unknown’ Marine
A mystery that went unsolved for 73 years began when Herman Mulligan threw a grenade.
In the thick of some of the most vicious fighting of World War II, on the island of Okinawa, Private First Class Mulligan’s grenade clattered into the dark maw of a Japanese bunker and blew up a cache of ammunition. The huge explosion obliterated most of the hillside, and blasted the 21-year-old Marine beyond recognition.
Amid the chaos, his unidentified body was buried in a hasty battlefield grave, while the Marine Corps listed Private Mulligan as missing in action. In the years after the war, he was reclassified as “unrecoverable,” and the family that knew him gradually died off, until his memory was almost as lost as his bones.
The private’s story could have ended there, among the roughly 72,000 American troops from World War II who have not been accounted for. But the ending has been rewritten by a black-and-white snapshot found in a Marine veteran’s trunk.
|
80021b70868aa5bccd689e85baf10b4e | https://historynewsnetwork.org/article/168847 | Review of Tony Fels’s “Switching Sides: How a Generation of Historians Lost Sympathy for the Victims of the Salem Witch Hunt” | Review of Tony Fels’s “Switching Sides: How a Generation of Historians Lost Sympathy for the Victims of the Salem Witch Hunt”
The Salem witchcraft trials of the 1690s continue to resonate in American political and popular culture as is evident in the almost daily tweets of President Donald Trump that investigations into collusion between the Russian government and the 2016 Trump Presidential campaign are a hoax and part of a contemporary witch hunt. Trump seeks to discredit investigators by identifying himself with the innocent victims of the Salem witchcraft accusations and executions who are celebrated in such literary works as Arthur Miller’s playThe Crucible(1953). Yet, Tony Fels, an associate professor of history at the University of San Francisco, suggests in Switching Sidesthat leading historians trained during the tumultuous 1960s have ignored the courageous victims of Salem in favor of concentrating upon the accusers. Rather than another history of the witchcraft trials in Salem, Switching Sidesis a historiographical study of how prominent scholars have treated this fascinating episode of American history. Thus, the book may be of less interest to general readers, but it seems sure to create a storm within the historical profession.
Fels challenges the scholarly findings of four books that are considered to be among the most prestigious academic studies available on the events in Salem: Paul Boyer and Stephen Nissenbaum’s Salem Possessed: The Social Origins of Witchcraft(1974), John Putnam Demos’s Entertaining Satan: Witchcraft and the Culture of Early New England(1982), Carol F. Karlsen’s The Devil in the Shape of a Woman: Witchcraft in Colonial New England(1987), and Mary Beth Norton’s In the Devil’s Snare: The Salem Witchcraft Crisis of 1692(2002). This is not to suggest that Fels fails to discuss other scholarly investigations of Salem as Switching Sidesdemonstrates considerable familiarity with both primary and secondary sources on the topic and includes extensive explanatory endnotes occupying approximately half of the book. Nevertheless, Fels focuses upon the four academic texts mentioned above as he considers them to be the most influential post-1960s studies of Salem, and they provide the inspiration for his allegation that academic historians, influenced by the political events of the 1960s, abandoned the victims of the witch hunts in favor of the accusers.
While criticizing such esteemed scholars as Boyer, Nissenbaum, Demos, Karlsen, and Norton, Fels lauds Marion Starkey’s The Devil in Massachusetts: A Modern Enquiry into the Salem Witch Trials(1949) for endorsing universal principles of the post-World War II liberal consensus that attempted to understand Salem within the historical context of how mass hysteria and fear may drive a community to abandon rational judgment and engage in the scapegoating of innocent victims. Fels argues that postwar liberalism was influenced by the examples of the Soviet and Nazi terrors of the 1930s, but he also acknowledges that the extreme anti-communism of the 1950s found in McCarthyism also contained similar dangerous populist elements. He laments that liberalism was abandoned in the 1960s for radicalism by a younger generation of scholars influenced by the Vietnam War, Civil Rights Movement, feminism, counterculture, and social unrest of the era. The new generation of scholars, Fels argues. emphasized conflict over consensus and universal humanistic values.
Thus, Fels insists that Boyer and Nissenbaum overstated the economic conflict in the Salem community and identified with the accusers who represented a less affluent segment of the population who resented the growing capitalist economy benefitting more prosperous elements of the community. Fels is somewhat less critical of Demos for his emphasis upon psychological and anthropological explanations, but he concludes that in a more widespread investigation of witchcraft accusations throughout New England, Demos distracted attention from the violence perpetrated upon innocent victims in Salem. Karlsen is credited by Fels with focusing upon women who were overrepresented among the accused, but he chastises Karlsen for grounding her criticism of the Puritan patriarchy in a general reading of Christianity rather concentrating upon Puritanism. In fact, one of Fels’s central complaints is that the post-1960s scholars neglected to pay sufficient attention to the extreme religious views of the Puritans as an explanation for social unrest and the scapegoating of the witchcraft trials.
Fels devotes most of his argument—two full chapters—to the scholarship of Mary Beth Norton, who currently serves as president of the American Historical Association. Fels concedes that Norton may be on target with her thesis that the violence of the warfare with the Wabanaki people on the New England frontier made a major contribution to the social unrest and fear perpetuating the hysteria of the witchcraft allegations. However, he parts company with Norton over the role of the colonial elite in the witchcraft outbreaks. Fels accuses Norton of taking sides with the accusers as a populist manifestation of class conflict against the governing and commercial elements who exploited and failed to adequately protect the common people of the frontier.
While acknowledging their work in the archives and primary sources Fels, nevertheless, accuses Norton and her scholarly colleagues of being biased and projecting their political prejudices into their scholarship. Labeling these scholars as members of the New Left, Fels writes: “These works mostly ignore the victims of the witch hunt, occasionally even expressing hostility toward them, and either write sympathetically about the accusers or else shift the reader’s attention away from the 1692 panic itself. In these accounts the role of courageous individuals standing up to mass prejudice moves out of view. Instead, social conflict among groups takes center stage, even to the point of seeming to justify the witch hunt” (2). This almost conspiratorial perspective on Salem scholarship seems to suggest that the work of an entire generation of historians is suspect. Reflecting upon what he perceives as the New Left period in American historiography, Fels concludes: “One hopes this era will be superseded by a period in which new works will rise to dominance, influenced again by the liberal universalist values that fell out of favor during the 1960s-inspired radicalism” (134).
Fels condemns a generation of scholars for introducing concepts of race, gender, and class while concentrating upon marginalized groups and attempting to tell history from the bottom up. He fails to consider how this broader perspective has increased our appreciation for the complexity of the past as well as contemporary events. Historically marginalized groups such as the LGBTQ community have been able to reclaim their history and place within the American story. And this search for a more inclusive past with multiple perspectives does not mean that historians have switched sides. The tragic execution of innocent victims in Salem is an important narrative that his been incorporated into American culture, but historians Boyer, Nissenbaum, Demos, Karlsen, and Norton have broadened our understanding of the witchcraft trials by asking different and more complex questions. Seeking to comprehend the motivation of those who made the witchcraft accusations does not mean that scholars have abandoned the victims. Rather, these historians have succeeded in producing a multi-faceted portrait of events that enhances our appreciation for the complexity of historical causation. In fact, Karlsen certainly expresses admiration for the “uppity” women of Puritan society who defied gender norms and became the targets of witchcraft accusations for their nonconformity. Expanding one’s perspective and understanding by taking a walk in someone else’s shoes does not necessarily constitute switching sides.
The detailed critique of Salem scholarship provided by Fels certainly raises some interesting questions regarding the evaluation of historical evidence on the witchcraft trials. His argument that scholars should pay more attention to the religious ideas of Puritanism also appears to have merit. Debates regarding sources and interpretation enhance our appreciation for the complexity of the past. However, dismissing an entire generation of historians for their alleged radicalism seems a simplistic exercise.
In his evaluation of evidence regarding the witchcraft trials Fels is a stickler for detail; however, when it comes to evaluating the motivation of historians from the 1960s and 1970s he is quick to paint with a broad brush, assuming but offering little in the way of actual evidence, that scholars such as Boyer, Nissenbaum, Demos, Karlsen, and Norton were politically biased in their research and writing. In the eyes of Fels, the consideration of race, gender, and class connotes radical politics while liberalism is equated with universal humanitarian values above politics. It is a little like those who criticize kneeling football players for injecting politics into sports, while giant flags, marching soldiers, and military jets flying above the stadium are all apolitical. As the tweets of Donald Trump suggest, the politics of witch hunting is indeed complicated and conflicted.
|
9b5969f0454da6fb37d379d20aff6c22 | https://historynewsnetwork.org/article/168871 | Are Historians Still Ambivalent About Getting Published Online? | Are Historians Still Ambivalent About Getting Published Online?
Cambridge University Digital History Seminar graphic
As earlier reportson historians’ use of technology demonstrated, most historians are gathering materials, analyzing their findings, and writing their scholarship in digital form. Curiously, however, a national survey in fall 2015 found that much of the profession remains skeptical about the value of disseminating their scholarship electronically (aside from digital versions of their print publications).
As of 2015, 26% of historians had reportedly published their work online (which was up substantially from the 20% among respondents in a similar 2010 survey), but the share with publications was less than half the share (58%) of historians who reported they had considered publishing their work online. (The latter was essentially the same share as in 2010.)
Among those who had not published something online, this ambivalence appeared to arise from two principal sources—personal doubts about the value of this form of work, and a larger sense that there is little professional appreciation or credit for this form of work. Attitudes on these questions were largely unchanged from 2010 to 2015; this despite efforts by the American Historical Association to promote online publication extending back to the 1990s, and culminating in the AHA’s “Guidelines for the Professional Evaluation of Digital Scholarship by Historians,” which was published shortly before the survey responses were collected.
Since the 1990s proponents of online publication in the discipline (including the present author) have assumed that generational change would bring about a shift in attitude on these questions, as “digital natives” entered the profession. The 2015 survey offers little support for that assumption. Younger historians (those below the age of 46) were no more likely than their older counterparts to have considered publishing their work online, and they were less likely than their older colleagues to have published their work electronically.
The survey did offer some evidence that receiving tenure freed faculty to both consider and publish their work online. Mid-career historians (those ages 46 to 55), were the most likely to have considered publishing their work online (65%), and also had the highest reported share of those who had published their work online (29%).
Ongoing Ambivalence about Digital Publication
Among the three-quarters of historians who had never published their work online, almost seven in 10 cited the lack of scholarly recognition and prestige for that form of publication—26 percentage points higher than the share who expressed a preference for print (fig. 1). (Respondents could select more than one reason for their preference.) Around 20 percent of respondents cited doubts that the work would improve the article or book, or noted that it seemed difficult enough to write for print publication.
There was a notable difference—not evident in the 2010 survey—between those who had given serious consideration to online publication and those who had not in their reasons for not publishing online. Among those who said they had never considered online publication, almost 80% cited the lack of scholarly prestige, and 58% noted a preference for print. Among those who had given online publication serious consideration, the lack of scholarly prestige was less of a factor (though still cited by 58%), while only 25% cited a preference for print. Among the latter group, comparatively large shares cited a lack of support from their institutions (13%), and almost 20% cited other factors, most often the opportunity or outlet for this type of work. As one respondent noted “I have not sought out the opportunity and it has not presented itself to me.”
Finding an Audience
Among the quarter of historians reporting that they had published some of their scholarship online, the opportunity to reach a wider audience in the public and among their peers was the most important consideration (fig. 2).
Two-thirds of those who had published their work electronically cited the opportunity to reach a wider public audience as their reason for publishing online. Slightly smaller shares cited the opportunity to reach a wider audience of historians (60%) and the opportunity to publish their work more quickly (54%).
The appeal of reaching a wider audience increased modestly from 2010, when 58% of respondents cited it as a reason for publishing online (the same share, in that survey, as those citing a faster time to publication). The opportunity to reach an audience of other historians had the largest increase from 2010, jumping 12 percentage points (from 48% of respondents). The share of historians citing the opportunities provided by the medium—to link to other materials, or tell the narrative in a new or enhanced way—was essentially unchanged from 2010 to 2015.
A third of the respondents with a digital publication reported they had published their work in more than one venue as of 2015—up substantially from 2010. A near majority (48%) of respondents pointed to a publisher’s website as the venue for publication of their work electronically, but there was a substantial diversity in the publication outlets. A third indicated some of their work was published on another college or universities web site, 20% said they had published their work on a blog, and 15% had published their scholarship in their own institution’s institutional repository.
The Tenure Difference
To explore the concerns about perceptions in the field, a second set of questions in the survey asked about the perceived value of particular activities or work for tenure. These findings seem to validate the concerns of those who had not published their work online. As in past surveys, this study found historians feel print publication is the key to tenure in most academic settings. More than 90% of the historians indicated that an article or monograph in print was valuable for tenure, while less than a quarter said the same about publishing either type of publication in electronic form.
Viewed by the type of department in which the faculty member was employed, substantial differences appeared in the perceived value of particular activities for tenure. Among faculty at both PhD- and non-PhD-granting programs, most historians characterized print publications as valuable for tenure. Barely one-in-five historians felt digital monographs or digital history projects would be valued for tenure, though a slightly larger share of historians employed at programs that do not grant a history PhD thought they would have some value for tenure decisions. There was a 5 percentage point gap on the question about scholarly articles, and a 13 percentage point gap on the question about print monographs.
Alongside the questions about publications, teaching was cited as very or highly valued at their institution by almost three out of four of the respondents. While 61% of the respondents at PhD-granting programs felt this way about teaching, 85% of the respondents at non-PhD-granting programs felt the same.
The differences on the teaching question were starker when the responses were broken out by the status of the institution. Teaching was viewed highly valued for tenure by 45% of the respondents at elite research universities, as compared to 100% of the respondents at elite liberal arts colleges.
The 2010 survey did not ask similar questions about tenure, though a survey of departmentsby the Humanities Indicators suggests this concern may be overstated. Nevertheless, the overall findings—particularly the personal preference for print by a majority in the discipline—points to an ongoing challenge for those advocating for forms of scholarship that take advantage of new media.
|
28cb9832c34f3ed2f8117fc194157c18 | https://historynewsnetwork.org/article/168882 | Raw Fish and Tapeworms: Ancient Latrines Reveal the Diets of Our Ancestors | Raw Fish and Tapeworms: Ancient Latrines Reveal the Diets of Our Ancestors
It's Denmark, 1020 AD and you’re putting your feet up after a long day of pillaging with your Viking friends. What’s on the table for dinner, you ask? Beer, buckwheat and undercooked fish—all sprinkled with a heady seasoning of parasites.
Scientists have performed DNA analysis on ancient stool samples from Northern Europe and the Middle East to get a glimpse of what our ancestors were eating. Research about the diets of people from Denmark, the Netherlands, Lithuania, Jordan and Bahrain was published Wednesday in the journal PLOS ONE.
Researchers examined archaeological stool samples from medieval Europe and later, as well as much earlier samples from the Middle East. The earliest sample was produced in Bahrain sometime from 500-400 BC. Although it’s uncertain exactly which sample is—shall we say—freshest, one stool from the Netherlands could date as late as 1850 AD.
|
503deea14105c4d730977ead0f39d126 | https://historynewsnetwork.org/article/168890 | Unraveling the Genetic History of a First Nations People | Unraveling the Genetic History of a First Nations People
... From an evolutionary standpoint, it was not long ago that the Tsimshian people of modern-day Alaska and British Columbia were first confronted with European settlers—roughly 175 years, a mere handful of generations out of the Tsimshian’s 6,000-year American history. But that fateful encounter, which introduced smallpox and other alien ailments into their population, decimated the Tsimshian and threatened to compromise their genetic diversity in the years ahead.
This landmark moment in Native American history captured the imagination of John Lindo, a genetic anthropologist at Emory University who delved deep into Tsimshian DNA as lead author on a just-published paperin the American Journal of Human Genetics. Lindo focused his research on the Tsimshian in an effort to understand the genetic dynamics surrounding their population collapse, which could shed light on the experience of many other Native American groups upon first contact with Europeans.
Employing cutting-edge genomic analysis, Lindo and his team compared modern Tsimshian DNA (obtained with consent from Tsimshian residents of Prince Rupert Harbour, Canada) against DNA found in millennia-old ancestral specimens (exhumed under community supervision and housed in the Canadian Museum of History), correcting for the degradation of the ancient DNA over time.
What the researchers learned about the Tsimshian—on both sides of the fateful 19th-century population collapse—adds considerable nuance to the genetic and social history of a prominent First Nations people.
What most surprised researchers was that the population of the ancient Tsimshian people was in decline long before the arrival of Europeans. Slowly and steadily, since their first settlement in modern Canada, the Tsimshian had been decreasing in number, not expanding as one might presume. ...
|
114f2350280a296f138a526970122839 | https://historynewsnetwork.org/article/168935 | Maya Little, PhD student at UNC, arrested for defacing Confederate monument with red ink and blood | Maya Little, PhD student at UNC, arrested for defacing Confederate monument with red ink and blood
Maya Little a UNC history PhD student put her blood and red ink on silent Sam 5-10 minutes ago @Move_Silent_Sam @ABC11_WTVD @MicahAHughes @WNCN @WRAL pic.twitter.com/eLiC3zjXrg— Samee Siddiqui (@ssiddiqui83) April 30, 2018
The scene on Monday was striking. On a picturesque day at the University of North Carolina at Chapel Hill’s lush McCorkle Place, the Confederate monument known as Silent Sam was doused in blood and red ink.
Videos posted online show Maya Little, a second-year doctoral student in history who is part of a core group of activists calling for the bronze statue’s removal, circling it and coating it in the liquids. A campus police officer detained Little, whose black shirt and white sneakers were stained with the mixture, including her own blood. Meanwhile, protesters chanted, "No cops! No Klan! Get rid of Silent Sam!"
Little’s arrest was the latest chapter in the saga of the monument, which has become a public-relations nightmare for a university that has struggled to reckon with its racial history. While students have protested the monument sporadically for decades, the push to remove Silent Sam took on more urgency following a deadly white-supremacist rally last August in Charlottesville, Va. The activists have kept the pressure on administrators, who argue that their hands are tied because of a 2015 state law that protects "objects of remembrance."
Maya Little not only resists, she’s a good historian. As she covered the Silent Sam statue with blood (hers) & red paint, she was accompanied by the reading of a primary source: Julian Carr’s speech from the unveiling. #WhiteSupremacy— Karen L. Cox ✍
|
190e609a6e9327bd540450e2ccf7e311 | https://historynewsnetwork.org/article/168959 | With Emmett Till Reference, Camille Cosby Invokes Oft-Used Cultural Touchstone | With Emmett Till Reference, Camille Cosby Invokes Oft-Used Cultural Touchstone
When Camille Cosby, the wife of Bill Cosby, issued a scathing statementon Thursday denouncing the verdict that found her husband guilty of sexual assault, she painted him as a casualty of an unfair, racist system. She blamed the media, the prosecution and the victim, and then made a comparison to one of the nation’s most infamous crimes.
“Since when are all accusers truthful?” Mrs. Cosby said. “History disproves that. For example, Emmett Till’s accuser immediately comes to mind.”
It was the second time in a week that a member of Bill Cosby’s inner circle linked him to Till, and while the comment drew immediate backlash, it was merely the most recent invocation of Till’s name as analogue. Beyond its impact on the civil rights movement, Till’s memory has inspired protesters, historians and documentarians to the present day; endured in popular culture as a synonym for injustice; and occasionally rekindled anger when used in unexpected ways.
|
c4d0a485e9696b19d223b0033cd7f037 | https://historynewsnetwork.org/article/169054 | Who Really Wrote Shakespeare’s Plays? | Who Really Wrote Shakespeare’s Plays?
Sir John Gilbert's 1849 painting: The Plays of Shakespeare, containing scenes and characters from several of William Shakespeare's plays.
Who wrote Romeo and Juliet? It was not William Shakespeare. Macbeth? Not Shakespeare. Hamlet? Guess again. A group of conspiracy theorists starting back in the 1840s have argued vociferously that any one of a half dozen 17th century writers could have written all of Shakespeare’s plays. Some even suggest that a few of them got together as a committee and wrote his plays.
The champion real Shakespearean playwright, the King of the Elizabethan hill, was Christopher Marlowe. The arguments put forth by the debunkers of the Bard of Avon pushing for Marlowe spew forth in a delightful new play, Marlowe’s Fate, by Peter B. Hodges, that just opened at the Studio Theater on Theatre Row, West 42d Street, New York.
Most historians believe that Marlowe, a brilliant scholar, fabled poet and famed playwright, worked as a spy for the Queen and got involved in a political skirmish for which he was going to be put on trial. Wouldn’t it be nice, the Queen’s strategists reasoned, if young Marlowe somehow vanished? Then, all of a sudden, he was stabbed to death in a bar fight and buried. That was in 1593, right at the start of William Shakespeare’s career. Historians say Shakespeare then went on to write 37 plays, all of them memorable and some, such as Henry V, The Tempest, A Midsummer Night’s Dream, Julius Caesar and Two Gentlemen of Verona, legendary.
No, say the naysayers, and playwright Hodges, that’s not what really happened. The truth is that Marlowe was not killed. His invented “stabbing” gave counselors loyal to the Queen a chance to bury a man said to be Marlowe, faking his death (oh, how many film noir movies used that one?) and then spirit Marlowe out of the country to protect the spy ring. Marlowe had to keep writing, though, so, in hiding, he then wrote his plays abroad – no one knows where – and put the name of William Shakespeare, an oafish Stratford-on-Avon farmer – on them to protect his identity. His plays, er, Shakespeare’s plays, became world famous.
How do we know the Bard did not write his plays? Hodges hugs the time-honored views of the Marlowe conspiracy ringleaders: 1) Shakespeare was uneducated, 2) he was an illiterate, 3) he had no books or any other documents in his home, 4) he knew nothing about the Elizabethan court or politics, so how could he had have written those titanic works of political intrigue? And how could anyone, pushing his quill as fast as possible, finish 37 plays?
There were others who supposedly wrote Shakespeare's plays including, Sir Francis Bacon, Sir Walter Raleigh, Sir Edward de Vere, the Earl of Oxford and William Stanley, the Earl of Derby. There are so many “writers” and conspiracies that an entire society was established, the Shakespeare Oxford Fellowship, to champion their authenticity. Marlowe was their shining knight.
In his play, Hodges says that the friends of Marlowe who wanted to see his work published picked Shakespeare as the “front” because he could be bought off to go along with it and keep his mouth shut. So, over the years he was paid money and given the scripts, which he put his name on and turned in to theaters. The friends of Marlowe convinced the highly respected writer Ben Jonson to vouch for his close friend, Shakespeare.
These arguments are the lynchpins of Hodges’s play, Marlowe’s Fate, that he also directed with great skill. His story begins with the plan to fake Marlowe’s death and get him out of the country. Then Hodges tells a neatly concocted tale of escape, life in hiding and the biggest hoax in world history. He gets wonderful performances from a truly lovable cast. They almost convince you that Marlowe actually did it. The small ensemble of actors includes Brady Adair, Tim Dowd, Thomas Grube, Sarah Kiefer and Len Rella. They are dramatic at times and howlingly funny at others. Even though they work on a small stage and with a sparse set, they put you in the middle of London at the start of the sixteenth century.
The highlight of the play is the mirthful Punch and Judy puppet show in which different Shakespeare wannabe puppets take on the Bard in boxing matches (there is even a “round girl” puppet that draws laughter from the audience). It is a part of delicious staging that works well.
If there was a conspiracy to dupe the public, Hodge’s tale, based on many other stories, works well. Fake death. Hideout far away from the big city. Believable stand-in playwright. Gullible public.
It’s hard to believe, though. Shakespeare was a country boy, uneducated dimwit? Abe Lincoln was a country bumpkin who only went to school for nine months as a kid. Nobody could have written 37 plays? Tennessee Williams wrote 43. No one who did not know the inner workings of the courts and palace could have written all of those political potboiler plays? Mario Puzo, who knew absolutely nothing about the Mafia, wrote The Godfather. These things happen.
One criticism of the theory has never been answered. If Marlowe wrote all those plays and became so famous, why did he never step forward out of hiding to accept the applause? How come no one ever saw him in hiding?
Besides, Marlowe was not even around London in 1600. He was in Dallas, Texas in 1963 and was, everybody knows, the second gunmen in the Kennedy assassination, comfortably perched on the grassy knoll with his rifle. You remember that photo – Amelia Earhart was standing next to him.
PRODUCTION: The play is produced by the Caravan Theatre Company. Set Design: Valeria Haedo, Lights: Ben Young III, Costumes: Elizabeth Bove, Fight Director: Ben Young III. The play is directed by Peter B. Hodges. It runs through May 26.
|
e0c8cebed2d15f0045a96b275fba9507 | https://historynewsnetwork.org/article/169115 | Historians Debate Which President Leonardo DiCaprio Should Play | Historians Debate Which President Leonardo DiCaprio Should Play
"Of all the American lives that need telling on screen in 2018, Grant and TR wouldn’t make my top 500." -- Jill Lepore https://t.co/a9gr4OAYvK— Kevin M. Levin (@KevinLevin) May 22, 2018
The Oscar winner has Teddy Roosevelt and Ulysses S. Grant biopics lined up, and scholars are using everything from 'Hamilton' to toxic masculinity to make their pitches to the actor.
Leonardo DiCaprio really wants to play a president. The question is which president? Ulysses S. Grant or Teddy Roosevelt?
The 2016 best actor Oscar winner for The Revenant (and overall six-time nominee) is attached to play Roosevelt, the 26th U.S. president, in a film to be directed by Martin Scorsese, with whom he’s already done five movies (The Wolf of Wall Street, The Departed, Shutter Island, Gangs of New York and The Aviator). DiCaprio also is in talks to star in a biopic about Grant, the Civil War hero and the 18th president, to be directed by Steven Spielberg, with whom the actor collaborated on Catch Me If You Can.
|
5635f64d081fe97385ea302fed0e874d | https://historynewsnetwork.org/article/169156 | Unlike Obama, Trump Has No Moral Compass | Unlike Obama, Trump Has No Moral Compass
Suggesting that President Trump lacks a “moral compass” is not a new criticism. But this charge requires further exploration. The fault is twofold. The first is a personal failing, the second a societal one. We shall examine both of these dimensions in some detail, but first several paragraphs about the moral compass of President Obama and the values of some of our outstanding previous presidents.
Such a compass is based on proper values. Before being elected to his first presidential term, Obama wrote in The Audacity of Hope,“I think that Democrats are wrong to run away from a debate about values,” and that the question of values should be at “the heart of our politics, the cornerstone of any meaningful debate about budgets and projects, regulations and policies.” He also wrote that empathy “is at the heart of my moral code, and it is how I understand the Golden Rule—not simply as a call to sympathy or charity, but as something more demanding, a call to stand in somebody else's shoes and see through their eyes.”
Besides empathy, Obama often mentioned the importance of such values as “honesty, fairness, humility, kindness, courtesy, and compassion,” as well as wisdom, which implies the ability to prioritize such values in order to best work for the common good. In Small Is Beautiful,E. F. Schumacher compared wisdom to the sun: “All subjects, no matter how specialised, are connected with a centre; they are like rays emanating from a sun.” Scholar Copthorne Macdonald noted in “The Centrality of Wisdom,” “values are at the heart of the matter.” He also wrote, “Wise values express themselves in wise attitudes and wise ways of being and functioning.” Among the wise values he mentions are humility, humor, creativity, love, compassion, empathy, courage, passion, patience, positivity, openness, self-awareness, self-discipline, tolerance, and truth.
After being elected president, Obama often stressed similar values, for example, the Golden Rule of doing “unto others as we would have them do unto us” (Cairo, 2009), tolerance (University of Michigan Commencement Speech, 2010),self-discipline, and empathy.
In stressing values, Obama followed the example of some of our outstanding presidents. George Washington in his Farewell Address of 1796 expressed the belief that it was “substantially true that virtue or morality is a necessary spring of popular government” and hoped that future government actions “be stamped with wisdom and virtue.” As Ron Chernow emphasizes in his Washington: A Life (2010), “Washington in victory [after the battle of Yorktown, 1781] was the picture of humility. In reporting to Congress, he deflected attention from himself.” And later, “It speaks to Washington’s humility that the greatest man of his age was laid to rest in a communal tomb where nobody could single out his grave or honor him separately.” Thomas Jefferson knew that “the wise know their weakness too well to assume infallibility; and he who knows most, knows best how little he knows.” In Lincoln: the Biography of a Writer (2010), Fred Kaplan emphasizes Lincoln’s great appreciation of “wisdom literature” and writers like Shakespeare and the poet Robert Burns. In addition to Lincoln’s compassion, two of the most prominent wisdom qualities were his humility and self-deprecating humor.
In his book on Leadership, presidential scholar and FDR biographer James MacGregor Burns notes that “hierarchies of values . . . undergird the dynamics of leadership.” He also quotes the abolitionist Senator Charles Sumner of Massachusetts who stated that “true politics” was simply “morals applied to public affairs.” In discussing FDR, Burns writes, “It was because Roosevelt’s fundamental values were deeply humane and democratic that he was able, despite his earlier compromises and evasions, to act [against Hitler] when action was imperative.” Such leadership reflects “considerations of purpose or value that may lie beyond calculations of personal advancement.”
Other scholars have noted other values that motivated FDR. Historian Douglas Brinkley, for example, has written of his strong love of nature, and Robert Dallek’s recent biography indicates that in addition to his ample political skills, FDR adhered to a variety of progressive values.
All of this stress on values is not meant to suggest that ethical considerations are always in the forefront of the mentioned presidents’ thinking, but as Burns points out, “Divorced from ethics, leadership is reduced to management and politics to mere technique.” And “the capacity of presidents to transcend their everyday role as bargainers and coalition builders and to confront the overriding moral and social issues facing the country gives rise not only to questions of principle, purpose, and ethics but to considerations of sheer presidential effectiveness.”
But enough about previous presidents. How about Trump’s moral deficiencies, his seeming lack of concern with moral values? The useful biography Trump Revealed: The Definitive Biography of the 45th President, by Washington Post journalists Michael Kranish and Marc Fisher, tells us much about Trump’s morals failings. It mentions his cheating on a wife, admitting to trying to “seduce a married woman,” and bragging about how he could grab women by the crotch—“when you’re a star, they let you do it. You can do anything.” His biographers also point out his lack of any sense of business ethics and comment on his addiction “to publicity and recognition,” his “focus on getting his name onto products, buildings, and news stories.” The authors also indicate that neither in college nor later in his life did Trump evidence any interest in literature, history, philosophy, the arts, or culture, subjects that might have awakened some concern about values.
Trump’s appreciation of religion was mainly of the self-help type as preached by Protestant minister Norman Vincent Peale. Such preaching, in the words of historian Richard Hofstadter, reflected a “confusion of religion and self-advancement.” Trump considered Peale an important mentor, who taught him “to win by thinking only of the best outcomes.” (See here for more on Trump and religion.)
Since becoming president, Trump’s moral impoverishment has become more evident than ever, as demonstrated by a recent New York Times list of his “more egregious transgressions” or Politifact’s reproduction of his “false statements.”
Whereas scholars such as Schumacher and Macdonald(see above) thought that wisdom should coordinate and direct our other values and actions, and Obama stated that values should be at “the heart of our politics,”what directs Trump’s actions is his narcissism. A narcissism that leaves no room for wisdom or other moral values such as humility or empathy. As conservative columnist David Brooks has written of Trump: “He has no . . . capacity to learn. His vast narcissism makes him a closed fortress. He doesn’t know what he doesn’t know and he’s uninterested in finding out.”
Trump’s lack of a moral center certainly reflects his own personal failings, but the fact that we now have such a president also reflects our own societal failure. As a nation, we picked him. How and why did we select such a morally bankrupt person? And what does that selection say about our culture and us as a nation?
Yes, we have had great presidents like Washington, Lincoln, and FDR. And yes we have been a land of opportunity for many. And yes we honor Martin Luther King Jr. on his holiday, and twice elected Barack Obama, the son of a black African and white mother as our previous president.
But we also have a history of slavery, of racism, of killing and subjecting Native Americans, of McCarthyism, of imperialism in places like the Philippines, of overemphasizing the getting and spending of money, and of a fondness for glitzy entertainment.
In 1963 anthropologist Jules Henry declared that the two main “commandments” of our culture were “Create More Desire” and “Thou Shalt Consume.” In 1985, Neil Postman wrote in Amusing Ourselves to Death, “Our politics, religion, news, athletics, education and commerce have been transformed into congenial adjuncts of show business, largely without protest or even much popular notice. The result is that we are a people on the verge of amusing ourselves to death.”
Trump reflects these more negative strains of our national life. Trump Revealed mentions some of Trump’s ethnic prejudice and racism—for example, complaining of “too many Italian and Irish students” while in college. In 1973 the Justice Department filed “one of the most significant racial bias cases of the era” against him and his father for their real estate dealings. It also mentions Trump’s closeness to lawyer Roy Cohn, one of Sen. Joe McCarthy’s chief aides. (Trump’s crassness and frequent anti-intellectual outbursts remind us of McCarthy.)
Shortly after Trump’s election in late 2016, I wrote of “the pursuit of monetary gain and pleasure which is often linked with our capitalism and consumer culture,” and added that “if you were pressed to name one individual who most personifies” that spirit, it would be Trump.He also reflects Postman’s fear that “our politics, religion, news . . . have been transformed into congenial adjuncts of show business.” Our fondness for glitzy entertainment has helped produce an entertainer-in-chief rather than an honorable and decent president.
Trump is fond of the rags-to-riches American myth of “Algerism” (after the novels of Horatio Alger who depicted it), which historian Christopher Lasch called “the dominant ideology of American politics” during the late-nineteenth-century’s Gilded Age. (Lasch added: “Failure to advance, according to the [Alger] mythology of opportunity, argues moral incapacity on the part of individuals or, in a version even more implausible, on the part of disadvantaged ethnic and racial minorities.”) Trump’s simplistic view of societies “winners” and “losers” has been strongly affected by this myth.
Trump’s America is like that of his business interests, a land of real estate wheeling and dealing, of casinos, of beauty pageants, of pro wrestling, of reality TV (like his The Apprentice or the earlier show on which he appeared, Lifestyles of the Rich and Famous). It is an America that has bestowed fame on big-mouth media personalities like talk-radio’s Rush Limbaugh and Fox’s Bill O’Reilly.
Trump’s America is also one with little respect for scientific or other truth (consider all the falsehoods spewed by Trump and his EPA). Moreover, in its unfettering of restraints meant to protect average Americans from greed and environmental damage, it harkens back to the Gilded Age, a time before twentieth century Progressivism attempted to constrain and supplement capitalism so that our government and laws paid more attention to the public good.
If we are ever to rid ourselves of Trumpism and whatever noxious odors it leaves behind, we need not only a more honorable president, but an emphasis on more honorable values such as wisdom, humility, compassion, empathy, tolerance, and truth. We need to cherish them in our education, in our culture, and in our politics. Only then can we dream of making America great.
|
4285e050cfc53d1a35b0a5c619cf284f | https://historynewsnetwork.org/article/169230 | What Ruined Frederick Douglass as a Slave? | What Ruined Frederick Douglass as a Slave?
It was reading, said his master. That’s worth remembering as we debate the importance of a liberal education.
In the past few years, alarm about the decline of the liberal arts, especially the humanities, has become a constant refrain. At the university level, they are often seen as being in competition with job skills and employment prospects. But as Marilynne Robinson has reminded us, the liberal arts are not just part of the heritage of American higher education, they have long been considered an “education appropriate to free people.” One of the clearest practical examples of that can be found in the life of Frederick Douglass.
Frederick Douglass was born into slavery in Maryland in 1818 and would never know his exact birth date. He was separated from his mother at an early age and had little knowledge of his father. He witnessed and experienced the horrors of slavery. But he would escape that slavery, become a prominent abolitionist, and even one of the best orators of his time. What did Frederick Douglass identify as the turning point in his own life? It was learning to read.
In his autobiography, Narrative of the Life of Frederick Douglass, Douglass thanked Providence for the early lessons he received, as a child, on the alphabet and a few basic words from his mistress, Sophia Auld. The lessons soon stopped, when Mr. Auld learned what his wife had done. He warned her that Douglass would be “ruined” in this way and that if he learned to read, “there will be no keeping him. It would forever unfit him to be a slave.” Douglass overheard these comments. As he later wrote, “from that moment, I understood the pathway from slavery to freedom.” He continued on that pathway even as Sophia Auld turned against him, becoming cruel and striving to keep him from reading. Douglass began to bribe other boys his age to help him with his lessons and eventually acquired a copy of The Columbia Orator when he was around twelve. “Every opportunity I got, I used to read this book,” he later wrote. Together with the Bible, it was his window on the world beyond slavery. He began to dream of freedom, and eventually, to plan his escape.
A copy of The Columbian Orator at the National Civil Rights Museum in Memphis, Tennessee - By LittleT889 - Own work, CC BY-SA 4.0
What did reading do for Douglass? It made him hate slavery more than ever. He writes, “that very discontentment which Master Hugh had predicted would follow my learning to read had already come, to torment and sting my soul to unutterable anguish.” Ability to see his own condition from other perspectives made him alive to other possibilities. Like any child with a book, his mental horizon expanded. Specifically, reading The Columbia Orator also taught him rhetoric. Douglass read speeches and debates and became accomplished himself at debating slavery. His victories in debate strengthened his own convictions. His facility with language also made him a leader, one who taught others to read and attempted to help others escape. And when he did escape, those same rhetorical skills made him an accomplished public figure and a leader in the abolitionist cause.
Were we to look at young Frederick Douglass and provide assistance for him on his road to freedom, most of us today would not suggest The Columbia Orator. We might suggest learning to read, but speeches, dialogues and dramas would not be the recommended handbook for a would-be runaway slave. Instead, we might give him a book on navigation by the stars, instructions on foraging and tracking, or a good map. We might offer food. We might think about job training. And Douglass did need supplies and practical skills, for the escape itself and to support himself afterward in the north. But his turning point with reading and his love of The Columbia Orator shows us the significance of the seemingly intangible and “unnecessary” skills offered by the humanities.
The slaveholding world saw Frederick Douglass solely in terms of his market value and his productivity as a worker. Slaveholders were never hesitant to ensure that slaves had job skills. Working was essential to being a slave. Douglass worked in childcare, labored on a farm, and learned to work caulking in the shipyards so well that he was bringing in six to seven dollars a week for his master. These skills would again be useful to Douglass after he had run away. But every slave had job skills. Many slaves even worked the same jobs as paid laborers. Many of these skills were valuable and much in life is learned through work. But there was no necessary connection between job skills and freedom. Slaveholders did not fear teaching slaves how to work. They feared teaching them to read.
For Frederick Douglass, reading ignited a fire within, broadened his horizons, strengthened his sense of self and his determination to find freedom, and equipped him to share his narrative with the world and contest the powerful pro-slavery narratives of his era. If Douglass did not have his reading, writing, and speaking ability, the world would not know his story, which described the reality of slavery and testified to the dignity of his humanity. As abolitionist Wendell Phillips wrote about Douglass’ autobiography, this was finally the lion writing history rather than the hunter.
The slavery that Frederick Douglass knew was not a metaphor. It would be wrong to suggest an equivalency between his condition and that of American workers or university students today. And there was much more than The Columbia Orator on Douglass’s road to freedom. But the power of the humanities in his life speaks to their significance. He was born into adversity but learned the value of reading at a young age. He was a boy who could not put a book down, even when owning that book might cost him dearly. He grew into a man who could hold and defend his convictions. As a master of oratory, he became a powerful and influential voice for the truth, distinguished both nationally and internationally. The liberal arts alone did not liberate Frederick Douglass from slavery but they gave him mental access to the world even while he was enslaved and, after he escaped from slavery, they propelled him to a speaking role on the world stage.
|
75c927fd63a8f039e7184d6d5b79cc0f | https://historynewsnetwork.org/article/169325 | Trump Is Terrorizing Children | Trump Is Terrorizing Children
Thousands of immigrant children held in U.S. shelters
In recent days, millions of Americans have seen the shocking, heartbreaking images of children—crying, screaming, hysterical children—being ripped from the arms of their parents and sent away to a newly constructed camp in the Texas desert. So many of us have been justifiably outraged by the news. Here are children, arriving with their parents after a perilous journey, seeking asylum from the life-threatening conditions in their home countries, being terrorized by agents of the United States Immigration and Customs Enforcement Agency (ICE). And all of it part of a deliberate policy implemented by Donald Trump, Jeff Sessions, and Stephen Miller in their ongoing effort to re-build a white, Christian America through the exclusion (in this case through terror) of all those they deem to be outsiders.
Unfortunately, after the bloody twentieth century, it’s no surprise that man’s capacity for evil extends all the way to the violent assault on children—the most innocent and vulnerable among us. What does it mean, then, for a society to sink so low as to permit the forced separation of families whose only “crime” is seeking peace and freedom in America? What does it mean for a society to allow for a policy of violence against children? How could a country punish children for the decisions of their parents—decisions that, by any standard, are unimaginably difficult?
They are the kinds of impossible decisions Holocaust scholar Berel Lang has called “choiceless choices.” And looking back to the Holocaust and the decisions that Jews faced then, perhaps we can gain further insights as to why the current policy of the United States government is so unimaginably cruel.
When Jewish families arrived at Auschwitz, they walked past a group of doctors who, in the blink of an eye, determined their fate. Men and boys strong enough to work they waved one way, while women and small children they sent in a different direction—to the gas chambers. Family separation preceded family destruction.
In The Drowned and the Saved, Auschwitz survivor Primo Levi tells the story of a young girl who miraculously survived the gas chamber and, at least for a brief time, was kept alive by the Jewish workers who had found her, before ultimately being shot by a German guard. The story is a tragic reminder of all those children who were murdered by the Germans in the Holocaust.
The number six million is indelibly linked to the Final Solution. But there’s another number that is also staggering in its horror: one million. That is the number of children murdered in the Holocaust. The stories of those who perished and of those who survived and the choiceless choices involved are all—like the stories of the parents who fled and the children taken from them and placed in America’s own concentration camp—extraordinarily heartbreaking. And it began well before the start of the Final Solution.
Only days after Kristallnacht in November 1938, the British government approved a plan for taking in unaccompanied children under the age of seventeen from Germany and Austria—what became known as the Kindertransport. Now thousands of parents found themselves in the unimaginable position of having to decide whether or not to send their children to a foreign country alone, possibly never to see them again. As we now know, they turned out to be truly life and death decisions.
Consider, for example, the case of Lory Cahn. Her parents took her to the station, watched her board the train, and stood under her window. As the train started, her father would not let go of her hand. He held it and ran alongside, finally pulling her from the train. As a result of that decision, Lory Cahn did not go on the Kindertransport. Instead, she and her family were sent to Theresienstadt—the “model” concentration camp in the former Czechoslovakia, which the Nazis used for propaganda purposes.
Once in the camp, she was called to the railroad station where an SS man would call out a list of names. Those on the list got on the train. The first time she went, her name was not called. She returned to the barracks. This happened several more times over the next two weeks. Soon, the agony of repeatedly saying goodbye to her parents became too much. The next time she was called to the station, she asked the SS man to let her on the train. She didn’t know where it was going. And even if she did, the name Auschwitz would have meant nothing to her anyway. Remarkably, her decision did not prove fatal. Before her liberation, however, she had experienced eight different camps and emerged weighing just fifty pounds. What if her father had not made that decision back on that train platform seven years earlier?
Of course, the vast majority of Holocaust victims were from Eastern Europe. They, too, faced impossible decisions. In Lithuania, newly acquired in 1940, Stalin worked to ethnically cleanse the land, deporting millions of Poles and Jews into the Soviet interior. The final deportation occurred in June 1941. It was then that an officer of Stalin’s secret police—the NKVD—visited the apartment of a young Jewish woman and her child who were on the list for deportation. The woman begged him to allow her to stay, but he refused. She asked whether he would take just her and leave her child behind. He agreed and she left her child with a neighbor while she went off on what she expected to be a fatal journey. But at least her child would survive. Instead, only days later Hitler began his invasion of the Soviet Union. Within a week his troops were in Lithuania and the child’s fate was sealed. What seemed at the time to be the best choice turned out to have been horribly wrong as she was deported to safety, beyond Hitler’s grasp, while her child fell into the clutches of Nazi murderers. She survived. Her child perished.
Then there is the case of Romanian Jews in the Czernowitz ghetto. In early 1942 some had the opportunity to flee by ship to Palestine. A man named D’Andrea gave them this chance. Many only had enough money for one ticket. Then came the decision. Don’t buy the ticket and risk the likely death of the entire family? Or buy one ticket and split up the family? But then, who should go—the younger child; the stronger child? Perhaps the smarter one who had a better chance to survive?
How can a parent possibly make such a decision? And what if that one child you selected doesn’t make it? What if—as was the case with the Struma in February 1942—the ship was struck by a Soviet torpedo and sunk in the Black Sea, killing all but two of the 769 people on board? Writing more than seventy years later, the grandson of one of the people in that room with D’Andrea described his father’s thinking. “You assume that you face certain death…. So, you pick one so that he will survive. This is an act of reason. It is an act of sick reason. But this act of reason gets undone, reversed, made ridiculous by a simple fact: The boat sinks, the chosen survivor dies and those who had been condemned to death, we live. Or do we?”
It’s hard to put into words the enormity of what Europe’s Jews faced during the Holocaust. The decisions they had to make were unimaginably difficult. Leave one’s home, the only home one has ever known; send your child away to another country alone; be deported deep into the Soviet Union—with your child or without; enable one family member to leave the ghetto, but which one? These are only a few of the decisions that Jews had to make in the extraordinary environment of the Holocaust. Decisions they were forced to make because the United States and others had decided that Jews did not belong in their country; they were not a part of America; they could never be part of America. And this was a conscious choice.
The fundamental problem with nationalism is that entry into the country is open to a limited group of people. Deciding who is a member of the nation, who is in and who is out, is a decision to limit the boundaries of one’s moral universe to encompass only those of your group. All those who are within that moral boundary are of concern to you; they are to be protected. All those who are outside that boundary—outside the national or racial community—are of little or no concern. If anything, they represent a potential threat that must be eliminated. But the farther the boundary of your moral universe extends, the more people who fall within it, the more people whose fate is of concern to you. In the 1930s, Jews found themselves largely outside the moral universe of too many people, and that helped seal their fate.
The question we need to ask ourselves today is, where are the children, those innocent children brought to this country by parents desperate to escape horrific conditions at home, desperate to risk their lives to find a better life in America? Where are they located in relation to our moral universe? Is ours extensive enough to include them and therefore compel us to help them? Or will we close our doors again—and in the process terrorize these children, tearing them from the arms of their parents, perhaps never to see them again—as this new generation of refugees grapples with its own choiceless choices?
|
545049a3aba800c21fd7d56cb8cc4eb5 | https://historynewsnetwork.org/article/169368 | The Tragic, Too-Soon Death of Quentin Roosevelt | The Tragic, Too-Soon Death of Quentin Roosevelt
Sunday morning, July 14th, 1918.
It’s Bastille Day—and, somewhere in France, a fledging, twenty-year old American aviator named Quentin Roosevelt is scampering into his single-seat French-made, wooden-and-canvas Nieuport-28 aeroplane for … nothing less than a rendezvous with death.
Quentin Roosevelt was, of course, Theodore Roosevelt’s youngest child. TR had argued mightily for American military preparedness in the lead-up to war in April 1917. Following our declaration of war, he virtually begged Woodrow Wilson to command his own combat regiment on the Western Front lines.
Wilson said no. And Wilson was right. TR was too old for combat, too ex-presidential for such matters—and too certainly much-too-much TR to be much—if any—good as a subordinate. But if TR did not go overseas, his four sons soon would. TR worried about them all, but most of all for Quentin, his youngest—so young that on his last night at Sagamore Hill before shipping off for Europe, his mother literally tucked her baby in for the night.
“It was hard when Quentin went,” his mother, Edith Kermit Carow Roosevelt, conceded. “But you can’t bring up boys to be eagles, and expect them to turn out sparrows.”
Quentin was a mere Harvard sophomore—and a rather undistinguished one at that. But beyond such matters was Quentin’s questionable physical fitness. He longed to be an aviator—a dashing, devil-may-care knight of the air. But he suffered from poor eyesight—and a bad back incurred in July 1913, when a pack horse rolled over him during a Grand Canyon rock slide. Memorizing the eye chart advanced him past the former barrier. But not the latter.
Quentin left behind not a wife but rather a fiancée—the barely twenty-year-old Flora Payne “Foufie” Whitney. Roosevelt-Whitney family relations needed work. Quentin’s father famously excoriated the “malefactors of great wealth”; Flora’s father, Harry Payne Whitney, was richer than Croesus—or at least wealthier than all the many Roosevelts put together.
In any case, love finds its own way, and when Quentin proposed in May 1917, Flora quickly accepted—but she did take more time to inform her family. TR soon resolved his own uncertainties (“After some hesitation and misgiving, Mother and I have become much pleased”) and found Flora to be “a dear.”
Quentin, was soon in France but not nearly ready for combat. He was still, as his second cousin Nicholas Roosevelt, assessed him, “the gayest and most whimsical and most promising” of the Sagamore Hill brood. In France, he was well liked and even admired by his colleagues. He was “gay, hearty and absolutely square in everything he said or did,” remembered the famed American ace Captain Eddie Rickenbacker. “We loved him purely for his own natural self.”
Yet Quentin shared with his father not only a taste for heroic glory and the military life but a nearly suicidal obsession with death. Beyond that lay a far stranger side. “At one time,” noted Kermit, “he was greatly interested in demonology and witchcraft, and combed the second-hand bookstores for grimy tomes on this subject.” Roosevelt family historian Edward J. Renehan noted how Quentin “tended to churn out macabre tales of madness, desperation and suicide that he did not dare to show his parents. . . . Every hero was a tragic, thoughtful, existential intellectual: brave but doomed.”
Quentin tried bringing Flora to France for matrimony. But Flora did not go; the War Department would not allow it.
Quentin fitfully continued learning how to fly—a harder process than one might think. His father was unfortunately right about preparedness. American industry seemed incapable of turning out the aircraft the new American Air Service needed. Two thousand American pilots shuffled around their French training base, grounded not by enemy air superiority but by a simple lack of planes. “They are not going to send any more pilots over here from the states for the present,” Quentin wrote home in April 1918, “which is about the first sensible decision that they have made as regards the Air Service. As it is they must have two thousand pilots over here, and Heavens knows it will be ages before we have enough machines for even half that number.”
In late March 1918, while still in training, Quentin cracked up his plane. “I smashed [it] up beautifully,” he wrote back home. “It was really a very neat job, for I landed with a drift, touched one wing, and then, as there was a high wind, did three complete summersaults (spelling?) ending up on my back. I crawled out . . . with nothing more than a couple of scratches.”
He attracted some antiaircraft fire in late June (“I had a hole through my wing”) but didn’t see actual aerial combat until July 11, 1918, when, after cavalierly veering off from his fifteen-plane squadron and “returning” to its rear, he discovered he was accidentally trailing three enemy aircraft.
“Quentin,” reminisced Eddie Rickenbacker, “fired one long burst. . . . The aeroplane immediately preceding him dropped at once and within a second or two burst into flames. Quentin put down his nose and streaked it for home before the astonished Huns had time to notice what had happened. He was not even pursued!”
The world moved very fast now. The dance of death whirled wildly, particularly for pilots. Eighty percent died in combat. Once aloft in their “flaming coffins,” their life expectancy was a mere eleven days. Such unpleasant facts counted for little to Quentin Roosevelt. “He was [so] reckless,” reminisced Rickenbacker, “that his commanding officers had to caution him repeatedly about the senselessness of his lack of caution. His bravery was so notorious that we all knew that he would either achieve some great, spectacular success or be killed in the attempt. Even the pilots in his own flight would beg him to conserve himself and wait for a fair opportunity for a victory. But Quentin would merely laugh away all serious advice.”
Sunday, July 14, was Bastille Day, and French airmen serving alongside Quentin’s unit hoped to celebrate their national holiday with some authentic American entertainment. Quentin helped by rounding up what talent he could, even participating in his troupe’s rehearsal on the evening of July 13. He was, a superior remembered, “the life of the party, inspiring everybody with his enthusiasm”
At 8:20 the next morning, Quentin’s squadron headed toward Château-Thierry. At forty-three hundred meters, seven Fokker triplanes appeared. Quentin pursued one of them, but it was a trap. Three other higher-flying German pilots descended upon him. Two machine-gun bullets ripped through his skull, killing him instantly. Spiraling downward, he crashed in enemy territory near the village of Chamery, ten kilometers north of the Marne.
Quentin’s German adversaries buried him with full military honors—but also photographed his remains lying alongside his wrecked craft. In Germany, the photo was issued as a postcard. It sold like hotcakes.
At Oyster Bay, the afternoon following Quentin’s crash, TR was dictating to his secretary, the forty-year-old Miss Josephine M. Stricker, when Associated Press correspondent Philip Thompson rapped upon his door conveying puzzling news: the New York Sun had received a cable from France reading, “WATCH SAGAMORE HILL FOR—” with the remainder of the message censored. TR feared trouble. Some family member was wounded—or, worse, slain. “It can not be Ted and it can not be Archie,” he speculated, “for both are recovering from wounds. It is not Kermit, for he is not in the danger zone at just this moment. So it must be Quentin. However, we must say nothing of this to his mother to-night.”
At 7:30 the following morning, Thompson interrupted the Colonel’s breakfast with word that Quentin had been shot down over enemy lines. But was he wounded? Dead? Alive? He had survived one crash. His Nieuport had, after all, once crashed without catching fire. Might he walk away from another?
TR knew enough of war not to deceive himself. And he knew Edith could not be deceived either.
For the longest time, he said nothing. Finally, he exclaimed, “But—Mrs. Roosevelt! How am I going to break it to her?”
Just after 1:00 p.m., TR issued a manly, stiff-upper-lip statement (“Quentin’s mother and I are very glad that he got to the front and had the chance to render some service to his country and to show the stuff that was in him before his fate befell him”). Later, he walked alone to Sagamore Hill’s stables. There, wrote the Sun, he “stopped before the stall of a fat, old and rheumatic Shetland pony, ‘Algonquin,’ which ‘was breathing laboriously under the strain of his twenty years.’ ” Years before this tiny, old Shetland had been Archie and Quentin’s pony. It was Algonquin that five-year-old Quentin so famously transported up the White House elevator back in 1903 to cheer the ailing eight-year-old Archie.
“In the seclusion of the stable,” the Sun continued, “the iron of a Spartan father’s soul gave way . . . and with tears in his eyes he threw his arms around the old pony’s neck.”
Reports filtered in, providing false hope of Quentin’s survival. It was not until Saturday, July 20, that German pilots dropped a note confirming his death—and his identification bracelet—behind American lines. Two days later, German authorities at Geneva notified their American Red Cross counterparts. Eventually, Quentin’s last letters home (“We lost another fellow from our squadron three days ago. However, you get lots of excitement to make up for it”) finally reached Oyster Bay. And thus, day after day, a son died again and again in a mother and father’s broken hearts.
On July 17, TR informed Edith of the terrible news about Quentin. He phoned Flora. And he kept on working. Choking back tears, he continued, as if all were normal, with his dictation. His only concession to tragedy was to cancel an afternoon business appointment in Manhattan.
On the following afternoon, he was to address the Republican State Convention at Saratoga Springs. No one would dare reprove him for his absence. He went anyway.
The next morning, reporters badgered him about his plans. “There is only one thought in my heart and you know what that is,” he responded. The New York World thought he laid “his hand over his heart for an instant.”
“There is no use pretending that we do not bitterly mourn,” TR soon wrote, “but [Quentin] had his crowded hour, of a life that was not only glorious but very happy; he had got his man; he has rendered service; he had a fortnight or three weeks when he stood on a crest of life which cannot even be seen by sordid and torpid souls who know neither strife in our honor and our love, and who live forever in a gray fog at the lowest level.”
“It is rather awful to know,” TR would write in late July to an acquaintance of Quentin’s, a Miss Mary L. Brown, who wrote Theodore and Edith with information about him, “that he paid with his life, and that my other sons may pay with their lives, to try to put in practice what I preached. Of course I would not have it otherwise.”
Three weeks afterward, he confided to Edith Wharton, “There is no use of my writing about Quentin; for I should break down if I tried. His death is heart breaking, but it would have been far worse if he had lived at the cost of the slightest failure to perform his duty.”
Yet even then, TR had committed to writing about Quentin—or at least about death and dying and sacrifice, and everyone knew of what and whom he actually wrote. In the October 1918 Metropolitan Magazine, he gritted his teeth and spat in death’s eye:
Only those are fit to live who do not fear to die; and none are fit to die who have shrunk from the joy of life and the duty of life. Both life and death are parts of the same Great Adventure. Never yet was worthy adventure worthily carried through by the man who put his personal safety first. Never yet was a country worth living in unless its sons and daughters were of that stern stuff which bade them die for it at need; and never yet was a country worth dying for unless its sons and daughters thought of life, not as something concerned only with the selfish evanescence of the individual, but as a link in the great chain of creation and causation, so that each person is seen in his true relations as an essential part of the whole, whose life must be made to serve the larger and continuing life of the whole. Therefore it is, that the man who is not willing to die, and the woman who is not willing to send her man to die, in a war for a great cause, are not worthy to live. . . .Woe to those who invite a sterile death; a death not for them only, but for the race; the death which is ensured by a life of sterile selfishness.But honor, highest honor, to those who fearlessly face death for a good cause; no life is so honorable or so fruitful as such a death. Unless men are willing to fight and die for great ideals, including love of country, ideals will vanish, and the world will become one huge sty of materialism. And unless the women of ideals bring forth the men who are ready thus to live and die, the world of the future will be filled by the spawn of the unfit. Alone of human beings the good and wise mother stands on a plane of equal honor with the bravest soldier; for she has gladly gone down to the brink of the chasm of darkness to bring back the children in whose hands rests the future of the years. But the mother and, far more, the father, who flinch from the vital task earn the scorn visited on the soldier who flinches in battle. . . .In America to-day all our people are summoned to service and sacrifice. Pride is the portion only of those who know bitter sorrow or the foreboding of bitter sorrow. But all of us who give service, and stand ready for sacrifice, are the torch-bearers. We run with the torches until we fall, content if we can then pass them to the hands of other runners. The torches whose flame is brightest are borne by the gallant men at the front, and by the gallant women whose husbands and lovers, whose sons and brothers, are at the front. . . .
These are the torch-bearers; these are they who have dared the Great Adventure.
Some adventures cost more than other.
Copyright David Pietrusza “All Rights Reserved”
|
7cce8b5b7ed49c1a5b57173b5288711b | https://historynewsnetwork.org/article/169404 | Supreme Court’s Travel-Ban Includes Surprise Ruling: Japanese Internment Was Wrong | Supreme Court’s Travel-Ban Includes Surprise Ruling: Japanese Internment Was Wrong
In Tuesday’s majority opinion upholding President Donald Trump’s travel ban, the Supreme Court also overturned a long-criticized decision that had upheld the constitutionality of Japanese-American internment during World War II.
Justice Sonia Sotomayor had mentioned the 1944 case, Korematsu v. United States, in her dissent, arguing that the rationale behind the majority decision had “stark parallels” to Korematsu; in both cases, she argued, the government “invoked an ill-defined natiounal security threat to justify an exclusionary policy of sweeping proportion.”
Writing for the majority, Chief Justice John Roberts argued that the case was not relevant to the travel ban, but went ahead and wrote that it is now overturned.
|
4700adb62e56bdcdbe33fdf3b26601ff | https://historynewsnetwork.org/article/169464 | Mexicans Made America—Why Do We Treat Them Like Alien Invaders? | Mexicans Made America—Why Do We Treat Them Like Alien Invaders?
Mexicans have contributed to making the United States in pivotal and enduring ways. In 1776, more of the territory of the current United States was under Spanish sovereignty than in the 13 colonies that rejected British rule. Florida, the Gulf coast to New Orleans, the Mississippi to St. Louis, and the lands from Texas through New Mexico and California all lived under Spanish rule, setting Hispanic-Mexican legacies. Millions of pesos minted in Mexico City, the American center of global finance, funded the war for U.S. independence, leading the new nation to adopt the peso (renamed the dollar) as its currency.
The U.S. repaid the debt by claiming Spanish/Mexican lands—buying vast Louisiana territories (via France) in 1803; gaining Florida by treaty in 1819; sending settlers into Texas (many undocumented) to expand cotton and slavery in the 1820s; enabling Texas secession in 1836; provoking war in 1846 to incorporate Texas’s cotton and slave economy—and California’s gold fields, too. The U.S. took in land and peoples long Spanish and recently Mexican—often mixing European, indigenous, and African ancestries. The 1848 Treaty of Guadalupe Hidalgo recognized those who remained in the U.S. as citizens. And the U.S. incorporated the dynamic mining-grazing-irrigation economy that had marked Spanish North America for centuries and would long define the U.S. West.
Debates over slavery and freedom in lands taken from Mexico led to the U.S. Civil War while Mexicans locked in shrunken territories fought over liberal reforms and then faced a French occupation—all in the 1860s. With Union victory, the U.S. drove to continental hegemony. Simultaneously, Mexican liberals led by Benito Juárez consolidated power and welcomed U.S. capital. U.S. investors built Mexican railroads, developed mines, and promoted export industries—including petroleum. The U.S. and Mexican economies merged; U.S. capital and technology shaped Mexico while Mexican workers built the U.S. west. The economies were so integrated that a U.S. downturn, the panic of 1907, was pivotal to setting off Mexico’s 1910 revolution, a sociopolitical conflagration that focused Mexicans while the U.S. joined World War I.
Afterwards, the U.S. roared in the 20s while Mexicans faced reconstruction. The U.S. blocked immigration from Europe, and still welcomed Mexicans to cross a little-patrolled border to build dams and irrigation systems, cities and farms across the west. When depression hit in 1929 (it began in New York, spread across the U.S., and was exported to Mexico), Mexicans became expendable. Denied relief, they got one-way tickets to the border, forcing thousands south—including children born as U.S. citizens.
Mexico absorbed the refugees thanks to new industries and land distributions—reforms culminating in the nationalization of the oil industry in 1938. U.S. corporations screamed foul and FDR arranged a settlement (access to Mexican oil mattered as World War II loomed). When war came, the U.S. needed more than oil. It needed cloth and copper, livestock and leather--and workers, too. Remembering the expulsions of the early 30s, many resisted going north. So the governments negotiated a labor program, recruiting braceros in Mexico, paying for travel, promising decent wages and treatment. 500,000 Mexican citizens fought in the U.S. military. Sent to deadly fronts, they suffered high casualty rates.
To support the war, Mexican exporters accepted promises of post-war payment. With peace, accumulated credits allowed Mexico to import machinery for national development. But when credits ran out, the U.S. subsidized the reconstruction of Europe and Japan, leaving Mexico to compete for scarce and expensive bank credit. Life came in cycles of boom and bust, debt crises and devaluations. Meanwhile, U.S. pharmaceutical sellers delivered the antibiotics that had saved soldiers in World War II to families across Mexico. Children lived—and Mexico’s population soared: from 20 million in 1940, to 50 million by 1970, 100 million in 2000. To feed growing numbers, Mexico turned to U.S. funding and scientists to pioneer a “green revolution.” Harvests of wheat and maize rose to feed growing cities. Reliance on machinery and chemical fertilizers, pesticides, and herbicides, however, cut rural employment. National industries also adopted labor saving ways, keeping employment scarce everywhere. So people trekked north, some to labor seasonally in a bracero program that lasted to 1964; others to settle families in once Mexican regions like Texas and California and places north and east.
Documentation and legality were uncertain. U.S. employers’ readiness to hire Mexicans for low wages was not. People kept coming. U.S. financing, corporations, and models of production shaped lives across the border; Mexican workers labored everywhere, too. With integrated economies, the nations faced linked challenges. In the 1980s the U.S. suffered from “stagflation” while Mexico faced a collapse called the “lost decade.” In 1986, Republican President Ronald Reagan authorized a path to legality for thousands of Mexicans in the U.S. tied to sanctions on employers aimed to end new arrivals. Legal status kept workers here; failed sanctions enabled employers to keep hiring Mexicans—who kept coming. They provided cheap and insecure workers to U.S. producers—subsidizing profits in times of challenge.
The 1980s also saw the demise of the Soviet Union, the end of the Cold War, and the presumed triumph of capitalism. What would that mean for people in Mexico and the U.S.? Reagan corroded union rights, leading to declining incomes, disappearing pensions, and enduring insecurities among U.S. workers. President Carlos Salinas of Mexico’s dominant PRI attacked union power—and in 1992 ended rural Mexicans’ right to land. A transnational political consensus saw the erosion of popular rights as key to post-Cold War times.
Salinas proposed NAFTA to Reagan’s Republican successor, George H.W. Bush. The goal was to liberate capital and goods to move freely across borders—while holding people within nations. U.S. business would profit; Mexicans would continue to labor as a reservoir of low wage workers—at home. The treaty was ratified in Mexico by Salinas and the PRI, in the U.S. by Democratic President Clinton and an allied Congress.
As NAFTA took effect in 1994, Mexico faced the Zapatista rising in the south, then a financial collapse—before NAFTA could bring investment and jobs. With recovery, the Clinton era hi-tech boom saw production flow to China. Mexico gained where transport costs mattered—as in auto assembly. But old textiles and new electronics went to Asia. Mexico returned to growth in the late 1990s, with jobs still scarce for a population nearing 100 million. Meanwhile, much of Mexican agriculture collapsed. NAFTA ended tariffs on goods crossing borders. The U.S. subsidizes corporate farmers--internal payments enabling agribusiness to sell below cost. NAFTA left Mexican producers to face U.S. subsidized staples. Mexican growers could not compete and migration to the U.S. accelerated.
NAFTA brought new concentrations of wealth and power across North America. In Mexico, cities grew as a powerful few and favored middle sectors prospered; millions more struggled with informality and marginality. The vacuum created by agricultural collapse and urban marginality made space for a dynamic violent drug economy. Historically, cocaine was an Andean specialty, heroin an Asian product. But as the U.S. pressed against drug economies elsewhere, Mexicans—some enticed by profit; many searching for sustenance—turned to supply U.S. consumers.
U.S. politicians and ideologues blame Mexico for the “drug problem”—a noisy “supply side” understanding that is historically untenable. U.S. demand drives the drug economy. The U.S. has done nothing effective to curtail consumption—or to limit the flow of weapons to drug cartels in Mexico. Laying blame helps block any national discussion of the underlying social insecurities brought by globalization—deindustrialization, scarce employment, low wages, lowered benefits, vanishing pensions—insecurities that close observers know fuel drug dependency. Drug consumption in the U.S. has expanded as migration from Mexico now slows (mostly due to slowing population growth)—a conversation steadfastly avoided.
People across North America struggle with shared challenges—common insecurities spread by globalizing capitalism. Too many U.S. politicians see advantages in polarization, blaming Mexicans for all that ails life north of the border. Better that we work to understand our inseparable histories. Then we might work toward a prosperity shared by diverse peoples facing common challenges in an integrated North America.
|
a86edd27de9d71da523f42cd6156c69a | https://historynewsnetwork.org/article/169528 | Obama Tops Public’s List of Best President in Their Lifetime, Followed by Clinton, Reagan | Obama Tops Public’s List of Best President in Their Lifetime, Followed by Clinton, Reagan
When asked which president has done the best job in their lifetimes, more Americans name Barack Obama than any other president. More than four-in-ten (44%) say Obama is the best or second best president of their lifetimes, compared with about a third who mention Bill Clinton (33%) or Ronald Reagan (32%).
Not yet halfway through his term, 19% say Donald Trump has done the best or second best job of any president of their lifetimes. That is comparable with the share who viewed Obama as one of the best presidents in 2011 (20%).
The survey by Pew Research Center, conducted June 5-12 among 2,002 adults, asks people in an open-ended format which president has done the best job in their lifetimes. The analysis is based on their first and second choices.
|
2ce21fa7684eaed0a4c4122acee54a46 | https://historynewsnetwork.org/article/169536 | How European colonization completely wiped out dogs in America | How European colonization completely wiped out dogs in America
Before Europeans started colonising the Americas around 500 years ago, the land was populated by native human inhabitants and the animals living alongside them.
Now, exhaustive DNA studies have unearthed the troubling history of dogs in North America – and their fate of extinction at the hands of European colonists.
Colonial devastation bestowed upon native humans of the Americas is well-documented; but evidence about dogs’ lives is less known.
|
cf84b5649f82940b80112e2607938343 | https://historynewsnetwork.org/article/169577 | DNA Analysis Confirms Authenticity of Romanovs’ Remains | DNA Analysis Confirms Authenticity of Romanovs’ Remains
Today marks the 100th anniversary of the execution of Nicholas II and his family, an event that toppled Russia’s Romanov dynasty. Yesterday, as the country was preparing to commemorate their deaths, Russian investigators announced that new DNA testing had confirmed that remains attributed to last tsar and his family are in fact authentic—a finding that may pave the way for the deceased royals to be buried with full rites by the Orthodox Church, according to Agence France-Presse.
The Investigative Committee of the Russian Federation, which is responsible for probing serious crimes, said DNA analysis “confirmed the remains found belonged to the former Emperor Nicholas II, his family members and members of their entourage.” As part of the new tests, investigators exhumed the body of Nicholas’ father, Alexander III to prove that the two are related, and also took DNA samples from living members of the Romanov family, according to the Moscow Times.
The new findings are the latest development in a tangled dispute over the remains of the Romanovs, whose downfall was nigh after Nicholas II was forced to abdicate the throne in the midst of the Russian Revolution of 1917. Radical Bolsheviks took power and formed a provisional government, and the tsar, his wife, Alexandra and their five children were imprisoned in the city of Yekaterinburg. In 1918, civil war broke out between the communist government’s Red Army and the anti-Bolshevik White Army. As the White Army advanced on Yekaterinburg, local authorities were ordered to prevent the rescue of the Romanovs, and in the early hours of July 17, the family was executed by firing squad. Those who remained alive after the bullets stopped flying were stabbed to death.
|
e53b88aebf35819126d171ba1963cd4b | https://historynewsnetwork.org/article/169694 | Trump says he doesn’t take vacations | Trump says he doesn’t take vacations
President Trump is expected to take a vacation from Washington this month to this Bedminster, N.J., golf club. Last year, he took a 17-day trip to the club. But don’t call these retreats “vacations.” Trump hates vacations.
“Don’t take vacations. What’s the point? If you’re not enjoying your work, you’re in the wrong job,” he wrote in his 2004 book “Trump: Think Like a Billionaire.”
He was also repeatedly critical of his predecessor’s annual holiday trip to Hawaii and summer visits to Martha’s Vineyard.
|
93eab1c44532020ac4ef4aea5a976af7 | https://historynewsnetwork.org/article/169705 | Jim Loewen says history classes helped create a "Post-Truth” America | Jim Loewen says history classes helped create a "Post-Truth” America
Related Link What James Loewen Needs to Learn About History Education By John Fea
… [With] the release this summer of a new paperback version of Lies My Teacher Told Me, [James] Loewen contends that his bestselling book has “new significance … owing to detrimental developments in America’s recent public discourse.” By providing students an inadequate history education, Loewen argues, America’s schools breed adults who tend to conflate empirical fact and opinion, and who lack the media literacy necessary to navigate conflicting information. I recently spoke to Loewen about how the quality of Americans’ history education could affect the country’s civic health. An edited and condensed transcript of our conversation is below.
Alia Wong: What’s changed with regard to your thinking on history education since the first edition came out in 1995? What about since the second edition in 2007?
James W. Loewen: Not much has changed in my thinking, and that’s because I think I was right in the first place. What has changed has to do with our current intellectual era. History and social studies, as taught in school, make us less good at thinking critically about our past. For one, textbooks don’t teach us to challenge, to read critically—they are just supposed to provide exercises in stuff to learn. Secondly, the textbooks (and the people who teach from those textbooks) don’t teach causality. They aren’t designed to have students memorize anything about causality—what causes racism, for example, what causes a decrease in racism. That means that those of us who are more than 18 years old and are out of high school and voting may have never had anybody teach us anything about what causes what in society.
Wong: How do you think inadequate history education plays into what some describe as the country’s current “post-truth” moment?
Loewen: History is by far our worst-taught subject in high school; I think we’re stupider in thinking about the past than we are, say, in thinking about Shakespeare, or algebra, or other subjects. We historians tend to make everything so nuanced that the idea of truth almost disappears. People in graduate history programs have said things to me like: Why should we privilege one narrative above others with the term “true”? That kind of implies that all narratives are equal—or, at least, that all narratives have some merit, that no narrative has all the merit. But maybe there is such thing as a bedrock of fact. Take the way we talk about the Civil War, for example. A lot of people will say that the war grew out of a pay dispute; many others say it had to do with states’ rights. Well, it’s quite the contrary—the southern states seceded so they could uphold slavery. Sometimes we don’t need nuance….
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.