id
stringlengths
32
32
url
stringlengths
31
1.58k
title
stringlengths
0
1.02k
contents
stringlengths
92
1.17M
9804539e9b86af0ec6a15a093fac28eb
https://www.smithsonianmag.com/history/terror-in-ad-1000-55965537/
Terror in A.D. 1000?
Terror in A.D. 1000? Popular accounts of the turn of the last millennium paint a world gone mad. Churches crammed with penitents, soldiers departed from the battlefield, farmers gone from their fields; and the church offering solace to all in exchange for property and gold. While easy to embrace, this vision of medieval Europeans, gripped by paralyzing fear in anticipation of the end of the world, is more legend than fact. So say most medieval historians, who long ago dubbed it a myth and named it the "Terrors of the year 1000." However, history professor Richard Landes, who is also director of the Center for Millennial Studies at Boston University, recently challenged old-guard academe with a new theory arguing that, in fact, millennium-related activity did occur on a larger scale in the years surrounding the first millennium and that the Terrors myth may actually contain elements of truth. Sources are few and subject to wide interpretation, making the resulting academic debate lively and contentious. But how did the tale of the Terrors get its start? The short answer is that historians from the 16th century on found the entire notion to be territory too rich to resist. Add to that the idea that interpretation of events in the past are often colored by the historian's own biases. Over the centuries, these interpretations flowered into the Terrors myth. Journey through the last ten centuries and learn how history is written down, interpreted, reinterpreted and analyzed. Why do year 1000 events so fascinate historians of the 16th, 17th and 19th centuries? And how will our own modern interpretations of the year 1000 be seen 1,000 years from now?
9e15ad3a204f81ece7d995e428b607df
https://www.smithsonianmag.com/history/the-age-of-peace-820068/
The Age of Peace
The Age of Peace One overlooked benefit of aging populations may be the prospect of a more peaceful world. [×] CLOSE Photo Gallery Demographers have found that developing nations with “youth bulges”—more than 40 percent of people between the ages of 15 and 29—are 2.5 times more prone to internal conflict, including terrorism, than countries with fewer young people, largely because of high unemployment combined with youthful exuberance and vulnerability to peers. “The more young people you have, the more violence you have,” says Mark Haas, a political scientist at Duquesne University who has spent the past three years studying how aging patterns among major world powers will affect U.S. security. Between 1970 and 1999, he says, 80 percent of the world’s civil conflicts erupted in nations with substantial youth bulges. Today, those bulges are clustered in the Middle East and sub-Saharan Africa, including Nigeria, Saudi Arabia, Uganda, Yemen and Somalia. But as youth bulges approach middle age, political stability often increases, researchers say. Richard Cincotta, a demographer who consults for the U.S. National Intelligence Council, cites Indonesia: “Political violence has declined in the westward islands,” which tend to be older, “while islands to the east, where the age structure is more youthful, remain politically unstable.” Cincotta also cites a decrease in political violence in Japan and South Korea—both rocked by student protests in the 1960s and ’70s—as their youth bulges dropped below 40 percent. Likewise, waning fertility rates, which have produced a decline in the youthful population in southern India, may have created an environment less supportive of Maoist insurgency groups that are active in the country’s northern and eastern states. “If we know that youth bulges are a big source of violence, including terrorism, it’s good news if these youth bulges are receding,” Haas says. Still, older is not always mellower. Not even a maturing population will settle down if accompanying economic gains aren’t shared, or if declining fertility rates don’t occur uniformly among different groups within a society. Ethnic divisions, in particular, can trump demography. The former Yugoslav republic, notes Cincotta and Haas, experienced years of brutal conflict between relatively mature populations. In Pakistan and Iraq, the youth bulge won’t drop below 40 percent until 2023 and 2030, respectively. Afghanistan is another story. It has one of the world’s fastest-growing populations, with more than 50 percent of the population currently 15 to 29 years old. The United Nations does not project that age group to dip below 40 percent before 2050. “The demographic pyramid of Afghanistan right now,” Haas says, “is really frightening from a stability point of view.” Carolyn O’Hara lives in Washington, D.C.
b7a866333aad4766727010b25428f509
https://www.smithsonianmag.com/history/the-ambush-that-changed-history-72636736/?all
The Ambush That Changed History
The Ambush That Changed History “This is the soil of 2,000 years ago, where we are standing now,” Susanne Wilbers-Rost was saying as a young volunteer pried a small, dark clod out of it. Wilbers-Rost, a specialist in early German archaeology, peered through wire-rimmed glasses, brushed away some earth, and handed an object to me. “You’re holding a nail from a Roman soldier’s sandal,” she said. Atrim, short-haired woman, Wilbers-Rost has worked at the site, which is ten miles north of the manufacturing city of Osnabrück, Germany, since 1990. Inch by inch, several young archaeologists under her direction are bringing to light a battlefield that was lost for almost 2,000 years, until an off-duty British Army officer stumbled across it in 1987. The sandal nail was a minor discovery, extracted from the soil beneath an overgrown pasture at the base of Kalkriese (the word may derive from Old High German for limestone), a 350-foot-high hill in an area where uplands slope down to the north German plain. But it was further proof that one of the pivotal events in European history took place here: in A.D. 9, three crack legions of Rome’s army were caught in an ambush and annihilated. Ongoing finds—ranging from simple nails to fragments of armor and the remains of fortifications—have verified the innovative guerrilla tactics that according to accounts from the period, neutralized the Romans’ superior weaponry and discipline. It was a defeat so catastrophic that it threatened the survival of Rome itself and halted the empire’s conquest of Germany. “This was a battle that changed the course of history,” says Peter S. Wells, a specialist in Iron Age European archaeology at the University of Minnesota and the author of The Battle That Stopped Rome. “It was one of the most devastating defeats ever suffered by the Roman Army, and its consequences were the most far-reaching. The battle led to the creation of a militarized frontier in the middle of Europe that endured for 400 years, and it created a boundary between Germanic and Latin cultures that lasted 2,000 years.” Had Rome not been defeated, says historian Herbert W. Benario, emeritus professor of classics at EmoryUniversity, a very different Europe would have emerged. “Almost all of modern Germany as well as much of the present-day CzechRepublic would have come under Roman rule. All Europe west of the Elbe might well have remained Roman Catholic; Germans would be speaking a Romance language; the Thirty Years’ War might never have occurred, and the long, bitter conflict between the French and the Germans might never have taken place.” Founded (at least according to legend) in 753 b.c., Rome spent its formative decades as little more than an overgrown village. But within a few hundred years, Rome had conquered much of the Italian peninsula, and by 146 b.c., had leapt into the ranks of major powers by defeating Carthage, which controlled much of the western Mediterranean. By the beginning of the Christian Era, Rome’s sway extended from Spain to Asia Minor, and from the North Sea to the Sahara. The imperial navy had turned the Mediterranean into a Roman lake, and everywhere around the rim of the empire, Rome’s defeated enemies feared her legions—or so it seemed to optimistic Romans. “Germania” (the name referred originally to a particular tribe along the Rhine), meanwhile, did not exist as a nation at all. Various Teutonic tribes lay scattered across a vast wilderness that reached from present-day Holland to Poland. The Romans knew little of this densely forested territory governed by fiercely independent chieftains. They would pay dearly for their ignorance. There are many reasons, according to ancient historians, that the imperial Roman legate Publius Quinctilius Varus set out so confidently that September in a.d. 9. He led an estimated 15,000 seasoned legionnaires from their summer quarters on the WeserRiver, in what is now northwestern Germany, west toward permanent bases near the Rhine. They were planning to investigate reports of an uprising among local tribes. Varus, 55, was linked by marriage to the imperial family and had served as Emperor Augustus’ representative in the province of Syria (which included modern Lebanon and Israel), where he had quelled ethnic disturbances. To Augustus, he must have seemed just the man to bring Roman civilization to the barbarous” tribes of Germany. Like his patrons in Rome, Varus thought occupying Germany would be easy. “Varus was a very good administrator, but he was not a soldier,” says Benario. “To send him out into an unconquered land and tell him to make a province of it was a huge blunder on Augustus’ part.” Rome’s imperial future was by no means foreordained. At age 35, Augustus, the first emperor, still styled himself “first citizen” in deference to lingering democratic sensibilities of the fallen RomanRepublic, whose demise—after the assassination of Caesar—had brought him to power in 27 b.c., following a century of bloody civil wars. During Augustus’ rule, Rome had grown into the largest city in the world, with a population that may have approached one million. The German frontier held a deep allure for Augustus, who regarded the warring tribes east of the Rhine as little more than savages ripe for conquest. Between 6 b.c. and a.d. 4, Roman legions had mounted repeated incursions into the tribal lands, eventually establishing a chain of bases on the Lippe and Weser rivers. In time, despite growing resentment of the Roman presence, the tribes exchanged iron, cattle, slaves and foodstuffs for Roman gold and silver coins and luxury goods. Some tribes even pledged allegiance to Rome; German mercenaries served with Roman armies as far away as the present-day Czech Republic. One such German soldier of fortune, a 25-year-old prince of the Cherusci tribe, was known to the Romans as Arminius. (His tribal name has been lost to history.) He spoke Latin and was familiar with Roman tactics, the kind of man the Romans relied on to help their armies penetrate the lands of the barbarians. For his valor on the field of battle, he had been awarded the rank of knight and the honor of Roman citizenship. On that September day, he and his mounted auxiliaries were deputized to march ahead and rally some of his own tribesmen to help in putting down the rebellion. Arminius’ motives are obscure, but most historians believe he had long harbored dreams of becoming king of his tribe. To achieve his goal, he concocted a brilliant deception: he would report a fictitious “uprising” in territory unfamiliar to the Romans, then lead them into a deadly trap. A rival chieftain, Segestes, repeatedly warned Varus that Arminius was a traitor, but Varus ignored him. “The Romans,” says Wells, “thought they were invincible.” Arminius had instructed the Romans to make what he had described as a short detour, a one- or two-day march, into the territory of the rebels.The legionnaires followed along rudimentary trails that meandered among the Germans’ farmsteads, scattered fields, pastures, bogs and oak forests. As they progressed, the line of Roman troops—already seven or eight miles long, including local auxiliaries, camp followers and a train of baggage carts pulled by mules—became dangerously extended. The legionnaires, wrote third-century historian Cassius Dio, “were having a hard time of it, felling trees, building roads, and bridging places that required it. . . . Meanwhile, a violent rain and wind came up that separated them still further, while the ground, that had become slippery around the roots and logs, made walking very treacherous for them, and the tops of the trees kept breaking off and falling down, causing much confusion. While the Romans were in such difficulties, the barbarians suddenly surrounded them on all sides at once,” Dio writes of the preliminary German skirmishes. “At first they hurled their volleys from a distance; then, as no one defended himself and many were wounded, they approached closer to them.” Somehow, the command to attack had gone out to the German tribes. “This is pure conjecture,” says Benario, “but Arminius must have delivered a message that the Germans should begin their assault.” The nearest Roman base lay at Haltern, 60 miles to the southwest. So Varus, on the second day, pressed on doggedly in that direction. On the third day, he and his troops were entering a passage between a hill and a huge swamp known as the Great Bog that, in places, was no more than 60 feet wide. As the increasingly chaotic and panicky mass of legionnaires, cavalrymen, mules and carts inched forward, Germans appeared from behind trees and sand-mound barriers, cutting off all possibility of retreat. “In open country, the superbly drilled and disciplined Romans would surely have prevailed,” says Wells. “But here, with no room to maneuver, exhausted after days of hit-and-run attacks, unnerved, they were at a crippling disadvantage.” Varus understood that there was no escape. Rather than face certain torture at the hands of the Germans, he chose suicide, falling on his sword as Roman tradition prescribed. Most of his commanders followed suit, leaving their troops leaderless in what had become a killing field. “An army unexcelled in bravery, the first of Roman armies in discipline, in energy, and in experience in the field, through the negligence of its general, the perfidy of the enemy, and the unkindness of fortune. . . . was exterminated almost to a man by the very enemy whom it has always slaughtered like cattle,” according to the a.d. 30 account of Velleius Paterculus, a retired military officer who may have known both Varus and Arminius. Only a handful of survivors managed somehow to escape into the forest and make their way to safety. The news they brought home so shocked the Romans that many ascribed it to supernatural causes, claiming a statue of the goddess Victory had ominously reversed direction. The historian Suetonius, writing a century after the battle, asserted that the defeat “nearly wrecked the empire.” Roman writers, says Wells, “were baffled by the disaster.” Though they blamed the hapless Varus, or the treachery of Arminius, or the wild landscape, in reality, says Wells, “the local societies were much more complex than the Romans thought. They were an informed, dynamic, rapidly changing people, who practiced complex farming, fought in organized military units, and communicated with each other across very great distances.” More than 10 percent of the entire imperial army had been wiped out—the myth of its invincibility shattered. In the wake of the debacle, Roman bases in Germany were hastily abandoned. Augustus, dreading that Arminius would march on Rome, expelled all Germans and Gauls from the city and put security forces on alert against insurrections. Six years would pass before a Roman army would return to the battle site. The scene the soldiers found was horrific. Heaped across the field at Kalkriese lay the whitening bones of dead men and animals, amid fragments of their shattered weapons. In nearby groves they found “barbarous altars” upon which the Germans had sacrificed the legionnaires who surrendered. Human heads were nailed everywhere to trees. In grief and anger, the aptly named Germanicus, the Roman general leading the expedition, ordered his men to bury the remains, in the words of Tacitus, “not a soldier knowing whether he was interring the relics of a relative or a stranger, but looking on all as kinsfolk and of their own blood, while their wrath rose higher than ever against the foe.” Germanicus, ordered to campaign against the Cherusci, still under the command of Arminius, pursued the tribe deep into Germany. But the wily chieftain retreated into the forests, until, after a series of bloody but indecisive clashes, Germanicus fell back to the Rhine, defeated. Arminius was “the liberator of Germany,” Tacitus wrote, “a man who, . . . threw down the challenge to the Roman nation.” For a time, tribes flocked to join Arminius’ growing coalition. But as his power grew, jealous rivals began to defect from his cause. He “fell by the treachery of his relatives,” Tacitus records, in a.d. 21. With the abdication of the Romans from Germany, the Kalkriese battlefield was gradually forgotten. Even the Roman histories that recorded the debacle were lost, sometime after the fifth century, during the collapse of the empire under the onslaught of barbarian invasions. But in the 1400s, humanist scholars in Germany rediscovered the works of Tacitus, including his account of Varus’ defeat. As a consequence, Arminius was hailed as the first national hero of Germany. “The myth of Arminius,” says Benario, “helped give Germans their first sense that there had been a German people that transcended the hundreds of small duchies that filled the political landscape of the time.” By 1530, even Martin Luther praised the ancient German chieftain as a “war leader” (and updated his name to “Hermann”). Three centuries later, Heinrich von Kleist’s 1809 play, Hermann’s Battle, invoked the hero’s exploits to encourage his countrymen to fight Napoleon and his invading armies. By 1875, as German militarism surged, Hermann had been embraced as the nation’s paramount historical symbol; a titanic copper statue of the ancient warrior, crowned with a winged helmet and brandishing his sword menacingly toward France, was erected on a mountaintop 20 miles south of Kalkriese, near Detmold, where many scholars then believed the battle took place. At 87 feet high, and mounted on an 88-foot stone base, it was the largest statue in the world until the Statue of Liberty was dedicated in 1886. Not surprisingly, the monument became a popular destination for Nazi pilgrimages during the 1930s. But the actual location of the battle remained a mystery. More than 700 sites, ranging from the Netherlands to eastern Germany, were proposed. Amateur archaeologist Tony Clunn of Britain’s Royal Tank Regiment was hoping for a chance to indulge his interest when he arrived at his new posting in Osnabrück in the spring of 1987. (He had previously assisted archaeologists in England during his spare time, using a metal detector to search for traces of Roman roads.) Captain Clunn introduced himself to the director of the Osnabrück museum, Wolfgang Schlüter, and asked him for guidance. The British officer promised to turn over to the museum anything he found. “In the beginning, all I had ever hoped to find was the odd Roman coin or artifact,” Clunn, who retired from the army with the rank of major in 1996, told me, as we sat drinking tea in a café next to the Varusschlacht (Varus Battle) Museum and Park Kalkriese, which opened in 2002. Schlüter had suggested that he try the rural Kalkriese area, where a few coins had already been found. Clunn planned his assault with a soldier’s eye to detail. He pored over old maps, studied regional topography and read extensively about the battle, including a treatise by 19th-century historian Theodor Mommsen, who had speculated that it took place somewhere near Kalkriese, although few agreed with him. As Clunn drove around Kalkriese in his black Ford Scorpio, introducing himself to local farmers, he saw a landscape that had changed significantly since Roman times. Forests of oak, alder and beech had long since given way to cultivated fields and copses of pine. Stolid modern farm buildings with red-tile roofs stood in place of the huts of the ancient tribesmen. The Great Bog itself had disappeared, drained in the 19th century; it was now bucolic pastureland. Using an old hand-drawn map he got from a local landowner, Clunn noted the locations of earlier coin finds. “The secret is to look for the easy route that people would have taken in ancient times,” he says. “No one wants to dig a lot of unnecessary holes in the ground. So you look for the most logical spot to start searching—for example, a pass where a trail might narrow, a bottleneck.” Clunn focused on the area between where the Great Bog had been and Kalkriese Hill. As he walked, sweeping his metal detector from side to side, he noticed a slight elevation. “I sensed it was an old trackway, perhaps a path across the bog,” he says. He began following the elevation, working backward toward the hills. Before long, a ringing in his earphones indicated metal in the earth. He bent over, carefully cut away a small square of turf with a trowel, and began to dig, sifting the peaty soil through his fingers. He dug down about eight inches. “Then I saw it!” Clunn exclaims. In his hand lay a small, round silvercoin, blackened with age—a Roman denarius, stamped on one side with the aquiline features of Augustus, and on the other, with two warriors armed with battle shields and spears. “I could scarcely believe it,” he says. “I was transfixed.” Soon he found a second denarius, then a third. Who lost these? He asked himself, and what had the coin carrier been doing—running, riding, walking? Before Clunn left the area for the day, he carefully logged the location of the coins on his grid map, sealed them in plastic pouches and restored the clods of dirt. The next time Clunn returned to Kalkriese, his metal detector signaled another find: at a depth of about a foot, he discovered another denarius. This one, too, bore a likeness of Augustus on one side, and on the other, a bull with head lowered, as if about to charge. By the end of the day, Clunn had unearthed no fewer than 89 coins. The following weekend, he found still more, for a total of 105, none minted later than the reign of Augustus. The vast majority were in pristine condition, as if they had been little circulated when they were lost. In the months that followed, Clunn continued his explorations, always turning over his finds to Schlüter. Along with coins, he discovered shards of lead and bronze, nails, fragments of a groma (a distinctive Roman road-surveying device) and three curious ovoid pieces of lead that German scholars identified as sling shot. “Slowly but surely a cohesive pattern began to emerge,” says Clunn. “There was every indication that a large contingent of people had splayed out from the area at the apex to the field, fleeing from an unknown horror.” Clunn began to suspect that he had found what was left of Varus’ lost legions. Thanks to Schlüter’s contacts in German academia, the site was recognized, almost immediately, as a major discovery. Professional archaeologists under the direction of Schlüter and, later, Wilbers-Rost undertook systematic excavations. They were fortunate: sometime in the past, local farmers had covered the poor sandy subsoil with a thick layer of sod that had protected the undiscovered artifacts below. Since the early 1990s, excavations have located battle debris along a corridor almost 15 miles long from east to west, and a little more than 1 mile from north to south, offering additional proof that it unfolded over many miles, before reaching its dreadful climax at Kalkriese. Perhaps the most important single discovery was evidence of a wall 4 feet high and 12 feet thick, built of sand and reinforced by chunks of sod. “Arminius learned much from his service with the Romans,” says Wilbers-Rost. “He knew their tactics and their weak points. The wall zigzagged so that the Germans on top of it could attack the Romans from two angles. They could stand on the wall, or rush out through gaps in it to attack the Roman flank, and then run back behind it for safety.” Concentrations of artifacts were found in front of the wall, suggesting that the Romans had tried to scale it. The dearth of objects behind it testifies to their failure to do so. The more the archaeologists excavated, the more they appreciated the immensity of the massacre. Clearly, Arminius and his men had scoured the battlefield after the slaughter and carried off everything of value, including Roman armor, helmets, gold and silver, utensils and weapons. Most of what archaeologists have unearthed consists of items the victors failed to notice, or dropped as they looted. Still, there have been some spectacular finds, including the remnants of a Roman officer’s scabbard and, most notably, a Roman standard-bearer’s magnificent silver face mask. They also uncovered coins stamped with the letters “VAR,” for Varus, which the ill-fated commander had awarded his troops for meritorious service. In all, Wilbers-Rost’s team has found more than 5,000 objects: human bones (including several skulls gruesomely split by swords), spearheads, bits of iron, harness rings, metal studs, pieces of armor, iron nails, tent pegs, scissors, bells that once hung from the necks of Roman mules, a wine strainer and medical instruments. Many of these objects, cleaned and restored, are on display in the museum at the site. (Archaeologists also found fragments of bombs that Allied planes dropped on the area during World War II.) Clunn, now 59, still works, as a staff officer, for the British military in Osnabrück. One recent afternoon, amid intermittent cloudbursts, he and I drove east from Kalkriese along the route that Varus’ army most likely followed on the last day of its harrowing march. We stopped at a low hill on the outskirts of the village of Schwagstorf. From the car, I could barely detect the rise in the ground, but Clunn assured me that this was the highest s ot in the vicinity. “It’s the only place that offers any natural defense,” he said. Here, he has found the same types of coins and artifacts that have been unearthed at Kalkriese; he hopes that future excavationswill determine that the battered Roman forces attempted to regroup here shortly before they met their doom. As we stood at the edge of a traffic circle and gazed across a cornfield, he added: “I’m convinced that this is the site of Varus’ last camp.”
af5ee570982b6d9d60079e718f7eec1d
https://www.smithsonianmag.com/history/the-art-and-science-of-embarrassing-art-162978657/
The Art and Science of Embarrassing Art
The Art and Science of Embarrassing Art German Expressionist art is not easy to appreciate. It can be embarrassing, which is probably the point. Three artists living in Vienna at the turn of the century (~1880-1920), Gustav Klimt, Oskar Kokoschka and Egon Schiele, were instrumental in moving art away from the goal of producing something beautiful towards the goal of expressing and evoking thoughts and emotions that were (and usually still are) considered inappropriate for public display. Not coincidentally, the same milieu also produced Sigmund Freud. How Western Art gradually approached realistic representationalism is not inherently interesting, but why and how a group of artists living at the same time and in the same city as Freud undertook to portray unconscious emotions is. To comprehend this movement in art, it helps to appreciate the intellectual climate of fin de Siècle Vienna, understand the neurobiology of emotion, and know how we perceive both art and emotion. This is a huge challenge, but Eric Kandel, in The Age of Insight, has undertaken this task, with very satisfying and enlightening results. Kandel’s expertise in the realm of neuroscience is unsurpassed: He wrote an excellent textbook on neuroscience and won a Nobel Prize for his neuroscience research. He was trained as a psychiatrist. He is a professor of neuroscience, not art history, but his personal connection to Vienna inspired him to explore the cultural and artistic ideas out of which Viennese Modernism emerged. He was born in Vienna in 1923 to a Jewish intellectual family: “I was forced to leave Vienna as a child, but the intellectual life of turn-of-the-century Vienna is in my blood," he writes. "My heart beats in three-quarter time.” This book is thus a synergy between the passion and the intellect of a great mind. To whet your appetite: Berta Zuckerkandle’s salon regularly brought together artists, scientists and writers. She was a writer and an art critic, married to Emil Zuckerkandle, the Chairman of Anatomy at the Vienna School of Medicine. Klimt invited Emil to give a series of lectures on biology and anatomy to a group of his artist friends, in which he was reported to have wowed his audience by projecting lantern slides of microscopic sections of tissues and cells. So those decorative things in Klimt’s portraits that look like cells, sperm and things from embryology, really are. Kandel traces the cross-fertilization of ideas among the intellectual circles in Vienna 1900. Richard von Krafft-Ebing, Chair of Psychiatry of the Vienna School of Medicine, put forward the idea that sexuality influences everyday behavior. Later Freud developed his theory that powerful forces of aggression and sexuality can influence behavior without entering conscious awareness. Freud himself tried, somewhat unsuccessfully, to understand the art of both Michelangelo and Leonardo da Vinci in terms of their relationships with their mothers and their adult erotic attachments; his attempts nevertheless encouraged others at the Vienna School of Art History to formally develop a cognitive psychology of Art. Simultaneously with Freud’s publication of On the Interpretation of Dreams, the Viennese writer Arthur Schnitzler introduced the interior monologue, or stream-of-consciousness, by which a protagonist’s innermost thoughts and feelings are exposed. Margaret S. Livingstone, PhD, is a Professor of Neurobiology Harvard Medical School
528e7204f553d3663ded3b80d17d17ff
https://www.smithsonianmag.com/history/the-ax-murderer-who-got-away-117037374/
The Ax Murderer Who Got Away
The Ax Murderer Who Got Away Shortly after midnight on June 10, 1912—one hundred years ago this week—a stranger hefting an ax lifted the latch on the back door of a two-story timber house in the little Iowa town of Villisca. The door was not locked—crime was not the sort of thing you worried about in a modestly prosperous Midwest settlement of no more than 2,000 people, all known to one another by sight—and the visitor was able to slip inside silently and close the door behind him. Then, according to a reconstruction attempted by the town coroner next day, he took an oil lamp from a dresser, removed the chimney and placed it out of the way under a chair, bent the wick in two to minimize the flame, lit the lamp, and turned it down so low it cast only the faintest glimmer in the sleeping house. Villisca: The True Account of the Unsolved Mass Murder That Stunned The Nation Still carrying the ax, the stranger walked past one room in which two girls, ages 12 and 9, lay sleeping, and slipped up the narrow wooden stairs that led to two other bedrooms. He ignored one, in which four more young children were sleeping, and crept into the room in which 43-year-old Joe Moore lay next to his wife, Sarah. Raising the ax high above his head—so high it gouged the ceiling—the man brought the flat of the blade down on the back of Joe Moore’s head, crushing his skull and probably killing him instantly. Then he struck Sarah a blow before she had time to wake or register his presence. The Moore house in Villisca, 1912. One of the town’s larger and better-appointed properties, it still stands today and has been turned into Villisca’s premier tourist attraction. For a price, visitors can stay in the house overnight; there is no shortage of interested parties. Leaving the couple dead or dying, the killer went next door and used the ax—Joe’s own, probably taken from where it had been left in the coal shed—to kill the four Moore children as they slept. Once again, there is no evidence that Herman, 11; Katherine, 10; Boyd, 7; or Paul, 5, woke before they died. Nor did the assailant or any of the four children make sufficient noise to disturb Katherine’s two friends, Lena and Ina Stillinger, as they slept downstairs. The killer then descended the stairs and took his ax to the Stillinger girls, the elder of whom may finally have awakened an instant before she, too, was murdered. What happened next marked the Villisca killings as truly peculiar and still sends shivers down the spine a century after the fact. The ax man went back upstairs and systematically reduced the heads of all six Moores to bloody pulp, striking Joe alone an estimated 30 times and leaving the faces of all six members of the family  unrecognizable. He then drew up the bedclothes to cover Joe and Sarah’s shattered heads, placed a gauze undershirt over Herman’s face and a dress over Katherine’s, covered Boyd and Paul as well, and finally administered the same terrible postmortem punishment to the girls downstairs before touring the house and ritually hanging cloths over every mirror and piece of glass in it. At some point the killer also took a two-pound slab of uncooked bacon from the icebox, wrapped it in a towel, and left it on the floor of the downstairs bedroom close to a short piece of key chain that did not, apparently, belong to the Moores. He seems to have stayed inside the house for quite some time, filling a bowl with water and–some later reports said–washing his bloody hands in it. Some time before 5 a.m., he abandoned the lamp at the top of the stairs and left as silently as he had come, locking the doors behind him. Taking the house keys, the murderer vanished as the Sunday sun rose red in the sky. Lena and Ina Stillinger. Lena, the elder of the girls, was the only one who may have awoken before she died. The Moores were not discovered until several hours later, when a neighbor, worried by the absence of any sign of life in the normally boisterous household, telephoned Joe’s brother, Ross, and asked him to investigate. Ross found a key on his chain that opened the front door, but barely entered the house before he came rushing out again, calling for Villisca’s marshal, Hank Horton. That set in train a sequence of events that destroyed what little hope there may have been of gathering useful evidence from the crime scene. Horton brought along Drs. J. Clark Cooper and Edgar Hough and Wesley Ewing, the minister of the Moore’s Presbyterian congregation. They were followed by the county coroner, L.A. Linquist, and a third doctor, F.S. Williams (who became the first to examine the bodies and estimate a time of death). When a shaken Dr Williams emerged, he cautioned members of the growing crowd outside: “Don’t go in there, boys; you’ll regret it until the last day of your life.” Many ignored the advice; as many as 100 curious neighbors and townspeople tramped as they pleased through the house, scattering fingerprints, and in one case even removing fragments of Joe Moore’s skull as a macabre keepsake. The murders convulsed Villisca, particularly after a few clumsy and futile attempts to search the surrounding countryside for a transient killer failed to unearth a likely suspect. The simple truth was that there was no sign of the murderer’s whereabouts. He might have vanished back into his own home nearby; equally, given a head start of up to five hours in a town at which nearly 30 trains called every day, he might easily have made good his escape. Bloodhounds were tried without success; after that there was little for the townspeople to do but gossip, swap theories–and strengthen their locks. By sundown there was not a dog to be bought in Villisca at any price. Dona Jones, daughter-in-law of Iowa state senator Frank Jones, was widely rumored in Villisca to have had an affair with Joe Moore. The most obvious suspect may have been Frank Jones, a tough local businessman and state senator who was also a prominent member of Villisca’s Methodist church. Edgar Epperly, the leading authority on the murders, reports that the town quickly split along religious lines, the Methodists insisting on Jones’s innocence and the Moores’ Presbyterian congregation convinced of his guilt. Though never formally charged with any involvement in the murders, Jones became the subject of a grand jury investigation and a prolonged campaign to prove his guilt which destroyed his political career. Many townspeople were certain he used his considerable influence to have the case against him quashed. There were at least two compelling reasons to believe that Jones had nursed a hatred of Joe Moore. First, the dead man had worked for him for seven years, becoming the star salesman of Jones’s farm-equipment business. But Moore had left in 1907–dismayed, perhaps, by his boss’s insistence on hours of 7 a.m. to 11 p.m., six days a week—and set himself up as a head-to-head rival, taking the valuable John Deere account with him. Worse, he was also believed to have slept with Jones’s vivacious daughter-in-law, a local beauty whose numerous affairs were well known in town thanks to her astonishingly indiscreet habit of arranging trysts over the telephone at a time when all calls in Villisca had to be placed through an operator. By 1912 relations between Jones and Moore had grown so cold that the they began to cross the street to avoid each other, an ostentatious sign of hatred in such a minuscule community. The Reverend Lyn Kelly, a markedly peculiar Presbyterian preacher, attended the Children’s Day service in Villisca at which the Moore children gave recitations, and later confessed to murdering the family—only to recant and claim police brutality. Few people in Villisca believed that a man of Jones’s age and eminence—he was 57 in 1912—would have swung the ax himself, but in some minds he was certainly capable of paying someone else to wipe out Moore and his family. That was the theory of James Wilkerson, an agent of the renowned Burns Detective Agency, who in 1916 announced that Jones had hired a killer by the name of William Mansfield to murder the man who had humiliated him. Wilkerson—who made enough of a nuisance of himself to derail Jones’s attempts to secure re-election to the state senate, and who eventually succeeded in having a grand jury convened to consider the evidence he had gathered–was able to show that Mansfield had the right sort of background for the job: In 1914 he was the chief suspect in the ax murders of his wife, her parents and his own child in Blue Island, Illinois. Unfortunately for Wilkerson, Mansfield turned out to have a cast-iron alibi for the Villisca killings. Payroll records showed that had been working several hundred miles away in Illinois at the time of the murders, and he was released for lack of evidence. That did not stop many locals—including Ross Moore and Joe Stillinger, father of the two Stillinger girls—from believing in Jones’s guilt. The rancor caused by Wilkerson lingered on in the town for years. The advert that Lyn Kelly placed in the Omaha World-Herald. One respondent received a “lascivious” multi-page reply which told her she would be required to type in the nude. For others, though, there was a far stronger–and far stranger– candidate for the ax man. His name was Lyn George Jacklin Kelly, and he was an English immigrant, a preacher and a known sexual deviant with well-recorded mental problems. He had been in the town on the night of the murders and freely admitted that he had left on a dawn train just before the bodies were discovered. There were things about Kelly that made him seem an implausible suspect—not least that he stood only 5-foot-2 and weighed 119 pounds—but in other ways he fit the bill. He was left-handed, and Coroner Linquist had determined from an examination of blood spatters in the murder house that the killer probably swung his ax that way. Kelly was obsessed with sex, and had been caught peering into windows in Villisca two days before the murders. In 1914, living in Winner, South Dakota, he would advertise for a “girl stenographer” to do “confidential work,” and that ad, placed in the Omaha World-Herald, would also specify that the successful candidate “must be willing to pose as model.” When a young woman named Jessamine Hodgson responded, she received in return a letter, described by a judge as “so obscene, lewd, lascivious and filthy as to be offensive to this honorable court and improper to be spread upon the record thereof.” Amongst his milder instructions, Kelly told Hodgson that she would be required to type in the nude. Convicted ax murderer Henry Lee Moore was the suspect favored by Department of Justice Special Agent Matthew McClaughry–who believed he committed a total of nearly 30 similar murders across the Midwest in 1911-12 . Investigation soon made plain that there were links between Lyn Kelly and the Moore family. Most sinister, for those who believed in the little preacher’s guilt, was the fact that Kelly had attended the Children’s Day service held at Villisca’s Presbyterian church on the evening of the murders. The service had been organized by Sarah Moore, and her children, together with Lena and Ina Stillinger, had played prominent parts, dressed up in their Sunday best. Many in Villisca were willing to believe that Kelly had spotted the family in the church and become obsessed with them, and that he had spied on the Moore household as it went to bed that evening. The idea that the killer had lain in wait for the Moores to go to sleep was supported by some evidence; Linquist’s investigation had revealed a depression in some bales of hay stored in the family barn, and a knot hole through which the murderer could have watched the house while reclining in comfort. That Lena Stillinger had been found wearing no underwear and with her nightdress drawn up past her waist did suggest a sexual motive, but doctors found no evidence of that sort of assault. It took time for the case against Kelly to get anywhere, but in 1917 another grand jury finally assembled to hear the evidence linking him with Lena’s murder. At first glance, the case against Kelly seemed compelling; he had sent bloody clothing to the laundry in nearby Macedonia, and an elderly couple recalled meeting the preacher when he alighted from a 5.19 a.m. train from Villisca that June 10 and being told that gruesome murders had been committed in the town—a hugely incriminating statement, since the preacher had left Villisca three hours before the killings were discovered. It also emerged that Kelly had returned to Villisca a week later and shown great interest in the murders, even posing as a Scotland Yard detective to obtain a tour of the Moore house. Arrested in 1917, the Englishman was repeatedly interrogated and eventually signed a confession to the murder in which he stated: “I killed the children upstairs first and the children downstairs last. I knew God wanted me to do it this way. `Slay utterly’ came to my mind, and I picked up the axe, went into the house and killed them.” This he later recanted, and the couple who claimed to have spoken to him on the morning after murders changed their story. With little left to tie him firmly to the killings, the first grand jury to hear Kelly’s case hung 11-1 in favor of refusing to indict him, and a second panel freed him. Rollin and Anna Hudson were the victims of an ax murderer in Paola, Kansas, just five days before the Villisca killings. Perhaps the strongest evidence that both Jones and Kelly were most likely innocent came not from Villisca itself but from other communities in the Midwest, where, in 1911 and 1912, a bizarre chain of ax murders seemed to suggest that a transient serial killer was at work. The researcher Beth Klingensmith has suggested that as many as 10 incidents that occurred close to railway tracks but in locations as far apart as Rainier, Washington, and Monmouth, Illinois, might form part of this chain, and in several cases there are striking similarities to the Villisca crime. The pattern, first pointed out in 1913 by Special Agent Matthew McClaughry of the Justice Department’s Bureau of Investigation (forerunner of the FBI), began with the murder of a family of six in Colorado Springs in September 1911 and continued with two further incidents in Monmouth (where the murder weapon was actually a pipe) and in Ellsworth, Kansas. Three and five people died in those attacks, and two more in Paola, Kansas, where someone murdered Rollin Hudson and his unfaithful wife just four days before the killings in Villisca. As far as McClaughry was concerned, the slaughter culminated in December 1912 with the brutal murders of Mary Wilson and her daughter Georgia Moore in Columbia, Missouri. His theory was that Henry Lee Moore, Georgia’s son and a convict with a  history of violence, was responsible for the whole series. It is not necessary to believe that Henry Lee Moore was a serial killer to consider that the string of Midwest ax murders have intriguing similarities that may tie the Villisca massacre to other crimes. Moore is now rarely considered a good suspect; he was certainly an unsavory character—released from a reformatory in Kansas shortly before the ax murders began, arrested in Jefferson City, Missouri, shortly after they ended, and eventually convicted of the Columbia murders. But his motive in that case was greed–he planned to obtain the deeds to his family house–and it is rare for a wandering serial killer to return home and kill his own family. Nonetheless, analysis of the sequence of murders—and several others that McClaughry did not consider—yields some striking comparisons. Blanche Wayne, of Colorado Springs, may have been the first victim of a Midwest serial murderer. She was killed in her bed in September 1911 by an ax man who heaped bedclothes on her head and stopped to wash his hands, leaving the weapon at the scene. The use of an ax in almost every case was perhaps not so remarkable in itself; while there certainly was an unusual concentration of ax killings in the Midwest at this time, almost every family in rural districts owned such an implement, and often left it lying in their yard; as such, it might be considered a weapon of convenience. Similarly, the fact that the victims died asleep in their beds was likely a consequence of the choice of weapon; an ax is nearly useless against a mobile target. Yet other similarities among the crimes are much harder to explain away. In eight of the 10 cases, the murder weapon was found abandoned at the scene of the crime; in as many as seven, there was a railway line nearby; in three, including Villisca, the murders took place on a Sunday night. Just as significant, perhaps, four of the cases—Paolo, Villisca, Rainier and a solitary murder that took place in Mount Pleasant, Iowa—featured killers who covered their victims’ faces, three murderers had washed at the scene, and at least five of the killers had lingered in the murder house. Perhaps most striking of all, two other homes (those of the victims of the Ellsworth and Paola murders) had been lit by lamps in which the chimney had been laid aside and the wick bent down, just as it had been at Villisca. Whether or not all these murders really were connected remains a considerable puzzle. Some pieces of evidence fit patterns, but others do not. How, for example, might a stranger to Villisca have so uneeringly located Joe and Sarah Moore’s bedroom by low lamp light, ignoring the children’s rooms until the adults were safely dead? On the other hand, the use of the flat of the ax blade to strike the fatal initial blows does suggest the murderer had previous experience–any deep cut made with the sharp edge of the blade was more likely to result in the ax becoming lodged in the wound, making it far riskier to attack a sleeping couple. And the Paola murders have striking similarities with Villisca aside from the killer’s use of a carefully adapted lamp; in both cases, for example, odd incidents occurred the same night that suggest the killer may have attempted to strike twice. In Villisca, at 2.10 a.m. on the night of the murder, telephone operator Xenia Delaney heard strange footsteps approaching up the stairs, and an unknown hand tried her locked door, while in Paola, a second family was awakened in the dead of night by a sound that turned out to be a lamp chimney falling to the floor. Rising hurriedly, the occupants of that house were in time to see an unknown man escaping through a window. Perhaps the spookiest of all such similarities, however, was the strange behavior of the unknown murderer of William Showman, his wife, Pauline, and their three children in Ellsworth, Kansas in October 1911. In the Ellsworth case, not only was a chimneyless lamp used to illuminate the murder scene, but a little heap of clothing had been placed over the Showmans’ telephone. A Western Electric Model 317 telephone, one of the most popular on sale in the Midwest in 1911-12. Note the phone’s startlingly “human” features. Why bother to muffle a phone that was highly unlikely to ring at one in the morning? Perhaps, as one student of the murders posits, for the same reason that the Villisca murderer took such great pains to cover the faces of his victims, and then went around the murder house carefully draping torn clothing and cloths over all the mirrors and all the windows: because he feared that his dead victims were somehow conscious of his presence. Might the Ellsworth killer have covered the telephone out of the same desperate desire to ensure that, nowhere in the murder house, was there a pair of eyes still watching him? Sources Beth H. Klingensmith. “The 1910s Ax Murders: An Overview of the McClaughry Theory.” Emporia State University Research Seminar, July 2006; Nick Kowalczyk. “Blood, Gore, Tourism: The Ax Murderer Who Saved a Small Town.” Salon.com, April 29, 2012; Roy Marshall. Villisca: The True Account of the Unsolved Mass Murder That Stunned The Nation. Chula Vista : Aventine Press, 2003; Omaha World-Herald, June 11, 12, 13, 14, 15, 16, 17, 1912; December 27, 1913; June 10, 2012. Several bloggers offer thoughtful insights into the Midwest ax murders. For the Villisca case, The 1912 Villisca Axe Murders Blog is a good place to start, and there was also occasional coverage at CLEWS. Meanwhile, Getting the Axe covers the whole apparent sequence of 1911-12 ax killings, with only a minor focus on the Villisca case itself. Mike Dash is a contributing writer in history for Smithsonian.com. Before Smithsonian.com, Dash authored the award-winning blog A Blast From the Past.
ca450e28aa4556570684c4fc454c6383
https://historynewsnetwork.org/article/179718
The Roundup Top Ten for March 26, 2021
The Roundup Top Ten for March 26, 2021 I Don’t Want My Role Models Erased by Elizabeth Becker The work of women journalists covering the war in Vietnam has been obscured in remembrance of the war and its place in American history and culture. The author seeks to recover the stories of Frances FitzGerald, Kate Webb and Catherine Leroy. Can a Grand Bargain Empower Amazon’s Workers and Limit Corporate Power? by Nelson Lichtenstein "Unions are weaker today than they were in the 1930s, but the idea that wages have to rise and democracy has to be revitalized, in the workplace and beyond, is returning in an echo of that era." Letters From an American: March 23, 2021 by Heather Cox Richardson Beginning in the 1970s, the National Rifle Assocaition evolved into a political lobbying organization increasingly enmeshed with the conservative movement. Two recent mass shootings are a tribute to the organization's success. Congratulations. The Battle Against D.C. Statehood is Rooted in Anti-Black Racism by Kyla Sommers "The continued power of Congress over the District’s affairs is rooted in this same fear of Black power and racist belief that a majority-non-White populace is incapable of independently governing itself." The Immovable AMLO by Humberto Beck, Carlos Bravo Regidor and Patrick Iber "AMLO continues to decry the faults of neoliberalism, but his government is, for the most part, failing to build an effective alternative to it." How the U.S. Tax Code Privileges White Families by Dorothy A. Brown The history of the married-filing-jointly tax return is one of affluent white families securing advantages through the tax code that working class families, including most Black taxpayers, were unable to realize. After the expansion of income taxation during World War II, this disparity became a significant source of inequality. We Need Social Science, Not Just Medical Science, to Beat the Pandemic by Nicholas Dirks "In order to ensure that scientific advances work not just to create new medicines but to help lead to a healthier and more just world, we need to ensure that science and social science work hand in hand as well." The Nazi-Fighting Women of the Jewish Resistance by Judy Batalion "I was raised in a community of Holocaust survivors and had earned a doctorate in women’s history. Why had I never heard these stories?" Medical Racism has Shaped U.S. Policies for Centuries by Dierdre Cooper Owens Medical racism is as old as America, and the COVID-19 pandemic has been no exception in terms of unequal vulnerability to disease. The Triangle Fire and the Fight for $15 by Christopher C. Gorham The Triangle Shirtwaist fire inspired workplace safety regulation and advanced the cause of organized labor. It's time to remember the victims with a commitment to a federal living wage law.
741178390a88f760f009567dde658d40
https://historynewsnetwork.org/article/179720
Will the Supreme Court Uphold the NCAA's Version of Amateurism?
Will the Supreme Court Uphold the NCAA's Version of Amateurism? EA Sports has, to the chagrin of many gamers, not produced an NCAA-licensed football video game since 2013. The video game market is just one area of dispute over the right of collegiate athletes to compensation for the commercial use of their names, images, or likenesses (NIL). Image Sports Gamers Online. On March 31, 2021, the U. S. Supreme Court will hear the case of NCAA v. Alston.  It is an antitrust case in which the NCAA argues that the property rights of Division I basketball and FBS football athletes should be dismissed because college athletes are amateurs.  If the NCAA wins, college athletes will continue to be deprived of financial benefit from the commercial use of their names, images, and likenesses (NILs).  The NCAA argues that it must control what players are paid to protect their amateurism.  Shawn Alston, as part of a class action suit, argues that the NCAA is violating antitrust law, and that property rights belong to the athletes as they do for all other college students, and they should be able to profit from their use. As I have just completed a book, The Myth of the Amateur:  A History of College Athletic Scholarships (University of Texas Press, 2021), I decided to initiate an amicus brief for the U. S. Supreme Court, challenging the NCAA’s perpetuation of the myth of college amateurism.  I asked five other historians who have written on the history of intercollegiate athletics to join me in the amicus brief. Taylor Branch wrote a piece in the Atlantic Monthly (October 2011) about the exploitation of college athletes under NCAA rules.  He is better known for the Pulitzer Prize he won for Parting the Waters, the first book in his trilogy on Martin Luther King.  Richard Crepeau is a historian at the University of Central Florida, who has written on Roman Catholic athletes as well as a recent history of the National Football League.  Sarah Fields, a lawyer and historian at the University of Colorado, Denver, has written a book about female competitors in contact sports and one on sports celebrity and the law.  Jay Smith is a French history scholar at the University of North Carolina, Chapel Hill, but has written on the decades-long disgrace of the notorious athletic-academic scandal at his institution.  John Thelin is a prominent educational historian at the University of Kentucky, who wrote a history of college athletic reform attempts. I was a history major and a baseball and basketball player at Northwestern University decades ago and was given a scholastic scholarship, but I was told if I did not keep a “B” average it would turn into an athletic scholarship.  I was not good enough to profit from my NIL, nor did anyone on my teams know that it was possible.  Later I did my doctoral work at the University of Wisconsin, writing my dissertation on an athletic conference.  I then joined the Penn State University faculty shortly after Joe Paterno became head football coach.  I have been interested in how athletes have been paid, not only when I was an undergraduate, but when I began researching intercollegiate athletics.  That took me back to the original college contest in America. It took place in 1852, nine years before the Civil War.  It occurred because a railroad magnate wanted to make money from sponsoring a crew meet between Harvard and Yale.  He told the Yale captain that if he would get Harvard athletes to agree, he would pay all expenses for an eight-day vacation for the crews at New Hampshire’s largest lake, Lake Winnipesaukee.  From that time to the present, athletes have often been paid in one form or another. Today, a major question is the paying of athletes for their property rights to use their names, images, and likenesses.  The U. S. Supreme Court will soon be pondering the question of whether the NCAA denying such property rights to athletes is a horizontal antitrust violation under the Sherman Antitrust Act of 1890.  The issue should have come up first in the early twentieth century, more than a century ago.  Probably the most prominent player among the big-time football schools of Harvard, Yale, and Princeton was James Hogan, a 29-year old who was paid in a variety of ways.  He had his tuition paid, lived in the most luxurious of Yale’s dormitories, ate free meals at the University Club, profited on scorecard sales at Yale baseball games, and was given a vacation in Cuba when the season was over.  But more important to the Supreme Court case, Hogan profited from his name and image from every American Tobacco Company pack of “Hogan” cigarettes sold in New Haven.  It was legal then to profit from his NIL. The U. S. Supreme Court will hear of the many ways that big-time college athletes are paid in 2021.  That is, there are more than a dozen methods in which athletes are paid legally beyond a full athletic scholarship, but which don’t violate “amateurism” under NCAA rules.  For instance, Olympic swimming gold medalist Katie Ledecky was awarded more than $300,000 in the last Olympics, and she remained eligible to swim for Stanford University.  An international gold medal swimmer, Joseph Schooling of Singapore, was given $700,000 for beating Michael Phelps in the 100-meter butterfly.  He then swam for the University of Texas as an amateur. The NCAA also allows athletes to draw thousands of dollars from two multi-million dollar funds, the Student Assistance Fund and the Academic Enhancement Fund.  In addition, federal Pell Grants for needy athletes increase athletes’ funding.  The NCAA also allows money to go to the conference athlete of the year, a team’s most-improved or most valuable player, and for bowl bound or March Madness player freebies worth hundreds of dollars.  This is not amateurism that the NCAA claims it is protecting.  Yet, the NCAA opposes a player gaining a portion of the revenue made from game jerseys sold which display his or her name and number (or from video games licensed by the NCAA). That property right is off limits, and a player who seeks to capitalize on it will be classified as a professional and lose eligibility. Our amicus brief points out the NCAA’s hypocrisy by quoting Taylor Branch: “no legal definition of amateur exists, and any attempt to create one in enforceable law would expose its repulsive and unconstitutional nature.”  According to Branch, “without logic or practicality or fairness to support amateurism, the NCAA’s final retreat is to sentiment.”  Historically, we point out the false NCAA claim that amateurism is central to college sport.  NCAA amateurism, originally opposed to any athlete being paid in any form, has been modified so drastically that athletes can now be paid in a variety of ways.  What sets college athletic participation apart from “professional” sports is not that intercollegiate sports are amateur, but that they are part of institutions of higher education.  College sports are, at least nominally, intended to be educational, while professional sports are not. Before a decision is made in the NCAA v. Alston case, the Supreme Court justices should read a brilliant article in the Harvard Law Review published three decades ago.  “Judicial invalidation of the amateurism principle,” the legal experts stated, “may actually allow the NCAA to place more emphasis on academic values in its members’ sports program.”  Six knowledgeable historians agree.
48342b6f7f268100407cd9ee10566f22
https://historynewsnetwork.org/article/179721
"What the Black Man Wants": Frederick Douglass's Answers Still Resonate
"What the Black Man Wants": Frederick Douglass's Answers Still Resonate In April of 1865, Frederick Douglass addressed the Annual Meeting of the Massachusetts Anti-Slavery Society. At forty-four years of age, six-foot-one inches tall, streaks of gray emerging in his hair, Douglas still radiated strength. He stood resolute before an audience of abolitionists with whom he was popular and respected. With his intense gaze afire in triumph and alertness to an hour of opportunity, trimly bearded but still rakishly handsome, his fierce countenance attracted admirers of many stripes. As always, in this venue, he was interrupted often by applause, laughter, and shouts of approval, as he presented his powerful arguments with clever word play steeped in American popular culture: the Bible, Shakespeare, and the already sacred rhetoric of the Founders. In his prime, Douglass presented perhaps the most striking public figure ever to stride across the American political stage, every bit as compelling as iconic politicians of the television age like JFK, Ronald Reagan, Bill Clinton, or Barack Obama. Douglass advanced his case with a series of questions. WHAT IS FREEDOM? Speaking out against a discriminatory labor policy instituted by the Union Army ostensibly to "prepare ex-slaves to better handle freedom," Douglass called the right to choose one's own employment essential. But the bedrock of true freedom for the freedman would be "immediate, unconditional, and universal enfranchisement." Why suffrage? Some will ask WHY DO YOU WANT IT? Invoking the language of the Declaration, Douglass simply demanded what was his by right. Any deprivation of a natural right reduced the "nature" of men. Voting represented a symbol of equality. As a result of the American founding and democratic evolution, the idea of universal suffrage defined American citizenship. In other political cultures, the denial of the "elective franchise" might do no great violence to a man. But, in our system, Douglass argued, disfranchisement equaled inferiority. And there were practical reasons. Beyond the principle of equal rights, it was in the interest of the Federal Government to empower and enlist their Black allies in the ongoing fight to stamp out treason and perpetuate unified constitutional government. Presciently, Douglass predicted the reluctance of the South to accept the verdict of the war, predicting the United States government would find itself an occupying force in a "strange land" surrounded by a "hostile spirit" struggling to maintain order and authority. HOW WILL YOU WIN THE PEACE WITHOUT THE BLACK MAN? Where will you find the strength to overcome the persistent spirit of the Southern rebellion? The North would need their wise and faithful Black allies, who understood clearly the war and its ultimate aim from the beginning much better than the North. African Americans had voted with their bodies, had been impervious to danger, and supported the cause of Union and freedom stubbornly and courageously as the war hung in the balance--and, truly, Douglass asserted, going forward, they represented "our only friends in the South." To the question of INFERIORITY, Douglass acknowledged the disadvantaged political condition of Black people in America, but, asserting once again a natural right claim, he denied inferiority in any original, natural, or practical sense--pronouncing African Americans equal to "anybody on this globe." Douglass reminded his audience that slavery and oppression, historically, did not equal a racial condition but rather a function of circumstance. Were not the "blue-eyed and fair-haired Anglo-Saxons considered inferior by the haughty Normans"? "You were down then," Douglass reminded his fellow abolitionists to howls of laughter and applause. "You are up now. I am glad you are up, and I want you to be glad to help us up also." "The story of our inferiority is an old dodge," Douglass continued. A rationale to explain political interest. When our "Manifest Destiny demanded a slice of Mexico,” we hinted the Mexicans were an inferior race. When Russia coveted parts of the Ottoman Empire, or the British wanted more authority in Ireland, the people in their way were an inferior race. "You say we are ignorant; I admit it." But if African Americans knew enough to be hung, they knew enough to vote. If they knew enough to fight for the flag, they knew enough to vote. If they knew enough to pay taxes, they knew enough to vote. With another call back to the American Revolution, Douglass proclaimed to his Boston audience, "taxation and representation should go together." And, of course, never one to pass up a swipe at the immigrants from the Emerald Isle, "if he knows as much when he is sober as an Irishman knows when he is drunk, he knows enough to vote, on good American principles." WHAT DOTH IT PROFIT A NATION IF IT GAIN THE WHOLE WORLD, AND LOSE ITS SOUL? In addition to a practical need for African Americans to accomplish a successful reconstruction of the South, what abut HONOR? What about JUSTICE? Douglass: You asked African Americans to "incur the enmity of their masters." You induced us to "turn against the South in favor of the North; to shoot down the Confederacy and uphold the American flag." The white people of the South will hate us for generations. "You have called upon us to expose ourselves to all the subtle machinations of their malignity for all time." DO YOU NOW INTEND TO SACRIFICE YOUR FRIENDS IN FAVOR OF YOUR ENEMIES? WILL YOU GIVE YOUR ENEMIES THE RIGHT TO VOTE AND TAKE IT AWAY FROM YOUR FRIENDS? We responded to your call to arms (like we did in 1776 and 1812). "In time of trouble we are citizens. Shall we be citizens in war, and aliens in peace?" Noting a proliferation of benevolence societies to aid African Americans, Douglas observed, "the American people are disposed to be more generous than just.” But, once again asserting a natural right claim, Douglass wondered, now that you are inarguably aware "we are men," will you deny us the "possession of all our rights?" Repudiating the poor substitutes of benevolence, pity, or sympathy, Douglass simply demanded justice. WHAT SHALL WE DO WITH THE NEGRO? "Do nothing with us," Douglass suggested. Leave African Americans alone. Give them a chance to be men. "If you see him on his way to school, leave him alone; don't disturb him," Douglass entreated. Similarly, if you saw a Black man having dinner at a hotel, or casting a ballot, or practicing his craft, just let him be. Allow him to pursue his inalienable rights in peace. If the Black man failed, surely it would be the fault of his Maker and perhaps give lie to the universal principle of the American founding. But, Douglass was certain, if given a chance, if unbound, if allowed to succeed on his own, the Black man would prove himself equipped for citizenship and success just as much as the white man. The war [in which 200,000 African Americans served with distinction in the Union Army] swept away a "great many delusions," Douglass reminded them. What does Douglass have to say to 2021? Should we just leave the Black man alone? Do nothing? Conscientious historians fault modern conservatives for misusing the above paragraph to distort or even troll the African American civil rights cause over the last four decades. It is a fair point. When Douglass advised “do nothing,” he envisioned an American government that permitted African Americans to join the body politic and be subject to equal protection and due process under the law as first class citizens. When Douglass declared, “just let him alone,” he clearly imagined and advocated a passive partnership between the government of the United States and African American citizens in which access to education, the right to vote, equal employment opportunity, and public accommodations were open and equal to all. But, instead, what Douglass feared most came to pass. After an attempt to honor their Black allies and establish justice for all, the North ultimately chose to make common cause with the white South—reneging on the promises of Reconstruction and the rhetoric of equality. The failed revolution succeeded in amending the Constitution. African Americans gained full citizenship and suffrage on paper, but, over the course of the next three decades, Douglass watched his victorious coalition of 1865 slowly but surely betray their “faithful friends.” By the time of his death in 1895, the United States had abandoned reform, left the South to the vagaries of white rule, and was fully engaged in the long nightmare of Jim Crow segregation that would last into 1960s. Almost a century after Douglass’s speech in Boston, the March on Washington, and Martin Luther King speaking before the Lincoln Memorial, symbolizes for us a rededication to our founding principles. African American leaders once again called upon the American people to honor the Creed. After a century of discrimination, oppression, and intimidation, in a very different moment, Congress passed the Civil Rights Act of 1964 and the 1965 Voting Rights Act. Unlike the failures of Reconstruction, the twentieth century moment represented a seismic cultural shift and a great leap forward. At the very least, a great down payment on living out “all men created equal.” An Aside. We should be honest about the progress achieved since our great civil rights moment. We merit praise not scorn for sincere repentance and 55 years of tangible achievements in cultural integration and racial unity American style. Almost inconceivably, we now live in a world in which myriad African American icons and heroes populate the uppermost elite echelons of our society: Oprah, Lebron, Barack and Michelle, Tiger, Beyoncé, Ta-Nehisi Coates, Shonda Rhimes, et al. Looking back from 2021, who are our most admired historical figures from the twentieth century? Number One (with no real competition): MLK. And the pantheon certainly includes Rosa Parks, John Lewis, Malcolm X, Colin Powell, and Jackie Robinson. In sports, Muhammad Ali has come to personify absolute excellence, integrity, and courage for the vast majority of Americans. Young people today worship a whole galaxy of African American sports stars with virtually no thought to race. Same for the arts and entertainment. No high school or college American lit survey seems complete without Toni Morrison, Langston Hughes, James Baldwin, Richard Wright, Ralph Ellison, or Maya Angelou. And, if we were enumerating heroes that one half of the nation adores but the other half detests for reasons only marginally connected to race, we would add Clarence Thomas, Thomas Sowell, Condoleezza Rice, and Tim Scott. An Aside. We should acknowledge that there are tens of millions of white Americans who would enthusiastically vote for Tim Scott, or any other conservative black candidate, over Joe Biden, or any other white male Democrat, for president of the United States. We should acknowledge that we have utterly shattered the ubiquitous assumptions of white supremacy that barred African Americans from participating in American politics solely on the basis of race just sixty years ago. They are virtually non-existent in our current political environment. But I HEAR YOU. This is not about Oprah or Barack Obama. Our problem is George Floyd, economic disparities including income, unemployment, the wealth gap, the inheritance gap, poverty rates, and home ownership, also disparities in health outcomes (COVID deaths) and incarceration, voter suppression, food deserts, and education. African Americans, statistically, in the aggregate, disproportionately suffer a lower quality of life in our nation compared to whites (and Hispanics and Asians). We have succeeded in empowering the Talented Tenth of Black America. We are now happily accustomed to seeing wealthy and powerful Black people among us, enriching our culture and strengthening our economy. But how do we achieve broader and deeper success? How can we make things better for more people? How can we make things right? Building on a half century of remarkable progress, how can we repair the residual damage resulting from a century of systemic discrimination? How can we honor our promises and live up to our founding ideals (not as white people) but as a united people? As one nation? How can we finally win the peace? I have a few ideas—and I think they are in keeping with the principles of Douglass, Lincoln, and King (without sacrificing Jefferson and Madison). Let’s talk about some possible economic and cultural solutions in my next installment, tentatively entitled, “A Just and Lasting Peace.” “What the Black Man Wants,” Frederick Douglass, 1865.
15081b0b8ad0a335f775a22400fa109a
https://historynewsnetwork.org/article/179722
Can Abolition of Nuclear Weapons Overcome the Opposition?
Can Abolition of Nuclear Weapons Overcome the Opposition? White House vigil, June 2006. Photo moi 84, CC BY-SA 3.0 Given the fact that nuclear war means the virtual annihilation of life on earth, it’s remarkable that many people continue to resist building a nuclear weapons-free world.  Is the human race suicidal? Before jumping to that conclusion, let’s remember that considerably more people favor abolishing nuclear weapons than oppose it.  Public opinion surveys—ranging from polls in 21 nations worldwide during 2008 to recent polls in Europe, Japan, and Australia—have  shown that large majorities of people in nearly all the nations surveyed favor the abolition of nuclear weapons by international agreement.  In the United States, where the public was polled in September 2019 about the UN Treaty on the Prohibition of Nuclear Weapons, 49 percent of respondents expressed approval of the treaty, 32 percent expressed disapproval, and 19 percent said they didn’t know. Nevertheless, surprisingly large numbers of people remain unready to take the step necessary to prevent the launching of a war that would turn the world into a charred, smoking, radioactive wasteland.  Why? Their reasons vary.  Die-hard militarists and nationalists usually view weapons as vital to securing their goals.  Others are the employees of the large nuclear weapons industry and have a vested interest in retaining their jobs.  In the United States, that enterprise has long been very substantial, and the Trump administration, through massive infusions of federal spending, has succeeded in fostering its greatest expansion since the end of the Cold War.  According to a December 2020 article in the Los Angeles Times: “Roughly 50,000 Americans are now involved in making nuclear warheads at eight principal sites stretching from California to South Carolina.  And the three principal U.S. nuclear weapons laboratories . . . have said they are adding thousands of new workers at a time when the overall federal workforce is shrinking.”  Members of these groups are unlikely to change their minds about the importance of retaining nuclear weapons. But another constituency resistant to the abolition of nuclear weapons, and probably the largest, is comprised of people whose position could be changed.  They view nuclear weapons as a deterrent to a military attack—and especially a nuclear attack—upon their nation.  And their fear of external aggression is often inflamed by hawkish politicians, defense contractors, and the commercial mass media that whip up public hysteria about enemies abroad. Of course, it’s not at all clear that nuclear deterrence actually works.  If it did, the U.S. government, with its vast nuclear arsenal, wouldn’t be as worried as it is about Iran obtaining nuclear weapons or fomenting war.  Indeed, if U.S. officials really believed that possession of nuclear weapons reduced the likelihood of nuclear and other kinds of war, they would be welcoming the proliferation of nuclear weapons around the globe.  Unfortunately, though, as they apparently recognize, the presence of nuclear weapons makes the world even more dangerous than it already is. Even so, the advocates of nuclear deterrence make a very legitimate point about the reality of international affairs.  It is a dangerous world, and people have good reason to fear external aggression.  Although nuclear weapons provide an inadequate response to the dangers of military attack, there is considerable justification for people to be concerned about the security of their nation. But what if the danger of external aggression were diminished?  In those circumstances, wouldn’t a substantial portion of the people concerned about national defense come around to supporting a nuclear weapons-free world? Developing a stronger international security system would provide a useful way to foster this shift in attitudes. The launching of the United Nations in 1945 raised hopes for the creation of an international entity that, in the words of the UN charter, would save humanity “from the scourge of war.”  And, in subsequent decades, this world organization, unlike any individual nation, did attain widespread legitimacy in world affairs, particularly for its humanitarian accomplishments and for the fairness of its decisions on global issues.  Nevertheless, the major nations—reluctant to give up the dominant power that they had traditionally exercised in international affairs—saw to it that the United Nations was denied the authority and resources that would enable it to develop an effective international security system. If, however, the United Nations were granted that authority and those resources, thereby providing nations with safeguards against external aggression, that would do a great deal to allay the fears of many people who cling to nuclear weapons.  And that, in turn, would transform the popular support for the abolition of nuclear weapons that currently exists into massive support for it—support that would be so overwhelming that even the nuclear powers might find it difficult to resist. It is possible, of course, that hammering away relentlessly at nuclear dangers will be sufficient to finally convince the governments of nations—even the governments of the nuclear powers—to abolish nuclear weapons. Nevertheless, people who want to end the nightmare of nuclear destruction that has haunted the world since 1945 should consider widening the popular appeal of nuclear weapons abolition by strengthening the UN’s ability to provide international security.
2b8c8cbe31acca75da548ac0559dba55
https://historynewsnetwork.org/article/179723
Historians for Peace and Democracy Present Free Resources for History Educators
Historians for Peace and Democracy Present Free Resources for History Educators Historians for Peace and Democracy (H-PAD) is a national organization of progressive historians. As part of our mission to foster education on campuses and in communities, encourage activism, and facilitate networking with organizations that work for peace and justice, we are making a series of new resources available for use. They are totally free, so they fit your budget! These resources include a Virtual Speakers Bureau, short videos in the Liberating History series, and a syllabus on sanctions. H-PAD launched its Virtual Speakers Bureau in March 2021. Forty outstanding professional historians, activists, and independent scholars have volunteered to speak to classes, campuses, community-based groups, and other organizations. No honorarium is required or expected, just a mutually-agreed-upon date, time, and topic. The presentations can be tailored to meet both parties’ interests, expertise, convenience, and needs. H-PAD has organized speakers bureaus in the past, but the current widespread use of video conferencing technology allows us to extend the invitation beyond our own locales to include organizations across the United States and around the world.  If you would like to learn about the speakers and how to invite them, please click here. The new Sanctions Syllabus was developed by Renate Bridenthal, Molly Nolan, and Prasannan Parthasarathi, three members of the H-PAD Empire Working Group. It dissects “economic sanctions – their forms, legality, and effectiveness, their history across the twentieth century and their current deployment, as well as blowback from and resistance to them.”  The syllabus offers definitions, examples, and links to a wealth of articles, books, and films. It examines the use and impact of sanctions against Cuba, Venezuela, Iraq, Iran, Russia, China, apartheid South Africa, and Israel, with a particular discussion of the Boycott, Divest, and Sanctions (BDS) campaign. To access the syllabus, click here. We’ve recently expanded into video and audio production, too. Our Liberating History series features lightning video lectures of 3-4 minutes. In “Black Panthers Against Patriarchy,” Robyn Spencer discusses why so many Black women saw the Black Panther Party as a place of feminist empowerment. In another new episode, Prasannan Parthasarathi puts “India’s Far Right in Historical Perspective,” explaining its origins in the country’s caste system and the ideology of Hindu nationalism. And be sure to check out our earlier Liberating History episodes as well: Irene Gendzier on the roots of Trump’s Middle East policy, Donna Murch on crack and mass incarceration, and Ellen Schrecker on McCarthyism past and present. We encourage you to use, and share, these resources. At H-PAD we believe in using history to empower people to confront systems of hierarchy and oppression. If you’d like to collaborate in making that happen, please join us!
7da16a896269e03fc947e5214cfab910
https://historynewsnetwork.org/article/179724
Can Biden Fulfill JFK's Incomplete Promise of a Peace Presidency?
Can Biden Fulfill JFK's Incomplete Promise of a Peace Presidency? JFK and Nikita Kruschev at the 1961 Vienna Summit. Talks there led toward the 1963 Partial Test Ban Treaty. NARA record: 3951647 President Joe Biden and his advisers appear to have studied the lessons of Franklin Roosevelt’s presidency. Several executive orders have undone some of the damage wrought by President Donald Trump. The passage of the American Rescue Plan Act of 2021 provides aid to poor and working people and investment in the county’s infrastructure. In addition, Biden has spoken out in support of a union vote by Amazon workers in Bessemer, Alabama. It would be wise for Biden and his advisers to study the lessons of the Kennedy and Johnson administration, second to the Roosevelt Administration in achieving domestic reforms. Like Biden, John Kennedy was supported by unions and articulated pro-union views. One year into his presidency, Kennedy signed Executive Order 10988 establishing collective bargaining for federal employees. Kennedy’s more dramatic shift to progressive positions came in June 1963 with a speech to the nation advocating civil rights and a speech at American University advocating peaceful coexistence with the Soviet Union. The shifts in policy these speeches represented led to the passage of the Civil Rights Act of 1964 and the August 1963 Test Ban Treaty ratified by the U.S., the Soviet Union, and the United Kingdom. Peace advocates found plenty to criticize in the foreign policies of the Kennedy and Johnson administrations. Prior to 1963, Kennedy’s relationship with Cuba and its ally the Soviet Union was confrontational, leading to the 1962 Cuban missile crisis. The escalation of the U.S. war in Vietnam by the Kennedy and Johnson Administrations and Johnson’s invasion of the Dominican Republic are among the ways that the shift to a more peaceful foreign policy fell short. That Johnson undermined his own domestic goals by his expansion of the Vietnam War is well known. In the Cuban missile crisis, Kennedy stepped back from the brink of nuclear war and reached an agreement with the Soviet Union that included a promise by the U.S. to cease its effort to overthrow the Cuban government. In his American University speech, Kennedy called for rethinking the cold war. The ideas that Kennedy articulated in his American University speech remains relevant. Kennedy declared: “So, let us not be blind to our differences--but let us also direct attention to our common interests and to the means by which those differences can be resolved. And if we cannot end now our differences, at least we can help make the world safe for diversity. For, in the final analysis, our most basic common link is that we all inhabit this small planet. We all breathe the same air. We all cherish our children's future. And we are all mortal.” The world system is much changed from what it was in the 1960s. During the Kennedy and Johnson years, the U.S. Gross Democratic Product (GDP) was nearly half the world total. Even with that economic clout, attempting to maintain U.S. hegemony undermined domestic reform goals. Today, the U.S. share of GDP is about 25 percent of world GDP. When one takes into account the many social benefits not measured by GDP, the U.S. position is weaker still. The attempt to maintain U.S. dominance with outsized military spending – 37 percent of the world total in 2015 – has led to a series of endless unsuccessful wars. The damage inflicted on millions of people in other countries and on the tens of thousands of U.S. people involved in these conflicts is both sad and unnecessary. It’s time to return to JFK’s concept of paying attention to “our common interests,” resolving our differences peacefully, and making “the world safe for diversity.” Progressives, unions, and the left, are seeking to achieve domestic reforms – the passage of the For the People Act of 2021 to protect voting rights, the Protecting the Right to Organize Act, Medicare for All, the $15 minimum wage, and the Green New Deal.  There are two ways to fund social programs and move to a more equal society. We need to increase taxes on the wealthy and corporations. The Ultra-Millionaires Tax Act proposed by Senators Elizabeth Warren and Bernie Sanders is a first step. Second, we need to shift funding from the military budget to social needs. To accomplish the latter goal means emphasizing peace advocacy. President Biden needs to step back from his attack on Russian President Vladimir Putin. The intelligence reports Biden is reviewing are political, not scientific documents.  The military-industrial complex is now the military-industrial-intelligence complex. The so-called intelligence community is part of that larger complex and is seeking to consolidate the new cold war with both Russia and China. One of Kennedy’s virtues was his ability to set aside advice from foreign policy and defense experts and to think independently. It helped that he had a sense of humor and, despite his high position, could show some modesty. After the Bay of Pigs failure, he commented to aides, “It’s just like Eisenhower. The worse I do, the more popular I get.” Whatever the truth of the charges of Russian interference in U.S. elections, Biden should remember that the U.S. intervened openly in the 1996 Russian election, helped overthrow the elected Ukraine government in 2014, and has a long record of interfering in other countries’ internal affairs. President Barack Obama took some steps away from the new cold war campaign with the New Start Treaty of 2011.  He also took a step toward the relaxation of the blockade against Cuba.  Biden should follow up on Obama’s Cuba initiative by ending the sixty-year-old blockade of Cuba. Returning to the themes of Kennedy’s American University speech could lead Biden to make lasting contributions to world peace. On disarmament, Kennedy said: “Our primary long range interest in Geneva . . . is general and complete disarmament-- designed to take place by stages, permitting parallel political developments to build the new institutions of peace which would take the place of arms.” About the United Nations and disarmament, Kennedy said: “we seek to strengthen the United Nations, to help solve its financial problems, to make it a more effective instrument for peace, to develop it into a genuine world security system--a system capable of resolving disputes on the basis of law, of insuring the security of the large and the small, and of creating conditions under which arms can finally be abolished.” Today, a peace presidency would ensure access to vaccines against the coronavirus by the neediest nations. It would lend full support to the United Nations and the World Health Organization. It would shift funding from armaments to domestic needs and aiding the world’s needy. It would put an end to the new cold war and seek ways to cooperate with Russia and China. It would put an end to our endless wars and support Palestinian rights. It would set our sights, once again, on world disarmament.
aea13e1dadb65ea67d203a9f32315199
https://historynewsnetwork.org/article/179725
America Does Have an "Original Sin": A Response to James Goodman
America Does Have an "Original Sin": A Response to James Goodman Scrolling amongst all the clickbait on CNN’s website these days, I was genuinely intrigued when I saw James Goodman’s headline “It’s Time to Stop Calling Slavery America’s ‘Original Sin’.” I was, however—as I often am—left disappointed by the click after I read the contents. While I agree with Goodman’s conclusion that slavery is NOT America’s original sin, I disagree with how he arrives at this conclusion. Goodman begins by critiquing the theological origins of the idea of “original sin,” rejecting the concept as irrelevant to our polity, seeing it as an unnecessary confusion between our secular state and Christianity. Indeed, American scholars can sometimes be quick to dismiss the importance of religion in American society. If this were simply a response to the recent corrupt entanglement of evangelicalism with the American right, it might be more understandable, but the fact is, Goodman’s reaction is illustrative of a much larger trend among American scholars that tends to ignore religion (and especially, conservative religion) and its influences in the hopes that it will just simply fade away. We can find the seeds of this in American life even among liberal theologians shortly after the end of the First World War, when Harry Emerson Fosdick, a prominent Presbyterian public theologian, intellectual, and modernist asked, “Shall the Fundamentalists Win?” To understand American history, and the American present, means that we must also understand the lasting influence of American religion. The idea of “original sin,” is, in fact, an apt metaphor for what continues to plague American society. It is just that slavery is a symptom of this original sin, and not the first sin itself. But before I address America’s original sin, I first want to defend the use of the metaphor. In 1967, in what is now a seminal classic article in the field of sociology of religion, the esteemed scholar Robert Bellah published “Civil Religion in America” in Daedalus, the Journal of the American Academy of Arts and Sciences. In this article, Bellah demonstrates how American politicians, as well as the American people, have practiced a “civil religion,” that has allowed politicians to continue to use and co-opt the Deism of the American founding generation as a unifying religiopolitical force that acknowledges a “divine providence” that was generic enough for Americans from most religious persuasions to accept. Politicians used and continue to use the rhetoric of American Civil Religion because it works, and the vast majority of Americans accept it. For those that doubt, witness the recent use of religious language to describe the U.S. Capitol during the middle of the armed insurrection against President Joe Biden’s election: American politicians and citizens alike were aghast that Trump’s supporters would violate the “sacredness” of the Capitol Building and America’s political institutions under the Constitution. Political crimes against the “spirit” of Democracy truly are sins to the American people because of American Civil Religion. Further, Goodman misunderstands the term of original sin. While he is correct that the concept in Christianity means that an original actor committed the sin, he is incorrect in his assertion that those who inherit the guilt of this sin are not responsible for it. While many Christian faith traditions reject the doctrine of original sin outright (it is, after all, originally a doctrine of the Catholic Church), those Christians that do hold to it believe that the original sin tainted all of humanity, to the point that it ensured continued sinfulness among all peoples, and it also meant that without a redeemer, all those who inherited this sin would be condemned to hell for its seriousness. Translated to American Civil Religion, such an original sin would mean that all of America is condemned for the nation’s (in this case, white America’s) crimes. Without redemption, the entire country is on the course to hell. However, in American Civil religion, the Great Emancipator of Abraham Lincoln has (in the eyes of certainly not all, but many), functioned as the redeemer, the divine agent of providence that would finally free us from the scourge of this original sin. However, as I look around American society, we still appear to be damned. Racial inequality remains long after both emancipation and the Civil Rights era. Unarmed black men and women continue to be shot by our law enforcement officers, who have been socialized to believe that the next person they contact might try to kill them, and even more so if they fail to possess white skin. It is quite apparent that the damnation of America’s original sin is still with us. America’s original sin, however, is not slavery, it is settler colonialism. As Goodman finally acknowledges in his piece, slavery would not be America’s original sin, because America’s sinfulness begins with the dispossession of indigenous lands. Settler colonialism, however, as an organizing principle, encompasses not just the theft of lands from the indigenous peoples of the Americas, but it also includes the forced labor of slavery, both chattel and otherwise, in order to bolster the budding capitalistic project of imperialism. After all, the original form of British American colonialism was carried out by the Virginia Company, incorporated under the British crown in order to carry out the Empire’s wishes and to make its original shareholders a significant profit. John Winthrop of the Massachusetts Bay Company and colony, in a letter to both potential supporters and detractors, described the right of the Puritans to take indigenous lands as a reaction to the closure of the English commons. The rich in England had enclosed the land that commoners had used for centuries to grow crops and graze upon in order to sustain themselves, and this natural right (granted by God, he believed) gave way to a civil right to make such land private by the act of erecting fences and gates. Believing that God had shown him and the Puritans a new commons, they argued that they could use any land not currently used and occupied to European standards by the indigenous peoples of America. While Winthrop made this argument, much of his letter is spent arguing against those who had warned that it was immoral to take indigenous lands because indigenous people had occupied it for centuries. While Winthrop thought he had the right to take indigenous lands, some back home in England thought the theft of the land was sinful. And they were right. Settler Colonialism continues to damn America. Today, in the midst of the COVID-19 pandemic, the virus kills more indigenous and African American people than it does white Americans. Where I live, on the Navajo Nation, many of my students are among the 30% of the Navajo population that in 2021 still do not have electricity or running water in their homes. Some of my students charge their cell phones, tablets, and laptops in their cars overnight and use mobile hotspots in order to turn in their homework, but can’t take a shower in their own homes and have to use camp stoves to cook. With 57-hour weekend lockdowns to stop the spread of COVID, many couldn’t even keep perishable food cold over the weekend because of a lack of access to ice. As part of settler colonialism, indigenous communities are denied their full sovereignty. Recently, two different indigenous nations in South Dakota shut down their borders to non-tribal members in order to keep the virus from infiltrating their communities, since the massively underfunded Indian Health Service would be quickly and completely overwhelmed by the arrival of COVID-19 to their reservations. In response, South Dakota Governor Kristi Noem threatened both tribal governments with lawsuits if they did not allow traffic to flow freely in and out of their reservation. The requirements of settler colonialism and capitalism demanded the death of more indigenous people in order to keep money and “liberty” flowing. Settler colonialism is America’s original sin, and it continues unabated to this very day. America will continue to find itself in hell—or at least, in purgatory—until it repents from and seeks redemption for its actual original sin. Additionally, American scholars will continue to misunderstand American society and culture if they fail to take stock of American civil religion and the spiritual beliefs of Americans in general. Original sin, after all, must be cleansed by a redeemer, but first, we must acknowledge our complicity in the sins of our ancestors. Editors Note: HNN excerpted the essay by James Goodman referred to here in our Roundup of op-eds in February.
a5a732e5e278e675d240da6285b1f803
https://historynewsnetwork.org/article/179726
Ammon Bundy's Ongoing Religious War
Ammon Bundy's Ongoing Religious War Ammon Bundy's arrest for failure to appear was triggered by his refusal to wear a mask into the courthouse. Still from Ammon Bundy YouTube. Ammon Bundy, right-wing malcontent behind the 2016 armed takeover of Oregon’s Malheur Wildlife Refuge and now a western anti-mask movement, believes he’s doing God’s work. Coming from a long line of religiously inspired men who have been “called” to defend the US Constitution, Bundy has varied in his focus, from rebelling against public land ranching regulations to protesting COVID-19 safety protocols. But in his view, these are all forms of government tyranny and affronts to constitutional rights. Arrested for the fourth time on March 15, 2021, Bundy was taken into custody for failing to appear at his hearing on past trespassing charges. Because he refused to wear a mask into the courtroom, thereby missing his trial, he was apprehended outside amid of a throng of other protesters. Bundy’s crusade has been a long time in the making, but in the last year he successfully established a coalition of supporters that is broad, diverse, and a serious threat to federal law. His group is called the People’s Rights Network. Like the Oath Keepers and Proud Boys, it includes members who see the current government as a threat to perceived rights and are committed to defend their ideas of personal liberty, by force if necessary. So what has taken Ammon Bundy, who first came to prominence during the 2014 armed standoff in Nevada over his father’s unpaid grazing fees and trespassing cattle, into a life of an anti-government militant? The answer is a libertarian worldview and his take on Mormonism. Bundy’s ideology parallels the thinking of certain leaders in the Church of Jesus Christ of Latter-day Saints, who’ve had a history of government cynicism. He also shares with them a tradition of theo-constitutionalism--venerating the Constitution as a sacred document. The paradox here is that Bundy believes he is upholding the Constitution and fulfilling his religious duties in his acts of lawlessness. His impetus has roots in the early Church. After the founder and first prophet of Mormonism, Joseph Smith (1805-1844) was murdered, Brigham Young (1801-1877) assumed the reins of the Church and brought the Latter-day Saints into the Great Basin. By that point, Young and his brethren were disgusted with the US government after the years of mistreatment and bigotry they had faced. But the Mormon people kept great faith in the Constitution. While still an apostle of Smith, Young said “I find no fault with the Constitution or laws of our country; they are good enough. It is the abuse of those laws which I despise, and which God, good men and angels abhor.” He later avowed “ Corrupt men cannot walk these streets with impunity, and if that is alienism to the Government, amen to it. The Constitution of the United States we sustain all the day long, and it will sustain and shield us, while the men who say we are aliens, and cry out ‘Mormon disturbance,’ will go to hell….But to proceed; the principal evil is in the rulers, or those who profess to be rulers, and in the dispensers of the law…” Young was not just the leader of the Church; like Smith, he was a prophet. Although he was not as prolific in his revelations as other Mormon prophet/presidents, these statements are memorialized in the Journal of Discourses and the History of the Church, texts not part of official Church doctrine, but significant, especially to those with radical leanings. Over the years, many Church prophets echoed Young’s sentiments, from Wilford Woodruff (1807-1898) to Ezra Taft Benson (1899-1994), reinforcing the idea that the Constitution is good, but not those who govern under it. Benson took that idea further, declaring that the Mormon people had a religious obligation to protect the Constitution, even if this meant violence. In 1979 he declared, “I say to you with all the fervor of my soul that God intended men to be free. Rebellion against tyranny is a righteous cause. It is an enormous evil for any man to be enslaved to any system contrary to his own will. For that reason men, 200 years ago, pledged their lives, fortunes, and sacred honor. No nation which has kept the commandments of God has ever perished, but I say to you that once freedom is lost, only blood – human blood – will win it back.” So this is where things get treacherous. If the Constitution is sacred, but those overseeing it are evil, then who determines and upholds the law of the land? Bundy has come to think that this is his duty—a chilling certainty that is likely to escalate during this current administration. As vaccines are more widely administered and mask mandates therefore become less of a concern, Bundy’s focus will return to his original cause. The new Secretary of the Interior, Deb Haaland, is now charged with overseeing public lands, including the place where Ammon’s Bundy’s father, Cliven, continues to illegally graze his cows. The patriarch Bundy and his most infamous son share the conviction that the federal government has no constitutional right or power to own land; hence the land belongs to those white people who have occupied and used it, and the requirement of grazing fees paid to the Bureau of Land Management is unconstitutional. Although Cliven has repeatedly lost his appeals in federal court, and currently owes over a million dollars in fines and fees, the old rancher’s cows still roam the same BLM land, years after the Nevada armed standoff. To Ammon, mask mandates and grazing regulations are the same thing—affronts to constitutional rights. Law and common good be damned—he sees both as tyranny. He is determined to protect the Constitution, even by unconstitutional means. Where the next action is again taken—Nevada, Oregon, or somewhere else on the 600,000,000 acres of American public lands—remains to be seen. In 2018, Bundy talked before an audience in South Jordan, UT during an event advertised as a “power packed four hours, with an LDS [Latter-day Saints] perspective, but practical info for all true friends of liberty.” He told them about his father’s dream, in which people approached a large building. Inside the building sat a golden calf, a biblical reference to a false idol, that Cliven understood to symbolize the American court of law. Ammon explained that the dream meant people are putting their faith in judges who do not have their best interests at hearts. “You can’t worship the golden calf, you have to have faith in God,” he told the audience. “When you know for certain that those are your own rights, you do not allow them to be questioned. And I know that’s a strong thing I’m saying. But when you do that, then your friends need to come and protect you also. And it’s a duty of ours to do that.” Four months later, PeoplesRights.org was registered, a year and two months before pandemic brought America to a screeching halt. COVID-19 gave him a cause that fit a long ongoing narrative. The pandemic swelled the ranks of People’s Rights because of conspiracy theory and righteous anger, but it wasn’t invented in response to it. Ammon Bundy has been looking for a next battle since the takeover of Malheur, when he led a group of heavily armed militia to occupy government buildings in Harney County, Oregon. During that takeover, one man, LaVoy Finicum, was shot and killed by police. Ammon now has his own militia, the People’s Rights Network, an army of over 50,000 members in 50 states, according to the organization’s website. He recently finished a recruitment tour in Utah, talking God, liberty, and the need for vigilante action—antidotes to golden calves. His campaign is part of a long drawn arc and we shouldn’t expect his rebellion to end with a die-down of COVID-19. Bundy’s attention will return to public land battles, where the first insurgencies began. It wasn’t COVID-19 that spurred the formation of the People’s Rights Network and inspired Bundy’s mission, it was a deeply rooted sense of righteousness, Cliven’s dream, and a version of Mormon ideology.
1177d176cc3ee236c4a00ad324478311
https://historynewsnetwork.org/article/179727
Is History Ready to Judge the Trump Presidency?
Is History Ready to Judge the Trump Presidency? With the second Trump impeachment concluded, the (first) Trump presidency is officially confined to history. How should history understand the Trump presidency? Right now, we would be hard-pressed to find anyone who disagreed with the contemporary consensus that Trump shattered the norms of the presidency itself. Hovering like a specter over historical analysis, that consensus obscures other significant innovations that Donald Trump brought to the presidency. Understanding his political strategies will help historians and political scientists generate further insights into the nature of power inherent in the office of the President and the structures that enabled him. We know that Trump’s presidency was consequential. He single-handedly changed the presidency in several ways, from altering relationships with the press, to hollowing out bureaucracies, and garnering unprecedented media attention from all over the world. What makes Trump different however is the unusualness of his style and methods. Take his use of social media as an example, effective as it was in boosting his own political standing by stirring chaos through entertaining and inflammatory remarks on Twitter. His Twitter account ultimately did not serve the interest of the country (as Twitter itself determined in the wake of the January 6 attacks on the Capitol, with the controversial decision to suppress the President’s access to the platform). And yet no doubt future presidents might adopt similar strategies to more traditional ends (what’s without controversy is to hope they use the Twitter pulpit to pursue national interests rather than personal ones.) Another controversial president who could demonstrate the unprecedented nature of the Trump presidency is George W Bush. Although few people draw comparisons between the two, Bush – like Trump - was plagued by historical low approval ratings and controversies, from his decision to invade Iraq in 2003 to his handling of the US economy in the wake of the 2008 financial crisis. Are both presidents destined to be remembered horribly? The Bush and Trump presidencies could not have been more different, as Bush, though awkward in conducting foreign policies, more plausibly rooted his intentions in what he believed was the morally righteous thing to do. By contrast, Trump was a tactician who applied unconventional methods in fulfilling his own political gains regardless of the nation. Concerning Bush, it was his policies that were out of touch with reality. He was simply not savvy enough to understand the political and military complexity of invading the Middle East. Though he perhaps had a point in assuming the danger of terrorism, given the shock that the nation endured with 9/11, his false judgment in invading Iraq, a nation with no credible evidence of preserving weapons of mass destruction, was of his own making. Like Trump, he handicapped himself by politicizing his own intelligence bureau, and the nation paid the price. Unlike Trump, Bush also paid the political costs. Bush was often depicted as a “war criminal” for the destabilization of the entire Middle East. In retrospect, at least it seems that Bush was reacting to a truly national emergency. Based on his course of actions, we can assume that Bush was simply inept. The nation suffered from the opposite problem with Donald Trump, who apparently never acted in the interest of the nation but who was so adept at controlling media narratives that he remains king of the Republican Party (where is George Bush, these days?). By repeatedly calling the news media fake news, he discredited negative stories. This tactic is effective in a rational choice framework if we were to disregard the implications of it all. If we were to use the criteria that presidents should be judged by how they employ the most rational choice and effective strategy in fulfilling their political interest within a set of limited options, it should be noted that while Bush did react out of proportion to the crises that he inherited, he did not necessarily use those crises to his own advantage. Bush used the resources of his office in a more traditional sense, though at the time of his presidency many thoughts about his tactics ranging from the opening up of Guantanamo Bay to the invasion of Iraq are approaching the borderline of the power of the American presidency. Though many might argue that his winning of reelection in 2004 indicated the successful selling of his “wartime president” status, this victory prolonged the festering of the existing crisis he manufactured himself in Iraq. More money and time were wasted in the Middle East, creating a financial drain on the country that cemented his status as a controversial president or “war criminal” by the time he left office in 2009. As it should also be noted those were arguably bad political tactics; though Bush won reelection in 2004 as a “wartime president,” he left office with low approval, and saw his own party move away from his leadership through the Tea Party. Trump, on the other hand, while unsuccessful in winning reelection, used a new method of conducting the presidency that made every scandal conducive to his own personal interest, retaining the loyalty of his base and command of the Republican Party. Compared to Bush, Trump played the role of presidency unconventionally by being able to manufacture crises to his own advantage, completely changing the way presidency is conducted and, possibly, basic expectations about its function. However much controversy Bush stirred, his controversial legacy nonetheless pales in comparison to Trump’s. And yet, the Trump presidency might be the point of inflection for the country, and a moment for historians to recalibrate how they judge future presidents.
54d5ab8226a6c9b84129b19c3a1dee26
https://historynewsnetwork.org/article/179728
Telling the Story of the "Ghost Children" of Germans who Plotted Against Hitler
Telling the Story of the "Ghost Children" of Germans who Plotted Against Hitler I was excited. I recognized the tell-tale signs of discovery after a decades-long career of sharing under-told stories from history with children and teens. Excitement, yes, and a sense that I’d found my next book. But this was something more. My latest discovery literally took my breath away. Not only had I unearthed an episode of World War II history that had barely been told beyond Germany, I’d found one that was anchored by a little-known diary kept by a child eyewitness to the events. And she was still alive. And so were other “children” from this history. And I was going to be able to meet and interview them. Their stories were intimately intertwined with a much better known occurrence: the Valkyrie coup attempt of July 20, 1944, that began with Claus von Stauffenberg’s efforts to kill Hitler at the dictator’s Wolfsschanze, or Wolf’s Lair, hideaway. Stauffenberg’s explosive device killed four men, but not Hitler, and the associated coup crumbled. Within hours Stauffenberg and three associates had been executed by firing squad. Theirs were among the earliest deaths in a wave of retribution that would claim over 150 lives. But this trail of vengeance—embedded within the greater horror of the Holocaust—and as gruesome and unjust as it was, marked only the starting point for Hitler’s revenge. That’s where the children came in. And the diary, and two research trips to Europe, and six years of alternating work and fermentation along the path toward creating Ensnared in the Wolf’s Lair: Inside the 1944 Plot to Kill Hitler and the Ghost Children of His Revenge (National Geographic Kids: 2021). Children remained top of mind throughout my work—those who had experienced the past and those who would learn about it through my book. I was also concerned about the adults that my eyewitnesses had become. These people were in their 80s now or more. Most had dodged and battled with shadows of trauma for years, even a lifetime, and I didn’t want my project to become another triggering event. I would need to employ patience and discretion during our shared backward glance. Context is everything when writing for children and teens. I assume a baseline knowledge of, well, nothing. The trick is to feather in a framework of understanding without overpowering the history-driven narrative. In order for my readers to empathize with the child protagonists in this book, they had to understand a wealth of background: how Hitler rose to power, the shifting nature of German resistance to his rule, World War II history, and the events of Valkyrie itself. Opening chapters offer this context, but whenever possible I share information through the lens of the families that would become intertwined in Valkyrie and the punishments that followed. I particularly drew on the childhood memories of my interview subjects. I hoped readers would feel even more connected to history when viewing it through the eyes of an earlier cohort of children and young adults. When I write, I aim to make the history feel personal, urgent, and irresistible. For this project I relied on historical facts, photographs, and eyewitness recollections to draw my audience into the drama of the events. I needed readers to comprehend how challenging and daring—and likely doomed—it was for the fathers of these children to try and overthrow Hitler’s regime. Twelve-year-old Christa von Hofacker began keeping a diary shortly after her father’s arrest in Paris for his role in the attempted coup. No sooner did her father’s fate become uncertain than her mother’s did, too. And her older sister’s. And her older brother’s. All three family members were taken away by Gestapo agents with minimal explanation soon after her father’s arrest. Christa and her two younger siblings remained at home under a patchwork of care that included an unwelcomed Nazi state nurse. Readers of Ensnared in the Wolf’s Lair know all about Christa and her family by the time these relatives start to go missing. They’ve been following them since the opening pages of the book with a growing sense of anxiety that makes the shock of their arrests that much more personal and distressing. I was able to build these bridges between readers and subjects because of direct connections I’d been fortunate enough to establish with Christa. Using interviews, correspondence, and access to her diary, I could share her perspective through her childhood writings as well in statements made with a lifetime’s worth of lived experience and insight. Similarly I was able to introduce readers to the memories of Berthold von Stauffenberg, the oldest son of Hitler’s would-be assassin, who was ten years old in 1944. During my second research trip in Germany I was fortunate enough to meet and interview him as well as Friedrich-Wilhelm (Friewi) von Hase, a retired professor who had been seven when his family became entangled in the aftermath of the failed coup. Friewi’s older sister, Maria-Gisela, is now in her mid-nineties, but her memories of those years remained fresh during our conversations. She was twenty in 1944 and was among the older family members swept up in early Gestapo arrests. These relatives became pawns for use as leverage against the conspirators, faced interrogations of their own, and were generally terrorized by their extra-judicial captivities. Younger family members experienced their own terror after they were removed from their parentless homes and spirited away to a remote compound for weeks or months on end. These girls and boys came to be known as the “ghost children” and were offered next to no explanation for their detentions or their fates. The children spent weeks and even months in suspended animation at their hideaway in the Harz Mountains of central Germany. After older relatives began to be released from prison, they found it next to impossible to locate the missing youngsters, both because there was a war going on and because the children’s surnames had been changed to obscure their identities. Christa, Berthold, and twelve others remained in captivity when Allied forces reached the property on April 12, 1945. These events are still living history for the children and young adults who have carried it with them into their senior years. They are also captured in the historical record through memoirs and letters from other eyewitnesses. My goal as an author was to transport new generations back to this era without losing the immediacy of those personal connections. Direct quotations from interviews, written accounts, and Christa’s diary are essential to that work, but so are my own memories from the research. During my investigative travels I had seen the Wolf’s Lair and explored the grounds of the wartime complex where Hitler had been attacked. I’d visited the Borntal, too—the Harz Mountain property where Christa and other children were detained—and I’d even been permitted to wander the abandoned interior of one of their former residences. I also had memories of seeing the cities the children had known, the places where conspirators had been executed, the room where the coup had failed. I infused my text with those details, too—and the emotions that accompanied them, from amazement at the grandeur of the lives these families had once led, to the paralyzing contrasts of the situations that followed, to the gritty terror of the places where the accused had been killed. Lives are built around memories, emotions, and personal connections. So are histories. For me the best way to engage readers in the past is to bring it to life, to make it fresh, to create such tangible connections through text and illustrations that readers almost fall into the pages of the book and travel back in time. I try to personalize the reading experience even further by capturing stories from the past that resonate in current times. Whether it’s the tutorial on the power of propaganda, or the account of the rise of a demagogue, or the conveyed terror of family separations, readers aren’t just learning about events from the past. They’re learning how to live in the present. That’s something to get excited about, too.
1f5f3c6512cdaa28dea5c05fb4c97065
https://historynewsnetwork.org/article/179731
The Lack of Federal Voting Rights Protections Returns Us to the Pre-Civil War Era
The Lack of Federal Voting Rights Protections Returns Us to the Pre-Civil War Era As Senate Democrats push to extend federal protection of voting rights, bills to restrict citizens’ access to the vote — many based on models produced by the conservative Heritage Foundation — have been introduced in 43 states. On Thursday, Georgia enacted a law that constrains voters’ options and makes it easier for the state legislature to interfere with election results. President Biden and others, criticizing the measure, have drawn parallels to the Jim Crow era from the 1890s to the 1950s. In a broader sense, though, the current absence of strong federal protection for voting rights better resembles the United States before it was transformed by the Civil War and Reconstruction. From the nation’s founding through the Civil War, the rights and protections that African Americans could expect varied dramatically from place to place. Free Black people might enjoy a broad range of civil rights in some jurisdictions, while elsewhere they faced drastic restrictions on their liberty, even to the point of enslavement. Meanwhile, no slave states permitted Black men to vote and only a handful of free states did, though in the 1840s and 1850s, northern Black activists and their White allies had agitated on the issue and were slowly gaining White support. The states had virtually unchallengeable power to define the status and rights of their residents. Slavery was only the most extreme example. Northern states such as Massachusetts and Pennsylvania chose to abolish slavery in the years following independence from Britain, but across the Southern tier of states, legislatures perpetuated slavery and codified it in law. Congress admitted new slave states to the Union, putting its imprimatur on states’ “right” to legalize slavery and to people’s right to hold Black people in bondage, denying them rights that many considered fundamental to personhood. Beyond slavery, all slave states and some free states placed strictures on the basic rights of free African Americans. After joining the Union as a free state in 1803, for example, Ohio adopted laws that required free African Americans who wanted to live in the state to prove their freedom and register with local officials. Laws forbade them from testifying in court cases involving White people, and barred Black children from public education. Ohio’s constitution granted voting rights to White men only. By contrast, Massachusetts placed no racial restrictions on men’s right to vote and had no racist or testimony residency laws. It did, however, bar interracial marriage and prohibit Black men from serving in the state militia. As a consequence of this patchwork of race and rights, the antebellum struggle for racial equality looked different in every state. In Ohio, Black activists pressed for repeal of the “Black laws” that marginalized them and made them vulnerable to exploitation and violence by White people. Yet, Black Ohioans were just 1 percent of the state population, and Black men could not vote. As a result, there was little they could do about the laws besides organize, petition the state legislature for justice and call on White neighbors to see the world from their eyes and do what was right. In 1843, a convention of Black men called on White Ohioans to recognize the injustice of the laws, insisting that the state government’s policy toward them was “utterly at variance” with the promises of the Declaration of Independence.
91bfdc364cd4dc124bf9362c5407b10c
https://historynewsnetwork.org/article/179734
The Problem with Confederate Monuments
The Problem with Confederate Monuments During the summer of 1993, as I drove down I-59 from Hattiesburg, Miss., to New Orleans to do some historical research for my dissertation, I spotted a bumper sticker with a Confederate battle flag that read “Don’t Blame Me, I Voted for Jeff Davis!” It was a play on the bumper sticker that emerged following the election of Bill Clinton in 1992, which read “Don’t Blame Me, I Voted for Bush!” Jefferson Davis, the one and only president of the failed Confederacy, was long dead, but the message was an indication of how much the Lost Cause remained very much alive in the Deep South. I’d grown up in Greensboro, N.C., in the 1970s, and I don’t recall such attention to Confederate memory there as I found when I moved to Mississippi to attend graduate school. Now, after a lifetime of studying Southern history, this makes more sense to me, because in the 1990s, Lost Cause sympathies were far more entrenched the deeper into the South one traveled. That has changed significantly in the past 25 years, and especially since 2015, as states across the region, including my home state, have passed laws to protect Confederate monuments as part of an alleged dedication to “Southern heritage.” GOP-dominated state assemblies have passed draconian legislation as part of Republicans’ culture war against Black Lives Matter and racial progress more generally. Much of it is based on propagating myths that the South fought the Civil War to protect states’ rights (it was to preserve the institution of slavery) and that removing a monument is an erasure of history (it isn’t). Even when a cross-section of Southerners petition for removal, they’re prevented from doing so because these state legislatures have usurped local control through so-called heritage protection acts. As a historian of the American South, I feel a deep responsibility to share the long history of these statues alongside the stories of racial injustice with which they are associated. It’s why I often speak to community groups, and also why I decided to write No Common Ground. While my role is not to offer advice on whether a monument ought to be removed, I can assist local governments and organizations in their decision-making processes regarding monuments in their communities by providing the necessary historical context. I also believe it’s important that I, a Southern white woman, write and speak about this topic with blunt honesty. Monument defenders cannot dismiss me as a Northern liberal who has invaded the region to tell them what to do. I’ve grown up here, too. Maybe that makes me a scalawag in their eyes. But I love this region as much as the next Southerner, who, let’s be clear, are not all white. That’s the rub about this “Southern heritage” argument, because it assumes only white heritage counts.
f9e456d156d5a546eec7f952cc74676a
https://historynewsnetwork.org/article/179736
Teaching Controversial History: Four Moves
Teaching Controversial History: Four Moves Inspired by some recent conversations and experiences, I have been thinking about how I approach the task of teaching controversial topics. Much of my approach, I think, is directly inspired by having been a fairly prickly kind of student myself. I still see a lot of myself in students who aren’t prepared to buy what their instructors are hoping to sell. (Let’s assume, for the sake of simplicity, that we instructors are correct, though of course that is not universally the case.) I think I can reduce my approach to four basic instructional moves. These moves strike me as both pragmatic and principled; I make these moves because they tend to work, but they work because they’re the morally right thing to do anyway. 1. Respect the intellectual autonomy of the student—even when they’re mistaken. Here is the axiom I have tried to work by, which I first formulated in words about five years ago: People don’t change each other’s minds. Instead, people change their own minds with tools other people provide. This statement probably varies in its level of truth. It may not be true at all for young children. And some adults are far more impressionable than others. However, it seems to hold up well as a general principle when you’re dealing with older children and fully grown students. In any case, nobody has the power to force another adult to change their mind. Opinions don’t work that way. Not even when they’re “opinions” that look an awful lot like facts or falsehoods. Here, I think of the fable of the wind and the sun. In this parable, the wind and the sun argue about which of them is more powerful. Looking down from the sky, they select a traveler wearing a cloak as an unwitting subject for a competition. You may remember what happens. First, the wind blows a ferocious gale, trying to tear the traveler’s cloak away. But the man only wraps his cloak around himself more tightly against the cold. Then the sun, beaming benevolently, shines down and gently warms the traveler. After a while, the man removes the cloak himself. The sun wins. A student changing their mind is like that traveler removing his cloak. It happens not because of the violence of external pressure but because the student feels the reason to change—feels it from the inside. But how can the student be encouraged to draw better conclusions? What does the sunshine in this analogy represent?
f2add09eea61c2709aa264bd4d8754c3
https://historynewsnetwork.org/article/179746
Mitch McConnell is Wrong. The Filibuster is, in Fact, Racist
Mitch McConnell is Wrong. The Filibuster is, in Fact, Racist In a recent news conference, Senate Minority Leader Mitch McConnell, R-Ky., made a startling observation that quickly went viral. The Senate filibuster, he remarked, "has no racial history at all. None. There's no dispute among historians about that." As a historian, I can tell you that this could not be further from the truth. Even a brief examination of U.S. history reveals that it is impossible to separate the filibuster from the history of racism and white supremacy. Then, as now, filibusters were often used to block measures to expand Black rights and political participation. In 1841, during a debate over the formation of a national bank, Sen. Henry Clay proposed a rule to "limit debate." The idea immediately sparked resistance, some of it from John C. Calhoun, a senator from South Carolina and renowned supporter of slavery. On the surface, the resistance stemmed from the belief that debates on the Senate floor should never be limited. It was framed as a matter of principle — the right to express oneself for as long as one desired to attempt to delay proceedings and even prevent the passage of legislation. A closer examination, however, reveals that Calhoun and others orchestrated the 1841 filibuster to protect the interests of Southern planters and, by extension, the institution of slavery. Calhoun, recognized as one of the architects of the modern filibuster, was deeply invested in upholding slavery and protecting the interests of slaveholders. And what he recognized during the 1840s was that he could effectively use the filibuster to obstruct efforts in the Senate that might undermine the South's vested interest in slavery. Other senators quickly followed suit. Of the 40 filibusters that took place in the Senate from 1837 to 1917 (when the cloture rule was established), at least 10 directly addressed racial issues. The use of filibusters to block Black political rights significantly expanded during the 20th century, as civil rights activists across the country fought to introduce legislation to empower Black Americans and white senators turned to the filibuster as one method to block their efforts. This continued through a series of coordinated filibusters from the post-World War I era to the advent of the modern civil rights movement.
3aef0d2f913de718eb3df0c19b232c15
https://historynewsnetwork.org/article/179748
Government has Always Picked Winners and Losers
Government has Always Picked Winners and Losers With the American Rescue Plan, the Biden administration has not only witnessed the passage of the most sweeping federal relief effort in recent history, but also advanced an argument about the role of public policy as a force for good in our economy. Polling shows that the legislation is broadly popular, but critics on the right and among centrist Democrats are sounding familiar alarms about “activist” government, warning that overreach will stifle economic activity and distort natural market processes, thus hindering growth. Pundits and experts echo these warnings and insist government should limit itself to protecting unfettered — or free — markets. But such arguments erase a fundamental reality: Throughout American history, public power has been a precondition of our markets, essential to their design and operations. Acts of governance and the exercise of public authority have always structured growth — and shaped the allocation of its rewards. More simply put, public power made our markets and regularly picks winners and losers. In fact, many of the goals being pursued by the Biden administration and promoted by the left aim to redress inequities created in no small part by past government action — government action that defenders of the status quo and critics of redistributive efforts would rather remain invisible. Control of property has always been key to building wealth in the United States. Yet, the government’s choices dating to before the Constitution have determined who can claim land and share in its benefits. In 1787, Congress enacted the Northwest Ordinance, setting the terms of governance and settlement in newly conquered and purchased Western territories. It carved land into one-mile square lots, then tasked federal officials with granting or selling it. Affluent and, often, politically connected Americans quickly monopolized markets for property. Subsequent policy interventions — land grants and management, forced removal of Indigenous populations, support for slavery — continued to privilege speculative property markets and this narrow class of investors at the expense of established communities and small landholders focused on self-sufficiency, family independence or communal provision. Even government investments in infrastructure — which theoretically benefited all Americans — concentrated benefits on the economic elite. In 1862, President Abraham Lincoln signed the Pacific Railway Act, facilitating completion of the Transcontinental Railroad in just seven years and setting the stage for decades of commercial and industrial expansion. Yet it did so by gifting millions of acres of federal land, including much of the prime real estate along the new transport corridors, to favored corporate entities and speculators. In exchange, recipients committed to building and operating the rail network and catering, for the war’s duration, to the Union’s transport needs.
5a57e31b6e9c69b431cc290e6d2ba3d7
https://historynewsnetwork.org/article/179752
Paleo Con
Paleo Con “They must be the most contented people in the world.” This is how the 1980 comedy The Gods Must Be Crazy introduces the San peoples of the Kalahari Desert, better known as the Bushmen. They eke out a simple living, the narrator explains, digging for roots and tubers, hunting with bows and arrows, and collecting dewdrops from leaves. “For the most part they live in complete isolation, quite unaware that there are other people,” the narrator continues. Thus they remain until a pilot drops a Coke bottle from his plane, and a hunter named Xi finds it. The shiny object quickly roils his otherwise happy community, so Xi sets out to hurl the “evil thing” off the end of the world. The South African movie broke box-office records, including the U.S. record for the highest-grossing foreign film, and spawned four sequels. Much of its success stemmed from the winning performance by the San actor N!Xau, in the role of Xi. The film’s director and writer, Jamie Uys, played up N!Xau’s image as a naïf, telling The New York Times that, before Uys found him, N!Xau had only ever seen one white man. Plucked from the bush, he was baffled by the ways of the wide world, Uys said: He didn’t understand what “work” meant, and the sight of a toilet amused him. Even after the film turned N!Xau into an international star, Uys had to keep the bulk of N!Xau’s pay in a trust fund, he explained, as N!Xau couldn’t “handle big sums yet.” Or so he said. N!Xau, when anthropologists interviewed him later in life, told a different story. He’d grown up on a farm, not in the bush. He understood money perfectly well; he’d worked as a cook at a local school and had been making a bow and arrow set to sell to tourists when he first met Uys. On the topic of money, N!Xau expressed annoyance that Uys was “living in luxury” after their first film’s success while N!Xau was still “living in a hut.” And Uys’s portrait of the San as living in placid isolation? “The image of the Bushmen given by the Gods films is not really good,” N!Xau felt, “because it does not show how people are really living.” Indeed, by the 1980s, foraging was a thing of the past, and most of the San were living impoverished lives in resettlement areas. N!Xau expressed surprise that audiences mistook the film for reality and didn’t get that he was “just acting.” Still, The Gods Must Be Crazy, more than any other movie or book, has cemented the image of the San in the public eye. This has been enormously frustrating to anthropologists, who tend to regard Uys’s portrayal of the San as both condescending and misleading, in that it conceals the wretched treatment of Kalahari populations by Southern African governments. The San’s prime chronicler, Richard Lee, deemed it a “thinly disguised piece of South African propaganda.” The idea that “some San in the 1980s remain untouched by ‘civilization,’” Lee added, was “a cruel joke.” Yet there is one anthropologist who has grudging respect for the film. In the eyes of the South African anthropologist James Suzman, The Gods Must Be Crazy carries a “subversive message,” one worth contemplating. The San, who in the middle of the twentieth century were “one of the last of the world’s few largely isolated hunting and gathering societies,” represent one of our closest links to the world before agriculture, Suzman writes. What we know of their foraging lifestyle suggests that they were, if not the “most contented people in the world,” then at least surprisingly well-off, heirs to an easy abundance that characterized most of Homo sapiens’ history. The San, Suzman writes, “by rarely having to work more than 15 hours per week had plenty of time and energy to devote to leisure.” In offering a glimpse into their disappearing world, The Gods Must Be Crazy makes a withering critique of our present-day, toil-obsessed labor regime.
5d558ee0ed0caadeb65a38cda3a19a5e
https://historynewsnetwork.org/article/179760
Who's Afraid of Antiracism?
Who's Afraid of Antiracism? Antiracism is not a threat to the state. It seeks to rectify and repair, not terrorize and destroy. Yet you wouldn’t know that if you looked at what the Macron administration in France has been doing over the past six months. In the midst of summer 2020’s protests against racial injustice and police violence, Macron likened social-science researchers—those investigating race, or discussing racism—to miners choosing to excavate a vein, whose opening could “only be secessionist. It is coming back to break the Republic in two.”1 Why does antiracism so threaten Macron, and his version of the French Republic? Put otherwise: When a state accuses racial-justice activists of threatening the values of the republic, what does that say about the state and about those republican values? And, most importantly—for France—can antiracism and French universalism be reconciled?2 Answering this last question would require a different approach, one that acknowledges that the grand narrative of French universalism as just that: a narrative, a story that gets told. It would require an approach that demands the re-sounding of the many other stories that French universalism has silenced.3 The three books under consideration here offer an approach—a practice—for unsettling France’s universalist narrative. Modern-day Guadeloupe, a French overseas department that has recently been in the news following the revelation of the French government’s authorization of the poisonous chlordecone pesticide there, is the setting for Maryse Condé’s novel La Belle Créole. The main character’s personal crises—his past unresolved trauma and his present predicament—mirror Guadeloupe’s late-20th-century (post)colonial ecosystem, economy, politics, and society. Meanwhile, Françoise Vergès’s The Wombs of Women: Race, Capital, Feminism tells a silenced story of French republican violence: the forced abortions and sterilizations of poor women in the French overseas department of La Réunion. Vergès recounts a story of French republican racialized violence that occurred in the recent past and reveals its connection to the longer French history of slavery and colonialism. Finally, Jean Casimir’s The Haitians: A Decolonial History is a history and a methodological treatise, grounded in understanding France’s universalist project as inextricably entwined in its racist, colonial history of conquest and civilization. Casimir’s decolonial reading “capsizes” the self-affirming colonial narrative of French universalism (“its inexhaustible discourse of self-adulation”) to tell a story of the Haitian people. Casimir’s story reinterprets Haiti’s postcolonial history to resurface ways of being, knowing, and existing outside of dominant Western conceptual categories. Taken together, these books make clear the multiplicity of other lived experiences—of lived consequences—of France’s universalist republican ideal. Each work grapples with the consequences of living with a self-fulfilling, self-congratulatory narrative of French universalism: consequences that are apparent on the bodies of nonwhite French former subjects and citizens, and of Black women most of all. Citing the French republican universalist ideal, Macron has deemed antiracism an anti-republican danger to the state. But why does France appear uniquely threatened by the idea of antiracism? The pretzel logic goes like this: France’s universalist republican ideal is color-blind and does not acknowledge racial, religious, or ethnic identification. If there is no place for race in the republic, racism cannot exist. These assumed propositions allow Republic officials to syllogistically conclude the following: those who assert the existence of race-related problems in France are, dangerously, creating division where there is none. Thus, according to such officials, asserting the existence of racism is an attack on “the values of the republic.” By this logic, antiracism and French republicanism are at best incompatible, and at worst locked in a zero-sum battle in which the fate of the nation hangs in the balance.
0f525520d9101fd5624709b7349c2341
https://historynewsnetwork.org/article/179761
America’s Longest War Winds Down
America’s Longest War Winds Down “Ours is the cause of freedom. We’ve defeated freedom’s enemies before, and we will defeat them again… [W]e know our cause is just and our ultimate victory is assured… My fellow Americans, let’s roll.” — George W. Bush, November 8, 2001 In the immediate wake of 9/11, it fell to President George W. Bush to explain to his fellow citizens what had occurred and frame the nation’s response to that singular catastrophe. Bush fulfilled that duty by inaugurating the Global War on Terror, or GWOT. Both in terms of what was at stake and what the United States intended to do, the president explicitly compared that new conflict to the defining struggles of the twentieth century. However great the sacrifices and exertions that awaited, one thing was certain: the GWOT would ensure the triumph of freedom, as had World War II and the Cold War. It would also affirm American global primacy and the superiority of the American way of life. The twentieth anniversary of the terrorist attack on the World Trade Center and the Pentagon now approaches. On September 11, 2021, Americans will mark the occasion with solemn remembrances, perhaps even setting aside, at least momentarily, the various trials that, in recent years, have beset the nation. Twenty years to the minute after the first hijacked airliner slammed into the North Tower of the World Trade Center, bells will toll. In the ensuing hours, officials will lay wreathes and make predictable speeches. Priests, rabbis, and imams will recite prayers. Columnists and TV commentators will pontificate. If only for a moment, the nation will come together. It’s less likely that the occasion will prompt Americans to reflect on the sequence of military campaigns over the two decades that followed 9/11. This is unfortunate. Although barely noticed, those campaigns — the term GWOT long ago fell out of favor — give every sign of finally winding down, ending not with a promised victory but with something more like a shrug. On that score, the Afghanistan War serves as Exhibit A. President Bush’s assurances of ultimate triumph now seem almost quaint — the equivalent of pretending that the American Century remains alive and well by waving a foam finger and chanting “We’re number one!” In Washington, the sleeping dog of military failure snoozes undisturbed. Senior field commanders long ago gave up on expectations of vanquishing the enemy. While politicians ceaselessly proclaim their admiration for “the troops,” in a rare show of bipartisanship they steer clear of actually inquiring about what U.S. forces have achieved and at what cost. As for distracted and beleaguered ordinary Americans, they have more pressing things to worry about than distant wars that never panned out as promised.
c515893cc91dce1791d4b161d1a3b5a4
https://historynewsnetwork.org/article/179768
HNN Will Be OFF This Thursday and Friday (April 1 and 2)
HNN Will Be OFF This Thursday and Friday (April 1 and 2) HNN will be taking the end of this week off (April 1 and 2). We will not be emailing a newsletter on Friday morning, but will post the week's Roundup Top Ten, along with a slate of new op ed essays, on Sunday. HNN will resume normal news posts on Monday, April 5.
a7cf0aad064296e8eb9eda328c1dec5f
https://historynewsnetwork.org/article/179775
Working with Histories that Haunt Us
Working with Histories that Haunt Us When I first began to conceptualize my dissertation project, I knew I wanted to write about an iconic group of textile traders in Togo called the Nana Benz. These women are the stuff of legends in my native country and although many think of their wealth and political influence as a contemporary phenomenon, I sought to historicize their political activism in Togo by highlighting the instrumental role they played in the nation’s anti-colonial struggle. While I didn’t know the exact contours of this history, I knew I was going to end the dissertation in 1963. That year, a group of disaffected soldiers who fought in France’s colonial army assassinated Togo’s anti-colonial nationalist leader and first president, Sylvanus Olympio. A few days later, one of the soldiers, Etienne Eyadema, confessed to firing the shots that killed Olympio. Exactly three decades later, Eyadema forced my family into exile. In the early 1990s, the people of Togo staged a series of demonstrations pressuring Eyadema to institute democratic reforms. Eyadema—who had at that point ruled Togo for over 25 years under a single party system—refused and instead began campaign terror that left tens of thousands of people dead and forced at estimated 300,000 into exile. The conflict forced my family into refugee camps in Benin were we lived with other displaced Togolese families for seven years. In a recent essay for The New Republic historian James Robins asks: can historians be traumatized by history? Robins offered a number of heartbreaking examples to highlight the devastating toll research on historical atrocities can have on scholars. Yet, Robins doesn’t take into account the fact that a scholar’s response to their work is as much a product of who they are as it is about the topics they study. As a Togolese refugee, the depths of my mourning for what Togo could have been if Eyadema had not come to power in 1963 is so deep that, for a long time, I feared I would lose myself in the abyss if I looked at the history of Eyadema’s rise to power too squarely in the face. Thus, the year 1963 was my Pandora’s Box and I designed my dissertation so that I would never have to open it. Not only would this allow me to conform to disciplinary fantasies of “objectivity,” I thought, it would protect me from the emotional toll of writing about a history that haunts me. The Covid-19 pandemic and the disruptions it brought to my research schedule forced me to rethink this approach. Like much of the world, my plans for the latter half of 2020 were upended by the pandemic. Unable to travel to Togo for research, I began to look towards the limited sources on Togolese history in my university’s archives. In the fall, when the libraries were briefly opened, I requested a few boxes containing pamphlets, brochures and speeches from Togo, looking for anything that could inform my dissertation writing. In the archival reading room, I worked through the folders jotting down notes and taking pictures of the documents. When I reached the last folder, however, I was jolted out of this rhythm. There, in bright red colors, was a front page newspaper article documenting the violence that would eventually force my family to flee Togo.
4889ac48e62917a1b7ab1a39bb392a0a
https://historynewsnetwork.org/article/179776
The Painful History of the Georgia Voting Law
The Painful History of the Georgia Voting Law Seventy-five years ago this July, a World War II veteran named Maceo Snipes reportedly became the first Black man to cast a ballot in his rural Georgia county. The next day, a white man shot him in his front yard, and Mr. Snipes would soon afterward die from those wounds. Fortunately, three generations removed from the political reign of terror that claimed Mr. Snipes’s life, voter suppression seems much less likely to arrive by bullet. But we may not be as distant in our political moment from theirs as we might think: The long struggle to block access to the ballot has always relied on legal maneuvering and political schemes to achieve what bullets and bombs alone could not. What legislators in Georgia and across the country have reminded us is that backlash to expanded voting rights has often arrived by a method that our eras share in common: by laws, like Georgia’s Senate Bill 202, passed by elected politicians. Opponents of the new Georgia law denounce the legislation as “Jim Crow 2.0” precisely because they recognize the continuities between past and present. The bill’s most ardent supporters, who lined up in front of a painting of a building on the site of an antebellum plantation to watch Gov. Brian Kemp sign it into law, seem less interested in distancing themselves from that past and more eager for Americans to forget it. “Our country has changed,” Chief Justice John Roberts explained in 2013 in defending the Supreme Court’s gutting a key provision of the Voting Rights Act in Shelby County v. Holder, a decision that helped clear the way for the current voter suppression campaigns. Yet the riot at the U.S. Capitol makes clear that concerted efforts to sow seeds of distrust in the democratic process can still stoke violent reaction. The methods in the fight against voting rights have a common objective — an electorate narrowed along predictable and demonstrable fault lines. Many present-day proponents of voting restrictions are quick to distance themselves from the racist aims and attitudes of their forebears, but the most durable and enduring attacks on voting rights have long cloaked their goals in race-neutral language — at least in writing.
dd76b3a93b0fc6750c506d032c65799a
https://historynewsnetwork.org/article/179777
The Roundup Top Ten for April 1, 2021
The Roundup Top Ten for April 1, 2021 The Painful History of the Georgia Voting Law by Jason Morgan Ward The new wave of vote suppression bills, like the one in Georgia, reflect a less obvious but important aspect of Jim Crow law: the use of superficially race-neutral language to keep specific groups from voting. The danger is that courts today will similarly fail to see these bills for what they are. Mitch McConnell is Wrong. The Filibuster is, in Fact, Racist by Keisha N. Blain "Try as he might, McConnell cannot erase the historical record. To use his own words, 'There's no dispute among historians about that'." Working with Histories that Haunt Us by Marius Kothor The author responds to a recent essay on the traumatic aspects of archival research. As a political exile from Togo, her identity and experience converged with subject matter she couldn't study at a remove. Government has Always Picked Winners and Losers by David M.P. Freund Government action has always been tied to economic growth, and always involved policy choosing winners and losers. Policies proposed by the Biden administration as part of the COVID recovery aren't inserting the government into the market, they're changing the parties favored by government policy. The Problem with Confederate Monuments by Karen L. Cox "I also believe it’s important that I, a Southern white woman, write and speak about this topic with blunt honesty. Monument defenders cannot dismiss me as a Northern liberal who has invaded the region to tell them what to do. I’ve grown up here, too." Teaching Controversial History: Four Moves by Jonathan Wilson A reflection on the work of teaching controversial subjects argues that it's essential to respect students' autonomy and provide them with the tools with which to change their own minds. Who's Afraid of Antiracism? by Chelsea Stieber Recent books in different genres shed light on the limits of the French governing ideal of republican universalism for a society where racism is real and historically significant. Paleo Con by Daniel Immerwahr Why do the lifestyles of paleolithic hunter-gatherers repeatedly pop up as foils for western capitalist modernity? The Lack of Federal Voting Rights Protections Returns Us to the Pre-Civil War Era by Kate Masur New vote suppression bills in multiple states threaten to return the United States not to the Jim Crow era but to the period before the Civil War and Reconstruction when civil and political rights were protected or denied according to state politics. America’s Longest War Winds Down by Andrew Bacevich Public fatigue over the ongoing War on Terror must not allow political leaders to do what they seem to want most to do: avoid taking responsibility or learning lessons.
7c4e59ca8d8bb90314a41dff082041fc
https://historynewsnetwork.org/article/179779
Hidden Stories of Jewish Resistance in Poland
Hidden Stories of Jewish Resistance in Poland In 1959, writing about the Holocaust, scholar Mark Bernard highlighted that Jewish resistance was almost always considered a miracle, ethereal, beyond research scope. Still today, this impression generally persists. And yet, Jewish defiance was everywhere during the war, carried out in a multitude of ways, by all types of people. I first encountered this phenomenon several years ago, when I accidentally came across a collection of Yiddish writing by and about young Polish-Jewish women who rebelled against the Nazis. These “ghetto girls” paid off Gestapo guards, hid revolvers in marmalade jars, and built underground bunkers. They flung homemade explosives and blew up German trains. I was stunned. Why had I – a Jewish writer from a survivor family, not to mention a trained historian who held a Ph.D. in feminist art — never heard this side of the story? And so began my research. As I discovered, due to preconceived notions of gender, the girls’ educations, and the lack of evident markers of their Jewishness (i.e., circumcision), women played a critical role in the Jewish underground in Poland. But when I set out to write their story and sought a chronological context, it quickly became apparent that there was none. No comprehensive history of the men in the underground existed either. Sure, excellent academic biographies and case studies of rebellions in particular ghettos and camps had been published, but there were no recent English books that relayed the tale of Jewish resistance in the country as a whole. As much as I was baffled by the ferocious female fighters, I was equally baffled by the entire Jewish effort in Poland, the epicenter of the bloodshed, where 3 million Jews (90% of the pre-war population) were savagely murdered. The truth was, though I’d heard of the Warsaw ghetto uprising, I had no idea what actually happened. I certainly had no idea of the scope of Jewish revolt. Holocaust scholars have debated what “counts” as an act of Jewish resistance. Many take it at its most broad definition: any action that affirmed the humanity of a Jew; any solitary or collaborative deed that even unintentionally defied Nazi policy or ideology, including simply staying alive. Others feel that too general a definition diminishes those who risked their lives to defy a regime actively and that there is a distinction between resistance and resilience. The rebellious acts that I discovered among Jewish women and men in Poland, my country of focus, spanned the gamut, from those entailing complex planning and elaborate forethought, like setting off large quantities of TNT, to those that were spontaneous and simple, even slapstick-like, involving costumes, dress-up, biting and scratching, wiggling out of Nazis’ arms. Some were one-offs, some were organized movements. For many, the goal was to rescue Jews; for others, to die with and leave a legacy of dignity. As guerrilla fighters, the Polish-Jewish resistance took only a handful of Nazi casualties and achieved a relatively minuscule victory in terms of military success, but the effort was much more significant than I’d known. Over 90 European ghettos had armed Jewish resistance units. In Poland, where many of these were located, the units comprised “ghetto fighters” who used found objects (like pipes), manufactured items (such as homemade explosives), and smuggled-in weapons (including pistols and revolvers) to engage in spontaneous or, more often, organized anti-Nazi assaults. Most of these underground operatives were young, in their twenties and even teens, and had been members of youth movements, which now formed the core structures of resistance cadres. Ghetto fighters were combatants as well as editors of underground bulletins and social activists. The Warsaw Ghetto Uprising, I learned, was youth-driven, and strategically planned over months. Most accounts agree that about 750 young Jews participated. (Roughly 180 of them were women.) Some Jews fought inside the ghettos, but 30,000 (ten percent were women) fled their towns and cities and enlisted in forest-based partisan units; many carried out sabotage and intelligence missions. ‘The Avengers,’ a Jewish-led detachment outside Vilnius, blew up German trains, vehicles, bridges, and buildings. They used their bare hands to rip down telephone poles, telegraph wires, and train tracks. Other Polish Jews joined Soviet, Lithuanian, and Polish-run detachments or foreign resistance units; while others still worked with the Polish underground, often disguised as non-Jews, even from their fellow rebels. Alongside military-style organizations, Jews organized rescue operations to help fellow Jews escape, hide, or live on the Aryan side as Christians. Warsawian Vladka Meed, a Jewish woman in her early 20s, printed fake documents, distributed Catholic prayer books, and paid Christian Poles fees for hiding Jews in their homes; she also helped save Jewish children by sneaking them out of the ghetto and placing them with non-Jewish families. In Poland, rescue networks supported roughly 10,000 Jews in hiding in Warsaw alone; they also operated in Krakow. Mordechai Paldiel, the former director of the Righteous Gentiles Department at Yad Vashem, Israel’s largest Holocaust memorial, was troubled that Jewish rescuers never received the same recognition as their Gentile counterparts. In 2017 he authored Saving One’s Own: Jewish Rescuers During the Holocaust, a tome about Jews who organized large-scale rescue efforts across Europe. Poland, he claims, had only a small number of these efforts, and still, it was significant. All these accompanied daily acts of defiance: smuggling food across ghetto walls, creating art, playing music, hiding, even humor. Jews resisted morally, spiritually, and culturally in public and intimate ways by distributing Jewish books, telling jokes during transports to relieve fear, hugging barrack-mates to keep them warm, writing diary entries, and setting up soup kitchens. Mothers kept their children alive and propagated the next Jewish generation, in and of itself an anti-Nazi act. Jews resisted by escaping or by taking on false Christian identities. Roughly 30,000 Jews survived by dying their hair blond, adopting a Polish name and patron saint, curbing their gesticulations and other Jewish seeming habits, and “passing.” I was fascinated by this widespread resistance effort, but equally by its absence from current understandings of the war. Of all the legions of Holocaust tales, what had happened to this one? While I researched the lives of Jewish rebels, I simultaneously probed the trajectory of their tales. As I came to find, though there were waves of interest in Jewish defiance over the decades, the resistance narrative was more often silenced for both personal and political reasons that differed across countries and communities. The history of the Jewish underground has generally been suppressed in favor of a “myth of passivity.” Holocaust narratives were shaped by the need to build a new homeland (Israel), the fear of exposing wartime allegiances (Poland), and redefining identity (USA). Early post-war interest in partisans turned into a 1970s focus on “everyday resilience.” A barrage of 1980s Holocaust publications flooded out earlier tales. Many fighters who survived kept their stories hidden. Many women were treated with disbelief; relatives accused others of having fled to fight instead of staying to look after their parents; still others were charged with sleeping their way to safety. Sometimes family members silenced them, as they feared that opening up old wounds would tear them apart. Many hushed their tales due to oppressive survivors’ guilt: they felt that compared to others, they’d “had it easy.” Then, there was coping. Women in particular felt a cosmic responsibility to mother the next generation of Jews. They wanted to create a normal life for their children and, for themselves. They did not want to be “professional survivors.” Like so many refugees, they attempted to conceal their pasts and start afresh. The fighters’ formidable tales were buried with their traumas, but both stayed close to the surface, waiting to burst out. The Warsaw Ghetto Uprising began in April 1943, on the first night of Passover. In her groundbreaking book, We Remember with Reverence and Love: American Jews and the Myth of Silence After the Holocaust, 1945–1962, Hasia Diner explains that Passover, a holiday where Jews celebrate liberty, became the time around which American Jews commemorated the Holocaust. However, the uprising element was forgotten. When my book comes out this April, I hope to bring the revolt to the fore once again. I cannot think of Polish Jewry without it; theirs is a story of persistent resistance and profound courage.
66ba326c2026c053022c26255dc446db
https://historynewsnetwork.org/article/179780
What Comes Next?
What Comes Next? Poster images by Amanda Phingbodhipakkiya, from I Still Believe in Our City, a recent public art campaign for the New York City Commission on Human Rights. “Until we address the discrimination and harassment against Asian Americans today, they will become deeply entrenched in the fabric of our nation, causing unimaginable harm and suffering and taking decades to undo,” Manjusha P. Kulkarni, Executive Director of the Asian Pacific Policy & Planning Council, explained in a recent written statement submitted to the House Subcommittee on Constitution, Civil Rights, and Civil Liberties. On March 18, activists, scholars, and artists across the Asian American and Pacific Islander community provided testimony on the increase in anti-Asian hate speech and violence since March of 2020. These attacks aligned with former president Donald Trump’s use of phrases like  “Chinese virus” on Twitter and in public statements. Stop AAPI Hate—a coalition of activists and scholars maintaining a database that contains nearly 3,800 documented incidents of verbal and physical abuse—has continued the legacy of Asian Americans pursuing protection by presenting evidence of racism. But what comes next? Forty-two years ago, Asian Americans spoke to legislators in DC as consultants on civil rights issues still faced by the AAPI community long after the legislative milestones of the 1960s. From May 8th to the 9th in 1979, the U.S. Commission on Civil Rights held its first hearing on specific rights violations encountered by Asian Americans. It coincided with increasing representation of Asian Americans in politics and Congress’s passing of Public Law 95-419, which designated the week of May 4th as Asian American Pacific Islander Week. Just a few days earlier, President Jimmy Carter declared, “We have succeeded in removing the barriers [for Asian Americans] to full participation in American life.” Refugees from southeast Asia fleeing the wreckage of the Vietnam War were also resettling in the US, adding diversity to the AAPI community and, Carter declared admiringly, "their successful integration into American society and their positive and active participation in our national life demonstrates the soundness of America’s policy of continued openness to peoples from Asia and the Pacific." Carter’s praise for the AAPI community and their “enormous contributions to our science, arts, industry, government, and commerce” bolstered the idea of Asian Americans as the model minority who had overcome adversity to achieve the American Dream. However, for those who appeared before the Commission, Carter’s comments did more harm than good. He described Asian Americans as economic drivers for the United States whose rewards were acceptance and economic comfort—an idea that glazed over the challenges they faced. Dr. Ling-Chi Wang, then an assistant professor in Asian American Studies at the University of California at Berkeley, provided a historical overview of Asian American experiences that clashed with Carter’s simplistic characterizations. “I just want to add,” Wang stated during his testimony, “that current popular beliefs, held most firmly by government agencies—that Asians have no problems, that Asians have made it, that Asians take care of their own problems, and that Asians are too proud to seek government assistance—are but persistent manifestations of the highly institutionalized government attitude toward Asian Americans of benign neglect.” This neglect stemmed from a history of exploitation by the government and white employers. “Almost without exception,” Wang continued, “each economic crisis was accompanied by an anti-Asian movement… each Asian group was imported to meet a concrete demand for cheap labor, and each was subsequently excluded by law when each was no longer perceived to be needed or when it was no longer politically and economically expedient to continue its utilization.” Racist policies excluded Asian immigrants, making them expendable, rendering them as outsiders, and making them easy to scapegoat in different crises. As Wang charged, the model minority myth “absolves the government of any responsibility of protecting the civil rights of Asian Americans and assigns Asian Americans to a permanent status of being neglected.” Others presented evidence of the damage from more than a century of anti-Asian sentiment. Challenges faced by Asian Americans ranged from limited access to health services, lack of bilingual educational resources, and poverty—social problems also encountered by other communities. Participants in the consultation offered solutions such as promoting more representation in the federal government, directing more money to community grants, and developing a set of criteria for identifying civil rights violations specific to Asian Americans. There was hope—particularly after the movement for reparations for Japanese Americans who survived incarceration during World War II—that with more attention, Asian American civil rights would progress. But in 1986, the Commission on Civil Rights issued a disturbing report. “Recent Activities Against Citizens and Residents of Asian Descent” noted an uptick in attacks on Asian Americans. Economic competition from Japan spurred a reinvigorated anti-Japanese movement in the US during the early 1980s. In 1982, a Japanese American state legislator in California reported that someone had spray-painted the word “Jap” on his garage door while a group called the White American Resistance Movement distributed anti-Asian pamphlets throughout San Francisco. The report connected these incidents to the death of Vincent Chin, a Chinese American draftsman who was murdered by two white men who had recently been laid off from an auto plant in Detroit. They blamed Chin for the layoffs—thinking he was Japanese—and brutally beat him. The recent deadly, racially-motivated attacks on Asian American women in the Atlanta metro area has brought attention to the historic trend highlighted by Wang during his 1979 testimony. In May of 2020, the Commission on Civil Rights promised to prosecute civil rights violations and hold public hearings on anti-Asian hate, but these initiatives had largely languished despite calls from the AAPI community for legislators to take verbal and physical abuse seriously. The promises made by the Commission in 1979 did not save Vincent Chin. And now—as Wang predicted—the COVID-19 pandemic is the latest in a long list of crises that produced violent anti-Asian attacks. The Chinese Exclusion Act and other historic moments are crucial to understanding where the nation is today, but historians have more contemporary examples to draw from. The pleas for help in 1979 before the Commission on Civil Rights largely went unanswered. Today, holding public officials accountable for the promises they will undoubtedly make to the AAPI community after recent hearings depends upon forcing Americans to reckon with a cycle of perpetual scapegoating and the racist language that makes it possible.
c3fb00e391799b4be9b1c1a12acda056
https://historynewsnetwork.org/article/179781
Economic Justice and Political Stability Require More Progressive Taxation
Economic Justice and Political Stability Require More Progressive Taxation Income Tax filers, 1920 The invasion of the Capitol on January 6th is a sign of deep anger at the course of American life among what is usually called the “white working class.”  Beside it is the protest of black America over continuing racism and poverty.  People with little else in common count perceived economic unfairness among their complaints. What can we as citizens of a democracy do about it?  Significant reforms, such as those usually ascribed to the left in the Biden administration, are going to cost money.  A return to progressive tax rates by meaningful tax reform will be part of the solution. Economists with a sense of history point out that inequalities began to grow about 1980, starting with the Reagan tax cuts.  Emmanuel Saez and Gabriel Zucman of the University of California, Berkeley, have done a service to the republic by methodically tracing what has happened to equality over the last 40 years.  Their book, The Triumph of Injustice (2020), is contentious but it sets out uncomfortable facts that bear upon a solution.  A complementary analytical tool is also provided. They trace and measure the working, middle, and upper class divisions in American society all the way to the super-rich and the top 400 families.  Fully 50% — half — of the American people are classified as working class, with annual income on average of $18,500.  They earned 10% of national income in 1980; 40 years later, only 12%.  Most gains in the economy due to technological progress and globalization went to the upper 10%.  The next 40% of the people are middle class, earning an average of $75,000.  The last 10% of the people are reckoned as upper class or the rich, earning $220,000 annually.  But they have divisions, too.  The top 1% earn $1.5 million per year.  They earned as much (10% of national income) as the whole working class in 1980; 40 years later, their share had grown to 20% (pp. 3-7).  At the very top are 400 families of the super-rich, including Warren Buffett, who earned $3.2 billion in 2015, and paid taxes of $1.8 million (a rate of 0.055%) (p. 129).  Buffet is honorable in that he openly admits that he should pay a higher rate, which he has famously stated is less than his secretary’s. Reversal of this pattern is absolutely vital to a sense of fairness in America.  We have already had one insurgency.  But won’t the rich, especially the very rich, resist any proposal that increases their taxes?  Money is their property.  A tax is an appropriation of private property for public benefit.  People within democracies are resistant to taxes until convinced of their necessity and justice.  This country began in a tax revolt.  How can we convince Jeff Bezos (net worth $179 billion) to Alice Walton ($62 billion) among the 400 to share? The principle of progressive taxation was established historically.  The Constitution did not originally provide for an income tax, but it distinguished between indirect taxes (like customs duties) and direct ones (like taxes on land).  Customs duties or tariffs were understood as taxes on consumption, which are regressive, but the citizenry in the early republic were so nearly equal — most were owners and cultivators of farms — that the slightly increased price of foreign imports was bearable.  For 100 years the principal revenue of the U.S. federal government was drawn from tariffs. Great national crises have been the settings for the introduction of a progressive income tax.  During the Civil War, the first income tax was introduced — as a direct tax — to meet the threat to the Union.  Its rates were gradually reduced until the 1890s, at the height of the industrial Gilded Age, when the Supreme Court ruled that the government had no right to impose a direct tax.  That defect was removed by the 16th amendment (1913), one of the high achievements of the Progressive Era (1905-15).  A regulated, orderly capitalistic economy was steadily established by the Interstate Commerce Commission, the Sherman and Clayton Anti-Trust Acts, the Federal Reserve System, the Federal Trade Commission, later the Securities and Exchange Commission, and the income tax. Initial rates for the tax were quite modest (7% for the top bracket) but U.S. entry into the Great War increased the top marginal rate to 67% to counter war profiteering.  An estate tax was also established at 10% for the largest bequests, which grew to 20% by the late 1920s. The great expansion of the income tax came with the supreme crises of the Depression and the Second World War.  In the 1930s, with business in ruins for many owners and many workers reduced to poverty, President Franklin D. Roosevelt aimed to confiscate remaining excessive incomes. The top marginal rate rose to 79% in 1936.  Roosevelt argued that in American democracy no one should, after taxes, have an income of more than $25,000 (equivalent now to about $1,000,000).  The purpose was plainly to redistribute income to create a more equitable society.  Roosevelt explained in his 1937 inaugural address, “The test of our progress is not whether we add more to the abundance of those who have much.  It is whether we provide enough for those who have too little.”  During the war, the top marginal rate rose to a maximum of 94%.  This progressive rate fell slowly in the 1950s and ’60s (even in Nixon’s time it was 70%), producing the most equitable, and hence just, U.S. society yet in the industrial age. This achievement from about 1936 to 1980 has largely been undone by tax cuts (the top rate is now 23%) and by the rise of an immense tax-dodging industry.  Neoliberal economics argues that the optimal tax rate on capital should fall to zero, and that capital gains revenue should be replaced by higher taxes on labor income or consumption (p. 99).  Reversal of this 40-year pattern is vital to bringing the working class into the promise of America and the very rich into recognition of their obligations to a democracy. If you want peace, work for justice.
33e53836123e27e461fe8523cb14c2d7
https://historynewsnetwork.org/article/179782
Pamela, Randolph and Winston: The Wartime Discord of the Churchills
Pamela, Randolph and Winston: The Wartime Discord of the Churchills Pamela Digby, photographed in 1938, before her brief courtship and tumultuous marriage (1939-1945) with Randolph Churchill. In the spring of 1941, Averell Harriman, Roosevelt’s special envoy to Britain, started an affair. That both he and the woman in question were married was not a huge problem; there were different rules in wartime. What was more complicated was the identity of the woman’s husband: Randolph Churchill, the British prime minister’s adored, spoiled, turbulent son. Randolph had started the Second World War desperate for two things. He wanted to be wherever the fighting was fiercest, and was anxious to find a wife who would bear his child. Both were ways of pleasing his father, who placed an outsized premium on physical bravery, and was obsessed with the idea of building a powerful political dynasty. Randolph, he felt, had a duty to make sure the line was continued. Randolph’s first ambition was stymied by the artificial calm that followed Hitler’s invasion of Poland, as well as his father’s reluctance to get him reposted. He was more successful in achieving the second. In the course of a fortnight he proposed to, and was rejected by, eight women. Then he met Pamela Digby. Pamela had wide, deep-blue eyes, pink flushed cheeks and auburn hair streaked with a patch of white. Some saw her as “a red headed bouncing little thing regarded as a joke by her contemporaries,” but beneath the plumpness and a forced air of jollity was an adamantine desire to escape her dull, provincial life in Dorset. Winston embraced her immediately. Pamela was soon an essential part of the Churchill family, especially once Winston became prime minister and they moved to Downing Street. She had an uncanny instinct for sensing what people needed, and then giving it to them almost before they had realized themselves. She was a source of support for an embattled Winston, and a much-needed confidante for his wife, the lonely Clementine. In October 1940 she gave birth to a boy, named, inevitably, Winston. The only problem was her husband. Randolph was charming, clever, generous and funny. Most of the time. He was also rude, arrogant and incapable of understanding why marriage should stop him sleeping with other women. All of these qualities were exacerbated when he drank, which he did uncontrollably. Bills and arguments mounted up. When Randolph behaved appallingly, or ran up debts he couldn’t cover, it was to his parents that Pamela ran for help. They, increasingly, took her side, which was another source of friction in an already fraught web of relationships. It was after Randolph finally got his posting abroad that the problems really began. In January, 1941 his Commando unit set sail for Egypt. Before their ship had even docked on the other side of the Atlantic he’d lost more at the gaming table than he could possibly pay back. Once Pamela had fixed the financial disaster her wayward husband had forced upon her, she deftly, single-mindedly, began to fashion a new, independent life for herself. Before the end of spring she had started sleeping with Harriman. Randolph was incandescent when he discovered his wife’s infidelity. This was largely because he was convinced that his father had at the very least tolerated, and at worst actually encouraged, an affair that was being conducted beneath his nose. After all, the situation presented a clear political advantage to Winston. And so although Pamela did not create the tensions that run between father and son – they had a long history of their own – her actions brought matters to a head. Winston and Randolph’s bond had always had an almost romantic intensity. Winston was obsessed with his son, claiming that he would not be able to continue leading the country if anything were to happen to him; Randolph was devoted to his father. They had spent the last decade living in each other’s pockets: drinking, plotting, gambling, talking and quarrelling. But this closeness masked some profound difficulties. Throughout his life Randolph had struggled to find a way of marrying the outsized expectations Winston had thrust onto his shoulders with the need to provide his father with the asphyxiating loyalty he demanded. Every time Randolph tried to fashion an opportunity for himself, or attempted to assert an independent position, he found himself accused of sabotage. He had been Winston’s most passionate defender during his time in the wilderness, an unfailing source of affection and reassurance. And yet when Winston formed his government in 1940, there was no place for Randolph. All of this had lain under the surface for years, now it erupted. Volatile, unable to control their emotions, the two men launched into rows that frightened anybody who witnessed them. Winston became so angry that Clementine feared he would have a heart attack; Randolph stormed out of rooms in tears, swearing that he would never see his father again. Although a fragile peace was restored, it could not last. Randolph was unable to reconcile the deep animal love he bore for his father with what he regarded as Winston’s treachery. Nor could he understand why his parents continued to show Pamela such open affection. Winston reacted violently to his son’s reproaches. Wrapped up in his own consuming sense of destiny, and unable to ever read what was going in anybody else’s heart, he did not see that he had done anything wrong. As Pamela moved serenely from one affair to the next, father and son fought, again and again, opening deeper and deeper wounds. Randolph and Pamela’s divorce was confirmed in 1945. Randolph could survive this, but the damage to his relationship with Winston was irreparable. They would never recapture the intimacy they had enjoyed before the war. Randolph had married Pamela to make his father happy, and yet he only succeeded in alienating the man he loved more than anybody else on the planet.
985791cc5f7b84353a0be20ce0e5833e
https://historynewsnetwork.org/article/179783
Teachers, Keep Hope about the Minds You Influence
Teachers, Keep Hope about the Minds You Influence Professor Donald Treadgold of the University of Washington. How and to what degree does a teacher impact a student?  I doubt that we will ever be able to gauge the matter. I believe it may boil down to matters of hopefulness and pessimism and the moral imperative of making a choice between them. Surely each one of us, both professional educators and laymen, impacts the lives of people we cross, but we often don't know which ones or to what degree. We must remain hopeful as an article of faith. I once had a memorable professor at the University of Washington, Donald Warren Treadgold, an eminent scholar of the Soviet Union. Let's be kind and just say that this man, a cold warrior extraordinaire, knew his own mind. He was more than a little famous for that. Professor Treadgold was a prolific author, and his work was known throughout the world. He authored Lenin and His Rivals; The Great Siberian Migration; Twentieth Century Russia; The West in Russia and China (two volumes); and Freedom: A History. When I first wandered into one of Professor Treadgold's classes, almost all of my study had been on western Europe. I was a stranger in a strange land in my attempts to learn about Russia and the Soviet Union. Considering myself a hotshot, I plunged forward. But wait. The rub was that Professor Treadgold attempted to teach me a great deal that I found myself resisting at every turn. It all took place within the constraints of academic etiquette, but make no mistake, this was a slugging match. And it was a mismatch, for he knew so much, and I knew so little. I considered him to be an old relic. He considered me to be a dopy, misguided, poorly informed idealist. I dug in. He persisted. Throughout the following years, my memory of him remained fresh. I continued to remember his disdain for my viewpoints, his deep learning, his patient demeanor, and the overall gentleness of his character. And as the decades passed, I found myself incorporating much of what he had vainly tried to teach me. It dripped into me, consciously and subconsciously. I never swallowed it whole, but the slow drip never stopped. I can now firmly say that he had as great an intellectual impact on me, both morally and intellectually, as any person that I have known. One day, many years later, I was pecking away at my computer. Suddenly, for no conscious reason, I did googled his name. I found that two years before he had passed away as a result of leukemia. Stunned, I gazed out my window. The sun was going down and it looked cold outside. The streets were empty. I placed both hands over my face and sobbed like a little child.
a5b157277d8bf48380389d815b352823
https://historynewsnetwork.org/article/179784
A Personal and Family History of Encountering Prejudice and Intolerance
A Personal and Family History of Encountering Prejudice and Intolerance Anti-Asian sentiment is nothing new in America. For instance, the Chinese exclusion Act of 1882 and the acts that preceded it, especially those that restricted Chinese women from living and working in the United States and those laws that followed restricting any person of Asian descent from being anything but a second class citizen. There has always been a deep-seated fear of what white supremacists call the yellow peril, a concept that some historians believe originated as far back as the Greco-Roman Wars. William Randolph Hearst is the villain who in the 1900s popularized the yellow peril in his newspapers as a major selling tool in the era of yellow journalism. Whether he believed it, he said that America was under threat of an invasion from Japan, thus what he called the yellow peril. We should never forget Hearst and the role he played in creating this deepest of inhumane prejudices. The threat of a Japanese invasion is long gone but the fear of Asians, their look and skin color remains deeply engrained in America's collective psyche. It is easy to review law by law how the mass of Asians suffer because of discrimination but that would be nothing new. I am here to tell you about my life, a personal history if you will, as I inadvertently became a part of the wider Asian community in many different counties. My late wife was from Saigon, Vietnam. We met in Saigon, married in Hong Kong, lived in London, Washington and New York. We had three mixed race children, two boys and a girl, now thriving adults. I have three grandchildren, boys who, because of their antecedents, are part of the Asian continuum. We lived in dynamic cities. As a soon to be married couple we had a tough time in Saigon. Unmarried and still courting, we did not live together. When as a couple in public, Vietnamese soldiers mocked and chided us accusing me of usurping their women and calling my future wife a whore. We found it better to walk separately with me behind her and to never hold hands or otherwise touch in public. Incidentally, but not less important are the memories my wife had of being chased through the streets of Saigon by French soldiers from Africa. From an early age she had an understanding of what it meant to be sexually harassed. We thought life in Hong Kong would be better and for the most part it was, but prejudice tailed us everywhere we went. In the 1960s, Hong Kong was a progressive, dynamic city with many mixed race couples. European mixed race couples were more easily acceptable than I was as an American with an Asian wife. I should note that during the Vietnam War, there was no love for Americans. The war was not very popular in South East Asia, of which Hong Kong was a part. Do I attribute the prejudice we felt to my being an American, for some reason easily detectable because of the way I walked, looked, dressed? To a degree, yes but it was mostly because we were a couple. A Chinese doctor friend said with a smile, that many Asian men could not understand why a beautiful Asian woman, particularly from Vietnam, would consort with a pale faced, big nosed American. Beyond that popular descriptive utterance he had no answer why prejudice should be part of anyone's life. I knew it was not part of his life. Life in London, Washington and New York, at least on the surface seemed to have less prejudice than in either Saigon or Hong Kong, yet it still existed in many forms, especially for my wife and then for my mixed race children as they grew and we established ourselves on Long Island. The outward expression of prejudice we experienced were the hard stares of people who viewed us as beings out of the ordinary. Most of Long Island then was conservative and not very progressive. So seeing a mixed couple and often their mixed race children out for a meal in public was indeed strange, enough so that most people could not help staring even for a moment. We were uncomfortable but we did nothing to stop the stares. We learned that keeping to ourselves in public was the best defense, though at times I wanted to physically strike out against their stupidity as I once did in a movie theater in Hong Kong when we faced a crowd of teenagers who attacked us verbally for being a couple. My wife worked for years on Long Island helping settle Vietnamese, Lao and Cambodian refugees. She served as a court interpreter for many refugees helping them understand a language for the most part they did not speak or understand. She worked with them to traverse the intricacies of the benefits they were due. The prejudice those new immigrants felt knew no bounds but they never, nor did she, ever complain publically. Getting and holding jobs, making life work, no matter how trivial what they did may have seemed, was more important than registering a complaint about a life they were trying to understand and manage to survive. Many of these former refugees, now adults, made it through to the new world of opportunity in America. When they first arrived, intolerance, though a concern, was not an issue. In time, they ignored, but never forgot, the unreasonable hatred they knew as newcomers to our so-called hallowed land. For many years there was no issue with hate crimes. Now that there is, living their lives to the full and educating their children about the faults of hate and intolerance works best for them in our current climate. I am Caucasian Jewish, my family from Lithuania, and Russia. That is normally enough for full bore bigotry. I grew up in a diverse neighborhood in Brooklyn and felt almost no prejudice. It was not until college that I suffered for being not only Jewish, but a New York Jew, a condition I survived with added strength into adulthood. My wife was South Vietnamese, part Chinese and a Buddhist. By the sheer force of her personality she was able to overcome much of the racial intolerance that permeated Long Island but she never understood why some people did not like her because she looked different, For my mixed race children, today all worldly adults, they are a part of a unique fabric that is more like an abstract quilt. They are white and Vietnamese but with those other fragments blended in. It was quite a mix and a serious burden for young children to carry. As children they knew they looked different. Everyday in their preteen and young teen years they knew the slings and arrows of racism. "Chink," was the epitaph applied to how intolerant and mostly ignorant kids and teenagers usually attacked my sons. My oldest son knew he looked different. He turned to Judo in the hopes that he could defend himself if attacked. My daughter simply said yes when asked if she knew bigotry but she did not elaborate. As a family we never talked about hatred and racial intolerance but I know this: what my wife and children went through informs my children's lives to this day. In a backward sort of way, the idea of bigotry in their lives has taught them to be better husbands, a better wife and better parents. They are better people for what they learned, for what many other people have never known or will have sadly ever understood.
a77dc042e029024d7146e3f89d0d044b
https://historynewsnetwork.org/article/179785
Richard Minear Reflects on Teaching History, Including Teaching Vietnamese History during the Vietnam War
Richard Minear Reflects on Teaching History, Including Teaching Vietnamese History during the Vietnam War Dr. Minear grew up in Newton outside of Boston and his wife was born in Northampton.  He taught at Ohio State University from 1967 to 1970. “That was prime Vietnam time,” he said. “Columbus, Ohio is distinctly not New England. Ohio State is a huge school.” He also taught at the University of Massachusetts Amherst from 1971 to 2008. Dr. Minear graciously answered questions via phone about history, his career, teaching, and what he is doing now while in retirement. Did your education in your early life prepare you to eventually pursue a career as a historian? Yes and no. I didn’t set out just to learn languages, but in this neck of the woods, being competent in Japanese, Chinese, or Vietnamese for that matter is a prerequisite. Did you think you would travel to so many continents and experience different cultures? By then, I had lived in Germany for two or three years, I had lived in Sweden for six months. When I was an eight-year-old, I went to a Swedish school while my father was on sabbatical. This was 1958-59: my brother was on a Fulbright scholarship in Germany and I was in Heidelberg for my junior year. My parents were in Holland in that spring. European winters are god awful, with not much sunshine. That spring, because of my dad's interest, he took my brother and me on a week-tour, and we hit Istanbul, Palestine, then Israel on the Jordanian side – Palestine/Israel was my dad's turf – New Testament theology. Then we went to Rome on the way home back to Europe. I had had a lot of travel. Which Japanese island have you been to the most? There are four major islands in Japan. I spent most of my time on the main island. UMass has a sister university, Hokkaido University. I spent the summer and part of another year. My first three years were in Kyoto and a little time in Tokyo. Is there a historical event that captivates you most? I was born in 1938. I learned about World War II primarily after the fact. My first ‘political’ memory was of the atomic bomb while I was up in Vermont. I got on a boat on the lake and the sirens started going off and I remember a bonfire. That's part of my background in Japan and a natural focus or interest. I was too young to serve in Korea and I used the educational deferment, which got me through 1968 by which time I was 30 and married. The accident of chronology kept me out of the military during Vietnam, and Vietnam had a major impact on everything that I did afterwards. Ohio State has a quarter system which means three ‘semesters’ each year. I was beginning to teach six courses that year and I quickly ran out of Japan courses. In the third semester of the quarter of the first year, which would have been spring of 1968, it dawned on me that there was no course on Vietnam at Ohio State. Here was a major university without a course on Vietnam and the war, and I proposed a course. Even though I had never had a course on Vietnam, my Asia background gave me some kind of entree. I taught a course in spring of in 1968 and 1969 and 1970, and Vietnam had gotten very big. I had brought in guest lecturers. Ever since, it has had a major effect on my politics, on my thinking, both having watched it and having read materials on it. I taught about Vietnam at UMass throughout the seventies and it has had an effect on all of my teaching. Was the effect related to how people perceived the Vietnam War or based on how you approached teaching and explaining about it? It very quickly dawned on me that this was more than textbook stuff. I had students who had graduated from my class who are in the military. One of the faculty members at Ohio State who was also an ROTC instructor, he (and a colleague) gave a single lecture in my course. It later dawned on me that they were only free to give the Pentagon line. He went back to Vietnam in late spring 1968 and a couple weeks later was killed. Students were graduating from the course and then going to Vietnam, and student populations were wrapped up in the anti-war movement. That gives a sense of urgency, a certain seriousness to what you do in the classroom. Growing up in the 1940s and 50s, my high school and college education I had was pretty straight-lined and celebratory to the American master narrative. My involvement with teaching about Vietnam and reading about Vietnam basically knocked me off that master narrative. What influenced your interest in history? It all looks different in retrospect than in prospect. A while ago, I looked back at my high school yearbook and several people had said, “You're going to make a great professor!” They were way ahead of me! My background was liberal arts English, history, and language. I knew I didn't want to go into theology. I was a history major as an undergraduate, although it wasn't much of a major. I spent my junior year abroad in Heidelberg and we went to classes, but there was no attendance, no grading, and no exams. It was great for languages and for other purposes, and then came graduate school. I can remember Christmas time in 1960 after I graduated from college, and my family was in the living room. The question was, “What will Richie do next year?” There wasn't any drive on my part or any consuming interest driving me to history, but once I got there, I was not sorry. I had seen enough of German European history which is what my undergraduate major was mainly about, to realize that it was a pretty trampled, congested field and somebody had told me about two-year programs in Asian studies. Yale had one, Harvard had one, and Berkeley. It was only two years, what could go wrong? That's what got me into graduate school and into Japanese language. How did the perspectives acquired through your education influence your career as a historian? I think the steering was more from the outside world than the education itself. The education that I got made it possible for the things that happened, but I think it was more stuff outside of the classroom. Looking back to the post-Vietnam era, Vietnam happened, and it had a major impact on my teaching. That is from 1964 to 1966; I was in Japan as a Fulbright graduate student and by the time I got back, it was a much bigger topic in this country. I was in Japan from 1964 to 1966, and by the late 1960s the war was heating up. One of the fortunate things for me, first at Ohio State, and here in Massachusetts, I was the only Japan historian, which meant except for rare occasions I wasn't team-teaching or preparing my students to take an advanced course in the subject with someone else. I had an unusual independence when it came to coverage. One of the major problems in history teaching is the compulsion that we feel or that is actual. Teaching Japanese history includes that you cover Japan from A to Z, or to cover the United States A to Z or Germany. If there are others in your department who are likely to get those students the next year, if your teaching has to cover what other colleagues expect it to cover, then that is one thing; but I never faced that issue. That's extraordinary freedom. It has been important for me all the way through. The standard introductory courses are large, and you have discussion sections taught by graduate students. In the discussion sections, all the way through, I was able to do my own discussion sections, so I rarely taught more than 60 people in one course. It was a Monday and Wednesday lecture and a discussion. I led the discussion sections and got to know the students as a result. It was important for me, not for them, to know where they were coming from. How different were students’ specialties when they came to your survey course? Here at UMass, the Japan survey was open to everybody so the history majors were a small part of that. I didn't really register which students were history majors or engineering or the sciences. This had an impact on my sense of audience, so I could give them some kind of perspective. One of your questions has to do with pedagogy; what we ought to be teaching and how to teach it. Every year I had one and often two Japan survey courses taught to people who would never have another Japan course and who came from all parts of the university. This kind of shaped my ideas about teaching. We often think of covering the field, and in my experience, we don't teach history, we talk about history. We should teach a habit of mind, not a list of facts. This may have changed since my student days, but I'm not sure. Students can take our courses without really getting a sense of what it means to think historically. Nowadays, things are different. Back when I was studying Asia, there were a handful of Asian experts at a dozen major universities. Nowadays, the US has many Asian historians. It wasn't true back then. The name I knew was Edwin O. Reischauer. He had got into the field and he was a major figure in the beginning of Japanese studies. Later, Kennedy had appointed him U.S. Ambassador to Japan. He was one of the names that attracted me to Harvard but as soon as I got there he left, and I left before he got back. We rarely teach about the history of the field. We rarely teach about who the historians are, who Reischauer was. What were the American Japanists doing in the World War II era. I didn't have to cover more than six to eight people to cover the field. This was after Vietnam had shaped my thinking. World War II and patriotic fervor and Japan was the enemy. Hence, they had a certain take on Japan. Some of them had a negative experience of Japan, but certainly there was ‘an American nation spreading democracy’ and that was shared almost across the spectrum. Part of this is teaching about the background of the field and part of it is more practical – in my syllabi, this is after I had gotten my feet on the ground, after teaching about Vietnam for a while – I gave biographical information on every author we encountered in the course, and I included myself. Date of birth, educational background. Every discussion session, once a week, started out with a quiz. The first question was, “Who is the author?” and another question likely was “When was this written?” and maybe a third question was, “Where was it published?” Was it Life magazine or was it the Harvard Journal? That kind of questioning. It underlined for the students that a major, major part of history is analyzing sources. Who is this person and why are they saying these things about Vietnam and Japan in World War II? Who is his or her audience? The emphasis for me in teaching, yes, the subject was Japan, but the underlying goal throughout was to get people to read critically, to think critically, not just about the authors we read, but also about me. At the end of the course, say “OK, this was Professor Minear’s course, who is he and where is he coming from? Then factor that in.” How many history courses today have biographical data on the professor and everybody else? And the continued emphasis in discussions, lectures: Who is this author? When did he write? Was it before the Tet Offensive or after? For what audience? It makes the students into players rather than audience members. In your opinion, what is the purpose of history? Who are its intended consumers, and does the historian have a social responsibility? I think for everyone, it’s different! With the audience, something we tend to forget is – in my case, I began graduate study as a 21-year-old, I think that’s true for many of the folks. The sense of the audience then is nonexistent and you are just trying to get through the next exam and get your Masters and decide whether to go on. But once you get past and into your thesis, your audience is the three or four guys – and they were all men back then – on your thesis committee. All of whom were academics and distinguished. I can remember thinking for a while in my thirties that my audience for my writing wasn’t anybody at UMass. My audience was 30 or 40 Japan experts like me, scattered around the country but limited to the ‘in-group’ of the real experts. I can remember thinking, at that stage, that maybe my audience was, in part, historians like me in Japan. If I was really good, they might learn something about Japan from what I had to say. In my teaching and publishing, it gradually got me away from that kind of hyper-professional focus on specialists and into what was useful for non-experts, the students who I was teaching in my courses or the general readers. Each historian has a different path to follow and maybe everyone has different expectations and a different take on this. My ‘5 minutes of fame’ was Dr. Seuss Goes to War: The World War II Editorial Cartoons of Theodor Seuss Geisel, and soon after that came out, I gave a talk in Dr. Seuss’s hometown in California, at University of California, San Diego (UCSD). They had posters around the campus which had Dr. Seuss and my name on it. I knew one of the Japanists at UCSD. I bumped into him after the talk, and he said, “I saw this poster. I knew it wasn’t you because you were a Japan person.” The idea that a Japan person would write about Dr. Seuss didn’t compute, and yet that book got me on Good Morning America and All Things Considered. Part of teaching about writing got me into E.B. White, and I did an essay tracking the changes on the various editions of his book, The Elements of Style. The idea that a Japanist could do Dr. Seuss and E.B. White… Back on Japan and speaking to the Japanese, my second book was Victors' Justice: The Tokyo War Crimes Trial. I was writing it in the middle of when the Vietnam War was happening, in anger. This was the Pacific counterpart of Nuremberg. When you look at the trial in retrospect, it was heavily a propaganda operation and it had a serious impact. In that sense, I’m here writing a counter piece on the trial that wasn’t exactly an exercise in justice. That gets translated into Japanese and it reinforces what the hard right in Japan was saying about the war, and about the Tokyo trial – that it was a put-up job. They are coming at it from a political position diametrically opposed to mine: the context makes a huge difference. They reacted, they loved the book. The Japanese have an expression, a proverb that if something is big news on Japan abroad, it tends to feed back into Japan. The Japanese press sits up and takes notice; I guess it’s much less so for the United States, partly because of size, reach, and influence. What other people say matters [in Japan]. I have done a lot of translations of Hiroshima survivor accounts and more recently translations of "ephemera" (pamphlets, wall posters) produced by Japan's left-wing activists. It’s fascinating how stuff that you do for one audience can be read very differently by another audience. I think maximum clarity about your own politics, your own stance, your own commitments, and not simply clarity, but not hiding your politics can give your readers enough material, whether it’s a biographical squib on a syllabus or a translator’s introduction to a translation. That gives your audience some clue as to who you are and where you are coming from. One of the first major translations I did was of a WW2 battleship epic, Requiem for Battleship Yamato. The battleship sailed out at the end of the war into the Okinawa campaign on essentially a suicide mission. What were they going to do with the battleship? They turned it into a floating platform that would maybe have some minor effect on the battle. Without air cover, it would be destroyed rapidly. One of the officers on Yamato survived, one of the three hundred or so crew members who had survived out of 3,332. He wrote his account. We tend to look down on military history, but it was a stunning, gruesome, yet gorgeous account of his own experience, of truth-seeking, and I showed a draft to my colleagues, a European historian, one of them a Canadian classicist. He said to me, “Any classicist (of whatever tradition) would appreciate Requiem for Battleship Yamato.” No matter which classics (for example, if you're a classical scholar in the European tradition or the Indian tradition or the Chinese tradition), there’s horror on one hand and human nobility on the opposite, but also underlying human need, a common humanity. I think that’s part of what we owe to the public and to our kids, to get across with our work. When I started teaching the Vietnam course, I very quickly found a classic Vietnamese poem, The Tale of Kieu, written before the French takeover of Vietnam. It’s beautiful, utterly unconnected to the war and yet. Kieu is a woman who undergoes great suffering, largely not of her own devising, and yet survives. The author is Vietnam's Shakespeare. I can remember one fellow here at UMass in the Vietnam course that had to read this, and this guy served in Vietnam. He came up after I had him write a paper, and he said, “I feel closer to Kieu than I ever felt with any Vietnamese.” If you approach Japan, China, Vietnam, or Russia through classics and poetry, it becomes a little harder to accept unthinkingly what used to be in the textbooks and the press. What used to be in the newspapers and comics. I used to do a lecture on the Sergeant Rock comic book, Ali My, and it was a story of a U.S. operation in a war, and Ali My is an anagram for Mỹ Lai. A gruesome American massacre of Vietnamese civilians gets transmuted into a heroic battle. How many of us grew up reading comics and war comics? Somebody needs to study videogames for their images. Who is the ‘other,’ who is the bad guy, how are they depicted, what are the gender dynamics? Videogames are having a far greater impact on our kids than any teacher in a classroom. Who is our audience and what do we know about our audience? What de-programming needs to be in place? One of the major influences on my intellectual development was Orientalism by Edward Said. That book blew my mind! I was already coming off of Vietnam disillusioned. Said's book takes the entire tradition of European and American thinking about the Arab world and points out what a coherent, self-congratulating, and denigrating constellation that is. When the book came out, the Journal of Asian Studies commissioned essay reviews, and it came to three Asian experts: one from Japan, one from China, and one from India. I was the Japan person, and almost everything that Edward Said says about orientalism transposes beautifully onto the pre-war- and wartime American thinking about Japan. You’re inside a tradition and you can’t see it as a tradition because it’s the world, but when somebody points out from outside the tradition or from a position from within, when somebody nails it so beautifully, you can say, a-ha. This is a world view. This is a coherent system, and we need to re-examine all of it. There was true excitement there. What we ought to be doing in teaching is somehow start conveying that excitement, that possibility to the folks who are in classrooms. And then to say, OK, who is Edward Said? Where’s he coming from? And who am I – either I as a professor or as a student and where am I coming from? How does this all factor into how I read Edward Said and how I look at the America or the European hang-ups about Japan or the Orient. It’s a game of mirrors but it’s a deadly serious game of trying to be aware. Not simply of what the tradition is, from a matrix, what’s handed down, but also of myself and how I’m reacting and how I’m contributing in one way or another to the perpetuation or the challenge. It’s only when you’re getting into it at that level or that order of operation, that you begin to see what a fascinating and difficult and impossible task we all have. But that’s where it goes back to syllabus – biographical sketches of all of the authors, and the dates when they’re writing, who is this person, when was she writing? Where was this published? We just don’t, for the most part, let most of our students, we don’t make them aware that there is this whole level of endeavor of thinking. How many times have you run into people who said, “I had history in high school and I hated it.” Don’t blame the teachers, they’re doing the best they can, given the constraints of SATs and covering the waterfront and all that. Part of the problem is history is not exciting for most people because they don’t see it for what it is. They can get into a historical novel because in one way it comes alive, but when you read a book like Said’s Orientalism, all of a sudden, the whole board game shifts. The whole perspective gets challenged in ways that can only be useful. Who writes history? By and large, of course, it’s the victors, but we don’t know who the victors are until much later. They cover stuff up. It has to be uncovered by oddballs like historians who don’t buy into the master narrative. A story about Vietnam: when I was teaching the Vietnam course in the mid- to late 70s, it was a smaller class size. There was less interest after a while. There was a group of the class of maybe 40 kids and we got two-thirds of the way through, and I said, “OK, you’ve got some play here on the last several weeks, what topics would you like to cover?” I listed several possibilities, including Mỹ Lai. After the class, one of the guys who had been sitting in the back all semester said, “Well, Professor, if you were going to cover Mỹ Lai, I’d be happy to answer questions.” He had been in Lt. Calley's platoon at My Lai. He did two class periods and took us through training. He had been through Vietnamese language training. Your jaw drops. What have you been doing since your retirement? I retired in 2008 and I was 69. Since then, I’ve published 3 or 4 book-length translations. I’ve kept some of that going and I’m doing a little bit now. I’m still living in Amherst, and I stopped teaching cold turkey and haven’t gone back to part-time teaching. It’s been 12, 13 years now since I gave a talk. For a while, with the Dr. Seuss book, I was giving talks on a regular basis but I did stop and I’ve been happily [retired]. Amherst is a neat pace to retire, it’s a beautiful fit. The city is close to the hills and the roads are good for biking. I do a 25-30-mile span when I go out. I hike in the hills; I bike north and south along the Connecticut. There’s an online journal, The Asia Pacific Journal: Japan Focus, and my most recent stuff is there, including, I mentioned, a Japanese leftist pamphlet about a Japanese massacre of Chinese forced laborers in the summer of 1945. I’ve kept a toe–or two toes–in. I loved teaching while I was doing but I’m happy to not be doing it now. I’ve always been active. I have two sons and they are in their 50s but for a while, we did triathlons as a team. I swam, one of them biked, and one of them ran. If you did a great time you could qualify for the Iron Man. We weren’t in that category but it was fascinating just to see how fit some of the folks were. For many students then and now, martial arts offers a way into Japanese culture. One of my students from 20 years ago, she sat in and took one of my courses. She's now a MMA practitioner, in the top ten in her weight category. You take them where they are at, try to figure out where they are at, and what you can do that might be useful, not in terms of profession, but thinking about Japan, about life, about what it means to be human. Those folks are maybe less likely to doubt the basic humanity of the Vietnamese or Japanese. Martial arts practitioners—or fans of anime or Zen meditators—have an advantage. One toe in the door.
419d8b182c70f7a90891845ec813f3ab
https://historynewsnetwork.org/article/179786
Paying the Price: Our Veterans and the Burden of Parkinson's Disease
Paying the Price: Our Veterans and the Burden of Parkinson's Disease Parkinson’s disease is the world’s fastest growing brain disease, even faster than Alzheimer’s.  The number affected worldwide has doubled in the past 25 years and, absent change, will double again in the coming generation.  In the U.S., 1.1 million Americans bear its burden, up 35% in just the past decade.  The toll is especially great on veterans; 110,000 have the debilitating disease. Veterans are at high risk for at least three reasons.  First, many were exposed to toxic herbicides like Agent Orange during the Vietnam War and other conflicts.  Richard Stewart is one of those affected.  He is a former Green Beret who served as a platoon leader in Vietnam for the U.S. Army’s famous 101st Airborne Division.  He, like thousands of other veterans and millions of Vietnamese, were often soaked by the 45 million liters of Agent Orange (“pretty nasty stuff” in his words) that were sprayed in the country.  The chemical, which derived its name from the large orange barrels in which it was stored, killed vegetation and crops and contributed to birth defects, cancer, and Parkinson’s disease.  Today, Stewart lives in upstate New York with his wife, a “flower child who peacefully protested the war.”  He still walks 2.5 miles and does 200 push-ups daily, is a member of local veterans’ groups, and says, “I only have Parkinson’s.  A lot of people are worse off.” Pesticides are not the only chemical contributing to Parkinson’s disease among veterans.  Trichloroethylene, or TCE, is another.  TCE has been used to decaffeinate coffee, clean silicon wafers, and remove grease.  The military used the dangerous chemical to clean engines and vehicles.  At the Marine Corps Base Camp Lejeune in Jacksonville, North Carolina, TCE and 70 other chemicals poisoned the base and its water supply for 25 years.  Over one million service members, their spouses, and children were exposed to its toxic effects, leading to miscarriages, birth defects, cancer—and Parkinson’s disease.  Many drank contaminated water or inhaled TCE that had evaporated into their homes, like radon, from polluted groundwater.  The consequences of that exposure are still being felt 30 years later. Finally, head trauma contributes to Parkinson’s disease.  A single head injury causing loss of consciousness or memory loss can triple the risk of Parkinson’s.  Repeated head trauma raises the risk even further.  These injuries are common in the military.  According to the U.S. Department of Defense, nearly 400,000 service members have had a traumatic brain injury since 2000.  Another eight million veterans have likely experienced such an injury.  Of those with moderate or severe injury, one in fifty will develop Parkinson’s within 12 years. So what can we do to help our veterans? The first and most important step is to prevent those who serve from ever developing the disease.  Banning harmful pesticides and chemicals like TCE, which the Environmental Protection Agency has proposed to do, is an important step.  We also need to clean up contaminated sites throughout the country, many of which are located on current or former military bases.  In addition, service members must have proper equipment to minimize the risk of head injury. Next we need to advocate for those that have already been harmed.  Veterans who have Parkinson’s and were exposed to Agent Orange are now eligible for disability compensation and health care.  Some efforts have been made to help those who have Parkinson’s tied to their service at Camp Lejeuene.  But these efforts are insufficient and have excluded many who have been injured.  For example, in 2019, the U.S. Navy denied civil claims from about 4,500 harmed at Camp Lejeune. We also need more research to prevent, measure, and treat the condition.  Despite Parkinson’s growth over the past decade, funding from the National Institutes of Health for the condition, adjusted for inflation, has actually decreased. Anyone anywhere with Parkinson’s should receive the care that they need.  The Veterans Health Administration has long had dedicated centers to research and treat Parkinson’s.  However, not every veteran lives near one of these centers.  Telemedicine is one way to expand the reach of care, but some veterans do not have internet access.  Others need in-person in-home care and support.  Increased access and novel care models can help ensure that no one suffers in silence. Finally, better treatments for Parkinson’s disease are lacking.  The most effective medication for the condition is now 50 years old, and we have had no major therapeutic breakthroughs this century.  The economic burden of Parkinson’s disease is over $50 billion per year.  Federal and foundation support is less than 1% of that total.  That will not get the job done.  We must increase our research efforts ten-fold to change the course of Parkinson’s as we did for polio, HIV, and COVID-19. Veterans have served and sacrificed too much to have Parkinson’s be their fate.
05b7b47b416b2e9cd33c463281256c37
https://historynewsnetwork.org/article/179797
“The Greatest Purveyor of Violence in the World”
“The Greatest Purveyor of Violence in the World” Fifty-four years ago, standing at the pulpit of Riverside Church in New York City, Martin Luther King, Jr., delivered his now-famous “Beyond Vietnam” sermon. For the first time in public, he expressed in vehement terms his opposition to the American war in Vietnam. He saw clearly that a foreign policy defined by aggression hurt the poor and dispossessed across the planet. But it did more than that. It also drained this country of its moral vitality and the financial resources needed to fight poverty at home. On that early spring day, exactly one year before his assassination in 1968, Dr. King warned that “a nation that continues year after year to spend more money on military defense than on programs of social uplift is approaching spiritual death,” a statement that should ring some bells in April 2021. In his sermon, Dr. King openly wrestled with a thorny problem: how to advance nonviolent struggle among a generation of Black youth whose government had delivered little but pain and empty promises. He told the parishioners of Riverside Church that his years of work, both in the South and the North, had opened his eyes to why, as a practitioner of nonviolence, he had to speak out against violence everywhere — not just in the U.S. — if he expected people to take him at his word. As he explained that day: “As I have walked among the desperate, rejected, and angry young men, I have told them that Molotov cocktails and rifles would not solve their problems… But they asked, and rightly so, ‘what about Vietnam?’ They asked if our own nation wasn’t using massive doses of violence to solve its problems, to bring about the changes it wanted. Their questions hit home, and I knew that I could never again raise my voice against the violence of the oppressed in the ghettos without having first spoken clearly to the greatest purveyor of violence in the world today: my own government.” A Global Pandemic Cries Out for Global Cooperation In 2020, the planet was swept up in a devastating pandemic. Millions died, tens of millions suffered. It was a moment, in Reverend King’s spirit, that would have been ideal for imagining new global approaches to America’s ongoing wars of the past century. It would similarly have been the perfect moment to begin imagining global cooperative approaches to public health, growing debt and desperation, and intellectual property rights. This especially given that the Covid-19 vaccines had been patented for mega-profits and were available only to some on this suffering planet of ours, a world vulnerable to a common enemy in which the fault lines in any country threaten the safety of many others. Internationally, at the worst moment imaginable, U.S.-backed institutions like the World Bank and International Monetary Fund continued to demand billions of dollars in debt payments from impoverished countries in the Global South, only forgiving them when their governments fell into step behind the U.S. and Europe, as Sudan has recently done. Moreover, Washington had a golden opportunity when the search for a Covid-19 vaccine threatened to change patent laws and force pharmaceutical companies to work with low-income nations. Instead, the U.S. government backed exclusive deals with Big Pharma, ensuring that vaccine apartheid would become rampant in this country, as well as across the rest of the world. By late March, 90% of the nearly 400 million vaccines delivered had gone to people in wealthy or middle-income countries, with vaccine equity within those countries being a concern as well.
53bc22670d4b14f5802348e142637668
https://historynewsnetwork.org/article/179802
If It’s Not Jim Crow, What Is It?
If It’s Not Jim Crow, What Is It? The laws that disenfranchised Black Americans in the South and established Jim Crow did not actually say they were disenfranchising Black Americans and creating a one-party racist state. I raise this because of a debate among politicians and partisans on whether Georgia’s new election law — rushed through last month by the state’s Republican legislature and signed by Gov. Brian Kemp, a Republican — is a throwback to the Jim Crow restrictions of the 20th century. Democrats say yes. “This is Jim Crow in the 21st century. It must end,” President Biden said in a statement. Republicans and conservative media personalities say no. “You know what voter suppression is?” Ben Shapiro said on his very popular podcast. “Voter suppression is when you don’t get to vote.” The problem with the “no” argument here is that it mistakes both the nature and the operation of Jim Crow voting laws. There was no statute that said, “Black people cannot vote.” Instead, Southern lawmakers spun a web of restrictions and regulations meant to catch most Blacks (as well as many whites) and keep them out of the electorate. It is true that the “yes” argument of President Biden and other Democrats overstates similarities and greatly understates key differences — chief among them the violence that undergirded the Jim Crow racial order. But the “no” argument of conservatives and Republicans asks us to ignore context and extend good faith to lawmakers who overhauled their state’s election laws because their party lost an election. Southern lawmakers at the turn of the 20th century weren’t shy about their motives — “Whenever there were political questions involved, of course, we looked to the interests of the party, because they are the interests of the State,” one Democratic delegate to the 1898 Louisiana constitutional convention, which sharply restricted the franchise, said at the time — but their laws had to be more circumspect. “Those who sought to prune the Southern electorate were hampered by various constitutional restrictions,” the historian J. Morgan Kousser explained in his 1974 book, “The Shaping of Southern Politics: Suffrage Restriction and the Establishment of the One-Party South, 1880-1910.” Between the 15th Amendment, which prohibited overt discrimination on the basis of “race, color, or previous condition of servitude,” and the 14th Amendment, which allowed Congress to slash the representation of states that disenfranchised adult males for any reason other than crime or rebellion, Southern lawmakers could not just write Black voters out of the electorate. “The disenfranchisers were forced to contrive devious means to accomplish their purposes,” Kousser writes. According to Kousser, the first wave of suffrage restriction after Reconstruction relied primarily on laws and practices that “decreased the influence of opposition voters but did not actually prohibit them from exercising the franchise.” Some states, for example, took the right to name their local officials away from voters and granted it to governors and state legislatures, a practice that “guaranteed that white Democrats would rule even in Republican areas.”
06606e5879b48c0f77a2f15919d79bdb
https://historynewsnetwork.org/article/179806
Methods of Power: How do Authoritarians Rule? (Review)
Methods of Power: How do Authoritarians Rule? (Review) The intellectual left reacted to Donald Trump’s election in 2016 in two very different ways. One group, like so many in the general public, immediately fell into full panic mode. The historian Timothy Snyder, for instance, rushed into print with a book called On Tyranny and in an interview declared it “pretty much inevitable” that Trump would follow Adolf Hitler’s example by declaring a state of emergency and staging a coup. Others urged caution. Snyder’s Yale colleague Samuel Moyn and Oxford’s David Priestland insisted in a New York Times opinion piece that “there is no real evidence that Mr. Trump wants to seize power unconstitutionally, and there is no reason to think he could succeed.” Trump, they claimed, was in reality a weak leader, despite his ability to exploit populist discontent. What was needed, they implied, was a focus less on his tweets and more on the neoliberalism and endless war that had provoked the discontent that brought him to power in the first place. The debates continued right through the 2020 election, with Snyder and many others continuing to warn of jackboots in the streets and Moyn and numerous other commentators insisting that the warnings themselves mostly worked to distract our attention from the staggering structural problems that the country faces. The events of January 6 might seem to have resolved the debate. Trump’s incitement of the Capitol attack was a treasonous crime. The ragtag rioters caused five deaths and put many other lives in danger. But what Moyn in these pages called a “parodic coup” (others dubbed it the “Q d’état”) in fact had no chance of delaying the certification of Joe Biden’s victory for more than a few hours, let alone of overthrowing the federal government. The sharply different views of the Trump presidency reflect two very different understandings of politics. The “ring the alarm bells” camp has tended to see right-wing authoritarianism as a powerful, malevolent force that can operate in at least partial independence from prevailing social and economic conditions. It can arise and destroy democracy wherever people lack the moral and institutional force to successfully oppose it. Even the erosion of relatively minor norms can have serious consequences, because it sets a precedent for more important transgressions. The “let’s focus on the larger problems” group, on the other hand, attributes the current manifestations of authoritarianism to broader social and economic conditions. Its members hold that the United States, while pathologically dysfunctional, is pathologically dysfunctional in a different way from the societies in which fascist dictators came to power in the 20th century. There, the virtual collapse of political order and civil society as a result of world war and economic depression created an opening for revolutionary right-wing mass movements. Here, on the other hand, neoliberal forces have proved perfectly capable of preserving their economic and political power through America’s existing, deeply imperfect but fundamentally stable constitutional system. It is the very dominance of these forces that generated the recent populist upsurge—and under Trump, the same forces also managed very largely to co-opt and neutralize it. (It is no coincidence, in this view, that among Trump’s major legacies are corporate deregulation and tax cuts for the superrich.) America’s problems, in the final analysis, can only be overcome through fundamental economic and political reform. Ruth Ben-Ghiat, a distinguished historian of Italian fascism and a prolific political commentator, belonged firmly to the alarm-bells camp over the past four years. Less than two weeks into Trump’s presidency, she wrote an article titled “Donald Trump and Steve Bannon’s Coup in the Making.” Her new book, Strongmen: Mussolini to the Present, elaborates on that position in a full-length survey of the ways ambitious strongmen can damage or destroy democratic regimes. The book features Trump prominently, but it sets him in a rogues’ gallery of authoritarians and would-be authoritarians ranging from Hitler and Benito Mussolini to late-20th-century dictators like Augusto Pinochet, Moammar El-Gadhafi, and Idi Amin to present-day populists like Viktor Orbán, Narendra Modi, and Jair Bolsonaro. These strongmen, Ben-Ghiat argues, all followed roughly the same “playbook” for seizing power and holding on to it, despite the very different societies in which they emerged. The strongman, she insists, is a modern political type—indeed, the modern political type. “Ours is the age of the strongman,” she states categorically. Ben-Ghiat’s story, like Snyder’s, is at its heart a moral drama. The crucial factors at play are not social and political conditions but rather unscrupulous ambition and greed, on the one hand, and the determination (or the lack thereof) to resist it, on the other. This point of view is a provocative one. Unfortunately, like many in the alarm-bells camp, Ben-Ghiat tends to treat it as self-evidently true, and she therefore devotes far more attention to the strongmen’s own actions than to the factors that allowed them to rise and determined whether or not they succeeded. The problem, as her own book reveals, is that authoritarians do not simply prevail through violence: They seduce, they appeal, they exert charisma. And to understand why the seduction works, we cannot look at the strongmen alone; we also need to consider the people who fall under their spell.
21fa1b2ac31c37c0c9848f98556684a1
https://historynewsnetwork.org/article/179808
Can America’s Problems Be Fixed By a President Who Loves Jon Meacham?
Can America’s Problems Be Fixed By a President Who Loves Jon Meacham? In February 2019, Joe Biden paid the University of Delaware a visit to celebrate the renaming of its public policy school in his honor. Biden, a famously middling student, feigned sheepishness over his alma mater’s tribute and suggested the honor really belonged to his sister and perennial political adviser, Valerie. “She graduated with honors,” Biden explained. “I graduated.” But Biden had no doubts about the brilliance of the man seated next to him on the stage: Jon Meacham, a Pulitzer Prize–winning biographer who has spent the last two decades pounding out bestselling accounts of American presidents such as Thomas Jefferson, Andrew Jackson, and George H.W. Bush. “You tend to find genius in those with whom you agree,” Biden said in a very loose paraphrase of Ralph Waldo Emerson. “I think he’s a genius.” Meacham, sporting a black suit and wide cornflower tie nearly identical to the vice president’s, gave a hearty laugh through his wide and toothy grin, the faintest of blushes spreading across his face. To mark the occasion, Biden had invited Meacham to Newark, Delaware, for a conversation about the biographer’s recent volume, The Soul of America: The Battle for Our Better Angels, a 416-page meditation on how enlightened political leaders, propelled by a civic-minded citizenry, have rescued America at its darkest hours. Meacham had just finished explaining that the country’s soul “is not all good or all bad” but rather an abiding conflict between “our better angels” and “our worst instincts.” Biden’s praise came in response to Meacham’s assertion that politicians are far more often “mirrors of who we are,” rather than “molders” of it. “That’s an uncomfortable truth,” Meacham said, an oblique reference to the white supremacist who at the time was behind the Resolute Desk. But history proves not all is lost, Meacham explained. “If we realize that we’ve come through, in this journey to make a more perfect union, storm and strife, and that that’s far more the rule and not the exception,” he said, “that history gives us an orienting capacity.” Two months before Biden announced his third run for the presidency, the intellectual underpinnings of the campaign were already in place. His ensuing candidacy was an exercise in moving Meacham’s thesis from the page to the stump. Biden cribbed Meacham’s book title for his campaign framing, a “battle for the soul of the nation.” Meacham occasionally weighed in on the narrative and thematic elements of Biden’s major speeches. He even made a five-minute appearance at the Democratic National Convention over the summer to endorse Biden and define the stakes of the election on the terms he presented in his book. As the Biden candidacy gave way to the Biden presidency, Meacham lingered on as a sort of historical and spiritual adviser to a White House beset by crisis. Biden has an agenda that reflects the center of his party, not the center of the nation, but he remains a staunch institutionalist who insists on bringing the country together through leadership that, in Lincoln’s words and Meacham’s assessment, appeals to the “better angels of our nature.” The question now facing the Presidency According to Jon Meacham is what to do when all those better angels are crushed beneath America’s institutions.
9106e6e9e096eb6fae5c37403f9ab065
https://historynewsnetwork.org/article/179810
Women’s College Sports Was Growing. Then the NCAA Took Over
Women’s College Sports Was Growing. Then the NCAA Took Over Fifty years ago, Carole Oglesby helped establish a governing body for women in college athletics at a time when the National Collegiate Athletics Association only oversaw men’s sports. When she saw images earlier this month of the inferior training facilities provided for the women’s NCAA basketball tournament—a few dumbbells and some yoga mats—she realized that some things still haven’t changed. “I really shake my head on this one,” Oglesby said. “Who is going to make these powerful bodies, like the NCAA, do the things that they promised they were going to do?” As the first president of the Association for Intercollegiate Athletics for Women in the 1970s, Ogelsby was at the heart of the long and contentious history of seeking equal opportunity, resources  and exposure for women’s college sports. The AIAW was briefly the dominant governing body for women’s collegiate athletics but lost a bruising legal battle with the NCAA that sunk the organization in 1982. The AIAW was founded in 1971 and thrived on a shoestring budget until 1978, when federal courts forced universities to comply with Title IX. The NCAA subsequently launched a takeover of women’s sports that drastically curtailed the percentage of women coaches and administrators in college sports. Oglesby and others say that what resulted from that bitter fight is a system in which, 40 years later, women remain underrepresented and their basketball championship remains underfunded, under-marketed and undervalued compared to men’s March Madness. “Just looking at the results, it’s never been treated on an equal basis,” she said of women’s college sports. “They [the NCAA] talk that line because they have to, that’s what the law requires technically.” The NCAA did not respond to a request comment.
23a6b6e8062c0ec5d0c1f019f76bb2f6
https://historynewsnetwork.org/article/179812
Philip Roth Was His Own Favorite Subject. What’s Left for a Biographer?
Philip Roth Was His Own Favorite Subject. What’s Left for a Biographer? Philip Roth, who stopped writing in 2010 and died eight years later at age 85, was not sure if he wanted to be the subject of a biography. He was the narrator of his story. King of sitzfleisch, Roth sat at his desk banging out his legacy 340 days a year, starting in his early 20s, returning in over 30 books to protagonists who resembled him: a son of Newark, secular Jew, younger brother and childless bachelor free to indulge his ego and appetites in a country without pogroms. In two senses, his legacy would be the writing: He never had children, so books would be all that would survive him; and his life was there, between all those covers. He insisted that his work not be read as autobiography, but Roth made a career out of doppelgängers and authorial stand-ins, an ongoing game of hide-and-seek with readers. In the 1993 novel “Operation Shylock,” a character named Philip Roth travels to Israel to confront a look-alike, named Philip Roth, who peddles Middle East peace plans while pretending to be the real Roth. He brackets his 1988 memoirs, “The Facts” — one of his few works of ostensible nonfiction — with letters to and from Nathan Zuckerman, his fictional alter ego. When embarking on “The Facts,” he wrote that he was trying memoir because he was tired of the “makeup and the false whiskers and the wig” of fiction — an implicit confession that he was always lurking just beneath his characters. In the end, Roth decided on a biography because he wanted to be known. His fiction courted misunderstanding, but he was wounded when misunderstood. Though living in rural Connecticut got him tagged as a recluse, Roth was a compulsive connector, always pressing himself on people, seducing them. After his death, the novelist Nicole Krauss wrote of “the sincerity and absorption with which he listened,” calling him, “the most generous audience one could hope to have.” In a group, he was a cutup, a mimic, a gentle teaser, a raconteur, the embodiment of what Zadie Smith, another friend in his old age, called literature’s “Rothian spirit” — “so full of people and stories and laughter and history and sex and fury.” Here was a famous controversialist who needed to be liked or, failing that, to be right: He had scores to settle with ex-wives and, not incidentally, an ex-biographer. By 2012, when Roth gave Blake Bailey access to his papers, friends, little black book and innermost thoughts, Roth had parted ways with two previous biographers, courted another and threatened to sue a third. But Bailey, who had appealed to Roth with a sympathetic ear and a brazen request for the job, persuaded the aging author. On April 6, W.W. Norton is publishing “Philip Roth: The Biography.” It is the fourth biography of an American writer by Bailey, a former public-school teacher who has become one of the great chroniclers of this country’s literary lives. In 2003, he published “A Tragic Honesty: The Life and Work of Richard Yates,” which helped earn the author of “Revolutionary Road” the fame that eluded him during a long, poor, drunken life. Six years later, Bailey returned with a biography of another midcentury drunk of gargantuan talent, John Cheever. When Bailey met Roth, he had just finished work on his biography of the “Lost Weekend” author Charles Jackson, whose aptly titled 1944 novel drew on his personal knowledge of blackout alcoholism. Early in their courtship, Roth asked Bailey, “Do you ever write about people who aren’t constantly drunk, or dead?” Bailey replied, “You would be my first.”
c7d7a3b1f864afa83b234304ec7bf36f
https://historynewsnetwork.org/article/179813
Slavery ‘Not Just About Profit And Suffering’, UK Government-Backed Race Report Claims
Slavery ‘Not Just About Profit And Suffering’, UK Government-Backed Race Report Claims Slavery was not just about making profit and British schools should instead tell a “new story” about culturally African people, a controversial government-backed report has said. The race and ethnic disparities commission said the UK’s education system should focus on parts of the “Caribbean experience” that show how “culturally African people transformed themselves into a re-modelled African/Britain”. It said education about the British Empire should focus on how “Britishness” influenced former colonies and those colonies “influenced what we know about modern Britain”. “One great example would be a dictionary or lexicon of well known British words which are Indian in origin,” the report’s controversial chair Tony Sewell wrote in the foreword. It went on: “There is a new story about the Caribbean experience which speaks to the slave period not only being about profit and suffering but how culturally African people transformed themselves into a re-modelled African/Britain.” Labour called on the government to explain how it published content which “glorifies the slave trade” and urged ministers to “immediately disassociate themselves from these remarks”. The review, commissioned after the Black Lives Matter protests, has been controversial since its inception after Boris Johnson gave control of it to his top policy adviser Munira Mirza, who has previously accused an “anti-racism lobby” of fostering a “culture of grievance”. The government has also been criticised for appointing Sewell as chair of the review after he previously claimed evidence of institutional racism was “flimsy”.
fba1cd93419df46d15400c3b3eda7de9
https://historynewsnetwork.org/article/179814
G. Gordon Liddy, Undercover Operative Convicted in Watergate Scandal, Dies at 90
G. Gordon Liddy, Undercover Operative Convicted in Watergate Scandal, Dies at 90 G. Gordon Liddy, the undercover operative whose bungling of the Watergate break-in triggered one of the gravest constitutional crises in American history and led to the resignation of President Richard M. Nixon, died March 30 at his daughter’s home in Fairfax County, Va. He was 90. His son Thomas P. Liddy confirmed the death but did not give a cause, saying only that it was unrelated to covid-19. A theatrical personality whose event-filled career included more twists and turns than a fictional potboiler, Mr. Liddy was at various times an FBI agent, jailbird, radio talk-show host, best-selling author, candidate for Congress, actor and promoter of gold investments. The role for which he is best remembered was in the plot to bug the Democratic Party headquarters in the Watergate complex in June 1972. At the same time, he was viewed by his superiors as “a little nuts,” in Nixon’s phrase. “I mean, he just isn’t well screwed on, is he?” the president complained to chief of staff H.R. Haldeman a week after the break-in. With his intense stare, cannonball head, bristling mustache and machine-gun style of speaking, Mr. Liddy looked like the archetypal bad guys he later depicted in television shows including “Miami Vice.” His friend and fellow Watergate conspirator, the late E. Howard Hunt, described him as “a wired, wisecracking extrovert who seemed as if he might be a candidate for decaffeinated coffee.” Mr. Liddy often boasted of his transformation “from a puny, fearful boy to a strong, fearless man” through a regime of intense exercise and physical bravado such as eating rats and holding his hand over a candle until the flesh burned.
0ac9dc80cedee65f89a912defdb82370
https://historynewsnetwork.org/article/179815
Stolen Confederate Monument will Become a 'Toilet' Unless ‘White Lies Matter’ Demands are Met, Group Vows
Stolen Confederate Monument will Become a 'Toilet' Unless ‘White Lies Matter’ Demands are Met, Group Vows A group claiming responsibility for the theft of a Confederate monument in Selma, Ala., laid out ransom terms in emails to local media Monday. The price for the relic’s return? Not cash, but a demand that the headquarters of the United Daughters of the Confederacy in Richmond hang a banner quoting a Black radical on Friday, the 156th anniversary of Confederate Gen. Robert E. Lee’s surrender at the end of the Civil War. The Jefferson Davis Memorial Chair, which was first reported missing from Live Oak Cemetery in Selma last month, is an ornately carved stone chair that was dedicated in 1893 to the Confederate president’s memory and is estimated to be worth $500,000. Calling themselves “White Lies Matter,” the group sent a message to the Montgomery Advertiser and AL.com that included a proof-of-life type photo of the chair, a ransom note styled to look like it came from the 1800s and a photoshopped image of what their banner might look like hoisted above the UDC headquarters more than 700 miles away. “Failure to do so will result in the monument, an ornate stone chair, immediately being turned into a toilet. See enclosed photograph,” the group said in the email to AL.com, with the photoshopped image below. This maybe the wackiest Civil War memory story in a while. 20 years from now this will make a great vignette in a book on Civil War memory. https://t.co/sc8T1PwbhU — Dr. Adam H. Domby (@AdamHDomby) April 5, 2021 Until local media reported on the ransom emails Monday, many in Selma didn’t even know the chair had been stolen, including the local district attorney. He confirmed it with the police chief. “Nobody knows what to make of this, it’s just really strange,” Dallas County District Attorney Michael Jackson told The Washington Post. “But you get used to ‘The Twilight Zone’ in Selma. Rod Serling would have a good time if he were down here himself.”
eebc4f9d26cc4daba5de23df03c8157e
https://historynewsnetwork.org/article/179817
How a Chicago Teacher Sparked a 'Memory War,' Forcing Lithuania to Confront its Nazi Past
How a Chicago Teacher Sparked a 'Memory War,' Forcing Lithuania to Confront its Nazi Past VILNIUS, Lithuania — As her mother lay dying, Silvia Foti made a promise. She vowed to continue her plans to write a book about her mother's father, Foti's grandfather, a Lithuanian hero known as "General Storm." He was among the young soldiers who fought the Soviet Union in its brief but brutal first occupation of Lithuania in 1940, and he was later shot in a KGB prison. He, like many of his comrades, is considered a national hero. But Foti, a high school English teacher from Chicago, said that after years of researching the man, whose name was Jonas Noreika, she discovered that her grandfather collaborated with the Nazis by facilitating the extermination of thousands of Lithuanian Jews. "He agreed with the Nazis on the elimination of the Jews," she said. Foti's revelations ignited a firestorm in Lithuania when they emerged two years ago. Laid out in painstaking detail in a book published last month, they have contributed to an increasingly toxic public debate over Noreika's legacy and what role Lithuanians played alongside Nazi Germany during the Holocaust. An estimated 95 percent of Lithuania's Jews, more than 200,000 people, were massacred as the Third Reich took hold — one of the highest proportions of any country affected by the Holocaust. Yet the dominant narrative in Lithuania has long been one of resistance to both the Soviets and the Nazis, a hallmark of national identity that state officials have worked to reinforce. In January, a lawmaker and longstanding defender of Noreika's legacy sparked outrage by suggesting that local Jewish leaders may even have borne some responsibility for the Holocaust. And on Thursday, the Lithuanian Parliament voted to dismiss the head of the country's genocide research center amid growing controversy surrounding the center's work. It's a bitter dispute that, more than 75 years after the end of World War II, highlights the degree to which Lithuania is still struggling to come to terms with its own history. Foti maintained that the official story has been a "cover-up."
d63e1a8a58a44e4c0429a14d40e393b1
https://historynewsnetwork.org/article/179818
'This is still being suppressed': OU professor's book of recovered photos preserves history of Tulsa Race Massacre
'This is still being suppressed': OU professor's book of recovered photos preserves history of Tulsa Race Massacre Once a gathering place for the city’s Black community, Mount Zion Baptist Church stands empty with smoke billowing from it, shortly before being burned to the ground, in an image from the Tulsa Race Massacre. Today, it continues to act as a place of community for its members, who meet in a large building similar to the one in the image. But its members haven’t forgotten its history. Sharlene Johnson, chair of Mount Zion’s joint board, said when the church started in 1909, it was held in a one-room frame building. Construction began on a larger building, on the same land the church is on now, in 1916. The first services were held in the new building in April 1921 — two months before white Tulsans would burn the building to rubble. Johnson said all of the Greenwood District was attacked because of racism and bigotry, but Mount Zion was a special target because white rioters wrongly believed it to be the headquarters and ammunition storage for the Greenwood community. She said she learned about the Tulsa Race Massacre growing up in Chicago, but when she moved to Oklahoma in 1977, she found that event wasn’t taught locally. “This is your history, it’s national history,” Johnson said. “But it wasn’t taught here, it was ignored for years and years. … This is a history that you can’t keep silent.” After half a century without pictures of the massacre readily available, OU professor Karlos Hill compiled images like the ones of Mount Zion and others as part of his latest project, “The 1921 Tulsa Race Massacre: A Photographic History.” His photobook is centered on the experiences of Black survivors and is intended to contextualize images taken by white participants. In his research on the massacre, Hill has seen countless images depicting destruction, damaged buildings and, simultaneously, the wrecking of the hopes and dreams of a prosperous Black community. But in his mind, one stands out from the rest — an aerial image of a smoky sky above a smattering of buildings, with a caption scratched across the bottom of the picture.
ac6f4ac96bf79533626c9190fa75999f
https://historynewsnetwork.org/article/179819
AHA Issues Letter Regarding Proposed Termination of Tenured Faculty Members at Salem State University (April 2021)
AHA Issues Letter Regarding Proposed Termination of Tenured Faculty Members at Salem State University (April 2021) Editor's Note: there is dispute about the nature of the proposal to eliminate specific tenured faculty positions at Salem State. After a faculty member received (from an unrelated public records request) a spreadsheet documenting three scenarios of cost-saving from the termination of named faculty members, the university's administration characterized the document as an "exercise" rather than a "plan." That dispute is addressed here. The AHA has written a letter to the president and provost of Salem State University strongly discouraging them from proceeding with the reportedly proposed termination of four tenured members in the history department. “This drastic reduction in faculty would severely diminish the department’s ability to maintain the impressive pedagogical and research standards that the department sets for itself and apparently maintains, along with its striking level of engagement with local communities,” the AHA wrote. The letter noted the Salem State history department’s participation in AHA Tuning, the data at Salem State showing history ranked #1 of 30 majors in the “fill rate” of its courses, and the fact that “Salem is a site of considerable historical importance,” making the role of historical work at Salem State “in many ways a special case.” Download the letter as a PDF. Dr. John Keenan President, Salem State University Dear President Keenan: The American Historical Association strongly discourages you from proceeding with the reportedly proposed termination of four tenured members in the Salem State University history department, whether now or in the foreseeable future. This drastic reduction in faculty would severely diminish the department’s ability to maintain the impressive pedagogical and research standards that the department sets for itself and apparently maintains, along with its striking level of engagement with local communities. The AHA recognizes the logical inclination to roll eyes when a scholarly association questions plans to terminate faculty in its own discipline. But we are not a labor union. Our interest lies in the promotion of historical work, historical thinking, and the influence of history in public culture. In this case we are concerned about the quality of undergraduate education and the role of Salem State historians in the community. Both stand to suffer from this short-sighted proposal. This concern with undergraduate education in general at Salem State and history education in particular is rooted in our experience with Salem State faculty in the Association’s “Tuning” initiative, and in well-documented accomplishments in this arena. Salem State was one of 120 history departments that participated in AHA Tuning, which involved thinking intentionally about the core value of the discipline to liberal education, and the utility of historical thinking to career and lifetime learning. The history department has employed these standards on behalf of its majors and the general education curriculum to serve the needs of your students. This reinvention is reflected in Salem State’s data, which shows history ranked number 1 of 30 majors in the “fill rate” of its courses. The department’s average class size is 21.82, higher than the 17.44 university average. It stands well above the university average in revenue compared with cost per credit hour. These data demonstrate that history, in addition to serving the mission of the university, has been profitable. Through Tuning the AHA has put work into history education at Salem State, and we are obviously unhappy to see that work diminished by a major reduction in the resources necessary to maintain the integrity of the program. The AHA has also recognized the work of department faculty and is especially disturbed that one of your colleagues apparently on the potential chopping block includes a winner of the Association’s prestigious James Harvey Robinson Prize, which recognizes “the most outstanding contribution to the teaching and learning of history in any field for public or educational purposes.” The role of historical work in the community surrounding Salem State is in many ways a special case. Salem is a site of considerable historical importance, both in terms of its local and global histories and its expansive role as a center of heritage tourism. The history faculty at Salem State have tied their areas of expertise and research to Salem and offered knowledge and professional development to its numerous historical institutions on both an informal and formal basis. History department faculty have served as trustees and advisory board members of the Salem Maritime National Historic Site, the House of the Seven Gables, the Witch House, Historic Salem, Inc., the Salem Athenaeum, Voices Against Injustice (formerly the Salem Award), and the regional Essex National Heritage Area, and these same institutions have benefitted from a succession of interns trained and supervised by your history department. Many of these institutions are staffed by history graduates. Your recent campaign featuring the most accomplished “40 Under 40” alumni included seven history graduates, more than any other department. Your history graduates are thriving in museums, archives, public policy, law, education, and business. From an educational and civic perspective, therefore, this reduction makes little sense. Eliminating faculty in a core liberal arts degree like history is an especially odd move at a time when civic leaders from all corners of the political landscape have lamented the level of historical knowledge of American citizens. In addition, overwhelming evidence shows employers seek the kind of skills a history degree can provide. To decimate a history department is a lose-lose proposition: it deprives students of essential learning and skills, even as it strips a university of the essential perspectives and intellectual resources so necessary to confront the present and shape the future. We certainly understand the pressure of budgets and do not underestimate the financial necessities you confront at this particular moment. This realignment plan, however, will have serious and deleterious consequences for the practice of history and the quality of undergraduate and graduate education at Salem State, as well as the relationships between the university and surrounding communities. Rather than looking at history as a drain on university resources, you might want to look at how well-connected historians are to local culture and appreciate the likelihood that history is instead an asset to the university’s need to increase overall enrollment. Sincerely, James Grossman Executive Director Jacqueline Jones President cc: Dr. David Silva, Provost
4c46ce24dbfa3fe6176420b7a35b7f42
https://historynewsnetwork.org/article/179820
Your Weather Forecast Update: Warmer Climate Will Be The New 'Normal'
Your Weather Forecast Update: Warmer Climate Will Be The New 'Normal' It's become so common, perhaps you've stopped noticing how often your local weather forecast is "above normal." It's noted during extreme heat in the summer, when mild temperatures persist through the winter, or when nights don't cool down like they used to. But on May 4, the hotter Earth will officially become the new normal. That's when the National Oceanic and Atmospheric Administration (NOAA) releases its once-a-decade update to "climate normals." They are the 30-year averages for temperature and precipitation that local meteorologists rely on as the baseline for their forecasts. To be sure, some updates will be minuscule. But the fastest-warming places will see a real bump in their averages that could make some forecasts seem confusing and pose a challenge to meteorologists. The current "normals" are from 1981-2010, based on data collected by thousands of monitoring stations around the country operated by the National Weather Service. The NOAA update will shift the time frame for those averages later, to the period from 1991 to 2020. The decade from 2011-2020 is one of the hottest on record in the U.S. "It was a very substantial upward trend in temperature, especially along the West Coast, in the South and along the East Coast," says Mike Palecki, with NOAA's National Centers for Environmental Information. There were exceptions; some places in the North Central part of the U.S. actually cooled a bit. But globally, the decade ending in 2020 was the hottest decade recorded since 1880. .... After the NOAA update in May, with the baseline for normal shifting to higher temperatures, forecasters in Phoenix and many other places might not be pointing out as many "above normal" days as before. In fact, some days considered warm now may become officially "cooler" when compared to the new temperature average. Sullins plans to take more time in her daily forecasts to explain the shift. "We're going to have to remind people, especially this year, 'Hey, if we're at 115, that is 5 degrees above the average. But remember that this average has changed," she says. That context is important. Research shows that as unusual weather events happen more frequently, people simply reset their perception of what's normal. One study found a common reference point for "normal conditions" was only two to eight years ago. Frances Moore, a co-author of the study and professor of environmental policy at the University of California, Davis, has seen this speed of "normalization" in her own state. After five years of extreme smoke events from wildfires, she says people now simply say, "Oh, fire season's coming, I guess we'd better get ready for it."
3325b27ff80519c444e08de33330c2fb
https://historynewsnetwork.org/article/179823
SSU Faculty Retrenchment Plan Accidentally Released
SSU Faculty Retrenchment Plan Accidentally Released Already tense relations between faculty and administrators at Salem State University have only further deteriorated after an unfinished planning document was accidentally released in a records request — a document that contained multiple scenarios to significantly reduce or eliminate staff that could save the university millions of dollars. On Friday, university faculty received an internal planning spreadsheet that outlined three scenarios under which specifically named faculty members would be retrenched. The scenarios, if acted upon, would have saved the university between $1.8 and $3.3 million, according to Tiffany Chenault, president of Salem State's chapter of the Massachusetts State College Association, which represents university professors and librarians at Massachusetts' nine state universities. "It was a detailed document that had three different scenarios. It had people's names, departments that would be retrenched, moved," Chenault said. "It was in-depth." The document was accidentally released by the university as part of a public records request from a member of Salem State's faculty, according to Rita Colucci, Salem State's general counsel. The request, which was unrelated to retrenchment, turned up about 6,000 documents, of which about 1,000 were turned over after duplicates and privileged information was removed. The spreadsheet, which was mistakenly included with those documents, was then shared by the faculty member in an unofficial online group that faculty are part of. "The spreadsheet came about because it was an unfinished exercise. We were figuring out retrenchments and what they look like in real life. We looked at some scenarios but never finished the document," Colucci said. "It was an unfinished exercise — never meant to be shared, never meant to be distributed in any way." .... This latest twist comes after close to a year of tense relations between administrators and faculty. With a multimillion-dollar budget deficit expected last year, administrators unveiled a plan for all university employees to take unpaid furlough time, a move that is expected to save $3.3 million once all employees take their two weeks by the end of the current academic year. The local MSCA chapter, following a legal dispute that resolved in the university's favor, took its first week of furloughs during spring break, March 14-20, with the second week set for May 23-29, the week after commencement. .... "It wasn't meant to be shared, but they got it nonetheless. It was put out there," Chenault said. "So how do you regain trust after that? How do you rebuild trust in faculty, and how do faculty know if this is an exercise? Is there going to be one that will be finalized?"
f723531898951630dde35e5e2f6d00bc
https://historynewsnetwork.org/article/179829
Can Joe Biden Replicate FDR’s Success in Rebuilding the Democrats’ Coalition?
Can Joe Biden Replicate FDR’s Success in Rebuilding the Democrats’ Coalition? Eric Rauchway’s latest book about the New Deal begins nearly two years before Franklin D Roosevelt took office in 1933, at the time of a clash over pensions for veterans of the First World War. This era of US history is often portrayed in American textbooks through images of the Wall Street crash of 1929, the breadlines of the unemployed, and portraits of hollow-eyed Dust Bowl farmers. But Rauchway takes us to the scene of the 1932 Bonus March, when thousands of veterans walked across the country to demand Washington policymakers allow them access to their pensions. Some of the march leaders had fascist sympathies, Rauchway argues, organising squads of khaki shirt militias in emulation of Benito Mussolini’s minions. Set against them was General Douglas MacArthur, who disregarded direct orders from the White House and broke up the encampments with extreme force. On both sides of the clash were forces that felt America’s democratic government had failed, and were willing to act outside of it. “I think it's very important to see that episode through FDR's eyes, which is to say that there was the possibility of a fascist movement in this country,” said Rauchway, who is a professor of history at the University of California. “It could have gone in a number of different directions. It's important to illustrate the real live threat to democracy.” Why The New Deal Matters, Rauchway's fourth book on Roosevelt’s time in office, is short, accessible, and mostly unfolds far from the corridors of power. Instead of focusing on economic policy conferences and harried diplomatic cables, Rauchway devotes a chapter to rural development and another to the little-known story of the New Deal in Native American lands. “The book focuses on the way the New Deal created a country Americans could lay hands on, and that belongs to everybody,” Rauchway told me. “There's almost no place [in the US] where we're not looking at some legacy of the New Deal: in the American South, the Tennessee Valley Authority, the Native lands.” Although the New Deal era is bemoaned by modern conservatives as a time of burgeoning centralisation of government, Rauchway shows how the laws of the 1930s weren’t administered in a top-down way on the ground. In many cases, he argues, they empowered locals and helped democratise America through electricity cooperatives and Works Progress Administration councils. But as is often the case in the United States, the empowerment of local leaders doesn’t always create the conditions in which democracy can flourish. Black Americans were often excluded from such benefits, chiefly in the American South, where local leaders were allowed to apply oppressive Jim Crow standards to their work sites. When New Deal programmes were administered by private interests, such as mortgage and real estate policies, they left lasting racialised scars on America’s cities and suburbs.
0304cce5591368d110d5a81fe9c90b94
https://historynewsnetwork.org/article/1802
The Vietnam War Crimes You Never Heard Of
The Vietnam War Crimes You Never Heard Of On October 19, 2003, the Ohio-based newspaper the Toledo Blade launched a four-day series of investigative reports exposing a string of atrocities by an elite, volunteer, 45-man "Tiger Force" unit of the U.S. Army's 101st Airborne Division over the course of seven months in 1967. The Blade goes on to state that in 1971 the Army began a four and a half year investigation of the alleged torture of prisoners, rapes of civilian women, the mutilation of bodies and killing of anywhere from nine to well over one hundred unarmed civilians, among other acts. The articles further report that the Army's inquiry concluded that eighteen U.S. soldiers committed war crimes ranging from murder and assault to dereliction of duty. However, not one of the soldiers, even of those still on active duty at the time of the investigation, was ever court martialed in connection with the heinous crimes. Moreover, six suspected war criminals were allowed to resign from military service during the criminal investigations specifically to avoid prosecution. The Toledo Blade articles represent some of the best reporting on a Vietnam War crime by any newspaper, during or since the end of the conflict. Unfortunately, the articles tell a story that was all too common. As a historian writing his dissertation on U.S. war crimes and atrocities during the Vietnam War, I have been immersed in just the sort of archival materials the Toledo Blade used in its pieces, but not simply for one incident but hundreds if not thousands of analogous events. I can safely, and sadly, say that the "Tiger Force" atrocities are merely the tip of the iceberg in regard to U.S.-perpetrated war crimes in Vietnam. However, much of the mainstream historical literature dealing with Vietnam War atrocities (and accompanying cover-ups and/or sham investigations), has been marginalized to a great extent -- aside from obligatory remarks concerning the My Lai massacre, which is, itself, often treated as an isolated event. Unfortunately, the otherwise excellent reporting of the Toledo Blade draws upon and feeds off this exceptionalist argument to a certain extent. As such, the true scope of U.S.-perpetrated atrocities is never fully addressed in the articles. The men of the "Tiger Force" are labeled as "Rogue GIs" and the authors simply mention the that Army "conducted 242 war-crimes investigations in Vietnam, [that] a third were substantiated, leading to 21 convictions... according to a review of records at the National Archives" – facts of dubious value that obscure the scope and number of war crimes perpetrated in Vietnam and feed the exceptionalist argument. Even an accompanying Blade piece on "Other Vietnam Atrocities," tends to decontextualize the "Tiger Force" incidents, treating them as fairly extraordinary events by listing only three other relatively well known atrocity incidents: former Senator, presidential candidate and Navy SEAL Bob Kerrey's raid on the hamlet of Thang Phong; the massacre at Son Thang -- sometimes referred to as the "Marine Corps' My Lai"; and the war crimes allegations of Lt. Col. Anthony Herbert -- most famously chronicled in his memoir Soldier. This short list, however, doesn't even hint at the scope and number of similar criminal acts. For example, the Toledo Blade reports that its "review of thousands of classified Army documents, National Archives records, and radio logs reveals [the "Tiger Force"] ... carried out the longest series of atrocities in the Vietnam War [from May and November, 1967]...." Unfortunately, this seven month atrocity-spree is not nearly the longest on record. Nor is it even the longest string of atrocities by one unit within its service branch. According to formerly classified Army documents, an investigation disclosed that from at least March 1968 through October 1969, "Vietnamese [civilian] detainees were subjected to maltreatment" by no less than twenty-three separate interrogators of the 172d Military Intelligence (MI) Detachment. The inquiry found that, in addition to using "electrical shock by means of a field telephone," an all too commonly used method of torture by Americans during the war, MI personnel also struck detainees with their fists, sticks and boards and employed a form of water torture which impaired prisoners' ability to breath. Similar to the "Tiger Force" atrocities chronicled by the Blade, documents indicate that no disciplinary actions were taken against any of the individuals implicated in the long-running series of atrocities, including 172d MI personnel Norman Bowers, Franciszek Pyclik and Eberhard Gasper who were all on active duty at the time that the allegations were investigated by Army officials. In fact, in 1972, Bowers's commanding general pronounced that "no disciplinary or administrative action" would be taken against the suspected war criminal and in a formerly classified memorandum to the U.S. Army Chief of Staff, prepared by Colonel Murray Williams on behalf of Brigadier General R.G. Gard in January 1973, it was noted that the "...determination by commanders to take no action against three personnel on active duty who were suspected of committing an offense" had not been publicly acknowledged. Their crimes and identities kept a secret, Bowers, Pyclik and Gasper apparently escaped any prosecution, let alone punishment, for their alleged actions. Similarly, the Toledo Bladepays particular attention to Sam Ybarra, a "notorious suspect," who was named in seven of the thirty "Tiger Force" war crimes allegations investigated by the Army -- including the rape and fatal stabbing of a 13-year-old girl and the brutal killing of a 15-year-old boy. Yet, Ybarra's notorious reputation may well pale in comparison to that of Sergeant Roy E. "the Bummer" Bumgarner, a soldier who served with the 1st Cavalry Division and later the 173d Airborne Brigade. According to a former commander, "the Bummer" was rumored to have "personally killed over 1,500 people" during a forty-two week stretch in Vietnam. Even if the number was exaggerated, clues on how Bumgarner may have obtained high "body counts" came to light in the course of an Army criminal investigation of an incident that took place on February 25, 1969. According to investigation documents, Bumgarner and a subordinate rounded up three civilians found working in a rice paddy, marched them to a secluded area and murdered them. "The Bummer" then arranged the bodies on the ground with their heads together and a grenade was exploded next to them in an attempt to cover-up their crime. Assorted weapons were then planted near the mutilated corpses to make them appear to have been enemy troops. During an Army criminal investigation of the incident, men in Bumgarner's unit told investigators that they had heard rumors of the sergeant carrying out similar acts in the past. Said one soldier in a sworn statement to Army investigators: "I've heard of Bumgarner doing it before -- planting weapons on bodies when there is doubt as to their military status. I've heard quite a few rumors about Bumgarner killing unarmed people. Only a couple weeks ago I heard that Bumgarner had killed a Vietnamese girl and two younger kids (boys), who didn't have any weapons." Unlike Sam Ybarra, who had been discharged from the military by the time the allegations against him came to light and then refused to cooperate with investigators, "the Bummer" was charged with premeditated murder and tried by general court martial. He was convicted only of manslaughter and his punishment consisted merely of a demotion in rank and a fine of $97 a month for six months. Moreover, after six months, Bumgarner promptly re-enlisted in the Army. His first and only choice of assignments -- Vietnam. Records indicate he got his wish! Military records demonstrate that the "Tiger Force" atrocities are only the tip of a vast submerged history of atrocities in Vietnam. In fact, while most atrocities were likely never chronicled or reported, the archival record is still rife with incidents analogous to those profiled in the Blade articles, including the following atrocities chronicled in formerly classified Army documents: A November 1966 incident in which an officer in the Army's Fourth Infantry Division, severed an ear from a Vietnamese corpse and affixed it to the radio antenna of a jeep as an ornament. The officer was given a non-judicial punishment and a letter of reprimand.An August 1967 atrocity in which a 13-year-old Vietnamese child was raped by American MI interrogator of the Army's 196th Infantry Brigade. The soldier was convicted only of indecent acts with a child and assault. He served seven months and sixteen days for his crime.A September 1967 incident in which an American sergeant killed two Vietnamese children -- executing one at point blank range with a bullet to the head. Tried by general court martial in 1970, the sergeant pleaded guilty to, and was found guilty of, unpremeditated murder. He was, however, sentenced to no punishment.An atrocity that took place on February 4, 1968, just over a month before the My Lai massacre, in the same province by a man from the same division (Americal). The soldier admitted to his commanding officer and other men of his unit that he gunned down three civilians as they worked in a field. A CID investigation substantiated his confession and charges of premeditated murder were preferred against him. The soldier requested a discharge, which was granted by the commanding general of the Americal Division, in lieu of court martial proceedings.A series of atrocities similar to, and occurring the same year as, the "Tiger Force" war crimes in which one unit allegedly engaged in an orgy of murder, rape and mutilation, over the course of several months. While not yielding the high-end body count estimate of the "Tiger Force" series of atrocities, the above incidents begin to demonstrate the ubiquity of the commission of atrocities on the part of American forces during the Vietnam War. Certainly, war crimes, such as murder, rape and mutilation were not an everyday affair for American combat soldiers in Vietnam, however, such acts were also by no means as exceptional as often portrayed in recent historical literature or as tacitly alluded to in the Blade articles. The excellent investigative reporting of the Toledo Blade is to be commended for shedding light on war crimes committed by American soldiers of the 101st Airborne Division in 1967. However, it is equally important to understand that the "Tiger Force" atrocities were not the mere result of "Rogue GIs" but instead stem from what historian Christian Appy has termed the American "doctrine of atrocity" during the Vietnam War -- a strategy built upon official U.S. dictums relating to the body count, free-fire zones, search and destroy tactics and the strategy of attrition as well as unofficial tenets such as "kill anything that moves," intoned during the "Tiger Force" atrocities and in countless other atrocity tales, or the "mere gook rule" which held that "If it's dead and Vietnamese, it's VC." Further, it must also be recognized that the "Tiger Force" atrocities, the My Lai massacre, the Herbert allegations and the few other better-known war crimes were not isolated or tangentially-related incidents, but instead are only the most spectacular or best publicized of what was an on-going string of atrocities, large and small, that spanned the entire duration of the war. The headline of one Blade article proclaims, "Earlier Tiger Force probe could have averted My Lai carnage," referring to the fact that the 101st Airborne Division's "Tiger Force" troops operated in the same province (Quang Ngai), with the same mission (search and destroy) months before the Americal Division's men committed their war crimes. But atrocities were not a localized problem or one that only emerged in 1967. Instead, the pervasive disregard for the laws of war had begun prior to U.S. buildup in 1965 and had roots in earlier conflicts. Only by recognizing these facts can we hope to begin to understand the "Tiger Force" atrocities and the history of American war crimes in Vietnam, writ large. Related LinksSeymour Hersh, "Uncovered" (New Yorker) ABC News Report This article was first published by http://www.zmag.org/ and is reprinted with the permission of the author.
3971feddf154f6bc0eca4ed885036290
https://historynewsnetwork.org/article/1811
FROM OUR ARCHIVES What Should We Make of the Charge Linking the Bush Family Fortune to Nazism?
FROM OUR ARCHIVES What Should We Make of the Charge Linking the Bush Family Fortune to Nazism? Editor's Note: Over the course of the past month several articles have been published on the Internet and in a handful of print publications that raised questions about the financial ties between the founder of the Bush dynasty and certain Nazi businessmen. We wondered what to make of the story and asked Herbert Parmet to investigate. This is his report. John Buchanan is a free-lance journalist with a mission. He intends to alert the media, and all who will listen, about how Prescott Bush, the progenitor of two presidents, was in league with some of Hitler’s “willing helpers.” Minimized or totally dismissed by the public, the story was revived from its World War II roots by Webster Tarpley and Anton Chaitkin in their George Bush: The Unauthorized Biography (1992) and given a strong additional push in John Loftus’s sensationalist The War Against the Jews (1994). Allegations involving the “father” of what has become the Bush dynasty relate to his association with Brown Harriman and Company, the Wall Street investment banking firm, which evolved from a 1931 merger of W. A. Harriman and Company and Brown Brothers, which was brought together by George Herbert Walker, president of the former, and his son-in-law, Prescott. The younger Bush, by then, was one of the seven directors of Brown Brothers Harriman, a board that included W. Averell Harriman and his brother Roland. Buchanan’s charges of a Bush-Nazi past are hard to ignore, largely because of his passion as a true-believer and an effective series of articles in a highly independent New England publication. His enthusiasm for getting “at the truth” of all this has been further emboldened by Loftus, who has suggested that Prescott Bush “should have been tried for treason, because they continued to support Hitler after the U.S. entered the war. Loftus, who describes himself as “a former prosecutor with the U.S. Justice Department’s Nazi-hunting unit,” has added the reassurance that he “could have made the case.” Treasury and Justice department files, including what was then the Office of Alien Property, declassified as recently as September, do indeed show that the U. S. government acted to seize numerous assets held by Harriman affiliated companies. “After the war,” Buchanan has written, “a total of 18 additional Brown Brothers Harriman and UBC-related [Union Banking Corporation] client assets were seized” under the Trading With the Enemy Act, which Franklin D. Roosevelt signed right after Pearl Harbor. George Herbert (Bert) Walker’s relationship with Averell Harriman went back to 1919, reported Buchanan, when both went to Paris to set up “the German branch of their banking and investment operations, which were largely based on critical war resources such as steel and coal.” Other corporate entities, all with ties to similar German interests, were then created by UBC, which had Prescott Bush on its board – most notably, the Hamburg-American Line, the Holland-American Trading Corporation, and the Seamless Steel Corporation. On October 12, 1920, the St. Louis Globe-Democrat headlined “Ex-St. Louisan Forms Giant Ship Merger,” explaining that Bert Walker was the “moving power” behind the “merger of two big financial houses in New York, which will place practically unlimited capital at the disposal of the new American-German shipping combine.” In the summer and fall of 1942, Congress, under the authority of the Trading With the Enemy Act, seized the first group of entities, the UBC, the Holland-American Trading Corporation, and the Hamburg-American Line. Buchanan’s diligence has discovered that the latter “reportedly smuggled Nazi spies into the U.S. before the war and encouraged U.S. ‘Patriots’ to travel to Germany and proselytize for Hitler in the early 1930s.” Much of this is confirmed by the new documentation. The UBC was not a “bank” at all but “in reality a clearing house” for many assets and enterprises held by Fritz Thyssen, a German steel magnate who has written about his role in helping to finance the Third Reich. Located close to Bush’s 59 Wall Street office, it was “founded in 1924 by W. Averell Harriman on behalf of Thyssen and his Bank voor Handel en Scheepvaart N. V. of Holland.” The UBC was seized by the United States under Vesting Order 248 on October 20, 1942, and, according to Buchanan, Bush and Harriman later received $1.5 million in compensation. Similar vesting orders leading to the divestiture of “enemy national” assets continued until well after the war. (A total of ten such vesting orders that indicate the firm’s investments are in the files in my possession.) Other holdings, associated with Bush, are more problematical, such as the relationship with the Silesian Holding Corporation and Consolidated Silesian Steel, which was bought from Thyssen in 1931. In 1943, after press reports that the Polish mining interest was employing forced labor by using prisoners from the Auschwitz concentration camp, we are informed that “Prescott Bush distanced himself from UBC and had even engaged in the collection of funds for the victims of the war in his role as president of the National War Fund.” He had, in fact, taken over as head of the United Service Organizations soon after Pearl Harbor, raising “millions for the National War Fund,” according to Mickey Herskowitz, Prescott’s recent biographer. The declassified papers confirm questionable transactions in violation of the Trading With the Enemies Act, but, as with all examinations of corporate malfeasance, more is needed to establish individual responsibility. Buchanan himself, when pressed for more details about Auschwitz, was uncharacteristically hesitant. Consolidated Silesian was the only direct link to the notorious death camp. A file in the Library of Congress confirms the business part of the relationship, but does not give any financial details. “More secretive,” says Buchanan, “is where the cloaking arrangement with Sullivan and Cromwell [most often associated with its best-known partner, John Foster Dulles] and Schroeder Rock, which is the Schroeder Bank and the Rockefeller family trust and investment arrangement – that links to the Rockefeller dealings (which John Loftus has written about) to the New York banks, some of which had to do with I. G. Farben through City National Bank of New York in back of those transactions.” Buchanan contends that there are also records involving the City National Bank, which he cites as “definitely the hot-blood area for all the Nazi money, especially I. G. Farben and Hermann Schmidt, the infamous managing director of I. G. Farben,” which was represented in court by Dulles’s Sullivan and Cromwell. Their relationship with German enterprises, moreover, began during the years of Germany’s Weimar Republic, well before Hitler's rise. An international investment banking firm, they also did business with the Soviets during the 1920s (which was not in violation of any statute), and, all in all, with forty-five different countries. Their correspondent relationships spanned the world and numbered some five thousand, according to Walter Isaacson and Evan Thomas's The Wise Men, transactions that hardly were confined to Nazis or the Soviet Union. Still, “following the money trail” is a tricky matter. Loftus points out that the Weimar government, pressed to pay their reparations bill, had to borrow gold from Sullivan and Cromwell’s American clients, and that some 70 percent of the gold that “flowed into Germany during the 1930's” came from U. S. Investors, and heavily from clients of the Dulles firm. An internal government memo on August 18, 1941, also noted that the UBC had made ”extensive” purchases of gold amounting to over eight million dollars, most of which was then shipped to Europe, presumably Germany. That transaction, speculated J. W. Pehle, an assistant to the secretary of the treasury, may have been the basis for rumors that Fritz Thyssen “has large gold deposits hoarded in the United States.” His own examination of the UBC books and ledgers, however, showed that “all of the purchases have been satisfactorily accounted for.” In view of all the financial transactions involving Germany during this period, the role of that major Jewish banking family, the Warburgs (which also did business with the Harriman group), demonstrates some of the realities of the flow of capital. The Warburgs, backed by such American groups as B’nai B’rith and the American Jewish Committee, demanded in 1934, according to Tarpley and Chaitkin, that “American Jews not ‘agitate’ against the Hitler government” or participate in any pro-Nazi boycott. Such denial about Nazi objectives was not unusual at the time. Aryan laws, at that point in German history, were still less tangible than the marketplace. Understandably, the American media has indeed been skittish about the Bush-Nazi story. The association of the Tarpley-Chaitkin book with the organization headed by the Lyndon LaRouche organization (published by the Executive Intelligence Review of Washington, a LaRouchian press) has not been helpful, to say the least. Nor has the sensationalist tone and dubious message of Loftus’s The War Against the Jews. One prominent critic of the administration in Washington, journalist Joe Conason, while acknowledging that the “involvement of Prescott Sr. and other members of the American business aristocracy with Nazi-era industry was shameful,” protests that “neither his offenses, nor the Republican Party’s politics of personal destruction, can justify using such [smear] tactics. Imputing Nazi sympathies to the President or his family ought to beneath his adversaries.” After a story on the allegations appeared in the Polish edition of Newsweek, it was “spiked” by the news magazine’s American version. Major U.S. outlets, Buchanan contends, have also bypassed a chance to investigate the story “when information regarding discovery of the documents was presented to them.” Although carried by the Associated Press, few members picked it up. One that did, Newsday on New York’s Long Island, ran it under the headline “Bush Ancestor’s Bank Seized by Gov’t.” A notable exception to the skepticism about Buchanan has been the New Hampshire Gazette. Published as a fortnightly in Portsmouth with a circulation of some seven thousand and owned and edited by journalist Steven Fowle, a descendant of the Daniel Fowle who first brought it out in 1756 (making it, “the nation’s oldest newspaper”), it has, to date, run two Buchanan stories plus his interview with John Loftus. Fowle, called by a writer for the St. Petersburg Times, an example of “Yankee Spunk,” explained that he thought it vital to gauge to what extent the Bush family fortune was derived from the Nazis. “If it’s true, it ought to be said,” he emphasized, “and it’s not my fault that it’s ugly.” Asked to account for the skittishness of most of the media, he responded by doubting that the president’s staff would care much about revelations in his paper and added that most of the other media outlets “don’t have the courage to stick their necks out if it involves challenging power. The only trouble with that is that challenging power is their job.” Conason, of course, is right. But, at least to some extent, it did happen, even if the details are far from clear. As with all such examples of infatuation with power, or the control of power, or the interests of sheer survival, the story should be told. As John F. Kennedy once said, “let the chips fall where they may.” The most judicious and succinct appraisal of all this was offered by Christopher Simpson ten years ago in n a book called The Splendid Blond Beast: Money, Law and Genocide in the Twentieth Century: By 1944 and 1945, leaders of major German companies such as automaker Daimler Benz, electrical manufacturers AEG and Siemans, and most of Germany’s large mining, steelmaking, chemical, and construction companies found themselves deeply compromised by their exploitation of concentration camp labor, theft, and in some cases complicity in mass murder. They committed those crimes not so much out of ideological conviction, but more often as a means of preserving their influence within Germany’s economy and society. For much of the German economic elite, their cooperation in atrocities was offered to Hitler’s government in exchange for its aid in maintaining their status. All this, especially considering the number of American businesses that were engaged in the German market, says more about finance and capitalism than about ideology. It is a story of power, totalitarianism on one hand, and sheer greed and economic survival on the other – and with no relationship to “morality.” We need to do more than merely sift through the essence of Buchanan's assertions, as troubling as they may be, to appreciate the value of his labors, and wonder at the contribution to public knowledge of Steven Fowle’s maverick newspaper. What all this means for the reputation of Prescott Bush's descendants should be as relevant as Joseph P. Kennedy’s for his descendants, just as it was for the connection of the Dulleses with Sullivan and Cromwell. Similar associations did not keep John Foster Dulles from becoming secretary of state or his brother Allen from heading the C.I.A. Nor did it stop Averell Harriman from becoming governor of New York.
502ec946b474878b0e1a9ca2f7f7d62f
https://historynewsnetwork.org/article/2940
Judenrein Palestine?
Judenrein Palestine? Rachel Neuwirth, writing in Israeli National News (Jan. 6, 2004): Why is it that people are proposing a Middle East peace plan that will make Judea and Samaria Judenrein (the Nazi term for a place with no Jews)? It is the historic homeland and birthplace of the Jewish people, yet many world leaders - including every American president - believe that the removal of Jewish communities from Judea and Samaria is a crucial prerequisite for a peaceful resolution to the Arab-Israeli conflict. Unfortunately, every Israeli prime minister has been pressured to follow this policy. Jews have lived in Judea and Samaria for thousands of years. In fact, the Jewish religion and people were birthed in Hebron. We know of the ancient Jewish presence there from both the Hebrew and Christian Bibles and from abundant archaeological and documentary evidence. No one denies that the oldest document showing the historical connection of the Jewish people to the land of Israel, including Judea and Samaria (a.k.a. the West Bank), is the Bible. Genesis 24:18 says: “And Abram moved his tent, and came and dwelt by the terebinths of Mamre, which are in Hebron.” And the world's oldest documentation of real estate being purchased for full price is also in the Bible (see Genesis 23:9). And for those who doubt biblical references, there is substantial evidence in archaeological findings (see http://www.sciencedaily.com/encyclopedia/History_of_ancient_Israel_and_Judah ). Historically, the Jewish homeland included what is today called Judea and Samaria, the Golan Heights, and a considerable part of today's Jordan. The land was inhabited mainly by Jews and was ruled by Jews. Therefore, Lord Robert Cecil, former acting British foreign secretary, was right to use the name"Judea" for the whole land in his famous remark:"Our wish is that Arabian countries shall be for the Arabs, Armenia for the Armenians, and Judea for the Jews." (December 2, 1917; see http://www.esek.com/jerusalem/iudaea.html .) The Jewish presence there has been continuous, except for 19 years from 1948 to 1967 when the area became Judenrein . And during that 19 year period, the Jordanians and Arabs of the remaining portion of"Palestine" desecrated Jewish holy sites and cemeteries in an attempt to deny that the Jews ever lived there. Those who advocate the dismantling of the Jewish communities in this territory are advocating a policy of ethnic cleansing. This may sound extreme, but from the early 1900s, the Arabs carried out a policy of ethnic cleansing that included the massacre and pogroms in 1929 and 1936 in Hebron. Both the spirit and practice of ethnic cleansing are being continued in the current conflict (see http://www.palestinefacts.org/pf_mandate_grand_mufti.php ). So, what did UN Secretary-General Kofi Anan mean in his 2001 Nobel Peace Prize acceptance speech when he said, “A genocide begins with the killing of one man — not for what he has done, but because of who he is. A campaign of 'ethnic cleansing' begins with one neighbor turning on another.” Does this not also apply to the Israeli Jews who have re-established homes in Judea and Samaria? Should they be ethnically cleansed from the heart of their historical homeland? Does the Nobel recipient not know a real victim of ethnic cleansing when he sees one? The same people and countries that condemned ethnic cleansing in the Balkans, Cyprus, Rwanda and Tibet totally reverse themselves when it comes to the right of Jewish people to live in the lands of their historic patrimony. If Chinese people were forbidden to live in China, Buddhists barred from Tibet, or Irish-Catholics banned from South Boston, there would be a tremendous outcry against such injustices. But where is the outcry against the removal of Jews from Judea — their historical homeland? Is there any other nation on earth that has such a legitimate birth certificate as Israel? And if the Jews have no such document, then the Old and New Testaments are worthless. The war for Israel's independence ended in 1949 with the Jordanians in full control of Judea and Samaria and the Old City of Jerusalem (the"West Bank"), cutting the Jewish people off from their most holy religious sites. The official status of these areas, then, was disputed territories, as no one had held sovereignty there since the defeat of the Ottoman Empire. Only two countries, Pakistan and Britain, recognized the 19-year Jordanian"illegal occupation". Even the entire Arab world refused to recognize it and, consequently, it was illegal and illegitimate ab initio . After the 1967 war, the Jewish people have simply been returning to the land from which they were forcibly expelled during the first Arab-Israeli war of 1948-49. This territory has always been known as Judea and Samaria. Do the names"Jew" (for Judea) and"Samaritan" (as in"good Samaritan") sound familiar? In fact, Shemer, founder of Asher, a clan of one of the twelve tribes of Israel, was the owner and eponym of the hills of Samaria. Is there anything Arab or"Palestinian" about either? Even UN Resolution 181, the Partition Plan of 1947, refers to these territories as Judea and Samaria ( http://www.yale.edu/lawweb/avalon/mideast/mideast.htm ). The word"occupiers" does not apply to the Jews. Prior to the illegal Jordanian occupation of 1948-67, Jews had maintained several thousand years of continual residence in the area. However, the term does apply to both the Jordanians and the"Palestinian" Arab squatters of today ( http://www.tzemachdovid.org/Facts/islegal1.shtml ). In the early part of the 20th century, the Arab population carried out a war against the Jewish inhabitants of the area. This resulted in a series of massacres in Hebron, the birth place of Judaism, in 1929, as well as numerous other violent attacks, such as the 1936-39 pogroms against Jews, ending in the total expulsion of the Jewish population from much of Judea, Samaria and the Old City of Jerusalem. As a result of the Israeli victory in 1967, Jewish people returned to this area and re-unified the historic capital of Jerusalem. Many of the Jews who had been expelled from this territory, or whose parents and grandparents were murdered by rampaging Arabs, have merely returned to their previous homes. And in subsequent years, additional Jewish communities (not"illegal settlements") were built, mainly for security purposes, and others for historical and emotional reasons on mainly state-owned land and historical outposts. Judea and Samaria were liberated , not stolen or occupied, from Jordan (see http://www.tzemachdovid.org/Facts/islegal3.shtml and http://www.internationalwallofprayer.org/A-143-A-Settlers-History-of-Settlements ). Since 1967, 261 new Arab settlements have been built in Judea and Samaria. According to international law, all of these are illegal, as no sovereignty was ever recognized over these territories; yet no one calls for their removal. Why is it that no one talks about those Arab settlements as obstacles to peace — especially when they are bases for carrying out terrorism, and their inhabitants are constantly taught virulent hatred toward the Jewish people and the West? Dismantling the Jewish communities in these territories will only reward terrorism. The Jewish communities in Judea and Samaria, are a litmus test of Arab intentions. Why can't Jews live in their historic homeland if there really is peace? After all, there are 1.2 million Arabs living as citizens of Israel in the one Jewish country in the world, while there are only a handful of Jews living in any of the 22 Arab countries. In fact, in Jordan and Saudi Arabia, not only is it illegal for Jews to be citizens, they are not even allowed to live there. Therefore, instead of Israel being the"apartheid state" in the region, it is the Arab world that is not only apartheid, but also racist and religiously exclusive.
d73c8605d814b4cfde0f09bdf2e493f5
https://historynewsnetwork.org/article/30519
About the Herbert Aptheker Sexual Revelations
About the Herbert Aptheker Sexual Revelations My first reaction to the shocking news in Bettina Aptheker’s book that her father, Herbert, sexually abused her as a child (as described in Chris Phelps’s article in the Chronicle of Higher Education), was contained in my October 3 letter to Chris (see the end of this note). The first thing to say is that what Herbert did to his daughter Bettina was just awful, and Bettina has my sympathy. The second thing is that it was dead wrong for me to use careless language that suggested that this news pleased Ron Radosh, whose first reaction was honorable and humane (see below) despite the immense political distance between him and Herbert. I continue to wish for discussion as to how the attitudes expressed in Herbert’s awful acts might have been reflected in books like the centrally important American Negro Slave Revolts and/or the truly terrible The Truth about Hungary. I can’t see it, but discussion may bring out some continuity. I think Chris implies but does not show a connection. There is much to be said about the Communist Party and issues of sex and gender. Women (including my mother) played significant roles in the Party, and Red Diaper Babies were brought up believing that women had achieved equality in the USSR – an utter fiction, as I found experientially in academic visits to Moscow in 1978 and 1991. It’s my impression that many Second Wave feminists were Red Diaper Babies who had picked up something important from this ambiguous heritage but that the Party was not friendly to feminism and the Women’s Liberation Movement: the everyday life of the Party was hardly as egalitarian as its expressed ideals, and it clung to the notion that class trumped gender, and saw discussions of, for instance, orgasm, as trivial and selfish. And Betty Friedan’s attack on lesbians – the “lavender menace” – are certainly relevant. Nonetheless, many Communist women were and are immensely supportive of younger feminists. There is no doubt that there was a very repressive side to the Party. Personal things, including illness, were sometimes thought to be self-indulgent luxury as against the Larger Struggle, e.g, “How can we speak of our individual mortal illnesses when the President has resumed the bombing?” (Not from Herbert.) Herbert lived a life blacklisted and under fire, with horrendous insult from a wide range of people, including Eugene Genovese and the Liberal Southern Gentleman C. Vann Woodward. Coming under a little fire myself, about thirty years ago I asked Herbert, “How do you take it?” He answered with Communist courage but with utter blindness to the emotional costs of a life under fire, “You redouble your efforts.” The New Left had some of this, but of course the Women’s Liberation Movement, originating in rebellion against organizations like the CP, SNCC and SDS, had a deeper connection to the emotions and their importance. And that connection (as well as other important insights), expressed in one of the most influential political movements of the twentieth century, made American culture radically better (despite all the present horrors); who would want to go back to the 50s? I see I’m not getting to Herbert’s acts. I won’t attempt to psychoanalyse. And I don’t think that the life of the Party was any worse than the lives of other Americans. (Certainly the events of the day remind us that such acts seem to be very much in the American grain.) But as I suggested in my letter to Chris Phelps, I think some of the lefts that I have been in have been less than candid with Americans about uncomfortable truths, and building a radical and just movement for a better America -- and keeping it off the backs of the rest of the world --- requires total candor with Americans, acknowledgement of faults and errors, admissions of failures as well as successes. October 3 Letter to Chris Phelps Good for you, Chris, and good for Bettina [for speaking these truths]. This is an awful and amazing story, reading like something in Doctorow's Ragtime. As you know, I have always said that the left should speak uncomfortable truths, even if they please Ron Radosh [Radosh had sent Chris’s article to his list under the subject heading “Another side of Comrade Aptheker,” but with the notation, “Of course, she should have done this when he was alive, so he could answer. Who knows if it’s true. But what a shock!”]… This material certainly sheds light on Aptheker, and by extension on the gap or connection between the personal and the political in the US CP (of which we have much evidence). It's full of irony that this comes out at the time of the delayed [Representative] Foley revelations. But. I think you are a little too agnostic in your rhetorical question, "To what extent should disheartening revelations about a scholar's conduct be held against his oeuvre?" Without positing a major disconnect between the personal and the public, I can't see how these revelations of despicable sexual behavior make American Negro Slave Revolts or the horrifying Truth about Hungary any more true or false. But I am interested in seeing what connections people might be able to sketch in. There might be some. Best.JesseRelated LinksClare Spark: Doubts about his daughter's story of incest
5beb642a758e146c7e3d76c1991a2575
https://historynewsnetwork.org/article/31400
What Happened When Democrats in Congress Cut Off Funding for the Vietnam War?
What Happened When Democrats in Congress Cut Off Funding for the Vietnam War? The prospect of Democrats controlling the 110th Congress has raised speculation over a possible suspension of funds for the war in Iraq. Given control of the purse strings, a Democratic Congress would be in the position to force the government to begin the withdrawal of troops. Although they have been hesitant to define their plan for Iraq, some Democrats have hinted at a drastic reduction in funds. When asked in a recent interview how a Democratic Congress could stop the war, Rep. Charles Rangel (D-NY), who is set to chair the Ways and Means Committee should the Democrats win the majority, precociously answered, “You’ve got to be able to pay for the war, don’t you?” Fellow member of the Out of Iraq caucus, Lynn Woolsey (D-CA) has stated that “Personally I wouldn’t spend another dime on the war,” and notes that Congress helped force an end to the Vietnam War by refusing to pay for it. (1) What happened when Democrats in Congress cut off funding for the Vietnam War? Historians have directly attributed the fall of Saigon in 1975 to the cessation of American aid. Without the necessary funds, South Vietnam found it logistically and financially impossible to defeat the North Vietnamese army. Moreover, the withdrawal of aid encouraged North Vietnam to begin an effective military offensive against South Vietnam. Given the monetary and military investment in Vietnam, former Assistant Secretary of State Richard Armitage compared the American withdrawal to “a pregnant lady, abandoned by her lover to face her fate." (2) Historian Lewis Fanning went so far as to say that “it was not the Hanoi communists who won the war, but rather the American Congress that lost it." (3) In January of 1973, President Richard Nixon approved the Paris Peace Accords negotiated by Henry Kissinger, which implemented an immediate cease-fire in Vietnam and called for the complete withdrawal of American troops within sixty days. Two months later, Nixon met with South Vietnamese President Thieu and secretly promised him a “severe retaliation” against North Vietnam should they break the cease-fire. Around the same time, Congress began to express outrage at the secret illegal bombings of Cambodia carried out at Nixon’s behest. Accordingly, on June 19, 1973 Congress passed the Case-Church Amendment, which called for a halt to all military activities in Southeast Asia by August 15, thereby ending twelve years of direct U.S. military involvement in the region. In the fall of 1974, Nixon resigned under the pressure of the Watergate scandal and was succeeded by Gerald Ford. Congress cut funding to South Vietnam for the upcoming fiscal year from a proposed 1.26 billion to 700 million dollars. These two events prompted Hanoi to make an all-out effort to conquer the South. As the North Vietnamese Communist Party Secretary Le Duan observed in December 1974: “The Americans have withdrawn…this is what marks the opportune moment." (4) The NVA drew up a two-year plan for the “liberation” of South Vietnam. Owing to South Vietnam’s weakened state, this would only take fifty-five days. The drastic reduction of American aid to South Vietnam caused a sharp decline in morale, as well as an increase in governmental corruption and a crackdown on domestic political dissent. The South Vietnamese army was severely under-funded, greatly outnumbered, and lacked the support of the American allies with whom they were accustomed to fighting. The NVA began its final assault in March of 1975 in the Central Highlands. Ban Me Thout, a strategically important hamlet, quickly fell to North Vietnam. On March 13, a panicked Thieu called for the retreat of his troops, surrendering Pleiku and Kontum to the NVA. Thieu angrily blamed the US for his decision, saying, “If [the U.S.] grant full aid we will hold the whole country, but if they only give half of it, we will only hold half of the country.”5 His decision to retreat increased internal opposition toward him and spurred a chaotic mass exodus of civilians and soldiers that clogged the dilapidated roads to the coast. So many refugees died along the way that the migration along Highway 7B was alternatively described by journalists as the “convoy of tears” and the “convoy of death.” 6 On April 21, President Thieu resigned in a bitter televised speech in which he strongly denounced the United States. Sensing that South Vietnam was on the verge of collapse, the NVA accelerated its attack and reached Saigon on April 23. On the same day, President Ford announced to cheerful students at Tulane University that as far as America was concerned, “the war was over.” The war officially concluded on April 30, as Saigon fell to North Vietnam and the last American personnel were evacuated. Notes 1 Bob Cusack, “Anxious Dems eye the power of the purse on Iraq.” The Hill, September 26, 2006. 2 Edward J. Lee, Nixon, Ford, and the Abandonement of South Vietnam (McFarland & Co., 2002), p. 105. 3 Fanning, Betrayal in Vietnam.4 Lee, 82.5 Ibid, 91. 6 Ibid, 98. Related Links David E. Kaiser: Stabs in the Back?
d3436e7932f26c1b134b53534cebc39b
https://historynewsnetwork.org/article/33300
History at Yale In the Dark Ages, 1953-76
History at Yale In the Dark Ages, 1953-76 I was at Yale, in and around the programs in Directed Studies, Scholar of the House, American Studies and History; living in Farnam, Berkeley, Silliman; as undergraduate, graduate student, teaching assistant, and finally instructor, most of the time from 1953 to 1963. I have stayed in touch since. I have been active, in my fashion, in alumni affairs, with the perhaps utopian goal of bringing Yale down: I don’t think the place can be reformed, and I see very little place for it in the vastly expanded system of public higher education that this increasingly backwards country desperately needs. I enjoy visiting Yale and lecturing there now and then (invited by indwelling subversives): I re-visit the sites of my various past crimes, eat bagel with cream cheese and other organic foods in the Berkeley College dining room (after checking the menu on the Web in advance), and I stay in touch with what people at Yale are saying and thinking – in particular, people at History and American Studies, and my friends and allies, the marvelous scholar-activists in the teaching assistants union, Graduate Employees and Students Organization (GESO: www.geso.org), as well as the members of the nascent chapter of the struggling-to-be-reborn Students for a Democratic Society. So what you are reading is part history and part memoir. I was at Yale (BA Yale College 1957, PhD 1963), right smack between the two Bushes, who were ’48 and ‘68. (My classmates feel a shock of recognition when they look at W’s transcript and see how many courses they had in common with him. Bushes and Bushies contributed to making Yale a poisonous presence on the national and international scene. This was the period that produced, along with a few virtuous exceptions, many evil-doers, e.g.: Porter Goss ‘60 (CIA Director), John Negroponte ’60 (National Intelligence Director), Richard Posner ‘59 (free market judge), Richard Gilder ‘54 (founder of right-wing Manhattan Institute), my student Benno Schmidt ‘63, privateer of schools, warrior against CUNY as Chair of its Board of Trustees, Joe Lieberman ’64 (Bush’s American poodle), and earlier, McGeorge Bundy ’40 of Vietnam fame, and James Jesus Angleton ’41 (OSS-CIA) The barbarities of undergraduate culture at the time helped to prepare these people to commit barbaric acts on a world scale later on in adult (?) life. The culture honored heavy drinking and public vomiting and urinating – long before the homeless picked up these virtuous behaviors from Yalies. During this period, W’s fraternity, DKE – he was President -- held an annual “Pig Night.” New Haven girls – “townies,” as they were called -- were invited to the fraternity for a dance. At midnight, the announcement was made: they had been selected for ugliness, “pigs.” Homophobia was rampant, and women were barred from Yale College, Mory’s, Linonian & Brothers Library (like Harvard’s Lamont), and the Elizabethan Club (get that, a club without women named after somebody named “Elizabeth”!). And the Whiffenpoofs, the leading Yale singing group, sang: “’Twas a cold winter’s evening, the guests were all leaving, O’Leary was closing the bar, When he turned and he said to the lady in red, ‘Get out, you can’t stay where you are.’ She shed a sad tear in her bucket of beer as she thought of the cold night ahead, When a gentleman dapper stepped out of the crapper, And these are the words that he said: ‘Her mother never told her. About the ways of college men. And how they come and go, mostly go, Now age has taken her beau-who--who-ty. And sin has left its sad scar; So remember your mothers and sisters, boys, And let her sleep under the bar.’”And: “Your Daddy is a Yale Man, We may be married soon There’s no room for rent, So we may pitch a tent in the backyard of Mory’s Saloon… … the home is where the heart is, So we’re thinkin’ of leasin’ a Quonset on Neeson, To do daddy while mommy does Yale” As an undergraduate I was part of a small and embattled social and political group, which clustered, generally dateless, around the John Dewey Society, the Yale chapter of the Student League for Industrial Democracy (which later became, through twists, turns and schisms, Students for a Democratic Society.) We were pretty much what passed for a Left in the hostile atmosphere on campus in those days, although there was also a Young Peoples Socialist League (YPSL). A disproportionate number of us from JDS have stuck with it: Founding Father Andre Schiffrin, who went on to head Pantheon Books, published a distinguished list, and was fired and founded New Press; Paul Chevigny, later an Attica lawyer and staff lawyer at the New York Civil Liberties Union, now creative litigator on behalf of New Yorkers’ fundamental right to dance; Joel Kovel, reformed psychiatrist, writer and activist, Green Party candidate for Senator from New York State; Jonny Weiss (’60), later head of Legal Services for the Elderly in New York City; Roy Jackson and Paul Asselin, both now sadly dead; and me, Left historian, writer and activist. We resisted the resounding silence of our generation as well as we could, bringing in Left speakers on such topics as “The Politics of Oil” (Robert Engler). Outside of the somewhat stolid and Fabian confines of JDS, some of us ridiculed Yale traditions in every way that we could: I brought women into the Elizabethan Club at times when they were not permitted, and wrote to the Yale Daily News in favor of admission of women to Yale. We inaugurated a new “tradition” with a marble contest on the steps of Sterling Memorial Library– which brought a hostile response from Calvin Trillin in the Yale Daily News: “Who Put Marbles in Schiffrin’s Head?” We hassled the secret societies, toting down from Weir Hall into Skull and Bones’s backyard live chickens labeled with the names of Bones members, and on Tap Day Schiffrin, so admirably, refused them all from inside a Berkeley toilet stall. I was hurled to the ground by a dark-suited Bonesman while observing the Tap Day procession of prominentoes and “Patriarchs” into Bones – and I learned a lesson in law enforcement and power when the campus cop who saw this urged me to “Go along, Sonny.” We named one of the frat boys in our entry in Berkeley “What’s-the-Score?” in imitation of his continual cry, and Paul Asselin was regularly beaten by these upstairs jock/frat neighbors.We did obscene pre-political things, ridiculing the “shoe” culture of the day, in my case dressing up very Fenn –Feinstein, but then revealing myself, to Whiffenpoof-like groups and in the Elizabethan Club, to be Nelson Algren’s character, Raincoat the Perfect Lover. We were a little beat: Roy Jackson was the first person I knew who could recite Ginsberg’s “Howl.” And we were early fans of Elvis, who we thought was named “Aldous.” Paul Chevigny and I spent the summer of 1957 literally on the road, hitch-hiking across the country and ending up in North Beach just as Kerouac’s book was coming out.We had a few faculty friends and allies: Charles Blitzer of Political Science, Paul Weiss of Philosophy. Bob Herbert of History of Art, Bob Bone of American Studies. Later, William Sloane Coffin became chaplain, preached “set our hearts on fire” to an aghast audience at the 1963 Commencement on the Old Campus, and took me to jail with him, his wife, Richard Sewall of the English department, and 200 clergymen including the Stated Clerk of the Presbyterian Church in a July 4, 1963 civil rights protest at Gwyn Oaks amusement park outside Baltimore, in the jurisdiction of Spiro Agnew.I’ve never told publicly the story about how I came to leave Yale prematurely. It was January of 1963, and Norman Pollack was delivering the last lecture of the first semester in the US History Survey in Sterling-Sheffield-Strathcona, perhaps Yale’s largest lecture hall. Concluding the semester neatly and on time, Norman stood on the stage and said, “and then one April night in 1865, with the cares of office heavy on his shoulders, Lincoln went to Ford’s Theatre, when suddenly….” This was my cue: I rose up in the balcony, fired off my Ruger starter pistol, and cried out – you know what’s coming -- “Sic Semper Tyrannis.” Six hundred Yalies’ heads jerked back, forming a wave that swept through the auditorium -- as if it were Yale Bowl now, in the time of waves. Norman squashed the ketchup pellet under his jacket, and fell to the floor. It was the end of the semester; it was the beginning of the Sixties in New Haven. Although this imaginative pedagogy happened before the Kennedy Assassination, which re-introduced assassination as a somber part of the culture, the History Department was not happy with it. Ed Morgan called me in. Before this, I had begun, albeit in a still primitive and stupid way, to understand that I might be rubbing the Department the wrong way. I had urged Ed to tell me if he heard any bad talk about me. Now, he had indeed heard such talk. “Alright, what happened?” I knew I was in trouble, but couldn’t tell the story without laughing. It emerged that George Wilson Pierson, chair of the History Department -- and owner of a blue tuxedo which he wore at Yale “Smokers” (now they are called receptions) at annual meetings of the American Historical Association -- had found this to be conduct unbecoming a member of the Yale faculty. I had offended against the genteel code, and it was even worse that I had done this together with Norman Pollack, who, as we will see, our senior colleagues had begun to detest. Shortly thereafter, I found my name scribbled in by some mysterious force on a list posted on a Hall of Graduate Studies bulletin board for a specific time slot for what turned out to be a nice job interview with the wonderful Carl Schorske, who was then at Berkeley, looking for a Colonial Historian focused in the seventeenth century. But I was squarely in the Revolutionary period. So fate would prevent me from showing up for the Free Speech Movement. Instead, on Secret Society Tap Day I got the call that would start me on my way to the University of Chicago -- passed off via the Mafia-like patronage system, in perhaps the same way that Hotchkiss ejectees were immediately admitted to Choate -- to right-wing hysteric Daniel Boorstin, and thus I was set up for my next more overtly political firing three years later. Talk about frying pans and fires! But that’s a story for another time. The “assassination” was a comical event, and I have neither regrets nor bad feeling about it, although it did mean that I left Yale and therefore became separated from my nude posture photos. It was a pre-political beatnik-like preview of what would come to be called “guerilla theatre,” or the kind of thing that would get you a Great Teacher Award later in the sixties (“Man, he really brings the text to life!”) But, comical as it was, it was to be the beginning of an exit parade of radical historians, a kind of New Haven death march, which lacked only some New Orleans-style trumpeter and a couple of bobbing blue parasols. In those Cold War years, Yale cooperated with the FBI, giving on-campus space to the agency. That courageous liberal, Yale President Charles Seymour, stood up to the Red Scare, saying “There will be no witch hunts at Yale,” since “There will be no witches at Yale. We do not intend to hire Communists!” Huh? This is courage? Similarly, Harvard’s President James Conant said that so far as he knew, there were no Communists there, but if there were, “I hope the Government will ferret them out and prosecute them” (Lemisch, On Active Service in War and Peace: Politics and Ideology in the American Historical Profession [1975], p. 50). At the University of Chicago, Robert Maynard Hutchins said, “The faculty number 1000; none of its members is engaged in subversive activities” (ibid.). An often uttered A. Whitney Griswold era homily, “men of good will may disagree and yet remain friends” encouraged me until I later discovered the invisible corollary: if you really disagreed, you were not a man of good will, and they would smash you. It was a dreadful time, the Dark Ages, a time when the institution endorsed bigotry of every kind: anti-semitism, anti-Catholicism, racism, nativism, homophobia, sexism, class contempt. As Geoffrey Kabaservice has shown in The Guardians: Kingman Brewster, His Circle, and the Rise of the Liberal Establishment (2004), A. Whitney Griswold’s oft-stated ideal of Yalies as “well-rounded men” was in fact a motto of exclusion, in each of its three words. Some of us who were not, by Yale’s definition, well-rounded; nor, for that matter, men (like my wife, Naomi Weisstein, then at Bronx High School of Science, and on her way to Wellesley), sensed this exclusive underside at the time, and it has been forcefully underlined by fine scholarship by Geoffrey Kabaservice, Jerome Karabel, The Chosen: The Hidden History of Admissions at Harvard, Yale and Princeton (2005), and Dan Oren,Joining the Club (1986, 2001). We see in George W. Bush the reductio ad absurdum of the era’s well-rounded man: Andover, DKE, Skull and Bones, football, baseball, basketball and rugby, and a classic gentleman’s C. I majored in American Studies as an undergraduate and got my doctorate in it in 1963. American Studies at Yale had been founded with clear ideological purposes. The American literature canon as then defined was watched over by an actual CIA man, Norman Holmes Pearson. The syllabus for Pearson’s American Studies 59a literature course, which W. took in the fall of 1966, oozed love of male authority, sexual mystique, etc. Pearson was to be one of a number of CIA-connected faculty who taught me. Yale Historian Robin Winks has described the ties between Yale faculty and the CIA in his Cloak and Gown: Soldiers in the Secret War, 1939-1961 (1987). (Some of you may have had the recent misfortune of sitting through two hours and thirty-seven leaden minutes of the 2006 film, “The Good Shepherd,” which attempts (without filming in New Haven) to show the connections among Skull and Bones, OSS and CIA. In this version, the full corps of Whiffenpoofs are always singing in the Tomb, amidst the naked mud wrestling and male consciousness-lowering.). In my years at Yale, when we needed to know the forms of citation, we turned to Sherman Kent, Writing History (1941); it was, as the CIA describes it, “a ‘bible’ for a generation of undergraduates charged with completing a competent term paper.” It was only years later that I learned that Kent had worked for the Research and Analysis Branch of the World War II Office of Strategic Services and from 1950-67 full time for CIA. The CIA’s official biography of Kent says that Writing History was “meant for college students but contains many of the themes that he would later develop for [intelligence] analysts”; it’s my understanding that the book we used for term papers was also used by CIA analysts. The History Department was overseen by George Wilson Pierson – he of the blue tuxedo – anti-semitic, anti-Catholic, nativist. As Thorstein Veblen gazed down quizzically from the picture frame on the wall in the Hall of Graduate Studies -- now there’s also a portrait of C. Vann Woodward, wearing a severely moiré-patterned jacket -- Pierson conducted the once-a-year full-Department ceremonial “meeting,” even including humble TA’s such as myself. The anti-Catholicism that was deep in Protestant Yale (at least as deep as anti-semitism) came out when he reported uncomfortably that the Cuban Revolution had caught the Department with its pants down, without a historian in that area, and so they had been forced to turn – here there was a palpable grimace – “to a Catholic college in Bridgeport” (Fairfield University?) for a temporary fill-in. What could be worse, both Catholic and from lower-class Bridgeport? On another occasion, as Peter Novick tells us in That Noble Dream: The ‘Objectivity Question’ and the American Historical Profession (1988), Pierson expressed doubt that the children of immigrants (and of optometrists, like myself?) could understand American history. And when I left Yale for a masters year at Columbia (1957-58), Leonard W. Labaree, History Professor and Editor of The Papers of Benjamin Franklin, wrote on my behalf to Columbia Colonial Historian Richard B. Morris, who Labaree described to me as “an energetic little man, of your same religious background.” Funding for the Franklin project and the other papers of American leaders had been justified by the American Historical Association as a weapon in the Cold War, as stated forthrightly by AHA Executive Director Boyd C. Shafer, who saw such papers as missiles in the War (Lemisch, “The American Revolution Bicentennial and the Papers of Great White Men: A Preliminary Critique of Current Documentary Publication Programs and Some Alternative Proposals,” American Historical Association Newsletter, IX (November 1971). “In those dark ages,” I later wrote, “academic thought contained much bigotry haughtily presented as political neutrality: contempt for the lower classes, racism, antiradicalism, fancy reactionary theories, and a worship of strong men” (In Search of Early America [Williamsburg, Va. 1993], 137). The curriculum and scholarly output in those years was loaded, heavily ideological, corrupted by Cold War and anti-democratic values. The notion of class was under attack, with Charles Beard as a stand-in for Marx, particularly in the writing and teaching of my mentor, Edmund Morgan, who participated in the wave of conservative Beard debunkings of those years. An Economic Interpretation of the Constitution (1913, 1935) was not in tune with the Great American Celebration of the time. America was and always had been classless – we were all middle class -- and was marked by consensus. (Morgan handed me a reprint of a Commentary article to this effect by his pal Daniel Boorstin.) Historians (as well as publishers) were gulled by Robert E. Brown’s wretched tracts on the Constitution and on “Middle-Class Democracy” as if they were scholarship. In Morgan’s view of the Revolutionary period, drunken mobs of sailors rioted without reason, manipulated by their betters. Radicals in all periods were denigrated as pointless or insane, going up against an otherwise happy consensus: true believers, guilt-driven Abolitionists. The survey course presented such deeply political messages as a -- shall we say -- fair and balanced account of slavery by Yale holy David Potter offering equal time to testimony by slaves who looked back on slavery days as happy times -- Herbert Aptheker’s American Negro Slave Revolts (1943) was missing from the graduate curriculum – or sometimes mentioned, accompanied by hysterical warnings against the Dangers of Communism -- and white supremacist Ulrich Phillips’s presence was still strong at Yale. The pendulum between management and labor in contemporary America was alleged to have shifted to the point where labor was too powerful, and thus what Harvey Swados would correctly label “The Myth of the Happy Worker” (The Nation, August 17, 1957) dominated. There was what I called in 1975 “a tremendous condescension in attitudes toward popular judgment and democracy” (Lemisch, On Active Service in War and Peace, 135). Nobody in New Haven had yet heard of the work of the British Marxists (as late as 1965 the senior British Historian at the University of Chicago responded to my mention by saying “Edward Who?”) and the field in general remained defended for many years against such alien stuff. The American people could not be trusted in the area of foreign policy (Yale’s Gabriel Almond, The Anerican People and Foreign Policy), and a strong presidency was necessary; according to Sam “Wave-the-Flag” Bemis, the only thing Jefferson ever got right was the Louisiana Purchase. McCarthyism was seen as the latest outburst from below of anti-intellectual populism. And John Blum wrote, in the midst of the war in Vietnam, that his The Promise of America (1967) reflected his"endeavor to rejoice, to describe those patterns that disclose – even for the impatient, perhaps especially for them – the nobility and the power, the mission and the magnificence of the United States." As student and then as researcher and teacher, I tried to present alternatives to this abysmal swamp, and my efforts were rewarded with disapproval from senior colleagues, laying the groundwork for my eviction from Yale after the Lincoln caper. I began to study the American Revolution from the bottom up in an atmosphere that was indifferent and sometimes hostile to my approach. I went searching for Jack Tar in the scholarly darkness in those years before the sixties became The Sixties. Looking in particular at seamen’s role in the Stamp Act Riots and in opposition to impressment, I came to see a certain rationality in the Revolutionary crowd, just as George Rude, E.P. Thompson and Eric Hobsabawm were finding in their work. Nobody in New Haven had yet heard that a “mob” might in fact be simply a “crowd,” and thus there was not yet sophisticated discussion about how a mob might in fact be a mob, despite Marxist contempt (Lemisch, “Communication: The ‘Mob’ versus the ‘Crowd’: The British Marxists and Early American History…” William and Mary Quarterly, January 1999). Edmund and Helen Morgan's The Stamp Act Crisis: Prologue to Revolution (1953) incited what was to be my dissertation with remarks like “Merchants, lawyers, and plantation owners directed the show from behind the scenes" (181) and "How… did the Sons of Liberty rouse these people to fury and, more important, how did they control that fury once they had aroused it?" (187) This encapsulates an entire theory of radicalism which has no notion of agency from the bottom up: weak in evidentiary base, it’s anti-radical theology. Having had a fine scholarship job on The Papers of Benjamin Franklin, I worked my way through the enormous amount of documentation there. I looked at Franklin’s social attitudes in a critical way that produced horror in the Franklin Factory. “I didn’t know you had such a scunner on Franklin,” said Editor Leonard Labaree, using a Scottish term I had never heard. My critical book-length Scholar of the House paper is still banned on the second floor of Sterling Memorial Library. (Later I criticized the American celebrationist avoidance of social history and the preference for the history of “Great White Men”; Lemisch, “The American Revolution Bicentennial and the Papers of Great White Men,” AHA Newsletter, 1971). To counter the propagandistic notions of happy workers and pendulums swinging in the direction of too much labor power, I borrowed from CBS a copy of Edward R. Murrow’s “Harvest of Shame” for showing to my sections of the survey course. In the purple ditto of the day, I provided my classes with narratives from other than the happy slaves who populated the course reader. This precipitated the anger of historian (and later Acting Yale President) Howard Lamar – who headed the course -- and it was seen as a kind of lese-majeste towards David Potter, who had put together the course readings. This conflict set the stage for my premature exit after the Lincoln assassination. When I shot Lincoln, I was a TA – or as Yale, with its talent for obfuscation and disdain for the realities of course staffing, called us, an “assistant-in-instruction.” Within four years, the Yale History Department was to divest itself of three Left Americanists. (A fuller account of the narrow limits of dissent at Yale at that time would include a gay firing in History of Art [Martin Duberman, Cures: A Gay Man’s Odyssey (1991), 43; conversation with author, December 2006], and the 1961 buying out of the tenure of Buckleyite Political Science Professor Wilmoore Kendall. Anyone trying to make sense of the latter will run into stories of drinking, “sexual indiscretions,” etc., but there remain questions as to just what was cause and what was pretext. Perhaps this broadens our discussion by suggesting that Yale punishes deviance from the political mainstream, though it does so more frequently when the deviance is to the left.) My co-conspirator in the Lincoln assassination was Norman Pollack. In highly crafted landmark articles, he offered data that revealed the inaccuracy and bias of anti-Populist historians Oscar Handlin (Harvard) and Richard Hofstadter (Columbia) and pioneered in rehabilitating the Populists, taking them out of H & H’s hostile grip. Pollack’s Populist Response to Industrial America (1961) deserves to be thought of as the first work of New Left history. (Many of its central themes were later supported by Michael Paul Rogin’s magnificent The Intellectuals and McCarthy: The Radical Specter (1967). With Pollack’s arrival at Yale in 1962, a new breeze blew, redolent of Harvard Square in the time of Joan Baez, from whence he came wearing a blue work shirt – at the time a serious lifestyle deviation for a member of the Yale faculty, where today they wear fine French Blue shirts, like workers in the Metro. With Pollack came the soft Marx of the 1844 Philosophical Manuscripts. Yale’s tolerance for Norman’s critiques of Hofstadter and Handlin was limited, and soon, he, too, was toast, on his way out. He had gotten into public tangles at professional meetings, and it is hard not to see elements of anti-semitism in the perception of his argumentative manner both face-to-face and at the podium as another violation of the gentlemanly code. And Norman had further tainted himself by introducing a Yale talk by Communist Herbert Aptheker. A year after I left Yale, I provided what turned out to be a kind of understated orientation to Staughton Lynd, who was about to fill the slot for Yale radical activist Colonial Historian. We didn’t know what horrors and hypocrisy lay ahead. By the time Staughton was denied tenure, because of his anti-Viet Nam war activism, including a trip to Hanoi with Tom Hayden and Herbert Aptheker, it was totally clear that there was a pattern of hostility to Leftists in the Yale History Department. This pattern continued in 1975-76 with Vann Woodward’s incredible vendetta – to the embarrassment of some of his colleagues -- his campaign to keep Communist historian Herbert Aptheker from teaching a one semester course on his friend and co-worker W.E.B. DuBois (Aptheker was DuBois’s literary executor) in a student-initiated Davenport College seminar program (Davenport had been George Bush’s college a few years earlier) which was so unofficial as to include a course by Howard Cosell – the Yale equivalent of what Lenny Bruce used to call non-scheduled airlines. In the same program, Howard Cosell taught “Big Time Sports and Contemporary America.” (Lemisch, “If Howard Cosell Can Teach at Yale, Why Can’t Herbert Aptheker?” Newsletter of the Radical Historians Caucus, May 1976). In his rage, Woodward treated this one-semester once-a-week-train-from-New-York gig as if it were a tenured appointment to the Yale College faculty, and fought against the appointment although it originated in Political Science. The appointment worked its way through the process and reached the pro forma stage of the Board of Permanent Officers, where, according to the chair of the Political Science Department, there was a “massive attack on a minor appointment.” In an unprecedented move, Woodward brought with him to the BPO ten members of the History Department. The result was the first College seminar appointment ever rejected at this level. To justify his stand, Woodward wrote a letter to the Yale Daily News (February 2, 1976) which is a classic of snotty expression by this supposed gentleman scholar, written on the arrogant assumption that Aptheker was merely a humble applicant for a job: … [Aptheker’s] writings did not measure up… A great many negative decisions are made every term. Hundreds of people apply to teach at Yale and only a handful are appointed. Neither the time nor the taste for debate with the candidates over their scholarly qualifications, such as Mr. Aptheker proposes, really exists. Neither the applications nor the reasons for the decisions regarding them are normally made public. It is to be hoped that this unfortunate exception to the rule will not become a precedent. It might discourage people from applying. The more applications from teachers and students we get the better we like it. We like to think that many more want to come here than we accept. It gives us a greater range of choice… [For] applicants… there are other good colleges available, even some good community colleges. As I commented in 1976, one of the risks involved in achieving the power and deference which have come to Woodward is that no one will tell you when you have done something awful. (Nonetheless, several of Woodward’s colleagues voiced to me their otherwise silent embarrassment over Woodward’s behavior in regard to Aptheker.) Woodward, like Hofstadter, was a liberal who moved rightwards in reaction to the sixties. I worked to have the Organization of American Historians investigate Yale, and the membership found Yale’s conduct so blatant that they voted to investigate. Yale stonewalled, but the pattern of hostility to leftists had been revealed in, among other places, the front page of the New York Times. And what of Yale today? Admissions policies have changed radically, although it should be noted that that took place in the context of the enormous social changes brought about by the movements of the sixties. I said that Yale could not be reformed. Has the institution changed? Are Yale’s Dark Ages over? I don’t think so. As we have noted, Yale is a major polluter of the national and international scene. At home, the institution (and its chief investment officer David Swensen) do their best to bust GESO, and then History Department Chair Jon Butler spoke against GESO at the last Washington business meeting of the American Historical Association. In some ways, the fifties are back at Yale: Political Scientist Ted Marmor urges surrender to the right on healthcare issues by parading again that hoary fifties Yale homily, “politics is the art of the possible” – as if there had never been the Sixties, when even Yale academics stopped saying it for a while. One Yale classmate treated me at a recent AHA meeting to an alcohol-fueled rant on why his son hadn’t been admitted to Yale: he said, with hostility to me, that Yale has been taken over by Jews and leftists. A leading class liberal, another Americanist, tried to get me to stop posting critical views of Yale on the class listserv on grounds that I could be more effective if I followed the advice of another classmate who remained anonymous and was characterized as “not of the old guard… thinks of himself as firmly on the left.” This anonymous guy felt that my critical tone about Bush would make people angry. The Yale Alumni Magazine grows lyrical about Yale’s role in developing aerial warfare – another great Yale contribution to civilization: “Flight to Glory,” September-October 2003. This was written in a Snoopy-vs.- the-Red-Baron tone, as if there had not later been Dresden, Hiroshima, London, Hanoi, Baghdad, and so on. In a shameful recent episode, Maya Lin was called in to head off the candidacy for the Yale Corporation of a black pro-union New Haven minister and graduate of the Divinity School. In a seeming abandonment of professional ethics, Archives and Manuscripts colludes with the History Department to grant Department member Gaddis Smith privileged access to archival material concerning Staughton Lynd’s firing while barring historian Carl Mirra. A new cabal, a kind of Woodward-Morgan-Blum redivivus, occupies the History Department, consisting of Paul Kennedy, John Lewis Gaddis, and Donald Kagan. In a notorious recent instance, they rejected an appointment of Juan Cole. The non-hiring of the University of Michigan Middle East expert was partly the result of opposition within the History Department by Kagan and Gaddis, which killed the appointment at a higher level (Senior Appointment Committee), as had been done with the Aptheker appointment. (Real Clear Politics, August 3, 2006: www.realclearpolitics.com) What is Yale for? It doesn’t boast anymore that it graduates, as Kingman Brewster put it in 1967, “1,000 male leaders.” Its aim now is to produce male, female, multi-racial, LGBT and other leaders. But consider the kind of leaders that Yale has produced, endless cohorts of people who struggle to maintain a slightly bandaged version of the status quo and to preserve the power of dominant elites in a broad spectrum of human activities, including scholarship, business, the arts, and government. Yale’s existence and values obstruct the development of the expanded and egalitarian system of public higher education that we need. Perhaps, like the equally anachronistic prep schools (Lemisch, “Hotchkiss in the Fifties: Myths and Realities,” History News Network, November 29, 2004), units like a miniaturized Yale may have a role as places of experimentation, free of both government and corporate control, as yardsticks by which to measure public and corporate-free higher education. Meantime, who does this country owe more to, Yale or CCNY? Copyright Jesse Lemisch 2006
452b66ccf9653f80308f7be090831cfe
https://historynewsnetwork.org/article/41698
How Paranoid Was Nixon?
How Paranoid Was Nixon? It wasn’t the crime, but it wasn’t the cover-up, either. Something more basic took down a president 33 years ago. Long before prosecutors identified him as an unindicted coconspirator, Richard Nixon was a conspiracy theorist. In the last 10 years, the government has systematically declassified hundreds of hours of White House tapes recorded on a voice-activated system that President Nixon had the Secret Service install in the oval office. They reveal a textbook example of what historian Richard Hofstadter called “The Paranoid Style in American Politics.” Any group can be the target of a conspiracy theory. Nixon targeted three -- Jews, intellectuals, and Ivy Leaguers. Their connection wasn’t logical, but political. Historian Arthur M. Schlesinger, Jr., summarized the reaction of the Republican bureaucratic old guard in the 1930s, when Franklin Roosevelt’s New Deal brought new kids to town: “There were too many Ivy League men, too many intellectuals, too many radicals, too many Jews.” So when Congressman Dick Nixon, a young Republican from California on the House Un-American Activities Committee in the 1940s, played a prominent role in exposing the Alger Hiss spy ring (which contained the tiniest fraction of the Jews, intellectuals and Ivy Leaguers who worked in the New Deal, but more than enough to make the right wing feel vindicated) Nixon rocketed to political stardom. As Garry Wills has noted, Nixon entered his 30s having never held public office and exited his 30s having been elected Vice President of the United States. The Hiss case made him. Later it would unmake him. Nixon drew lessons from the Hiss case about Jews, intellectuals and the Ivy League. “Remember that any intellectual is tempted to put himself above the law.” “The guys from the best families are most likely to develop that arrogance that puts them above the law.” “If they’re from any Eastern schools or Berkeley, those are particularly the potential bad ones.” “The Jews are born spies,” with an “an arrogance that says -- that’s what makes a spy. He puts himself above the law.” What’s important is that Nixon said the same thing about all three groups –- that they were arrogant and put themselves above the law. Hofstadter would have seen what was coming next. “A fundamental paradox of the paranoid style,” he wrote, “is the imitation of the enemy.” His examples include anti-Catholic Ku Klux Klansmen “donning priestly vestments” and the anti-Communist John Birch Society forming cells and employing front groups. Had he lived long enough to hear the Nixon tapes, Hofstadter could have added to the list an anti-Semitic, anti-intellectual, anti-Ivy League president arrogantly putting himself above the law. The Nixon quotes above come from June and July tapes of 1971, when he was on the verge of creating a secret police organization, the Special Investigations Unit (SIU), without congressional authority. The SIU is better known as “The Plumbers,” since one of its purposes was to plug “leaks” like that of the Pentagon Papers, a classified multi-volume Defense study of Vietnam War decision-making that the New York Times had begun publishing on June 13, 1971. By coincidence (a common phenomenon conspiracy theorists have a hard time accepting), the man who leaked the Pentagon Papers, Daniel Ellsberg, had Jewish ancestors, a career as a defense intellectual, and a degree from Harvard. By further coincidence, the man who conducted the Pentagon study, Leslie H. Gelb, and the man who recruited Gelb to the Defense Department, Morton H. Halperin, were also Jewish intellectuals with Ivy League degrees. While Halperin and Gelb let Ellsberg see a copy of the Pentagon Papers, at a time when Ellsberg had a security clearance and needed the study for Vietnam research, no investigation, legal or illegal, ever found evidence that either Halperin or Gelb took part in the leak. But political paranoids don’t need evidence. Nixon quickly formed a conspiracy theory and never let it go. In the privacy of the oval office, he lumped Halperin and Gelb together with Ellsberg as “the three Jews.” It’s not like no one warned Nixon. The day the Times started publishing, former national security adviser Walt Rostow, after talking it over with Lyndon Johnson, called the White House and fingered Ellsberg. Alexander M. Haig, Nixon’s deputy national security adviser, asked about Halperin and Gelb, but Rostow didn’t think either would do it. “He said whoever did this could not be a good Democrat,” Haig reported to Nixon the next day. “He said he would have to be a radicalized individual.” Anyone leaking thousands of pages of classified documents must abandon all hope of future government employment. Ellsberg burned that bridge, but Gelb would later work for President Carter, Halperin for President Clinton. Like other political paranoids, Nixon did have some real worries. Not necessarily the ones he put in his memoirs, about potential leaks threatening his diplomatic opening to China or nuclear arms negotiations. Both of these initiatives involved legitimate national security secrets. But the tapes show that Nixon’s first concern was with the potential exposure of an illegitimate secret, his bombing of Cambodia. It’s questionable whether Nixon ever had the right to keep the bombing of North Vietnamese infiltration routes through Cambodia secret. He claimed later it was necessary for Cambodian Prince Sihanouk to preserve his public neutrality regarding the Vietnam War. By the time the Pentagon Papers were published in 1971, however, Sihanouk had been overthrown, and Cambodia’s government was no longer officially neutral, but pro-American. The foreign policy rationale for secrecy was gone, but a pressing political one remained. The bombing of Cambodia, once revealed, was bound to cause controversy. Nixon had won the 1968 election only after publicly pledging support for Lyndon Johnson’s decision to halt the bombing of North Vietnam. How would he explain that in his first months in office he had secretly started bombing another country? On this subject Nixon was plagued by another coincidence. Halperin knew about the secret bombing. Henry A. Kissinger, Nixon’s national security adviser, had hired Halperin in 1969 onto the National Security Council staff. (One might wonder how someone with Nixon’s views of Jews, intellectuals and the Ivy League could employ, as his most trusted foreign policy adviser and de facto Secretary of State, a Jewish refugee from Nazi Germany whose intellectual credentials started with three degrees from Harvard. Bigots can make exceptions for some of their best friends, and Nixon did for Jews, intellectuals and Ivy Leaguers he personally deemed “loyal.” Kissinger, in his judgment, rose to the level of “loyal bastard.”) Kissinger had hired Halperin onto the National Security Council staff at the start of Nixon’s presidency. When someone leaked a story on one of the Cambodian bombing runs to the Times, the president had the FBI tap Halperin’s phone. The wiretap lasted 21 months. The FBI found no evidence that Halperin revealed any classified information. That was not enough for Nixon. Halperin remained at the top of one enemies list (the Nixon White House had multiple lists) along with Gelb and the Washington think tank where both were scholars, the Brookings Institution. A White House aide claimed that Gelb took a Top Secret report on the 1968 bombing halt with him to Brookings. That was
f9c174637a5b95da6032ed663ecd5162
https://historynewsnetwork.org/article/45042
Scott Horton and David T. Beito: Why Ron Paul Is Right About Terrorism ... A Letter to the GOP Base
Scott Horton and David T. Beito: Why Ron Paul Is Right About Terrorism ... A Letter to the GOP Base send him mail Liberty and Power send him mail
520fae4735234bc72a2e6b291cd6e976
https://historynewsnetwork.org/article/51384
Historians and Facebook: In the Halls of an Electronic AHA
Historians and Facebook: In the Halls of an Electronic AHA Facebook Why should historians be on Facebook? I think it has the potential to be an electronic version of the halls of the AHA: a place of lively and utterly informal talk about what historians are doing and saying, and what’s going on in their lives. Just as Facebook threatens to replace college reunions, it can constitute something like a professional meeting, between professional meetings. (Note that “something like”: I have no desire with this proposal to replace professional meetings, but rather to extend them.) I’ll show below the ways that Facebook can foster rewarding communication among historians. But first, the negatives: 1) Time. Oy, vey: Here’s another Internet time waster. We just got finished consuming several hours looking at: all available YouTubes of the Beelzebubs and other college a cappella singing groups (they have left the Whiffenpoofs in the ancient dust); the Third Man theme (not just Anton Karas, but also, playing in the background as I write this, the Lugano Mandolin Orchestra); the Red Army Chorus singing “Eets a Lung Vay to Tipperary”); and “He’s Got the Whole World in his Hands” (with an additional couple of minutes searching fruitlessly for the Lonnie Donegan version). Our inbox has 400 messages in it, some of which require reading, some of which, g_d help us, require answers, and some of which even require thought. And the Provost is telling us to rush in the syllabi for our Second Life courses. 2) Privacy. It seems that Facebook is constructed so as to make Oprah-esque self-revelation the default position: “in a relationship”? it asks; “in an open relationship”? “it’s complicated,” etc. And every time you edit your profile, a message goes out to all your “friends.” Thus if, out of motives of privacy, you prudently neglected to mention that you are in a relationship but subsequently decided that it would be more prudent to do so, a flash goes out to all your friends (and their friends?): “[heart] Jesse Lemisch is now in a relationship.” (People are writing to congratulate me, and I have much to explain to others.) Your picture is displayed – if you send them one -- and, if you are female, stalkers come after you. All of this of course can be traced back to the singles-bar dating site milieu out of which Facebook arose. It’s fine, for those who are looking. But if not, and you try to beat it by holding back personal information, one way or another it will get you. We want more human contact than our profession has allowed, but those of us who grew up in a different world, before the general collapse of society, probably want to limit what we display on the Internet. 3) “Friends” and the presentation of self. And of course there are numerous questions as yet unresolved: are “friends” actually friends? Just what is a friend? How do we select what we communicate to our friends, and what is left out? What are we vibing out by the picture we choose to display, if we do display a picture? Should we be smiling, or soberly wise? Finally, such questions will lead us to consider just what is true on Facebook, and what is inflated self-advertisement. Do we choose, in answer to the question, “What are you doing?”to convey to our friends the moment in which we write a piece like this one, selecting this moment out of 24 hours of sloth and passivity? We need to think of Facebook material as first-person testimony, and to think -- as historians do about other sources -- about the value and biases built into such testimony. Is Facebook a reasonable sample of reality, or is it rigged towards certain kinds of content? Why are people there, and just what is it they are seeking? So, there are plenty of negatives. (And it should be added that Facebook is a money-making, commercial milieu, with ads.) But I’m here to argue that, despite the above (and much else that might be added), the virtues outweigh the drawbacks if we think of Facebook as, among other things, an opportunity to construct a meeting place for historians. I value the papers given at the AHA and OAH, but I generally come away from these meetings as well educated by conversations in the halls, and while prowling the book exhibits. Somebody has mounted a stupid and uncomprehending attack on me in a book whose galleys are available at booth 432. And there he is, at booth 927, hiding, but available for animated conversation. Here’s somebody you haven’t seen in years, and, thank goodness, she has a name badge. And, you find, she is doing fascinating work. Here is somebody who responds to regards to the spouse with a facial expression that tells you immediately that your information is no longer accurate. And here are historians of all stripes, and information about new sources and new work and controversies not yet erupted. And so on: readers of HNN know what happens in the halls of the AHA. For better or worse, all these things can happen on Facebook. What can historians do on Facebook? Pretty much all of the things we do in the halls of the AHA, as above. And more. Of course we want to maintain human contact and keep up with developments in the lives of our friends insofar as they are willing to mention such developments; even some people in their seventies can be pleased that we have lived on to the 21st century with its new and more candid ways. Aside from the personal, what can we do, and how do we do it? As I mentioned, I’m just a recent arrival, but even from my limited experiences, I see a world of possibilities: “Notes,” Walls” to write on, chats, “status updates”on what we are doing, “News Feeds,” “Groups,” group messages; “notifications,; “discussion boards, videos. (e.g. a talk presented), links to other sites. What are we doing? What have we read? What do we think of what we have read, and what have we written, and what are we writing, or what are we thinking about writing? Do we dare, as one courageous young historian has done, to announce that we have writer’s block? And speaking of that young historian, by its very nature, Facebook fosters intergenerational communication – an obvious good thing. And we can contact all of our friends or select those to whom we want to send stuff. In advance of my call to historians, there is evidence that some historians are already doing these things, though, so far, they are just taking baby steps. Consider these Facebook groups, with pictures of members who are mainly young people: American Historical Association (20 members); Organization of American Historians (13 members); H-Net Editors (4 members); Clio (x members), Progressive Historians (55 members and a discussion board: “there are no discussions.”); and even HNN (218 members). There remain many questions and problems, and more down the road. Why is this form of communication better than the serial discussions that take place on lists like those on H-Net? Shall we break our groups and friends down by discipline, period, etc,? How do we define “historian,” and is membership closed (as with the AHA and OAH groups), or open to all? With all its drawbacks, Facebook could be a good place for historians to be. Let’s think about using it for our own purposes. Literacy in the 21st century requires that we experience this mass phenomenon, and we should cook up our own ways of doing it. Acknowledgment: Joanne Landy, a newcomer herself, has kept a step ahead of me and guided me through the strange new geography of Facebook.
e972e3bfaa05a0e2fd5e104f38f148cb
https://historynewsnetwork.org/article/695
Bobby Seale’s Confession: David Horowitz Was Right On
Bobby Seale’s Confession: David Horowitz Was Right On Seale is best remembered for his 1969 courtroom histrionics as one of the Chicago Eight, the eight moral degenerates who were put on trial for inciting riots at the 1968 Democratic Convention. Seale’s behavior caused the judge to order him shackled to a chair and gagged. Seale is now apparently ungagging himself about the Panther past. In a recent speech at a Panther reunion, he confessed that the Panthers were little more than extortionists, gangsters and murderers and that they killed Betty Van Patter -- whose murder remains unsolved till this day. As expected, the former Panther engaged in selective memory and exonerated himself from any personal wrongdoing in Panther crimes. He also called David Horowitz a liar –- even though he (Seale) simultaneously admitted that the Panthers were everything Horowitz has been saying they were. Seale’s confession serves as yet another reminder of the Left’s practice of historical amnesia, since the Liberal Establishment has yet to reconcile itself with who and what the Panthers really were. This explains why there has been a literal blackout by the national media on this issue. To fight this assault on historical memory, Horowitz has devoted much of his life to exposing Panther criminality. He has done so because the Panthers abducted and killed his friend –- Betty Van Patter. For speaking the truth about the criminality of the Left’s revolutionary vanguard, Horowitz has paid a large personal price. His life has been put in danger and his intellectual scholarship has been banned by the Nazi-like Leftist censors in academia. Seale’s confession now serves as yet another vindication of Horowitz. Eldridge Cleaver’s confession did the same several years back. In the now famous 1998 60 Minutes program during which he admitted the pernicious ruthlessness of the Panthers, the former Panther leader discussed his change of heart. Cleaver stated,"If people had listened to Huey Newton and me in the 1960s, there would have been a holocaust in this country." Betty Van Patter was one of the tragic victims of that holocaust in its beginning stages –- and fortunately that potential holocaust did not animate itself into a larger force. Betty had been recruited by Horowitz in the early 1970s to keep the books of a"Learning Center" in Oakland that he had created to run a school for the children of Black Panthers. A Leftist radical at the time, Horowitz had become affiliated with the Panthers after he met their infamous leader, Huey Newton, and became enchanted with him. Horowitz didn’t have a clue that the"Learning Center" served as a cover for Panther criminal activity; it was a military training center that was also being used as a vehicle to embezzle millions of dollars in California state and local education funds. After Newton killed a teenage prostitute and fled to Cuba in 1974, Elaine Brown took over as leader of the Panthers. She asked Horowitz to recommend an accountant to run the Party’s finances. Horowitz suggested Betty. Extremely naïve about what she was dealing with, Betty found something wrong with the Panthers’ record books and went to inform Brown. She subsequently disappeared. In January 1975, Betty’s battered body -- with her head caved in -- was found floating in San Francisco Bay. Horowitz was horrified by the murder of his friend. He felt a personal responsibility because he had brought Betty into the fold. He began to ask questions about her death, but he faced a disturbing lack of curiosity among his Leftwing associates. Horowitz was soon to learn that, in the mind of the Leftist, curiosity about Betty’s fate was tantamount to disloyalty to the cause. Jean-Paul Sartre had set the example long before: appealing to Leftists to avoid speaking, let alone seeking, the truth about Stalin's gulags, since doing so would demoralize the French proletariat. In his autobiography Radical Son, Horowitz explains:"To doubt the Panthers was to jeopardize the faith that the Left had placed in them. Even though the era of revolutionary enthusiasm was over, they had remained a symbolic vanguard, the embodiment of black America’s revolt against white oppression and the incalculable odds every radical faced." (p.243) In his essay,"Black Murder Inc." published in Hating Whitey, Horowitz notes,"The existence of a Murder Incorporated in the heart of the American left is something the left really doesn't want to know or think about. Such knowledge would refute its most cherished self-understandings and beliefs. It would undermine the sense of righteous indignation that is the crucial starting point of a progressive attitude. It would explode the myths on which the attitude depends." (p.121) Thus, Betty’s murder, and the eerie indifference shown to it by her Leftist friends and colleagues, forced Horowitz to face the unfathomable: that the revolutionary vanguard of his own socialist dream was a criminal entity. As a result, the radical’s utopian odyssey came to an abrupt and sudden end. His Whittaker Chambers-like conversion began. As Horowitz considered the insignificance of Betty’s life and death in the eyes of his comrades, he began to recognize a familiar historical reality being played out in the surroundings of his own life: totalitarian and ruthless means were being perpetrated to build the fantasy of an earthly paradise. Real human flesh and blood was being sacrificed on the altar of ideals. While Horowitz could no longer blind himself about the Panthers, the American Left continued to do just that. It explains why, even though many radicals of the counter-culture have knowledge about what happened to Betty Van Patter, no one has ever been charged in her death. It also explains why, after more than two decades, the national media have yet to conduct even one serious investigation into any Panther murders. Now Bobby Seale has come forward and acknowledged that the Panthers murdered Betty. He has admitted that the Panthers were what the Left has always denied they were. His confession is no Twentieth Party Congress -- that landmark watershed in Soviet history (1956) that witnessed Nikita Khrushchev expose and denounce Stalin’s crimes. But one can hope that it might be the foundation for something that can become analogous to Khrushchev’s secret speech. This is not to suggest that the Panther reality is equivalent to the Stalinist horror. Implying such a thing would only trivialize and minimize the large-scale diabolical evil that Stalinism was. But it is to suggest that many of the ingredients that spawned the Panther nightmare and the Stalinist terror were exactly the same. Seale might have just let the genie out of the bottle, and maybe we will soon be told more truth about what a mutated form of Stalinism, albeit on a much smaller scale, perpetrated in America. This article first appeared on FrontPageMagazine.com on April 25, 2002.
be540c61c11d58aba2bb929abbcfb97b
https://historynewsnetwork.org/article/7886
Kerry Got that de Gaulle Story Half-Right
Kerry Got that de Gaulle Story Half-Right Note from the Editor: This article is based, in part, on information graciously provided by the H-Diplo historians who answered our request for fast help. In the first presidential debate John Kerry argued that America has lost credibility in the eyes of its allies. In the context of a controversial call for a “global litmus test,” Kerry claimed that whereas now America would fail such a litmus test, in the past America would have passed. To illustrate America’s past credibility, Kerry related an anecdote about the French president, Charles de Gaulle: We can remember when President Kennedy in the Cuban missile crisis sent his secretary of state to Paris to meet with de Gaulle. And in the middle of the discussion, to tell them about the missiles in Cuba, he said, "Here, let me show you the photos." And de Gaulle waved them off and said, "No, no, no, no. The word of the president of the United States is good enough for me. Though Kerry’s story contains a minor factual error (Kennedy sent Dean Acheson, the former Secretary of State), the rest of the statement is accurate according to several sources, most famously Jean Lacouture’s biography of de Gaulle.* In his anecdote, however, Kerry failed to provide vital contextual information, which belies his use of the example to demonstrate American multilateralism. Before the photos were shown to him General de Gaulle stated: "I understand that you have not come to consult me, but to inform me." Upon being informed that this was indeed the case, De Gaulle then continued on to say that America was perfectly right in defending its interests, but if he had been consulted instead of informed he would have had to disagree with U.S. actions. On the one hand, then, Kerry was correct in using the example of de Gaulle to assert that many nations did once put more faith in American intelligence gathering than they do at present, de Gaulle was not questioning the veracity of the U.S intelligence photographs, whereas in the case of Iraq, allies such as France have increasingly doubted American intelligence. On the other hand, Kerry’s comments vis-à-vis de Gaulle do not demonstrate American multilateralism. Rather they highlight the structural difference in the world system at the time—a structure in which the French more often quietly acquiesced in American decision making. In the case of the Cuban Missile Crisis, the U.S. was certainly not attempting to pass some global litmus test before acting. *Jean LaCouture, Le Souverain (Editions de Seuil, Paris, 1989), pp. 364-365.
9d824c8cc40fb1bee4e152acce251e83
https://historynewsnetwork.org/article/7982
The Cuban Missile Crisis Myth You Probably Believe
The Cuban Missile Crisis Myth You Probably Believe Several months after the publication of Averting ‘The Final Failure’: John F. Kennedy and the Secret Cuban Missile Crisis Meetings (Stanford University Press, 2003), my narrative history of the Cuban missile crisis ExComm meetings, I received a call from a production company preparing a television program about letters by American presidents. They asked if I might be interested in discussing John F. Kennedy’s missile crisis letters to Nikita Khrushchev. I explained that these letters were not really JFK letters at all, since they had been composed by committee rather than by Kennedy himself. I suggested instead that we might discuss one of the most famous incidents relating to the Kennedy-Khrushchev correspondence: on the evening of October 26, the Soviet leader sent a letter offering to remove the missiles from Cuba if the U.S. pledged not to invade the island nation. But, early on October 27, Khrushchev demanded that the U.S. also withdraw its Jupiter missiles from Turkey. According to the traditional view, Robert Kennedy suggested accepting the proposal in Khrushchev’s first letter and simply ignoring the second message. This strategy, which presumably led to resolving the crisis, came to be called the “Trollope Ploy”—a reference to a plot device by nineteenth-century British novelist Anthony Trollope, in which a woman interprets a casual romantic gesture, such as squeezing her hand, as a marriage proposal. The producer seemed interested in including a “revisionist” perspective in the program and we later did fifteen minutes of filming in which I carefully explained that the Trollope Ploy is a great story, but the ExComm tapes prove that it never really happened. When the program was broadcast, however, the editors cut quickly from my five seconds to actor Martin Sheen—who had played JFK in a 1983 dramatization of the missile crisis. Sheen recapitulated the standard account of the Trollope Ploy and praised its brilliance in helping the U.S. and the U.S.S.R. avoid nuclear war. The filmmakers apparently decided that the conventional explanation was less complicated and made a more dramatic story. In fact, even among historians and ExComm participants the Trollope Ploy remains an all but immovable fixture in the legend and lore of the Cuban missile crisis. Stewart Alsop and Charles Bartlett, writing in the Saturday Evening Post less than two weeks after the crisis and exploiting leaks from the Kennedy brothers, first created the notion that Robert Kennedy “had dreamed up the ‘Trollope Ploy’ to save the day.” Several years later, anticipating a run for president in a nation bitterly divided by the Vietnam war, RFK was eager to take credit for hitting upon a path to peace in 1962: “I suggested, and was supported by Ted Sorensen and others, that we ignore the latest Khrushchev letter and respond to his earlier letter’s proposal…that the Soviet missiles and offensive weapons would be removed from Cuba under UN inspection and verification, if, on its side, the United States would agree with the rest of the Western Hemisphere not to invade Cuba.” [1] Arthur Schlesinger, Jr., writing three years after the crisis, also claimed that RFK “came up with a thought of breathtaking simplicity and ingenuity: why not ignore the second Khrushchev message and reply to the first?” Ted Sorensen, an ExComm participant, had initially suggested in 1965 that JFK himself had “decided to treat the latest [October 27] letter as propaganda and to concentrate on the Friday night [October 26] letter” and had delegated RFK and Sorensen to come up with the right wording. However, when Sorensen completed the manuscript of Thirteen Days, published in 1969 after RFK’s assassination, he did not challenge Bobby Kennedy’s claim to have first suggested this strategy.[2] Some controversy has continued about who actually initiated the Trollope Ploy. Some ExComm participants and scholars have suggested that Llewellyn Thompson, former Ambassador to Moscow, came up with the idea. Others have pointed to National Security Adviser McGeorge Bundy or Assistant Secretary of State Edwin Martin. Dean Rusk, the secretary of state, believed that Thompson “originally suggested” the idea but argued that RFK first “brought it out at the table.” Several historians have insisted that “the idea was hardly Robert Kennedy’s alone … [and] entered the discussion gradually and was embraced by several members” and that the Trollope Ploy “is a little too elegant to explain the muddle and confusion of the debate on Saturday, October 27.” There has, nonetheless, been a long-standing consensus that the Trollope Ploy was “a brilliant way to handle it,” “an ingenious ploy,” “an extraordinary diplomatic move,” and that RFK met with the Soviet ambassador on the evening of October 27 “to execute the Trollope Ploy.”[3] President Kennedy himself immediately seized on the political benefit in this explanation of the settlement of the crisis since the secret agreement to remove the U.S. missiles from Turkey was just that—top secret—and remained so for decades. Only hours after Khrushchev publicly agreed to remove the missiles, JFK phoned former Presidents Eisenhower, Truman and Hoover—and deliberately misinformed them. He accurately reported that Khrushchev, on Friday, had privately suggested withdrawing the missiles in exchange for an American promise not to invade Cuba; but, on Saturday, the Kremlin leader had sent a public message offering to remove the missiles if the U.S. pulled its Jupiter missiles out of Turkey. President Kennedy informed Eisenhower, “we couldn’t get into that deal;” assured Truman, “they … accepted the earlier proposal;” and told Hoover that Khrushchev had gone back “to their more reasonable [Friday] position.” Eisenhower, who had dealt personally with Khrushchev, asked skeptically if the Soviets had tried to attach any other conditions. “No,” Kennedy replied disingenuously, “except that we’re not gonna invade Cuba.” The former president, aware of only half the truth, concluded, “this is a very, I think, conciliatory move he’s made.” Such deceptions shaped the administration’s cover story and helped generate the notion of the Trollope Ploy—which was indelibly fixed in public consciousness by the 1974 television film, “The Missiles of October,” based on RFK’s book. In fact, listening carefully to the recently declassified ExComm tapes proves conclusively that the notion of the Trollope Ploy was actually invented to conceal the real agreement to remove U.S. missiles from Turkey. It is a myth; it simply did not happen that way—much like the resilient fable that Lincoln dashed off the Gettysburg Address on the back of an envelope. At the morning ExComm meeting on Saturday, October 27, [4] barely twelve hours after receiving Khrushchev’s Friday evening letter--the first of the two letters--JFK read aloud a press statement just handed to him: “Premier Khrushchev told President Kennedy in a message today he would withdraw offensive weapons from Cuba if the United States withdrew its rockets from Turkey.” The president and the ExComm were clearly startled and puzzled. “He didn’t really say that, did he?” Sorensen recalled. “No, no,” Bundy insisted. But JFK speculated, “He may be putting out another letter,” and called in press secretary Pierre Salinger. “I read it pretty carefully,” Salinger asserted, “and it didn’t read that way to me either.” “Well,” the president concluded, “let’s just sit tight on it.” Rusk finally articulated the emerging realization in the Cabinet Room: “This appears to be something quite new.” President Kennedy had actually been probing the Turkish option for more than a week and asked, “where are we with our conversations with the Turks?” Assistant Defense Secretary Paul Nitze responded firmly, “The Turks say that this is absolutely anathema” and view it “as a matter of prestige and politics.” JFK understood the world of prestige and politics as well as anyone in the room, but told Nitze, “Well, I don’t think we can” take that position “if this is an accurate [report].” Bundy argued that if Khrushchev had backed away from the “purely Cuban context” in last night’s letter, “There’s nothing wrong with our posture in sticking to that line.” “Well maybe they changed it overnight,” JFK persisted. “He’s in a difficult position to change it overnight,” Bundy reasoned, “having sent you a personal communication on the other line.” “Well now, let’s say he has changed it,” JFK snapped, “and this is his latest position.” “Well, I would answer back,” Bundy retorted testily, “saying that ‘I would prefer to deal with your interesting proposals of last night.’” Someone egged Bundy on, whispering, “Go for it!” JFK’s reply represents a turning point in the discussions—leaving no doubt about his evolving position: “Well now, that’s what we oughta be thinkin’ about. We’re gonna be in an insupportable position on this matter if this becomes his proposal. In the first place, we last year tried to get the missiles out of there because they’re not militarily useful, number one. Number two, it’s gonna—to any man at the United Nations or any other rational man, it will look like a very fair trade.” “I don’t think so,” Nitze countered, as someone muttered “No, no, no” in the background. “Deal with this Cuban thing. We’ll talk about other things later.” Salinger soon brought in a news ticker report which JFK read aloud, confirming Khrushchev’s new public offer to link the missiles in Cuba and Turkey. “Now we’ve known this might be coming for a week,” Kennedy asserted impatiently, “This is their proposal.” “How much negotiation have we had with the Turks this week?” JFK grumbled again, “Who’s done it?” “We haven’t talked with the Turks,” Rusk tried to explain, “The Turks have talked with us.” “Where have they talked with us?” JFK demanded. “In NATO,” Rusk replied. “I’ve talked about it now for a week,” the president protested again. “Have we got any conversations in Turkey with the Turks?” Rusk reiterated, “We’ve not actually talked with the Turks.” Under Secretary of State George Ball declared that approaching the Turks on withdrawing the Jupiters “would be an extremely unsettling business.” “Well,” JFK barked, “this is unsettling now George, because he’s got us in a pretty good spot here. Because most people will regard this as not an unreasonable proposal. I’ll just tell you that.” “But, what ‘most people,’ Mr. President?” Bundy asked skeptically. The president shot back: “I think you’re gonna have it very difficult to explain why we are going to take hostile military action in Cuba … when he’s saying, ‘If you get yours out of Turkey, we’ll get ours out of Cuba.’ I think you’ve got a very tough one here.” “I don’t see why we pick that track,” Bundy repeated, “when he’s offered us the other track in the last 24 hours.” JFK interrupted irritably, “Well he’s now offered us a new one! … “I think we have to assume that this is their new and latest position, and it’s a public one.” Rusk guessed that the personal Friday night letter had been sent by Khrushchev “without clearance,” and a consensus quickly developed that “The Politburo intended this one.” “This should be knocked down publicly,” Bundy demanded. “Privately we say to Khrushchev: ‘Look, your public statement is a very dangerous one because it makes impossible immediate discussion of your private proposals and requires us to proceed urgently with the things that we have in mind. You’d better get straightened out!’” CIA director John McCone, backed by several others, affirmed, “This is exactly right!” Ball subsequently revealed, at the late afternoon ExComm meeting, that the Soviet UN Ambassador had told Secretary General U Thant that Khrushchev’s private Friday letter had been “designed to reduce tension but so as far as he was concerned,” the public Saturday message, just as the president had argued that morning, “contained the substantive proposal.” Bundy continued to resist, “I think if we sound as if we wanted to make this trade to our NATO people and to all the people who are tied to us by alliance, we are in real trouble.” The national security adviser admonished the commander-in-chief: “I think that we’ll all join in doing this if this is the decision. But I think we should tell you that that’s the universal assessment of everyone in the government that’s connected with these alliance problems.” JFK nonetheless maneuvered to put the Turkish option on the fast track, but RFK insisted that even considering the Turkish trade “blows the possibility of this other one, of course, doesn’t it?” “Of what?” JFK asked impatiently. “Of getting an acceptance of the [Friday] proposal,” RFK replied. Rusk soon proposed new language for JFK’s message to Khrushchev: “‘As I was preparing this letter, I learned of your broadcast message today. That message raises problems affecting many countries and complicated issues not related to Cuba or the Western Hemisphere.’” After the crisis in Cuba is resolved, “‘we can make progress on other and wider issues.’” President Kennedy recognized immediately that Rusk’s wording did not reflect his persistent stance on pursuing a Turkey-Cuba trade—his advisers appeared to be trying a rather transparent end run around his position. “Well, isn’t that really rejecting their proposal of this morning?” JFK countered irritably. “I don’t think so,” Bundy replied, supported by Rusk. “It’s rejecting the immediate tie-in [on Turkey],” Treasury Secretary Douglas Dillon affirmed, “But, we’ve got to do that.” “We’re not rejecting the tie-in,” President Kennedy responded forcefully. “Mr. President,” Ambassador Thompson admonished, “if we go on the basis of a trade, which I gather is somewhat in your mind, we end up, it seems to me, with the Soviets still in Cuba with planes and technicians and so on. Even though the missiles are out, that would surely be unacceptable and put you in a worse position.” President Kennedy replied with practical and determined logic: “But our technicians and planes and guarantees would still exist for Turkey. I’m just thinking about what we’re gonna have to do in a day or so, which is 500 sorties in 7 days and possibly an invasion, all because we wouldn’t take missiles out of Turkey.” Perhaps recalling his own wartime experience, JFK continued, “And we all know how quickly everybody’s courage goes when the blood starts to flow and that’s what’s gonna happen in NATO.” If the Soviets “grab Berlin, everybody’s gonna say, ‘Well, that was a pretty good proposition.’ Let’s not kid ourselves,” he repeated for the third time, “that’s the difficulty. Today it sounds great to reject it, but it’s not going to after we do something!” If the Turks were adamant, JFK continued, then the U.S. ought to get NATO to “put enough pressure on them. I just tell you,” he lectured, “I think we’re better off to get those missiles out of Turkey and out of Cuba because I think the way of getting ‘em out of Turkey and out of Cuba is gonna be very, very difficult and very bloody, one place or another.” Bundy finally seemed to be coming to terms with the president’s resolve: “If you…are yourself sure that this is the best way out, then I would say that an immediate personal telegram of acceptance [of the trade] was the best thing to do.” But JFK objected to forcing the deal on Turkey and NATO. “I’d rather go the total blockade route which is a lesser step than this military action. What I’d like to do is have the Turks and NATO equally feel that this is the wiser move.” Sorensen pressed the president to delay replying to Khrushchev’s public Saturday offer and instead respond privately to the secret Friday letter: “There’s always a chance that he’ll accept that. … We meanwhile won’t have broken up NATO over something that never would have come to NATO.” “The point of the matter is,” Kennedy snapped again, “Khrushchev’s gonna come back and refer to his thing this morning on Turkey. And then we’re gonna be screwing around for another 48 hours. … He’ll come back and say, ‘Well we’re glad to settle the Cuban matter. What is your opinion of our proposal about Turkey?’ So then we’re on to Monday afternoon, and the work goes on. … He can hang us up for three days while he goes on with the work.” “For three weeks!” Dillon muttered. “Let’s start with our letter,” JFK continued. “It’s got to be finessed … we have to finesse him.” President Kennedy, nonetheless, had no illusions about Khrushchev’s response to U.S. pressure to go back to Friday’s proposal, “which he isn’t gonna give us. He’s now moved on to the Turkish thing. So we’re just gonna get a letter back saying, ‘Well, he’d be glad to settle Cuba when we settle Turkey.’” Thompson repeated that Khrushchev might still accept the Friday deal since he could still say that he had removed the U.S. threat to Cuba. “He must be a little shaken up,” RFK pointed out, “or he wouldn’t have sent the [Friday] message to you in the first place.” “That’s last night,” JFK retorted impatiently. “But it’s certainly conceivable,” RFK replied, “that you could get him back to that. I don’t think that we should abandon it.” JFK halfheartedly agreed that there was no harm in trying. “All right,” he finally conceded, “Let’s send this” letter dealing with Cuba first. But, he cautioned that the key question remained, “what are we gonna do about the Turks.” “Actually, I think Bobby’s formula is a good one,” Sorensen observed; “we say, ‘we are accepting your offer of your letter last night and therefore there’s no need to talk about these other things.’” The president seemed willing to go along with this scheme on the slim chance that Khrushchev would at least agree to a cessation of work, but he clearly remained unconvinced and unenthusiastic: “As I say, he’s not gonna [accept] now [after his public offer on Turkey]. Tommy [Thompson] isn’t so sure. But anyway, we can try this thing, but he’s gonna come back on Turkey.” Bundy jumped on the bandwagon as well: “That’s right, Mr. President. I think that Bobby’s notion of a concrete acceptance on our part of how we read last night’s telegram is very important.” After news arrived that a U-2 had been shot down over Cuba by a Soviet surface-to-air missile, the president tried to placate the opponents of a Turkish deal by reiterating that “first we oughta try to go the first route which you suggest and get him back [to the Friday offer]. That’s what our letter’s doing.” But, at the same time, he again underscored his lack of conviction about that strategy and made clear that he was determined to keep the Turkish option alive: “Then it seems to me we oughta have a discussion with NATO about these Turkish missiles.” At the end of the late afternoon ExComm meeting, Defense Secretary Robert McNamara, Deputy Defense Secretary Roswell Gilpatric, Ball, Bundy, RFK, Rusk, Sorensen and Thompson joined President Kennedy, at his invitation, in the Oval Office. JFK revealed that his brother Bobby was about to meet with Ambassador Anatoly Dobrynin and requested advice on what to tell the Soviet diplomat. The group quickly agreed that RFK should warn Dobrynin that military action against Cuba was imminent and make clear, consistent with Khrushchev’s Friday letter, that the U.S. was prepared to pledge not to invade Cuba if the missiles were withdrawn. But, the president also continued to press for a deal on the Turkish missiles. Rusk, finally recognizing JFK’s determination, suggested that RFK advise the ambassador that a public quid pro quo for the missiles in Turkey was unacceptable, but the president was prepared to remove them once the Cuban crisis was resolved. The proposal was quickly accepted. Robert Kennedy was instructed to tell Dobrynin that any Soviet reference to this secret proposal would make it null and void. JFK clearly had no faith in the strategy of accepting Khrushchev’s Friday offer and ignoring his public Saturday message and instead worked secretly with Rusk to put together another fall-back plan. The secretary of state arranged to have former deputy UN Secretary General Andrew Cordier put in place an emergency back channel strategy by which U Thant would announce, after receiving private word from Rusk that negotiations had failed, a UN plan through which the U.S. and the U.S.S.R. would mutually agree to remove their missiles from Turkey and Cuba. JFK was prepared to gamble that if the U.S. publicly accepted this supposedly neutral plan, it would be very difficult for the Soviets to reject it. Khrushchev’s unexpected decision the following morning made the Cordier gambit moot and Rusk did not reveal this closely-held secret for over twenty-five years. Listening to the October 27 meeting tapes proves that ExComm participants and scholars have read far too much cunning and coherence into the discussion of the Trollope Ploy. President Kennedy, as the tapes document, stubbornly and persistently contended that Khrushchev’s Saturday offer could not be ignored precisely because it had been made public. In fact, JFK’s eventual message to Khrushchev did not ignore the Saturday proposal on Turkey, but left the door open to settling broader international issues once the immediate danger in Cuba had been neutralized. JFK ultimately offered the Kremlin a calculated blend of Khrushchev’s October 26 and 27 proposals: the removal of the Soviet missiles from Cuba, an American non-invasion pledge (contingent on UN inspection), a willingness to talk later about NATO-related issues and a secret commitment to withdraw the Jupiters from Turkey. The Trollope Ploy is essentially a myth. Robert Kennedy did tirelessly press his brother not to give up on Khrushchev’s Friday proposal. JFK, although skeptical and reluctant, finally agreed to try this scheme despite repeatedly predicting that the Soviet leader would inevitably “come back” to his public offer on the Turkish missiles. The president had no illusions about forcing Khrushchev to settle for the terms in his earlier message and assented to this strategy largely to placate unyielding ExComm opposition. In fact, as revealed by RFK’s meeting with Dobrynin and the other secret steps taken later that day and kept from much of the ExComm, JFK was determined not to allow this chance to avert nuclear catastrophe slip away. As he had reminded the gung-ho Joint Chiefs on October 19, an attack on Cuba could prompt the firing of nuclear missiles against American cities and result in 80-100 million casualties—“you’re talking about the destruction of a country.” In fact, President Kennedy’s inclination to pursue the Turkish option actually seems to have hardened in response to the dogged intractability of his advisers at the October 27 meetings. The ExComm toughened JFK’s determination simply by repeatedly and all but unanimously opposing his preferred course of action—a deal on the Turkish missiles. The later conclusion, based on the incomplete transcripts prepared by Bundy in the 1980s, that “Llewellyn Thompson certainly persuaded the President that it [the Trollope Ploy] might actually work” is not corroborated by the definitive primary source—the ExComm tapes. This celebrated diplomatic slight of hand, in essence little more than a cosmetic concession to the full ExComm by JFK, ultimately served to conceal the real agreement that secretly—and peacefully—resolved the Cuban missile crisis.[5] ENDNOTES [1] Evan Thomas, Robert Kennedy: His Life, Simon and Schuster, 2000, 438; Robert F. Kennedy, Thirteen Days: A Memoir of the Cuban Missile Crisis, Norton, 1999, 77. [2] Arthur M. Schlesinger, Jr., A Thousand Days: John F. Kennedy in the White House, Houghton Mifflin, 1965, 828; Theodore C. Sorensen, Kennedy, Harper and Row, 1965, 714-5. [3] James G. Blight and David A. Welch, On the Brink: Americans and Soviets Reexamine the Cuban Missile Crisis, Noonday, 1990, 162, 179, 369; Thomas, Robert Kennedy, 438; Robert W. Merry, Taking on the World: Joseph and Stewart Alsop—Guardians of the American Century, Viking, 1996, 389; Graham Allison, Essence of Decision: Explaining the Cuban Missile Crisis, Little, Brown, 1971, 227; David A. Welch and James G. Blight, “The Eleventh Hour of the Cuban Missile Crisis: an Introduction to the ExComm Transcripts,” International Security, Winter 1987/88, 16. [4] This account of the October 27 ExComm meetings is adapted from the author’s concise narrative for classroom use, The Week the World Stood Still: Inside the Secret Cuban Missile Crisis, to be published in January 2005. (Copyright: Stanford University Press) For a more complete discussion of the “Black Saturday” meetings and the Trollope Ploy, see Stern, Averting ‘The Final Failure’, 310-86, 419-26. [5] Blight and Welch, On the Brink, 369.
1754c85c3a376c02c4c34d1f9bbce285
https://historynewsnetwork.org/article/814
Did J. Edgar Hoover Really Wear Dresses?
Did J. Edgar Hoover Really Wear Dresses? In 1993, Anthony Summers, in his book Official and Confidential: The Secret Life of J. Edgar Hoover, claimed that Hoover did not pursue organized crime because the Mafia had blackmail material on him. In support of that, Summers quoted Susan L. Rosenstiel, a former wife of Lewis S. Rosenstiel, chairman of Schenley Industries Inc., as saying that in 1958, she was at a party at the Plaza Hotel where Hoover engaged in cross-dressing in front of her then-husband and Roy Cohn, former counsel to Senator Joe McCarthy. "He [Hoover] was wearing a fluffy black dress, very fluffy, with flounces and lace stockings and high heels, and a black curly wig," Summers quoted Susan as saying."He had makeup on and false eyelashes."[1] Susan claimed Cohn introduced Hoover to her as"Mary." Hoover allegedly responded,"Good evening." She said she saw Hoover go into a bedroom and take off his skirt. There,"young blond boys" worked on him in bed. Later, as Hoover and Cohn watched, Lewis Rosenstiel had sex with the young boys. A year later, Susan claimed, she again saw Hoover at the Plaza. This time, the director was wearing a red dress. Around his neck was a black feather boa. He was holding a Bible, and he asked one of the blond boys to read a passage as another boy played with him. It was episodes such as these, Summers declared, that the Mafia held over Hoover's head."Mafia bosses obtained information about Hoover's sex life and used it for decades to keep the FBI at bay," the jacket of the book says."Without this, the Mafia as we know it might never have gained its hold on America." Rosenstiel, a former bootlegger during Prohibition, was well-acquainted with Mafia figures such as Frank Costello, originally Francesco Castiglia. He was also friends with Hoover, having endowed the J. Edgar Hoover Foundation in 1965 with $1 million. But Susan was Summers's primary source for the cross-dressing story, and she was not exactly a credible witness. In fact, she served time at Riker's Island for perjuring herself in a 1971 case. Convinced that Hoover had somehow stacked the cards against her during the divorce proceedings, Susan had long tried to interest anyone who would listen that Hoover was a cross-dresser. Susan had taken her allegations to Robert M. Morgenthau, the U.S. Attorney in New York, who himself had no use for Hoover. "She used to call me after 5:30 p.m. when my secretary had left, so I wound up having to listen to her," Morgenthau said. He said he found her claims baseless. But Morgenthau shared her allegations with William Hundley, who had a Justice Department attorney look into them.[2] "Susie Rosenstiel had a total ax to grind," Hundley said."Somebody who worked for me talked to her. It was made up out of whole cloth. She hated Hoover for some alleged wrong he had done. Plus the story was beyond belief. I told Summers this. Then he goes ahead and uses it."[3] Now seventy-seven and living in a single room in a Manhattan hotel where rooms rent for $98.85 a night, Rosenstiel said Summers paid her for the interviews she gave him, and she wanted to be paid for an interview for this book. Like most journalists and news organizations, I believe paying for information calls into question its credibility. When I told Rosenstiel this and suggested she could generate publicity for herself by telling the truth and admitting she made up the cross-dressing story, she said,"It did happen."[4] Summers said that after Rosenstiel told her the cross-dressing story, she told him that she intended to give the story to another journalist. Summers said he paid her a fee to hold the story until his book came out. The producer of a documentary made for Frontline and the BBC also paid Rosenstiel for her appearance with Summers, he said. In an Esquire piece, Peter Maas, a world class journalist who died in 2001, pointed out that Summers's rendition of events has a fatal flaw: After the alleged incident at the Plaza, Hoover assigned agents to investigate Lansky, who supposedly had the goods on him. When the Miami Field Office complained that the investigation of Lansky was not producing enough information to justify the manpower, Hoover wrote back,"Lansky has been designated for 'crash' investigation. The importance of this case cannot be overemphasized . . . The bureau expects this investigation to be vigorous and detailed." Still presumably cowering because Lansky had incriminating photos of him, Hoover followed up with an order to install bugs in Lansky's apartment. Having been ordered by Robert Kennedy to attack the Mafia as the FBI had attacked Communism, Hoover wrote in the January 1962 FBI Law Enforcement Bulletin,"The battle is joined. We have taken up the gauntlet flung down by organized crime. Let us unite in a devastating assault to annihilate this mortal enemy." Yet even before Kennedy took over, Hoover, stung by the disclosure of the 1957 Appalachin meeting, had been pursuing the mob aggressively. Doesn't that torpedo Summers= theory? No, Summers told me, by that time it didn't matter to Hoover. But, of course, if there were such photos, they would have been just as embarrassing in the 1950s and 1960s as in earlier years. Summers pointed out that he wrote a lengthy rebuttal to Esquire, and he called the Maas article"inaccurate and abhorrent." Despite the clear implication in the book that her story was true and the declaration on the book's jacket that the Mafia knew that Hoover was a" closet homosexual and transvestite" and held that over his head, Summers told me that he merely reported what Rosenstiel said, along with what others claimed. He said he holds"no firm view one way or the other" as to whether she told the truth.[5] While there was always speculation about Hoover and Tolson, there were never any rumors about Hoover cross-dressing. Oliver"Buck" Revell, a former associate director of the FBI, noted that if the Mafia had had anything on Hoover, it would have been picked up in wiretaps mounted against organized crime after Appalachin. There was never a hint of such a claim, Revell said. Hoover was more familiar to Americans than most presidents. The director of the FBI simply could not have engaged in such activity at the Plaza, with a number of witnesses present, without having it leak out. The cross-dressing allegations were as credible as McCarthy's claim that there were 205 known Communists in the State Department, yet the press widely circulated the claim without further investigation. That Hoover was a cross-dresser is now largely presumed to be fact even by sophisticated people. (Click here to see a cartoon by Pat Oliphant depicting Hoover in a dress.) Summers, Anthony, Official and Confidential, page 254. [back]Richmond Times-Dispatch, March 27, 1993, page A14. [back]Hundley, William G., July 24, 2001, Robert Morgenthau, January 2, 2002. [back]Rosenstiel, Susan, December 23, 2001. [back]Summers, Anthony, December 24, 2001 and January 2, 2002 and email of January 4, 2002. [back]
1daae367401549cfa91170fc004dda48
https://historynewsnetwork.org/article/83554
American Torture: No Knowledge of History, No Sense of Tragedy
American Torture: No Knowledge of History, No Sense of Tragedy Recently in the New York Times, Scott Shane and Mark Mazzetti showed that the Bush Administration, the CIA, and the Senate and House Intelligence Committees failed to ask for any historical context before approving so-called “harsh interrogation techniques,” including waterboarding, in 2002.  No one apparently knew, or wanted to know, that the U.S. had defined waterboarding as torture and prosecuted it as a war crime after World War II.  Did our leaders think the events of 9-11 constituted an entirely new reality, one in which historical precedent was rendered nugatory? Perhaps so, but their failure to ask historically-based questions also highlights the narrowness of their intellectual training.  Like the accused Nazi judges before the bar in the movie Judgment at Nuremberg (1961), they asked themselves only what the law is (or what it became under John Ashcroft and John Yoo), not whether it is just.  If a legal brief authorized brutal methods such as waterboarding, who were they to question, let alone challenge, the (freshly minted) legal opinion? Clearly, the leaders making and implementing decisions on torture constituted a single, self-referencing, self-identified Washington elite almost entirely divorced from thinking historically, let alone tragically.  And because they could think neither historically nor tragically, they found false comfort in picturing themselves as stalwart defenders of the nation, not recognizing the mesmerizing power of vengeance and hate. Our elected officials who find history books too onerous would do well to invest three hours of their time to watch Judgment at Nuremberg.  They might learn that a compromised judiciary will uphold any action -- discriminatory race laws, involuntary sterilization, even mass murder -- all in the name of defending the people from supposedly apocalyptic threats. Indeed, defending the country from apocalyptic threats is a popular line for those wishing to uphold the Bush Administration’s policy on torture.  After the tragedy of 9/11, and subsequent panic in the wake of Anthrax attacks, our leaders were compelled to “take the gloves off” in our defense, even compelled to exact vengeance as a way of deterring future attacks -- or so these torture apologists claim. In their haste to make America safe, Bush and Company effectively declared vengeance was theirs and not the Lord’s.  But the human lust for vengeance is blinding, even more so when it’s perceived as righteous.  Here our wrathful lawyers/politicians might consider the lessons of Giuseppe Verdi’s opera, Rigoletto.  The hunchbacked court jester, Rigoletto, delights in other people’s misfortune, and for this he is cursed by a cuckolded husband.  Soon, his own daughter, Gilda, the joy of his life, is kidnapped and despoiled, the first bitter fruits of the curse.  Despite Gilda’s pleas to forgive the transgressor, Rigoletto, blinded by his own murderous desire for vengeance, sets in motion a chain of events that ends with the sacrificial death of his beloved Gilda and the annihilation of any vestige of goodness in his tortured soul. In Rigoletto, the desire for total vengeance produces total tragedy.  In Judgment at Nuremberg, man’s ability to justify the worst crimes in the name of “safeguarding the people” is memorably exposed and justly condemned. What we need today in Washington are fewer leaders who base their decisions on vengeance empowered by legal briefs and more who are willing to embrace the toughest lessons to be gleaned from history and tragedy.  What we need today as well is our own version of Judgment at Nuremberg-- our own special prosecutorial court -- one that is unafraid to elevate justice, truth, and the value of a single human being above all other concerns -- especially political ones. Related Links Robert Brent Toplin: Punish Torture? Hollywood's Answer. HNN Hot Topics: Torture
f17a1e56f2f68b49a359bd7029b07480
https://historynewsnetwork.org/article/8420
Are Gilder and Lehrman Tilting American History to the Right? A Case in Point
Are Gilder and Lehrman Tilting American History to the Right? A Case in Point National Review The central themes of the Hamilton exhibit announce themselves fairly garishly even before you enter. A huge, multi-colored banner stretching a full block along Central Park West reproduces the ten dollar bill (take a look in your wallet). Leaving a small space for entry, the banner otherwise covers the entire four-story facade: standing at the corner of Central Park West and 77th, I found it impossible to get the whole thing in a picture. (Let's hope that, like the characters in Macys' Thanksgiving Parade that assembles nearby, it's well anchored against the wind.) The exhibit is entitled "The Man who Made Modern America," reflecting a theory of how history happens, an archaically hagiographic approach (which is coming back into style in Bush's America), and a certain political partisanship. An opening high-tech slide show ridicules those contemporaries who dared to utter critical words about Hamilton. Actor's voices represent Jefferson as haughty and aristocratic; John Adams is whiny, kvetchy, failing to recognize Hamilton's greatness. Gilder Lehrman has decided to pursue a popular audience by presenting Hamilton as somewhat populist, anti-slavery (unlike that bad Jefferson), a humble immigrant, illegitimate at that, who acted out the American dream and rose to the heights: a Great Man who rose from the people. (In the "Time Line" section, there is one mention of Hamilton's role in putting down the Whiskey Rebellion by armed force, but no effort to square this with the otherwise benign picture of him.) Gilder and Lehrman must have spent millions on this high-tech exhibit. (Overall, the New York Times reports, the exhibit cost the Society $5 million: "Shift at Historical Society Raises Concerns"; the article quotes historian Mike Wallace as fearing that the Historical Society could "wind up as a subsidiary of the Gilder Lehrman Institute.") As we enter the main hall, we see, straight out of 1984, several gigantic video screens showing modern scenes: the floor of the NY Stock Exchange, commuters at Grand Central Terminal, high rises under construction, and -- no kidding -- military paratroopers jumping out of planes. This is the exhibit's idea of Hamilton's heritage today, and it leaves no doubt as to the heroic quality of that heritage. So that we can see the video images, the items in the exhibit are mainly in semi-darkness, with many displayed almost at floor level and with illumination that makes them almost impossible to make sense of. A forty-minute theatre presentation, "Alexander Hamilton in Worlds Unknown," well acted on stage by a man and a woman (a little Oedipally, she plays Hamilton's mother, his wife, and Mrs. Reynolds, with whom Hamilton had an affair) is nicely integrated with large video images. We move from the opening to the strains of "Yankee Doodle," through Hamilton's life, and on to The Duel, with his life poignantly and sympathetically presented partly as a search for his absent father, with Washington and others as surrogates. G and L's notorious takeover and purge at the N-YHS (the July New York Times article mentioned above caught only the tip of the iceberg, with more to come) is part of their larger penetration of American history. These two wealthy Yalies ('60), supporters of the right-wing Manhattan Institute (Gilder is founder and a former chair), have a clear ideological program. To me, the strategy seems reminiscent of the CIA's suppport for the Congress for Cultural Freedom, including the funding of Encounter (see Christopher Lasch's classic article on this in Barton J. Bernstein, ed., Towards a New Past, 1968). Gilder and Lehrman are buying legitimacy by buying historians, giving money to Yale and to the Organization of American Historians, constructing a board with some stellar left-liberal types on it (what is with this, guys?). Put this together with the horrors at NEH and we have a clear picture of the theft of history for ideological purposes. But the N-YHS exhibit has neither subtlety nor historiographical sophistication: the codpiece slips, and the right-wing agenda comes right out in your face, starting with that huge $10 bill, hanging over Central Park West. Related LinksNew-York Historical SocietyGilder/Lehrman Institute of American History"After 200 Years the Hamiltons Make Up with the Burrs" by Brian Murphy
aa28ecac8893b3117894adead34a0685
https://historynewsnetwork.org/article/8588
Hotchkiss in the Fifties: Myths and Realities
Hotchkiss in the Fifties: Myths and Realities This article was adapted from a presentation at the fiftieth reunion of the Hotchkiss Class of 1954, which was held at the Hotchkiss School, Lakeville, CT, on October 30, 2004.
fbef82f5c3316c03920d5f21f6d29fa7
https://historynewsnetwork.org/article/9245
Why Did Truman Really Fire MacArthur? ... The Obscure History of Nuclear Weapons and the Korean War Provides the Answer
Why Did Truman Really Fire MacArthur? ... The Obscure History of Nuclear Weapons and the Korean War Provides the Answer The media claim that North Korea is trying to obtain and use weapons of mass destruction. Yet the United States, which opposes this strategy, has used or threatened to use such weapons in northeast Asia since the 1940s, when it did drop atomic bombs on Japan. The forgotten war -- the Korean war of 1950-53 -- might better be called the unknown war. What was indelible about it was the extraordinary destructiveness of the United States' air campaigns against North Korea, from the widespread and continuous use of firebombing (mainly with napalm), to threats to use nuclear and chemical weapons (1), and the destruction of huge North Korean dams in the final stages of the war. Yet this episode is mostly unknown even to historians, let alone to the average citizen, and it has never been mentioned during the past decade of media analysis of the North Korean nuclear problem. Korea is also assumed to have been a limited war, but its prosecution bore a strong resemblance to the air war against Imperial Japan in the second world war, and was often directed by the same US military leaders. The atomic attacks on Hiroshima and Nagasaki have been examined from many different perspectives, yet the incendiary air attacks against Japanese and Korean cities have received much less attention. The US post-Korean war air power and nuclear strategy in northeast Asia are even less well understood; yet these have dramatically shaped North Korean choices and remain a key factor in its national security strategy. Napalm was invented at the end of the second world war. It became a major issue during the Vietnam war, brought to prominence by horrific photos of injured civilians. Yet far more napalm was dropped on Korea and with much more devastating effect, since the Democratic People's Republic of Korea (DPRK) had many more populous cities and urban industrial installations than North Vietnam. In 2003 I participated in a conference with US veterans of the Korean war. During a discussion about napalm, a survivor who lost an eye in the Changjin (in Japanese, Chosin) Reservoir battle said it was indeed a nasty weapon -- but "it fell on the right people." (Ah yes, the "right people" -- a friendly-fire drop on a dozen US soldiers.) He continued: "Men all around me were burned. They lay rolling in the snow. Men I knew, marched and fought with begged me to shoot them . . . It was terrible. Where the napalm had burned the skin to a crisp, it would be peeled back from the face, arms, legs . . . like fried potato chips." (2) Soon after that incident, George Barrett of the New York Times had found "a macabre tribute to the totality of modern war" in a village near Anyang, in South Korea: "The inhabitants throughout the village and in the fields were caught and killed and kept the exact postures they held when the napalm struck -- a man about to get on his bicycle, 50 boys and girls playing in an orphanage, a housewife strangely unmarked, holding in her hand a page torn from a Sears-Roebuck catalogue crayoned at Mail Order No 3,811,294 for a $2.98 'bewitching bed jacket -- coral'." US Secretary of State Dean Acheson wanted censorship authorities notified about this kind of "sensationalised reporting," so it could be stopped. (3) One of the first orders to burn towns and villages that I found in the archives was in the far southeast of Korea, during heavy fighting along the Pusan Perimeter in August 1950, when US soldiers were bedevilled by thousands of guerrillas in rear areas. On 6 August a US officer requested "to have the following towns obliterated" by the air force: Chongsong, Chinbo and Kusu-dong. B-29 strategic bombers were also called in for tactical bombing. On 16 August five groups of B-29s hit a rectangular area near the front, with many towns and villages, creating an ocean of fire with hundreds of tons of napalm. Another call went out on the 20 August. On 26 August I found in this same source the single entry: "fired 11 villages." (4) Pilots were told to bomb targets that they could see to avoid hitting civilians, but they frequently bombed major population centres by radar, or dumped huge amounts of napalm on secondary targets when the primary one was unavailable. In a major strike on the industrial city of Hungnam on 31 July 1950, 500 tons of ordnance was delivered through clouds by radar; the flames rose 200-300 feet into the air. The air force dropped 625 tons of bombs over North Korea on 12 August, a tonnage that would have required a fleet of 250 B-17s in the second world war. By late August B-29 formations were dropping 800 tons a day on the North. (5) Much of it was pure napalm. From June to late October 1950, B-29s unloaded 866,914 gallons of napalm. Air force sources delighted in this relatively new weapon, joking about communist protests and misleading the press about their "precision bombing." They also liked to point out that civilians were warned of the approaching bombers by leaflet, although all pilots knew that these were ineffective. (6) This was a mere prelude to the obliteration of most North Korean towns and cities after China entered the war. China joins the war The Chinese entry caused an immediate escalation of the air campaign. From November 1950, General Douglas MacArthur ordered that a wasteland be created between the fighting front and the Chinese border, destroying from the air every "installation, factory, city, and village" over thousands of square miles of North Korean territory. As a well-informed British attaché to MacArthur's headquarters observed, except for Najin near the Soviet border and the Yalu dams (both spared so as not to provoke Moscow or Beijing), MacArthur's orders were "to destroy every means of communication and every installation, and factories and cities and villages. This destruction is to start at the Manchurian border and to progress south." On 8 November 1950, 79 B-29s dropped 550 tons of incendiaries on Sinuiju, "removing [it] from off the map." A week later Hoeryong was napalmed "to burn out the place." By 25 November "a large part of [the] North West area between Yalu River and south to enemy lines is more or less burning"; soon the area would be a "wilderness of scorched earth." (7) This happened before the major Sino-Korean offensive that cleared northern Korea of United Nations forces. When that began, the US air force hit Pyongyang with 700 500-pound bombs on 14-15 December; napalm dropped from Mustang fighters, with 175 tons of delayed-fuse demolition bombs, which landed with a thud and then blew up when people were trying to retrieve the dead from the napalm fires. At the beginning of January General Matthew Ridgway again ordered the air force to hit the capital, Pyongyang, "with the goal of burning the city to the ground with incendiary bombs" (this happened in two strikes on 3 and 5 January). As the Americans retreated below the 38th parallel, the scorched-earth policy of torching continued, burning Uijongbu, Wonju and other small cities in the South as the enemy drew near. (8) The air force also tried to destroy the North Korean leadership. During the war on Iraq in 2003 the world learned about the MOAB, "Mother of All Bombs," weighing 21,500 pounds with an explosive force of 18,000 pounds of TNT. Newsweek put this bomb on its cover, under the headline "Why America Scares the World." (9) In the desperate winter of 1950-51 Kim Il Sung and his closest allies were back where they started in the 1930s, holed up in deep bunkers in Kanggye, near the Manchurian border. After failing to find them for three months after the Inch'on landing (an intelligence failure that led to carpet-bombing the old Sino-Korean tributary route running north from Pyongyang to the border, on the assumption that they would flee to China), B-29s dropped Tarzan bombs on Kanggye. These were enormous 12,000-pound bombs never deployed before -- but firecrackers compared to the ultimate weapons, atomic bombs. A blocking blow On 9 July 1950 -- just two weeks into the war, it is worth remembering -- MacArthur sent Ridgway a hot message that prompted the joint chiefs of staff (JCS) "to consider whether or not A-bombs should be made available to MacArthur." The chief of operations, General Charles Bolte, was asked to talk to MacArthur about using atomic bombs "in direct support [of] ground combat." Bolte thought 10-20 such bombs could be spared for Korea without unduly jeopardising US global war capabilities. Boite received from MacArthur an early suggestion for the tactical use of atomic weapons and an indication of MacArthur's extraordinary ambitions for the war, which included occupying the North and handling potential Chinese -- or Soviet -- intervention: "I would cut them off in North Korea . . . I visualise a cul-de-sac. The only passages leading from Manchuria and Vladivostok have many tunnels and bridges. I see here a unique use for the atomic bomb -- to strike a blocking blow -- which would require a six months' repair job. Sweeten up my B-29 force." At this point, however, the JCS rejected use of the bomb because targets large enough to require atomic weapons were lacking; because of concerns about world opinion five years after Hiroshima; and because the JCS expected the tide of battle to be reversed by conventional military means. But that calculation changed when large numbers of Chinese troops entered the war in October and November 1950. At a famous news conference on 30 November President Harry Truman threatened use of the atomic bomb, saying the US might use any weapon in its arsenal. (10) The threat was not the faux pas many assumed it to be, but was based on contingency planning to use the bomb. On that same day, Air Force General George Stratemeyer sent an order to General Hoyt Vandenberg that the Strategic Air Command should be put on warning, "to be prepared to dispatch without delay medium bomb groups to the Far East . . . this augmentation should include atomic capabilities." General Curtis LeMay remembered correctly that the JCS had earlier concluded that atomic weapons would probably not be useful in Korea, except as part of "an overall atomic campaign against Red China." But, if these orders were now being changed because of the entry of Chinese forces into the war, LeMay wanted the job; he told Stratemeyer that only his headquarters had the experience, technical training, and "intimate knowledge" of delivery methods. The man who had directed the firebombing of Tokyo in 1945 was again ready to proceed to the Far East to direct the attacks. (11) Washington was not worried that the Russians would respond with atomic weapons because the US possessed at least 450 bombs and the Soviets only 25. On 9 December MacArthur said that he wanted commander's discretion to use atomic weapons in the Korean theatre. On 24 December he submitted "a list of retardation targets" for which he required 26 atomic bombs. He also wanted four to drop on the "invasion forces" and four more for "critical concentrations of enemy air power." In interviews published posthumously, MacArthur said he had a plan that would have won the war in 10 days: "I would have dropped 30 or so atomic bombs . . . strung across the neck of Manchuria." Then he would have introduced half a million Chinese Nationalist troops at the Yalu and then "spread behind us -- from the Sea of Japan to the Yellow Sea -- a belt of radioactive cobalt . . . it has an active life of between 60 and 120 years. For at least 60 years there could have been no land invasion of Korea from the North." He was certain that the Russians would have done nothing about this extreme strategy: "My plan was a cinch." (12) A second request Cobalt 60 has 320 times the radioactivity of radium. One 400-ton cobalt H-bomb, historian Carroll Quigley has written, could wipe out all animal life on earth. MacArthur sounds like a warmongering lunatic, but he was not alone. Before the Sino-Korean offensive, a committee of the JCS had said that atomic bombs might be the decisive factor in cutting off a Chinese advance into Korea; initially they could be useful in "a cordon sanitaire [that] might be established by the UN in a strip in Manchuria immediately north of the Korean border." A few months later Congressman Albert Gore, Sr. (Father of former VP and 2000 Democratic candidate Al Gore, Jr., and subsequently a strong opponent of the Vietnam war) complained that "Korea has become a meat grinder of American manhood" and suggested "something cataclysmic" to end the war: a radiation belt dividing the Korean peninsula permanently into two. Although Ridgway said nothing about a cobalt bomb, in May 1951, after replacing MacArthur as US commander in Korea, he renewed MacArthur's request of 24 December, this time for 38 atomic bombs. (13) The request was not approved. The US came closest to using atomic weapons in April 1951, when Truman removed MacArthur. Although much related to this episode is still classified, it is now clear that Truman did not remove MacArthur simply because of his repeated insubordination, but because he wanted a reliable commander on the scene should Washington decide to use nuclear weapons; Truman traded MacArthur for his atomic policies. On 10 March 1951 MacArthur asked for a "D-Day atomic capability" to retain air superiority in the Korean theatre, after the Chinese massed huge new forces near the Korean border and after the Russians put 200 bombers into airbases in Manchuria (from which they could strike not just Korea but also US bases in Japan). (14) On 14 March General Vandenberg wrote: "Finletter and Lovett alerted on atomic discussions. Believe everything is set." At the end of March Stratemeyer reported that atomic bomb loading pits at Kadena Air Base on Okinawa were again operational; the bombs were carried there unassembled, and put together at the base, lacking only the essential nuclear cores. On 5 April the JCS ordered immediate atomic retaliation against Manchurian bases if large numbers of new troops came into the fighting, or, it appears, if bombers were launched from there against US assets. On that day the chairman of the Atomic Energy Commission, Gordon Dean, began arrangements for transferring nine Mark IV nuclear capsules to the Air Force's 9th Bomb Group, the designated carrier for atomic weapons. The JCS again considered the use of nuclear weapons in June 1951, this time in tactical battlefield circumstances (15) and there were many more such suggestions as the war continued to 1953. Robert Oppenheimer, former director of the Manhattan Project, was involved in Project Vista, designed to gauge the feasibility of the tactical use of atomic weapons. In 1951 young Samuel Cohen, on a secret assignment for the US Defence Department, observed the battles for the second recapture of Seoul and thought there should be a way to destroy the enemy without destroying the city. He became the father of the neutron bomb. (16) The most terrifying nuclear project in Korea, however, was Operation Hudson Harbour. It appears to have been part of a larger project involving "overt exploitation in Korea by the Department of Defence and covert exploitation by the Central Intelligence Agency of the possible use of novel weapons" -- a euphemism for what are now called weapons of mass destruction. The 'limited war' Without even using such "novel weapons" -- although napalm was very new -- the air war levelled North Korea and killed millions of civilians. North Koreans tell you that for three years they faced a daily threat of being burned with napalm: "You couldn't escape it," one told me in 1981. By 1952 just about everything in northern and central Korea had been completely levelled. What was left of the population survived in caves. Over the course of the war, Conrad Crane wrote, the US air force "had wreaked terrible destruction all across North Korea. Bomb damage assessment at the armistice revealed that 18 of 22 major cities had been at least half obliterated." A table he provided showed that the big industrial cities of Hamhung and Hungnam were 80-85% destroyed, Sariwon 95%, Sinanju 100%, the port of Chinnampo 80% and Pyongyang 75%. A British reporter described one of the thousands of obliterated villages as "a low, wide mound of violet ashes." General William Dean, who was captured after the battle of Taejon in July 1950 and taken to the North, later said that most of the towns and villages he saw were just "rubble or snowy open spaces." Just about every Korean he met, Dean wrote, had had a relative killed in a bombing raid. (17) Even Winston Churchill, late in the war, was moved to tell Washington that when napalm was invented, no one contemplated that it would be "splashed" all over a civilian population. (18) This was Korea, "the limited war." The views of its architect, Curtis LeMay, serve as its epitaph. After it started, he said: "We slipped a note kind of under the door into the Pentagon and said let us go up there . . . and burn down five of the biggest towns in North Korea -- and they're not very big -- and that ought to stop it. Well, the answer to that was four or five screams -- 'You'll kill a lot of non-combatants' and 'It's too horrible.' Yet over a period of three years or so . . . we burned down every town in North Korea and South Korea, too . . . Now, over a period of three years this is palatable, but to kill a few people to stop this from happening -- a lot of people can't stomach it." (19) NOTES (1) Stephen Endicott and Edward Hagerman, "First victims of biological warfare," Le Monde diplomatique, English language edition, July 1999. (2) Quoted in Clay Blair, Forgotten War, Random House, New York, 1989. (3) US National Archives, 995.000 file, box 6175, George Barrett dispatch of 8 February 1951. (4) National Archives, RG338, KMAG file, box 5418, KMAG journal, entries for 6, 16, 20 and 26 August 1950. (5) See the New York Times, 31 July, 2 August and 1 September 1950. (6) See "Air War in Korea," Air University Quarterly Review 4 no 2, autumn 1950, and "Precision bombing," ibid, n° 4, summer 1951. (7) MacArthur Archives, RG6, box 1, Stratemeyer to MacArthur, 8 November 1950; Public Record Office, FO 317, piece n° 84072, Bouchier to Chiefs of Staff, 6 November 1950; piece n° 84073, 25 November 1959 sitrep. (8) Bruce Cumings, The Origins of the Korean War, vol. 2, Princeton University Press, 1990; New York Times, 13 December 1950 and 3 January 1951. (9) Newsweek, 24 March 2003. (10) New York Times, 30 November and 1 December 1950. (11) Hoyt Vandenberg Papers, box 86, Stratemeyer to Vandenberg, 30 November 1950; LeMay to Vandenberg, 2 December 1950. Also Richard Rhodes, Dark Sun: The Making of the Hydrogen Bomb, Touchstone, Simon & Schuster, New York, 1995. (12) Bruce Cumings, op cit; Charles Willoughby Papers, box 8, interviews by Bob Considine and Jim Lucas in 1954, published in the New York Times, 9 April 1964. (13) Carroll Quigley, Tragedy and Hope: A History of the World in Our Time, MacMillan, New York, 1966; Quigley was Bill Clinton's favorite teacher at Georgetown University. See also Bruce Cumings, op cit. (14) Documents released after the Soviet Union collapsed do not bear this out; scholars who have seen these documents say there was no such major deployment of Soviet air power at the time. However, US intelligence reports believed the deployment happened, perhaps based on effective disinformation by the Chinese. (15) This does not mean the use of "tactical" nuclear weapons, which were not available in 1951, but the use of the Mark IVs in battlefield tactical strategy, much as heavy conventional bombs dropped by B-29 bombers had been used on battlefields since August 1950. (16) Samuel Cohen was a childhood friend of Herman Kahn. See Fred Kaplan, The Wizards of Armageddon, Simon & Schuster, New York, 1983. On Oppenheimer and Project Vista, see Bruce Cumings, op cit; also David Elliot, "Project Vista and Nuclear Weapons in Europe," International Security 2, n° 1, summer 1986. (17) Conrad Crane, American Airpower Strategy in Korea, University Press of Kansas, 2000. (18) Jon Halliday and Bruce Cumings, Korea: The Unknown War, Pantheon Books, New York, 1988. (19) J F Dulles Papers, Curtis LeMay oral history, 28 April 1966. This article originally appeared in Le Monde Diplomatique (December 2004) and was reprinted by Japan Focus with permission of the author.
42d1c679f823cf13019ff6ee98e6f873
https://historynewsnetwork.org/blog/131559
Cordoba: 1010, in the 20:20 of hindsight
Cordoba: 1010, in the 20:20 of hindsight I have had brewing for a while a post about the rhetoric surrounding the proposed Islamic cultural center in New York near the erstwhile site of the World Trade Center.  This takes careful phrasing, because it's absurdly tempting just to give in and call it"The Ground Zero Mosque" so that people will immediately identify what I'm talking about.  This would be wrong because none of the nouns there are true, as UK columnist Charlie Brooker has pointed out in his amazed piece about the way this business is being handled in the British media: The "Ground Zero mosque" is a genuine proposal, but it's slightly less provocative than its critics' nickname makes it sound. For one thing, it's not at Ground Zero. Also, it isn't a mosque. Wait, it gets duller. The whole piece is worth reading, especially perhaps from within the U.S., partly for a perspective on what the situation looks like to less involved eyes, partly because of Brooker's especially blunt grade of lampoon, but also because it unwittingly harks back to a golden age in which the popular media's mission was to inform, rather than the opposite, which is as he rightly says exactly what is happening every time someone allows the GZM phrase into print or online.  I'm not sure, especially after this long as a member of Cliopatria, if there ever was such an age, but it's always bracing to meet ideas which you thought were dead, apparently still breathing. The misinformation has however been much deeper than just the name, and some of it strays into my bailiwick because it's difficult for people to talk about Islam in hate-speech without invoking the Middle Ages.  That genuinely is, after all, when Islam can be fairly described as having been a unified politico-religious entity bent on world conquest and the subjugation of all non-believers.  That time lasted about twenty minutes in the seventh century and then the entity split into differently concerned bits, but it was there. I mean, it's problematic that, even while their troops were rolling over late Roman North Africa and into Visigothic Spain, the Umayyad Caliphs were failing to enforce religious sanctions on the Christians they'd conquered in the East, and that repressive measures against Christians, much like Christian ones against Jews, only really cropped up when political and military fortune had ebbed again, but there's still enough truth in it to float a newspaper boat.  And it's in that light that Newt Gingrich, a figure of bewildering influence for someone outside the U.S., started talking about what Córdoba could be made to mean in this context.  His piece was ridiculous, of course, even though it didn't use the GZM phrase, and Carl Pyrdrum at Got Medieval took it down all guns blazing (as Ralph notes here) but even for Carl there were problems.  One commentator (among about fifty-fifty applause and wingnuttery) put it like this: All that matters here is that everyone agrees on the facts, and these facts are just as Newt Gingrich stated them: 1) Cordoba was the capital city of an Islamic theocracy. 2) After their conquest of Cordoba, the Muslims built a Mosque on a site where a Church had previously been. 3) That Mosque came to be the third largest Mosque in the world....Most of the time you can count on being right just by blindly disagreeing with Newtie. But it takes a real genius like Carl Pyrdum to prove that even that doesn't work 100% of the time. It should be no surprise that Córdoba, as the capital of an Islamic theocracy that outstripped anywhere in the West for richness and sophistication at exactly the time the Christian countries that became modern Europe were forming up, presents a very complex face to the modern enquirer.  Obviously it can mean many things.  Compare, for example, the different senses you might get from "Venetian"; "Venetian splendor," "Venetian politics," "Venetian gondola" and "Venetian blind" all send the auditory brain to fairly different places, for me being sympathetically impressed, regretfully distasteful, annoyed by cliché and basically neutral.  Similarly we have the Córdoba mosque, but also Cordoban convivencia as picked up by Carl, Córdoba leather and (of course!) the Chrysler Cordoba as ingeniously detected in the plans by Michael Berubé.  But, there is still the basic problem that for people who know only one thing about Muslim Spain, Córdoba is where the Muslims built a mosque over a cathedral (after a century-odd of sharing the space with the Christians), whereas for the Arab world at large, as other commentators at Got Medieval pointed out, it means something quite different: I've lived and worked in several Arab-Muslim countries in the course of a diplomatic career spanning nearly three decades.  Years before 9/11, much less the plan for this Muslim cultural center, I'd often heard Muslims describe the caliphate in Cordoba as a golden age of scholarly learning, enlightenment, and tolerance among Muslims, Christians, and Jews.  In Morocco today, conferences study Maimonides and Averroes together, and celebrate Morocco's Andalusian history and culture.  I agree with one poster above that the subsequent transformation of the Cordoba mosque into a Catholic cathedral is much closer to the type of symbolic conquest gesture that Newt Gingrich wants us to read into the proposed Cordoba House name. I'm sure this is right myself, but it doesn't stop the person who said that Gingrich is also, in some sense, right, being right too.  A third way to read Córdoba is needed.  So okay, here's one that may be new to you:  ever hear of the sack of Córdoba in 1010? This is one of the sorrier tales of the Spanish Reconquest, and it's a tale that's very rarely told all together, because it manages to fall across the divide between the historiography of Castile, from which a unified Spain has tended to derive itself historically, and of Catalonia, some of whose inhabitants daub slogans proclaiming that"Catalunya no es Espanya" over any available surface in election seasons.  These two rarely converse historically, and so a battle that was fought between them is hard to find in the textbooks.  Such, however, this was.  By 1010, the Caliphate of Córdoba was not doing well.  Why this was would be a book in itself, but part of an answer might lie in the attempt by the caliphs, who had until 'Abd al-Rahman III al-Nasir (ruled 928-961) been emirs, to raise their status and importance and decrease their vulnerability to the noble military aristocracy by their self-promotion to the caliphate, the development of an increasingly elaborate palace ceremonial, and the professionalisation of both administration and army. This had resulted, once it ceased to be superintended by a brilliant, ruthless and adult ruler, in an almost total isolation of the actual caliph from government.  Real power was by now held by military leaders, in charge of a fiscally-funded mercenary army that made itself a total terror in northern Spain for some years under its most successful hajib (roughly 'first minister'), Abu Amir Muhammad Ibn Abdullah Ibn Abi Amir, known as al-Mansur. This, of course, provoked discontent among the Umayyad royal family (which was large and generously defined) and the traditional military groups, especially the Berber settlers, who were not much loved by the non-Berber population, because they were losing their principal sources of safety and power.  The paradox was, then, that at its military peak in the 980s this state was already becoming brittle at the foundations.  Al-Mansur's son was a fair successor to his power, but his son went one step too far in trying to obtain the succession to the caliphate in 1008; the tattered dignity of the rank was fatally damaged and a number of rival caliphs were set up, each faction raising their own forces wherever they could, and fatally weakening the defences against the state's external enemies.  Enter the Christians. By 1010, the contendors for the highest office were as follows.  First, there was Hisham II, the leading Umayyad, who had ruled as caliph from 976 (when he was ten) till 1009; it was him whom the hajib 'Abd al-Rahman Sanchuelo had persuaded to name him as heir.  This had caused outrage and a rising by another Umayyad, Muhammad II, in defence of the dynasty.  Muhammad II, who took the title al-Mahdi, was the second contender, and his coup of 1009 had succeeded not only in taking the throne, but also in killing 'Abd al-Rahman Sanchuelo on his return to the capital.  The third would-be caliph was another Umayyad, Sulaiman al-Musta'in, who had entered the fray in 1010 on a legitimist ticket, aiming to restore Hisham but who, having briefly done so, declared the ex-caliph unfit to rule and took his place after only two days. All these contenders had different support bases.  Sulaiman's army was made up mainly of Berbers on military tenures whom the state had had increasingly little use for, and who felt threatened by this loss of livelihood.  Muhammad II had secured the service of part of the old state mercenary army, but another part remained loyal to Hisham.  No one side had a plausibly conclusive advantage, and for this reason they sought outside help.  Sulaiman's success over Muhammad had been achieved with troops and supplies provided by King Sancho García of Castile.  Perhaps inevitably, therefore, Muhammad turned to the other northern Christian power in any sort of fighting order, the county of Barcelona.  Count-Marquis Ramon Borrell (of Barcelona, Girona and Osona) and his brother Count Ermengol of Urgell negotiated a huge salary for themselves and their troops, arranged to split the proceeds of the trip 1:4, Urgell to Barcelona, and set out.  They, with the largely Slav mercenary troops of Muhammad, met the Castilians and the Berbers loyal to Sulaiman at 'Aqabat al-Bakar north of the city on May 22, 1010, and the Catalans won.  Sulaiman still managed to get most of his troops away, however, and the Catalans promptly held Muhammad's recaptured capital to ransom, so both from outside and inside his position was very far from secure.  The war would go on until a much more serious sack in 1013 that more or less finished Córdoba as a political center, but 1010 is far enough for the point I want to make. The thing about Córdoba as a symbol is that it doesn't just depend whom you are asking, it depends when you choose to look.  For Christians living under Islam in Iberia in the 860s, Córdoba was perhaps a place where some idiots with a death wish were going to bring down the wrath of the previously tolerant state on their heads and their coffers, but otherwise just a place where the largely-irrelevant kings, to whom the local lord sometimes listened, ruled.  For the world at large in the 950s, on the other hand, Córdoba and its suburban palace of Madinat al-Zahra' were impossibly rich and lofty locations of power where the almost invisible caliph would sometimes deign to let a foreign emissary through the veils of secrecy to make terms with him.  It was widely reported to be the richest city in the dar al-Islam by Arabic writers at this sort of time, and even more so to have been once it had fallen.  For the inhabitants of Barcelona in 986, on a third hand fitted to improve our shadow-boxing, Córdoba was the implacable source of the unstoppable armies that had just sacked their city; it was a place of terror where many of their nearest and dearest were now held prisoner, and many other recently-attacked cities in northern Spain would have held the same view.  The two bishops who died on the Catalan campaign of 1010, at least, must have been old enough to remember this, though the two counts would both have been infants then.  Count Ermengol also died on the campaign; he would go down in history as "el Cordobès," the Cordovan, for his part in cracking the rotten fruit open.  Christians, allied with Muslims on both sides, fought each other for the prize, already bruised by two ineffective and bloody coups in the previous two years. Nobody won except the Catalans who survived. The ruins of the caliphal palace of Madinat al-Zahra', outside Córdoba. Córdoba in 1010 had none of the meanings that Gingrich or his opponents in debate have given it:  in fact, 1010 robbed it of most of its previous meanings and converted this great symbol into a stamped-down boxing arena where the Christians kept running off with the prize money.  In a wider sense, Córdoba in 1010 specifically symbolized the collapse of the Caliphate and of al-Andalus, and the beginning of Christian dominance in Spain.  Gingrich seems to be thinking of Córdoba in the 840s or 850s.  1010 might make him happier; but as long as there was the 720s, 860s and 950s as well, he can't have the symbolism all his way, and neither can the apparently-peaceful group behind the proposed Cordoba House.  History contains multitudes, of perspectives among other things, and very very few of them are right enough to stand unchallenged. Most of the the political history here can be found in Hugh Kennedy, Muslim Spain and Portugal: a Political History of al-Andalus (Harlow 1996), but my pictures of the court and periphery respectively also rest heavily on Miquel Barceló,"The Earliest Sketch of an 'Oriental Despot'? (A Note on the Exchange of Delegations between the Ottonides and the Caliphes of Qurtuba 338-339/950-367/974)" in L'Histoire à Nice. Actes du Colloque International « Entre l'Occident et l'Orient » (Antibes – Juan les Pins, 29-31 octobre 1981) (Nice 1983), pp. 55-85; transl. as"¿El primer trabazo de un 'despota oriental'? (Una nota sobre el intercambio de delegaciones entre los Otones y los Califas de Qurtuba 338-339/950-367/974)" in Barceló, El sol que salió por Occidente (estudios sobre el estado omeya en al-Andalus) (Jaén 1997), pp. 163-186, and Eduardo Manzano Moreno, La Frontera de al-Andalus en época de los Omeyas, Biblioteca de Historia 9 (Madrid 1991). It's also interesting to see how the two schools of Iberian historiography keep their blinkers: notice how Derek W. Lomax, The Reconquest of Spain (London 1978), covering 1010 at pp. 49-51, omits the Catalans, while Josep María Salrach, El Procés de Feudalització (segles III-XII), Historia de Catalunya 2 (Barcelona 1987), p. 296, omits the Castilians.
00b5d72a4a78e43733b010e6e511ddbd
https://historynewsnetwork.org/blog/154090
A Perfect Recipe for a Nuclear War?
A Perfect Recipe for a Nuclear War? Murray Polner, formerly HNN's senior book review editor, blogs at There's No There There. He is the author of No Victory Parades: The Return of the Vietnam Veteran, Branch Rickey: A Biography, and co-editor of We Who Dared Say No To War. John LeCarre, my favorite spy novelist and former M15 and M16 agent was speaking last Fall to friends and readers at the Royal Festival Hall in London about Donald Trump's America."Something truly, seriously bad is happening and from my point of view we have to be aware of .... I think of all things that were happening across Europe in the 1930, in Spain, in Japan, obviously in Germany. To me, these are absolutely comparable signs of the rise of fascism and it's contagious, infectious."Yet as Cold War II replaces Cold War I, the behavior of powerful rivals are no different, as Elisabeth Asbrink put it in her extraordinarily illuminating 1947: Where Now  Begins:  "the  lines that divide the world are now more sharply drawn. The Cold War map is reduced to black and white. Power against power, light against darkness. Nuances of gray: nonexistent. Doubts, compromise, signs of weakness: ditto."A perfect recipe for a nuclear war.Now that the missiles have fallen on Syria, presumably allowing Assad to wreck havoc on rebel Syrians so long as they don't do it with chemicals, Trump and his fellow home front warriors such as John Bolton and Nikki Haley, et al., can ruin the anticipated talks with North Korea while hugging the Saudis, armed with U.S. weapons, as they kill and maim thousands of Yemenis. Later, they can even threaten war with China, Iran and Russia. Paralyzed by the passivity and gullibility of the mass media (remember with virtually no exceptions they backed the invasion of Iraq) and the absence of skepticism among most of the Imperial City's subsidized Think Tanks and Congress, Trump surely recognized he could then forge ahead with his imitation of TR's "Splendid Little War"(as Secretary of State John Hay dubbed it), though I'm  inclined to think this one sounded more like "Wag the Dog," given the  investigations by Robert Mueller, of the FBI's seizure of Michael Cohen's papers and James Comey's appearances seemingly everywhere as he promotes his new, critical book about our leader. Too much heat? I have no idea if LeCarre's warning is correct. But  historian  Benjamin Carter Hett's intriguing and gripping new book, The Death of Democracy: Hitler's Rise to Power & the Downfall of the Weimar Republic, concludes with words of  warning aimed at  all of us  today: "Few Germans in 1933 could imagine Treblinka or Auschwitz, the mass shootings at Babi Yar or the death marches of the last months of the Second World War.  It is hard to blame them for not foreseeing the unthinkable. Yet their innocence failed them and they were catastrophically wrong about their future. We who come later have on advantage over them: we have their example before us."
6fc3530b082f75d4ddd2606074e629e0
https://historynewsnetwork.org/blog/154294
How Tolstoy's War and Peace Can Help Us Understand History's Complexity
How Tolstoy's War and Peace Can Help Us Understand History's Complexity David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington; his most recent book is Through a Glass Brightly: Using science to see our species as we really are (Oxford University Press), 2018. We have extraordinary access to ideas and data, and yet it has become a truism that getting to the bottom of things has become more difficult. Truth itself seems increasingly unclear and under assault. Call it Nietzschean perspectivism if you wish, but I submit it is because things are rarely linear. Even without the astounding barrage of half-truths, outright lies, “alternative facts” and demented tweets emanating from the Orange Presence in the White House, the Internet alone has muddied the waters of our comprehension. Our times are best understood not via relativism or subjectivity of “truth claims” but by recognizing that although events and other facts are indeed real, they mostly result from a complex, interlocking web of causation with no one factor necessarily determinative and no single interpretation necessarily correct. In the brief essay to follow, I summon in support of this proposition no less a presence than Leo Tolstoy. (If you resist such arguments from authority, feel free to skip the quotation that is about to come!) War and Peace can, not unlike our increasingly muddled grasp of reality, be perceptually overwhelming in its recounting of detailed human stories. Yet one of the author’s notable asides in this masterpiece begins with a seemingly simple and mundane natural occurrence – “A bee settling on a flower has stung a child” -  then proceeds to view it from various angles, eventually arriving at a grand generalization: And the child is afraid of bees and declares that bees exist to sting people. A poet admires the bee sucking from the chalice of a flower and says it exists to suck the fragrance of flowers. A beekeeper, seeing the bee collect pollen from flowers and carry it to the hive, says that it exists to gather honey. Another beekeeper who has studied the life of the hive more closely says that the bee gathers pollen dust to feed the young bees and rear a queen, and that it exists to perpetuate its race. A botanist notices that the bee flying with the pollen of a male flower to a pistil fertilizes the latter, and sees in this the purpose of the bee's existence. Another, observing the migration of plants, notices that the bee helps in this work, and may say that in this lies the purpose of the bee. But the ultimate purpose of the bee is not exhausted by the first, the second, or any of the processes the human mind can discern. The higher the human intellect rises in the discovery of these purposes, the more obvious it becomes that the ultimate purpose is beyond our comprehension. All that is accessible to man is the relation of the life of the bee to other manifestations of life. And so it is with the purpose of historic characters and nations. Despite the human yearning for simple, cause-and-effect explanations, people increasingly understand that nothing can be grasped in isolation, even as complexity makes things less graspable in their entirety. This is particularly true when it comes to the elaborate kaleidoscope that is human human behavior, where no single factor explains everything – or indeed, anything. Tolstoy himself was especially concerned with supporting his idea that history in general, war in particular, and the Napoleonic Wars most especially were due to processes that the human mind could not discern, and that contrary to Great Man notions, “historic characters and nations” are influenced by an array of factors such that every individual is no less influential than are deluded, self-proclaimed leaders. And so, Napoleon comes across in War and Peace as more than a bit ridiculous with his insistence that he and he alone drives events. (This, in service of Tolstoy’s urging that if citizens refuse to participate, there would be no wars; i.e., an early version of the bumper sticker, “What if they had a war and no one came?”) Taking a hedgehoggy view of that brilliant old fox, today’s world is no less multi-causal and therefore confusing than Tolstoy proclaimed with his parable of the honeybee. Afflicted with an unending stream of presidential lies and gaslighting, constant yet shifting claims of “fake news” and “alternative facts,” a dizzying array of Internet information overload, and a blizzard of wild conspiracy theories, it is tempting to give up on any coherent interpretation … of anything. Nor is this problem new. Social scientists have long disagreed about what is predominant when it comes to causation: language, socialization, learning, cultural tradition, historical circumstance, evolutionary inheritance, and so forth. Was World War I, for example, due to interlocking alliances, frustrated yearning for colonial empire on the part of Germany, Austria-Hungary’s anxiety about losing its empire, incompetent national leaders, Europe bored with decades of more-or-less peace, the rigidity of war plans combined with strict mobilization time-tables, the assassination of a certain Arch-Duke, machinations by the “merchants of death,” a combination of these, or something else? Was the 2003 invasion of Iraq due to George W. Bush’s yearning to outdo his father, W having been manipulated by Messrs Cheney, Rumsfeld, Wolfowitz et al, a genuine hope of bringing democracy to the Arab Middle East, greed for Iraqi oil, a real if misguided belief that Western “liberators” would be welcomed with chocolate and flowers, illusions about weapons of mass destruction, a combination of these, or something else? Is the climate crisis due to corporate greed, consumer indifference, technological over-reach, the cowardice and short-sightedness of politicians, human overpopulation in the face of limited energy resources of which fossil fuels are the most available and at the cheapest immediate cost, a collision between atmospheric physics and growth-oriented economics, the inexorable push of energy-gobbling civilizations, a combination of these, or something else? Is the danger of nuclear annihilation due to the military-industrial complex, a human penchant for war, distrust of “the other,” excessive reliance on deterrence as a savior, a kind of psychic numbing due to the unimaginable consequences of thermonuclear holocaust, perceived helplessness on the part of ordinary citizens, widespread feelings of fatalism, a sense that if something really bad hasn’t happened yet it never will, a mesmerized delight in extreme power and potential violence whatever the consequences, a combination of these, or something else? Although ultimate causes and to some extent even reality itself are often beyond our comprehension –possibly even beyond our ability to repair – it is nonetheless our duty to behave as though they aren’t. And certainly our duty to acknowledge such critical realities as war, climate change and nuclear weapons. Just as “All that is accessible to man is the relation of the life of the bee to other manifestations of life,” we can conclude with the existentialists that just as Albert Camus’s Sisyphus was heroic because he persevered in pushing his rock, we are equally obliged – and privileged – to push ours, even though what is accessible to us turns out to be not one but many, going in different directions and with uncertain outcomes. Or as Rabbi Tarfon (70 CE – 135 CE) proclaimed, “It is not your responsibility to finish the work of perfecting the world, but neither are you free to desist from it." Tolstoy would agree.
89b76e4147aec3988bfdf387632e0b20
https://historynewsnetwork.org/blog/154370
A Renaming Everyone Can Get Behind
A Renaming Everyone Can Get Behind For a decade at least, Washington, D.C., has been stuck in ugly political gridlock. As a step toward renewed bipartisanship, I offer this modest proposal. At the turn of the last century, Republicans engaged in a wave of memorials and renamings for President Ronald Reagan. In 1998, a Republican Congress passed a bill requiring renaming Washington National Airport for Reagan. The airport authority and many D.C. residents pointed out that it was already named for one president, but "Ronald Reagan Washington National Airport" went into effect nevertheless. After George W. Bush entered the White House in 2001, the renaming went on in earnest. Historians know that memorials in the U.S. have often sprouted in waves. Union monuments began to go up immediately after the Civil War. Most Confederate memorials were dedicated much later, in the period 1890 to 1940. Why? Because victors usually put up memorials, and in about 1890, the Confederacy — or more accurately, since it was a new generation, neo-Confederates — won the Civil War. And, Republicans argued, had not Reagan similarly won the Cold War? A year or so after the breakup of the Soviet Union, I heard an interview about it with Eduard Shevardnadze, who had been Foreign Secretary in the U.S.S.R. Asked if Ronald Reagan deserved partial credit in some way for the downfall of Communism and the breakup of the U.S.S.R., he was momentarily struck dumb. Clearly he had never thought of that hypothesis. Having considered it, he rejected it out of hand, citing more basic economic, societal, and ideological contradictions within the Communist system. But this made no difference to Republicans. Years ago Walt Kelly had mocked such thinking in his comic strip Pogo, in a scene in which Albert Alligator, claiming some political mantle at the time, took credit for the weather, a fine sunny day. "Why not?" he protested. "It happened during my administration, didn't it?" The resulting mania for memorializing Reagan thus reflected a political rather than historical judgment. Historically, Ronald Reagan surely ranks no higher than the third best Republican president of the twentieth century, well below Teddy Roosevelt and Dwight Eisenhower. No matter. Grover Norquist, leader of the Ronald Reagan Legacy Project and even more famous for his no-tax-increase pledge, called for a monument to Reagan in each of America's 3,067 counties and on the national mall in Washington, D.C.; his face on the $10 bill, replacing Alexander Hamilton's; and perhaps his profile added to Mount Rushmore! "Or we could have our own mountain," suggested Norquist. An article by Greg Kaza in National Review called Mt. McKinley a "precedent" for renaming some other peak for Ronald Reagan. Of course, more recently McKinley has given way to Denali, its aboriginal name, but at the time the example made sense. I have a suggestion for Mount Reagan that I think will never get renamed for someone else. Each of our United States has by definition its highest point. The highest point in Reagan's home state of California is already named, of course, for Josiah Dwight Whitney, who founded the California Geological Survey. So is the highest point in Reagan's native state, Illinois, 1,235' high Charles Mound. In fact, the highest point in every state is already named, even Florida's Britton Hill, a mere 345' from sea level — except Delaware's. Indeed, Delaware's tallest spot was misknown until recently. It was thought to be marked by a National Geodetic Survey azimuth on Ebright Road between Brandywine and Brandywood in far northern Delaware. The Ebright Azimuth turns out not to be the highest point in Delaware, however. The highest point in Delaware, at 451' a full two feet above the Ebright Azimuth, is in a mobile home park some 300 yards west. It is "the elevation in front of the first trailer," according to William S. Schenck of the Delaware Geological Survey. Of course, Ronald Reagan had nothing particularly to do with Delaware. But then he had nothing to do with aviation, either, except for smashing the air traffic controllers' union, which didn't stop Republicans from renaming Washington National Airport Ronald Reagan Washington National Airport. And, like McKinley with Spain, Reagan did win a war — with Grenada. So perhaps he does deserve to have a mountain named for him. Delaware's tallest spot — “Mount Reagan” — is perfectly appropriate. It matches exactly the size of the war Ronald Reagan won.
491fcbca3723a8709a7a96ab550fa77e
https://historynewsnetwork.org/blog/154401
A Conversation with Seattle Author Dr. Lawrence Matsuda on His Debut Historical Novel "My Name is Not Viola"
A Conversation with Seattle Author Dr. Lawrence Matsuda on His Debut Historical Novel "My Name is Not Viola" Lawrence Matsuda portrait by Alfredo Arreguin On December 7, 1941, forces of the Japanese Empire attacked the American naval base at Pearl Harbor and left hundreds of American military members and civilians dead or wounded. In response to the surprise attack, the United States declared war on Japan the next day. The attack on America inflamed anti-Japanese sentiment and hysteria that led to hate crimes, particularly on the West Coast, against aliens and US citizens of Japanese extraction—and those who looked like them. Under President Franklin D. Roosevelt’s February 1942 Executive Order 9066, the US government forcibly removed 120,000 people of Japanese ancestry from their homes and incarcerated them in concentration camps.  Most of these interned people were kept in the camps until 1945, with the exception of early releases of a few, such as the valiant souls who volunteered to serve in the American armed forces, including members of the Japanese American 442nd Regiment that became the most decorated American unit of the war. Others were released to attend college or work in defense industries like munitions factories in areas away from the West Coast. The unfortunate internees subjected to the harsh and dehumanizing conditions of the prison camps had committed no crime but were rounded up, dispossessed, and detained unconstitutionally based only on their ancestry and race. And about two-thirds of the internees were United States citizens. The detainees included Hanae and Ernest Matsuda who, with removal in 1942, lost their home and grocery business in Seattle. Like thousands of others, they were evacuated without due process and were incarcerated at the Minidoka concentration camp in Idaho where Hanae gave birth to two sons and a stillborn child. Hanae and Ernest Matsuda’s youngest son Lawrence was born in 1945 in Block 26, Barrack 2, of Minidoka Camp. Their baby’s prisoner number was 11464d. Now Dr. Lawrence Matsuda, a renowned Seattle writer and human rights activist, brings to life his mother’s travails, traumas, and triumphs in mid-20th century America in his debut historical novel My Name is Not Viola. The events experienced by the fictional Hanae of the novel mirror actual incidents in the life of his mother including her girlhood in Seattle’s Japantown; her pre-war journey to Hiroshima, Japan; her removal from her Seattle home and incarceration at the brutal Minidoka concentration camp; her quest for Hiroshima relatives after the atomic obliteration of the city; her marital woes; her severe depression and incarceration at Western State Hospital, a psychiatric facility; her resilience grounded in Japanese and western beliefs; and her evolution as a force for good. The novel captures the rhythm of life in Seattle’s Japantown, the unrelenting misery of internment at the Minidoka camp, and the pain and loss of internees as they returned home after the war to face dispossession and poverty. This history through the eyes of the fictional Hanae grips the reader with its lively writing and evocative imagery while sharing an important and heartbreaking chapter from our American experience. Yet it is also a story of hope and triumph despite recurrent traumas—and quite timely as we face an unprecedented pandemic and political crises today. Dr. Matsuda is known in Seattle as a voice for social justice, equality, and tolerance. He is a former secondary school teacher, administrator, principal, and professor. He received an MA and PhD at the University of Washington. As a writer, Dr. Matsuda is most well-known for his poetry. His first book of poems, A Cold Wind from Idaho, was published by Black Lawrence Press in 2010. He has published two other books of poetry, one in collaboration with renowned American poet Tess Gallagher, as well as a graphic novel about the Second World War experiences of the Japanese American 442 Regimental Combat Team. Chapter one and two of that graphic novel were animated by the Seattle Channel and both won regional Emmys, one in 2015 and the other in 2016. His poems have appeared in many publications including Raven Chronicles, New Orleans Review, Floating Bridge Review, Poets Against the War website, Nostalgia Magazine, Plumepoetry, Surviving Minidoka (book), Meet Me at Higos (book), Minidoka-An American Concentration Camp (book and photographs), the Seattle Journal for Social Justice, and many others. And he co-edited the book Community and Difference: Teaching, Pluralism and Social Justice, winner of the 2006 National Association of Multicultural Education Phillip Chinn Book Award. And Dr. Matsuda continues to work tirelessly for a more just and tolerant nation. He graciously talked about his new novel and his writing career by telephone from his home in Seattle. Robin Lindley: You had a successful career as an educator, administrator, and professor. How did your “encore career” as a poet and writer come about? Dr. Lawrence Matsuda: When I got my PhD, I decided to take something fun because the PhD was tough sledding and not always enjoyable. So, I took a poetry class from Nelson Bentley. Robin Lindley: He was a beloved professor at the University of Washington. Dr. Lawrence Matsuda: Yes. I enjoyed it a lot. I attended his class several times and read for the Castilla reading series for several years. He always encouraged me to publish my poetry. He was a good person and took great pride in having his students published. I moved my energy into poetry after my PhD, and continued to write poetry when I was working. Most of it was not great, but mediocre poetry. In about 2008, I decided to get good at poetry. I worked with Tess Gallagher. She helped me with my first book of poetry A Cold Wind from Idaho. I thought I was done because I had worked with some other people who helped. I gave the manuscript to my friend, the artist Alfredo Arreguin, and he said Tess Gallagher was coming to his house, and that he would show the book to her. Evidently, she was taken by the manuscript, but decided it needed revisions. She worked with me for about a year, mostly electronically. We finally met and I submitted to Black Lawrence Press as part of a contest. It didn't win first prize, but received honorable mention, and it was published in 2010. Currently more than 1,300 copies are in print. Robin Lindley: Thanks for sharing that story. It’s wonderful that one of our great American poets, Tess Gallagher, helped launch your writing career. Now you've written this historical novel, My Name is Not Viola, based on the life of your mother. What sparked a novel at this time? Did you see it as a memoir for you as well as the story of your mother? Dr. Lawrence Matsuda: It started as a play in the Minidoka [concentration camp] canteen where old guys were sitting around and talking in a general store--cracker barrel scene. I decided that the play wasn't going anywhere. It was just talking, and it needed a little more action. So, I looked to my own life and I compared it to my mother's and my mother had a much better story. It's not a memoir because some of it is fiction, and it’s not an autobiography. It follows the same character in the first person from beginning to end. It’s a historic novel that looks very much like the memoir. The bones of the novel are my mother’s story and that structure is true. My mother was born in the United States. She went to Japan and was educated there. She came back to the United States, and thengot married. She was incarcerated. And she went to a mental hospital. So, all the bones are true, and to add flesh, I borrowed some of the stories that she told me. I filled in the blanks and then, to move the story farther, I added stories that I heard from other people about Minidoka. I’ve made pilgrimages to Minidoka six or seven times. They have a story time when former internees talk about being there. I borrowed some of those stories, and then farther out, I brought in stories of my friends, and then way out farther it was just fiction. So, the book is historic fiction based on the general outline of my mother's life. What motivated me is, I have always thought that each person has a good story, and at least one novel. I decided I needed to write and find my one novel, but it wasn't my story. It was my mother's story. The other thing is that I’ve always felt an artist should keep moving. I went from poetry to a graphic novel, to a kind of a poetry exchange with Tess. and then to a novel. I'm always trying to do different things. I think an artist should always try something new. Because the incarceration is so powerful it is very tempting to dwell on it and not move forward.  For the novel, I wanted to present the context of the incarceration and the afterward to give a larger perspective. Robin Lindley: Thanks for your words on your process. How did you decide on the novel’s title, My Name is Not Viola? Dr. Lawrence Matsuda: I found my mother’s high school annual and there were inscriptions like “Good Luck, Viola.” I asked her who Viola was, and she said her teacher gave her that name. Robin Lindley:  In your novel, you take your mother’s life and add to the story. Picasso said that art is the lie that tells the truth. You share an engaging human story that deals on so many levels with the forces of history such as racism and injustice and the aftermath of war. It’s incredible how much she dealt with in her life. Dr. Lawrence Matsuda: There are 120,000 stories of people who were forcibly incarcerated and each one is different but similar. They all experienced the same thing at different levels. My story is only one of 120,000. Robin Lindley: You were born a Minidoka in 1945 so you must not have any direct memory of the internment. Dr. Lawrence Matsuda: No, but I do have borrowed memories. No matter what, at every Christmas, every Thanksgiving, every New Year's party, every wedding, funeral, the evacuation and the incarceration always came up. It's just a part of life. And I have these borrowed memories that usually focus on the worst of the experience. I don't have clear memories in the traditional sense, but my friend, a psychiatrist, says that, when my mother was pregnant, more than likely some chemicals were sent to me in her womb and that affected me in terms of fear and stress that made up my personality. And he also has said that, when he talks to someone who has deep problems, oftentimes he asks if their grandparents suffered any problems? He says big traumas are passed down for three generations. He feels that what happened to your grandparents and your parents is relevant to your current situation. Robin Lindley: I’ve heard about studies on genetics and past trauma. There are several studies with grandchildren and children of Holocaust survivors. Dr. Lawrence Matsuda:  So the trauma is passed down, and somehow you adjust. The third generation of trauma can still affect you. Robin Lindley: So, we’re haunted by the traumas of earlier generations. You deal with almost a century of modern American history in the book. What was your research process as you wrote the novel? Dr. Lawrence Matsuda: I went to Minidoka about six or seven times. In 1969, I taught the first Oriental American history class in the state of Washington at Sharpless Junior High School—now Aki Kurose Middle School. So I was interested in history and, while there, a number of things happened. I met Mineo Katagiri, a reverend who founded the Asian Coalition for Equality, and we worked together. Later on, some members of the Asian Coalition for Equality and I confronted the University of Washington because they were not admitting Asian students into their educational opportunity program (EOP). At the time, it was called the Special Opportunity Program, which served poor whites, blacks, Latinos, and Native Americans, but not Asians. And so, my interest in history took a step into activism. Ironically, it did again with the kids in the Oriental American history class. At that time, we were still referred to as “Orientals” and the term “Asian” was emerging. The class made a display of miniature barracks like those at Minidoka for an exhibit called “The Pride and the Shame,” a Japanese American Citizen League’s traveling exhibit for the University of Washington Museum. Bob Shimabokuro in his book, Born in Seattle, writes about how the traveling exhibit was the impetus for the reparations movement for Japanese Americans. So, my history interest moved me into activism, and my activism was rooted in history, especially anti-Asian, anti-Chinese, and anti-Japanese prejudice which culminated in the forced incarceration. Robin Lindley: Thank you for your work for change. To go back to your novel, I’m curious about the story of your main character Hanae, who is based on your mother, and your mother's actual experiences. Did your mother go to Hiroshima, as in the novel, when she was about nine and have a rather dismal experience with her relatives, especially her older brother’s wife? Dr. Lawrence Matsuda: That was not true. She was born in Seattle and she went to Japan at age one and she returned with her mother and brothers about eight years later.  Her father stayed in Seattle and sent money home to Hiroshima when the family was there. And when she was nine years old, she came back to Seattle. When she was 21, she returned to Hiroshima to live with her older brother and that's when she couldn't get along with her sister-in-law and left after a year. Robin Lindley: And did she have an older brother Shintaro who was an officer in the Japanese Navy? Dr. Lawrence Matsuda: Yes. He was a submarine officer. He was not a captain, but he was a high-ranking officer on a submarine. He mentioned that the warlords were feeling very confident because of the victory over a Western power in the Russo-Japanese War. Robin Lindley: The militarists were building sentiment for war in Japan in the early 1930s. In your novel, you depict the removal, the evacuation, and the internment vividly. Was your depiction of Hanae’s story in the novel similar to what your mother experienced in the shocking removal and then the incarceration. Dr. Lawrence Matsuda: Yes, it was as described I think most of the Japanese were shocked. They knew that the Japanese nationals were at-risk as non-citizen aliens. There was a law that wouldn't allow them to become naturalized citizens, so they were aliens. That would be her father's generation. But the initial thought among the Japanese was that they would not take the Nisei [second generation] who were US citizens. So, they were shocked when citizens were taken because it was totally unconstitutional and un-American. You don't round up and arrest citizens for no crime without due process, right? Robin Lindley: Didn’t the US government contend that the order of evacuation and internment was to protect people of Japanese origin because of extreme anti-Japanese sentiment after the Pearl Harbor attack? Dr. Lawrence Matsuda: Some people used that excuse, but that wasn't the reason that they were evacuated. If you read the actual evacuation notice, it says all persons of Japanese ancestry, alien and non- alien, were to report to designated locations. And overnight the Nisei, who were citizens, became non-aliens. Robin Lindley: And weren't the families and others of Japanese ancestry actually rounded up by troops armed with rifles with fixed bayonets? Dr. Lawrence Matsuda: Yes. There were troops. The people were told to report to certain places.  The earliest pickups were done by the FBI. They took  mostly first-generation people who were leaders of the community shortly after Pearl Harbor while the bulk of Japanese were taken in April. Robin Lindley: It was a heartbreaking violation of human rights and the rule of law. What happened once these citizens and non-citizens were rounded up? What happened to their property and possessions? Dr. Lawrence Matsuda: It was different in every region of the country, but here the Japanese obviously sold off a lot of their goods at fire sale prices. And they stored some items. My parents actually stored some goods at a storage company and also at the Buddhist church. There were people in rural areas who left their land to others to care for. For example, on Bainbridge Island, some leased their land to their Filipino workers. They did take care of it and when the Japanesereturned, the land was in good shape. And some of the Japanese split the land with the Filipino workers. Other Japanese left the land and it was totally in disrepair when they came back. Many couldn’t keep their properties because they couldn't pay the taxes. So it was lost. There are countless stories. One storeowner left his ten-cent store to a Jewish man to care for. I think he was a jeweler who watched the boarded-up store and took care of it. Nothing happened to that store, but other places such as farmhouses were destroyed, especially when they came back. A farm house was burned on Vashon Island. There were farm houses vandalized in anti-Japanese incidents in Hood River where the whole town signed a petition not to permit the Japanese to return--but the Japanese did anyway. Each place has a different story, but overall, most of the people lost their businesses. Most of them lost their jobs. Most of them lost their homes. Most of them sold whatever they had at huge discount. So it was a very difficult time. Goods were sold for a penny on the dollar and customers took advantage because they knew that the Japanese were vulnerable. Robin Lindley: You have some remarkable scenes in your novel. I was struck when some white person wanted to buy a piano for a dollar. Dr. Lawrence Matsuda: Yes. The Japanese knew they couldn't take it with them. And, if a store was going out of business, they would sell at a huge discount on all goods. They were trying to make something, no matter how small. Robin Lindley: Were their physical attacks on people of Japanese origin following the Pearl Harbor attack? Dr. Lawrence Matsuda: I hadn’t heard of any physical attacks. I know some Filipinos were beaten up because they were thought to be Japanese. The Chinese wore buttons saying “I am Chinese.” And I know that there was a man who was impersonating an FBI agent and he tried to do some bad things to Japanese women. Robin Lindley: That was such a time of fear and hysteria. What are some things you’d like people to know about the conditions of the concentration camp at Minidoka where your parents were held and where you and your brother were born? You describe the circumstances vividly in your novel. Dr. Lawrence Matsuda: They were in the desert. The food was not always sanitary. The quarters were cramped. There was no privacy. People had to use the latrines instead of regular toilets. There were scorpions and rattlesnakes and dust storms. All of that was just a given, but the worst part of it was being betrayed by your country. I compare it to rape. The whole community was raped and we handled it like rape victims. Some were in denial and others tried to prove that they were good citizens. Some committed suicide. Others were just depressed. So, the worst part of it was the mental realization that the whole community was raped. And very few on the outside really cared. I compare it to a rape by your uncle--by someone you trust in your family. It was a rape by our Uncle Sam. Robin Lindley: And wasn’t the internment out of sight and out of mind, without much press coverage or any outside attention? Dr. Lawrence Matsuda: Yes. Minidoka was tucked into a ravine and 9,000people were imprisoned there. If you drove by, you wouldn't even see Minidoka even though it was the third largest city in Idaho at the time. The physical conditions were bad, but I think the mental trauma was really devastating. The fact that your country betrayed you. And afterwards. Think about it. Who can you trust if you can't trust your government to protect you and maintain your rights? Who can you trust? Robin Lindley: That history is devastating. What sort of housing did your mom and dad live in there at the concentration camp? I understand the shelters were very crude and crowded with little privacy. Dr. Lawrence Matsuda: They lived in barracks that were hastily constructed. They had tar paper on the outside and weren't shingled or sided. It was like army barracks. It was open and they used blankets as curtains, and several families shared each building. The noises and the smells spread. The barracks were heated by a pot belly stove that burned coal. At the first relocation center, my parents were given ticking and sent to a pile of straw to stuff a mattress. That's what they slept on at Camp Harmony in Puyallup, which was actually a county fairground. Some of the bachelors lived in the horse stalls that still had horse smells. My cousin got the measles and was quarantined in a horse stall. When they moved to the permanent camps, like Minidoka and the other camps, they lived in hastily-constructed, army-style barracks with cracks in the floors, cracks in the walls. The wind would blow through. And the barracks all looked alike so people could get lost and wander into your area at night. Robin Lindley: And there were extreme temperatures in the hot summers and cold winters. The weather must have been miserable. Dr. Lawrence Matsuda: It was cold and muddy in winter. The residents had to walk on boards that were laid down on the mud. And that was how they got to the mess hall. My mother would never eat Vienna sausage because it caused dysentery several times. Robin Lindley: And wasn’t healthcare limited? Dr. Lawrence Matsuda: There was a patient hospital on site. When there was an outbreak of dysentery, you had to line up at the latrine with everyone else, because everyone who ate at the same mess hall had dysentery. One night, the lines were so long and the internees were upset, the guards thought there was a riot. Soldiers were going to shoot. The residents shouted, “No, no, it's dysentery. We've got the trots.” And so, the soldiers left them alone. Robin Lindley: When your parents were released from Minidoka with you and your brother, they returned to Seattle where they had been dispossessed. And your mother was facing the additional trauma of dealing with the probable deaths of her relatives in the atomic bombing of Hiroshima. Dr. Lawrence Matsuda: They actually released many people at Minidoka before the end of the war to work, attend college or join the army. My father left several times to find housing, which he never found.  So, they stayed in camp until it closed. The administration shuttered it down, turned off the electricity, and told them to leave, and gave them a train ticket and $25. Back in Seattle, my family stayed in the basement of my mother's friend's house for a while. We lived there until my dad could find proper housing, but it was in short supply because of the war and the GIs coming back. It was not an easy time. And, there was racial real estate redlining in Seattle, so we couldn't move to the best part of town. We could only move to certain parts of town. If those areas were taken, it was tough luck. And in fact, some of the Japanese who moved out of the Central Area returned and found that African-Americans who came up from the South to work during the war had moved into the redlined area. Robin Lindley: That’s another tale of discrimination in America, and we're still living with the results of racist red lining. Thanks for sharing that insight. I didn't realize the effect on the Japanese community. Your mother must have been shaken by the terrible atomic bombing of Hiroshima and the lack of news about her relatives. Dr. Lawrence Matsuda: Yes. The first news they heard was that Hiroshima was bombed. Tokyo had suffered firebombing with more or less conventional bombs like napalm, but the residents did not understand what an atomic bomb was and the results. Recently, I read an article about how the US was suppressing news about the Hiroshima destruction until John Hersey visited Hiroshima and wrote his famous book, which revealed the aftermath. The news came in very slowly. It wasn’t like today when, if something happens, CNN is there by the next day. This news dribbled in. They knew that Hiroshima was destroyed, but they didn't know quite what that meant. It was the instantaneous destruction that was hard to comprehend. You could understand something being destroyed slowly, but everything in Hiroshima was vaporized or destroyed in an instant. My mother didn't know what happened to our relatives. It was only because of our relatives in the countryside that she found out the full story. But it was tough for her because she had lived in Hiroshima and she knew the city, so it was really devastating to realize that the city and many of her relatives were gone instantly. The people of Hiroshima were not soldiers. Soldiers expect to be put in harm's way and die, but these were civilians: old women, old men, young children, and workers.  They were evaporated and destroyed instantly or many died later of radiation sickness. Robin Lindley: Have you traveled to Japan and visited Hiroshima? Dr. Lawrence Matsuda: Yes. I was actually in Hiroshima during the 50th anniversary of the bomb.  It is a strange city. Kyoto is very old. You see the shrines and the old architecture. Hiroshima is modern. It doesn't look like a Japanese city, but a modern city because it was totally destroyed. And in real life, our family home was only a thousand meters from ground zero. Robin Lindley: That visit must have been very moving for you then. Now it’s the 75th anniversary. Dr. Lawrence Matsuda: Yes. But I was surprised too when I met my relatives, the children and grandchildren of my mother's oldest brother. They were all very positive, very healthy, and very energetic. They were generally happy people. I met Akkiko who survived the bomb. She was in the family home at the time.  I met her son, and her son’s son. So it seems life goes on. Robin Lindley: Yes, that’s encouraging. Didn’t Akkiko suffer radiation illness and severe burns? Dr. Lawrence Matsuda: Yes. She’s mentioned in the book. Robin Lindley: Your description of Hanae’s treatment for depression at Western State Hospital, a psychiatric facility, is very moving. It happened in 1962 and you juxtapose her experience with the Cuban Missile Crisis. You also destigmatize mental illness. Does your portrayal in the novel parallel your mother’s own “incarceration” at the hospital when she was admitted for severe depression? Dr. Lawrence Matsuda: I really couldn't say for sure because she never talked about it. But I did talk to my friend who is a psychiatrist.  He took me to the Western State Hospital Museum and I saw what it was like, and I knew what they did at the time. I studied the hospital’s history and learned that doctors specialized in lobotomies at the time. Robin Lindley: Did you visit your mother when she was in the hospital? You must have been a teenager then. Dr. Lawrence Matsuda: I visited her once. They wouldn't let me go inside. We had to meet her in front of the hospital, in the parking area, at the turnaround. She came out to see us. Robin Lindley: What do you remember about that visit? Dr. Lawrence Matsuda: She was very thin and she looked worse than when she entered. Robin Lindley: And what kind of treatment did she receive? Did she actually have shock treatment or electroconvulsive therapy? Dr. Lawrence Matsuda: I'm sure she did. My psychiatrist friend told me that was pretty standard. Robin Lindley: Did your mother seem depressed to you before she was hospitalized? Did she talk about suicide? Dr. Lawrence Matsuda: Yes, she seemed depressed, and she was very distant and not engaged. But she did admit to her sister-in-law that she was contemplating suicide. Robin Lindley: Wasn’t there almost an epidemic of suicide among the internees after the war? Dr. Lawrence Matsuda: Yes. There’s no real data on that because nobody kept track of it. But I talked to Tets Kashima, who was a professor of Asian American studies, and he said in California suicide was prevalent. There were just a lot of suicides. And the other thing was, few people talked about it. Robin Lindley: From some history I’ve read, such as The Nobility of Failure by Ivan Morris, it seems that suicide is honorable in Japanese culture and tradition. And in your novel, some characters see suicide as an acceptable way to cope with loss and depression. Dr. Lawrence Matsuda: That's the samurai tradition. If you dishonor your master, or yourself, you must die too. That led to a custom of ritual suicide. Hara kiri, which translates into “cut your stomach.” And that’s what samurai did. And my friend, [the artist] Roger Shimomura had ancestors who were famous for a double suicide. They stood face to face and stabbed each other simultaneously. So, they committed ritual suicide together. Robin Lindley: That's an elaborate way to go. You indicate that Hanae and your mother were influenced by both Japanese and Christian traditions. Were those traditions a source of your mother’s strength and resilience through the catastrophes in her life? Dr. Lawrence Matsuda: Yes. I think both of them helped her. She could call on Japanese tradition to deal with her stress if an American tradition did not help. So, she had a little more of an arsenal, if you will, or two toolboxes to pull from. However, some tools that helped her survive became counterproductive. Take the Japanese word shikatanganai. “It can't be helped.” That word helps you get through, but after a while it doesn't move you forward. Robin Lindley: Yes. “It can't be helped.” When I read that phrase in your book, it reminded me of Vonnegut’s refrain: “So it goes.” It can’t be helped seems a pessimistic adage rather than we can change this or we can do better. Dr. Lawrence Matsuda: It isn't really. Japan was a harsh land of starvation, earthquakes, and typhoons. When your house fell down, no one in the village wanted to hear you crying because their house fell down too. And so it’s shikatanganai, it can't be helped. It's just what happened. And in America, a rich country, not a poor country like Japan, there is no shikatanganai. Here, your house falls and you call your lawyer. You sue the city. You sue the architect. You sue your neighbors. But it's not that it couldn't be helped. You’ve got to sue somebody. And it's really an irony that, in a poor country, they accept their fate but in a rich country, they always want to contest what happens. Not always, but there’s a different feeling. So this Japanese value helped my mother and others cope with overwhelming forces. Robin Lindley: Maybe that's akin to the acceptance stage of grief. Dr. Lawrence Matsuda: Yes, you accept fate rather than get angry. Robin Lindley: It’s a different perspective. I was interested in your influences, and you have mentioned the naturalist writers such as Frank Norris and his classic novel The Octopus. Naturalism concerns how characters deal with the forces of nature, the forces aligned against them, and you write beautifully of how your characters take on fate. Do you see the influence of writers like Norris in how Hanae deals with forces beyond her control and then, it seems, becomes a force herself? Dr. Lawrence Matsuda: Yes. The naturalists felt that the forces of nature superseded human ambition. Human beings have to deal with natural forces at work in this world and these forces often overcame individuals.  In The Octopus, the novel by Norris, the railroad was a force which had to reach from coast to coast to deliver grain to the starving people in India. So that was another force to deal with. And even though the ranchers resisted the railroad, they couldn't stand up to it because the force was more potent. It had to deliver the grain to feed the starving masses. If you look at our situation today, there are numerous outside forces at play. One is obviously the pandemic. The other is the political situation. And these forces that are largely out of our control. But in the novel, Hanae managed to survive the adverse forces and learned to surf the waves of the tsunami and become a force herself--not a capital letter F force like feeding the starving in India, but a small force that is filled with equality and social justice. We're in that kind of a situation now. The large forces out there can destroy us, but we must learn to use them and to survive them and become forces for good. And if many people get together and become forces themselves, they can become a large force, like a natural force, like the starving masses in need of grain. We need to persevere and make it to the other side and become forces ourselves. Robin Lindley: And you have been a force for social justice and for democracy in your writing and in your activism and teaching. Dr. Lawrence Matsuda: I have tried. Robin Lindley: I’ve read about your many accomplishments. You’re too humble. You’ve written now about atrocious incidents and the resulting trauma, but you have also shared triumphs of the human spirit. Where do you find hope today? Dr. Lawrence Matsuda: When I was a kid, I read all the Greek mythology in the Beacon Hill Library at grade three. And that helped me. I think that mythology is something like history. I recall that Pandora opened a box and unleashed all these horrible things. But the thing that was left in the box was hope. There is still hope. Robin Lindley: Is there anything you’d like to add for readers? Dr. Lawrence Matsuda: I'd like to speak to why the Japanese were incarcerated. Three presidents, Reagan, Bush Senior, and Clinton, said in the letters of apology. They said the causes were racial discrimination, wartime hysteria, and failed leadership. And I ask you to take a look at what we have now regarding racial discrimination. My hope is that things get better. For wartime hysteria, which was called propaganda then, and is now called fake news. I hope that the network that peddles fake news crashes and burns. And the last one, failed leadership. I hope that our failed leaders are repaired or replaced soon. So those are my three hopes. Robin Lindley: Those are powerful thoughts to end on. Thank you for sharing your thoughtful comments Dr. Matsuda, and congratulations on your moving new novel, My Name is Not Viola. It’s been an honor to speak with you. Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, Huffington Post, Bill Moyers.com, Salon.com, NW Lawyer, ABA Journal, Real Change, and more. He has a special interest in the history of human rights, conflict, medicine, and art. He can be reached by email: robinlindley@gmail.com. Dr. Lawrence Matsuda, a renowned Seattle writer and human rights activist, brings to life his mother’s travails, traumas, and triumphs in mid-20th century America in his debut historical novel My Name is Not Viola.
10922230fcba7b724495977f48a26cc2
https://historynewsnetwork.org/blog/154473
Congressional Leadership Experience and the Presidency
Congressional Leadership Experience and the Presidency Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available. As Joe Biden is becoming President of the United States, one question that has arisen is whether he will be able to accomplish his domestic goals with such an evenly divided Congress, rather than a mandate of substantial control of both houses, as Franklin D. Roosevelt and Lyndon B. Johnson had in the 1930s and 1960s. History informs us that only a small number of the 45 people who have been president were men of substantial congressional experience, including leadership positions.  And at the top of the list is Joe Biden, with his 36 years in the senate representing Delaware, including his time as chairman of the Senate Judiciary Committee from 1987-1995, and as chair of the Senate Foreign Relations Committee from 2001-2003, and 2007-2009.  These experiences and challenges, along with being an extremely active and engaged vice president for President Barack Obama, meant he navigated the problems of dealing with the Congressional opposition party in a manner not matched by any other president to the same extent. However, there were other presidents who did have extensive experience when one combines their senatorial experience with their time in the House of Representatives.  The two presidents with the most years and leadership after Joe Biden were Gerald Ford and Lyndon B. Johnson. Ford represented Michigan in the House of Representatives for nearly 25 years from 1949 to late 1973, when he became Richard Nixon’s vice president under the terms of the 25th Amendment after the resignation of Spiro Agnew.  Ford had served as House Minority Leader for nearly nine years from 1965 to late 1973, had made many contacts and connections with the majority House Democratic leadership, and was warmly endorsed as the right person to replace the disgraced Agnew. Johnson had spent 12 years in the house (1937-1949), and had 12 years in the senate (1949-1961) from Texas, giving him almost the same amount of time in congress as Gerald Ford.  Johnson rose in the senate leadership, becoming Majority Whip (1951-1953), Minority Leader (1953-1955), and Majority Leader (1955-1961). He was the most powerful figure in the latter position in American history, considered a master of legislative procedures, and having an innate ability to convince his colleagues on both sides of the aisle to follow his lead, even though President Dwight D. Eisenhower was the leader of the opposition party.  While his experience as vice president for nearly three years (1961-1963) was an unhappy period, he came to the presidency with unmatched skills that would lead to the Great Society, the most active domestic program since FDR’s New Deal. The only other president with combined congressional experience of more than 20 years was James Buchanan from Pennsylvania, who served in the House of Representatives (1821-1831), including being chairman of the Judiciary Committee in his last two years; and in the senate (1834-1845).  Despite his years in congress, as well as in appointed positions in government, sadly he failed to meet the challenge of the pre-Civil War years and is seen as a presidential failure, usually at or near the bottom of rankings of presidents by scholars. Additionally, 4 other presidents served between 12 and 18 years in congress, and held important leadership positions and had prominent roles. James A. Garfield from Ohio served nearly 18 years in the House of Representatives (1863-1880) before being elected as the only president to go directly from that chamber to the presidency, but sadly was assassinated within six months of taking the oath.  Garfield played a leading role in the House, and was House Appropriations Committee Chairman from 1871-1875, one of the most crucial committees in the Reconstruction period.  He was at the center of the many political controversies of the tumultuous times after the Civil War. James K. Polk from Tennessee served 14 years in the House of Representatives (1825-1839), and was the Chairman of the Ways and Means Committee (1833-1835), before being the only president to have held the position of Speaker of the House (1835-1839). He later served one very active term as president (1845-1849), doubling the territory of the United States through diplomacy with Great Britain and war with Mexico. William McKinley from Ohio served 13 years in the House of Representatives (1877-1884, 1885-1891), including being chairman of the Ways and Means Committee (1889-1891), and having a national leadership role as the sponsor of the McKinley Tariff of 1890.  He then served as president for four and a half years before being assassinated in September 1901. He did accomplish the gaining of territory by war with Spain and the annexation of Hawaii. Finally, John Quincy Adams from Massachusetts, after having earlier served in the US Senate (1803-1808), in diplomatic posts, and as Secretary of State under James Monroe before his one term in the presidency (1825-1829), became the only president to be elected to the House of Representatives after his term, serving for 17 years (1831-1848).  With his stature and outspoken nature, Adams became engaged in controversies over domestic and foreign policy under presidents Andrew Jackson, Martin Van Buren, John Tyler, and James K. Polk, most notably on his opposition to slavery and the Mexican War.  While he had no leadership position as the other seven presidents had in Congress, his unique role as a former president gave him prominence unmatched in American history. So in conclusion, Joe Biden comes into the presidency with a track record unmatched in many ways, and one of only 8 presidents to have had extensive experience on Capitol Hill.  Whether that will give him an edge in accomplishing his goals and dealing with greatest crises comparable to those faced by Barack Obama, Franklin D. Roosevelt, and Abraham Lincoln, is something that only time will tell, with the nation hoping for the best in a difficult time.
d9efedbd37e3f165cf35be49fb112d24
https://historynewsnetwork.org/blog/154478
Presidents Who Look Better or Worse by Comparison
Presidents Who Look Better or Worse by Comparison President-elect Obama with President GW Bush and former Presidents Carter, Clinton and GHW Bush, January 7, 2009. Photo Pete Souza, CC BY 3.0 Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available. America has had two presidents who each “shine” in historical reputation while being sandwiched between preceding and succeeding presidents who have been rated among the worst presidents America has experienced in its 232 year history. America has also had four presidents, who while not “failures,” are perceived as less successful and outstanding than the presidents before and after them. The first such case in the first category is Abraham Lincoln (1861-1865), coming to office after James Buchanan (1857-1861), and succeeded by Andrew Johnson (1865-1869). Lincoln has certainly been criticized by many scholars for his civil liberties violations during the Civil War, and for the often failed military leadership he had until later in the war.  His racial views have been seen with a critical eye by many, as well.  And yet, it is acknowledged by most historians and political scientists that Lincoln is our greatest president, despite his shortcomings. James Buchanan is ranked in most scholarly polls as our worst president, as he presided over the disintegration of the nation in the late 1850s, leading up to the secession of the southern states in his last months in office. He further proved unwilling to challenge southern states who claimed the right to seize US military property in the interim four months between the election and the inauguration on March 4, 1861. Andrew Johnson, an accidental president due to the assassination of Lincoln, proved to be a disaster in the making, unable and unwilling to work with the Republican congress on Reconstruction policy, and faced the first presidential impeachment trial.  He also possessed one of the worst personalities of any president, constantly confrontational and highly condemnatory of all critics, and hostile to racial equality.  Johnson is ranked either just above Buchanan or, alternatively, as the absolute nadir of the presidency. The second case in the first category is Barack Obama (2009-2017), coming to office between George W. Bush (2001-2009) and Donald Trump. Obama faced great opposition in his two terms, but had major accomplishments despite that reality, and is perceived as a president who overcame the Great Recession he inherited, saved the auto industry, and promoted reform in health care, the environment, and foreign policy, and also utilized executive orders to advance many initiatives when prevented by the opposition from making progress through Congress.  His ranking has been Number 12 in the C-SPAN Presidential Poll of 2017, and Number 8 in the American Political Science Association Poll of 2018, and he is highly ranked in public opinion polls of everyday Americans. George W. Bush has been low-ranked, mostly in the bottom ten of presidents, or at best the bottom third, due to tragedies of the prolonged wars in Iraq and Afghanistan, as well as the poor response to Hurricane Katrina, and the collapse of the American economy in 2008-2009 into what has been termed the Great Recession, the worst economic downturn since the Great Depression of the 1930s. Donald Trump has not been formally ranked yet, but after one year in office, he was rated in the bottom of the American Political Science Association poll, moving Buchanan and Johnson up a peg in each case.  With the reality of the tumultuous four years of Trump, leading to two impeachments of the president in just over one year; plus the reality that he never won the popular vote in 2016 or 2020, and refused to accept the election results of the 2020 presidential election, leading to the US Capitol Insurrection of January 6, 2021, it seems likely that for the short run, as well as the long run, Trump will languish in the basement of rankings, or at most but not likely, be just above Buchanan and Johnson.  This assessment must also include the high level of corruption and incompetence, and the botched reaction to the COVID-19 pandemic, the greatest public health crisis in a century. The cases where a president’s reputation has been diminished due to comparison of presidents before and after him does not mean these four presidents to be discussed were as horrendous as those surrounding Abraham Lincoln and Barack Obama, but the rankings of the following presidents do suffer by comparison. The first such case is John Adams (1797-1801), between George Washington (1789-1797) and Thomas Jefferson (1801-1809).  Adams only had one term, was defeated for reelection, and is harshly criticized for the Alien and Sedition Acts of 1798.  But he still ranks in the top 20 of presidents, while Washington is usually ranked as Number One or Two, and Jefferson usually around Number Seven for their contributions and accomplishments in their time as presidents. The second such case is James Madison (1809-1817), between Thomas Jefferson (1801-1809) and James Monroe (1817-1825).  Madison presided over the failed War of 1812, which included the British attack on the US Capitol and the White House in August 1814. There is a perception that Madison, despite his great historical accomplishments before his Presidency, had been weak and subjected to pressure by the “War Hawks”, who recklessly wanted war with Great Britain to gain all of Canada, a false effort. So Madison is rated in most polls near the bottom of the top 20, while James Monroe is rated higher with his foreign policy accomplishments. The third such case is William Howard Taft (1909-1913), between Theodore Roosevelt (1901-1909) and Woodrow Wilson (1913-1921).  Taft had one term and was defeated for reelection, in a four way race with TR, Wilson, and Socialist Eugene Debs in 1912, and is the only president running for reelection to come in third, not second, in the popular vote (he claimed only twenty-three percent of the vote, and won 8 electoral votes).  Taft had promoted some progressive ideas in office and two progressive constitutional amendments, but had divided the Republican party, which was in disarray at the time. Taft also faced opposition from his predecessor, who had promoted him for the Presidency in 1908 but then turned vehemently against him.  Theodore Roosevelt had the best third party performance in American history.  Taft would go on to be Chief Justice of the Supreme Court in the 1920s, and would end up being ranked in the high to mid 20s as president. Roosevelt and Wilson, however, even with their shortcomings, would always rank higher, with TR often 4th or 5th, and Wilson, while slipping in recent scholarly estimation, still being number 11 in the C-SPAN Poll of 2017, after having regularly been in the top ten, and as high as 6th in earlier polls. The fourth and last such case is George H.W. Bush (1989-1993), between Ronald Reagan (1981-1989) and Bill Clinton (1993-2001).  Bush had one term and was defeated for reelection in 1992, in a three way race with Bill Clinton and H. Ross Perot, who ran as an independent and took 19 percent of the vote, leaving Bush with only 37 percent of the popular vote, the second worst popular vote defeat of any sitting president. Only William Howard Taft, eighty years earlier, did worse.  Bush had been successful in promoting the Persian Gulf War against Saddam Hussein’s Iraqi regime, had promoted a significant civil rights law, the Americans With Disabilities Act of 1990, and had presided over a peaceful and stable end of the Cold War when the Berlin Wall came down in 1989, but he had been undermined by the Recession of 1992, just as an election year came on. So Bush is usually ranked at the bottom of the top 20 presidents, while Reagan is rated around Number 10, and Clinton is usually ranked in the top third of presidents, usually around Number 15. So who is president before and after any president has in six cases had a big impact on their scholarly ranking in American history. Joe Biden has had an active first two months in office, and the odds look likely that Donald Trump will be another case of a president who makes his predecessor and successor both look even better by comparison.
cc0f4181cba2cd147f9ced4c33953e84
https://historynewsnetwork.org/blog/154483
Fate and Fortune in Presidential Elections
Fate and Fortune in Presidential Elections Ronald L. Feinman is the author of Assassinations,  Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available. Fate and fortune play a major role in the American presidency, as so many of those who have become president were not perceived, even a year before their elections, as likely to reach the Oval Office. The effort to project “frontrunners” in presidential races has not worked very well, when one looks back at the year before many elections, as will be outlined below. In the year 1843, it was clear that the frontrunner for the Democratic presidential nomination in 1844 was former President Martin Van Buren.  Former Speaker of the House James K. Polk was not in public office in 1843, and it seemed clear that Van Buren was the likely choice of his party against Whig nominee Henry Clay.  But Van Buren was unable to achieve the Democratic Party requirement of support from two thirds of the delegates at the National Convention. On the 9th ballot Polk, seen as the first “dark horse” presidential nominee, triumphed. He then overcame the much better known Henry Clay, with the assistance of a small third party, the Liberty Party, to become the 11th president of the United States. Polk would gain, through war with Mexico and treaty with Great Britain, more territory for the nation than anyone since Thomas Jefferson and the Louisiana Purchase in 1803. In the year 1859, former one term Whig Congressman Abraham Lincoln, fresh off a losing Senate race against Stephen A. Douglas, had gained notice, but New York Senator William Henry Seward was seen as the likely choice for the Republican presidential nomination in 1860, and Douglas was clearly the leading Democratic candidate.  But Lincoln went on to win the Republican nomination, and while Douglas was the Democratic nominee, that party would become divided between Douglas and outgoing Vice President John C. Breckinridge, who became the Southern Democratic nominee for the presidency.  John Bell also ran as the nominee of a one-time third party, the Constitutional Union Party.  Lincoln ended up winning the presidency, despite having less than 40 percent of the total national vote, and then led the Union in the Civil War.  While highly controversial in office, Lincoln brought about the Emancipation Proclamation and the 13th Amendment to the Constitution, preserved the Union over the secessionist movement, and is widely regarded as the greatest American president. In the year 1911, Woodrow Wilson had just become the new Democratic Governor of New Jersey, after a career as an educator, scholar, and nationally noticed president of Princeton University. Speaker of the House Champ Clark of Missouri was seen as the front runner for his party’s nomination in 1912, but the Democratic National Convention was in a stalemate due to the thirds rule that was required, just as it was in 1844 when Polk won the nomination on the 9th ballot.  This time, the convention went through 46 ballots, before Wilson, seen like Polk as a “dark horse,” with only a year and a half as an elected politician, became the nominee.  Wilson was fortunate that the opposition Republicans split between incumbent President William Howard Taft and former President Theodore Roosevelt, giving Wilson and his party the advantage.  Despite only winning 42 percent of all votes cast, Wilson won a landslide victory in the Electoral College, and went on to promote extensive domestic reform in his first term, and become a wartime leader in the First World War in his second term. In the year 1931, Franklin D. Roosevelt had recovered enough from polio, although still in a wheelchair, to be in his second term as New York Governor, promoting the “Little New Deal.”  He was well aware that the major barriers to winning the Democratic presidential nomination in 1932 were the 1928 nominee Alfred E. Smith and Speaker of the House John Nance Garner of Texas.  Famed journalist Walter Lippman was skeptical of FDR, saying while he thought Roosevelt was a pleasant man, that he lacked any important qualifications, but clearly would like to be president.  FDR won the nomination, surpassing the two-thirds rule on the 4th ballot, selected Garner as his vice presidential running mate, and went on to a landslide victory over President Herbert Hoover. He would promote the New Deal, take America through World War II, and be regarded as one of the top three presidents of all time. In the year 1959, Senator John F. Kennedy seemed ready to seek the Democratic presidential nomination in 1960, but his reputation in the Senate was of a “lightweight” compared to his rivals, Senate Majority Leader Lyndon B. Johnson of Texas, Hubert Humphrey of Minnesota, and Stuart Symington of Missouri.  Kennedy also had the negative factor of being the first serious Catholic contender for the White House since the failed candidacy of Alfred E. Smith in 1928.  JFK needed to overcome the “Catholic issue” by showing strength and victory in two states with small Catholic communities, and a large Protestant majority, and he did so by defeating Humphrey in the Wisconsin and West Virginia primaries.  But even then, he still faced the challenge of Johnson, who had Southern backing and a strong image as a challenger, and only at the end of the roll call of the states on the first ballot at the Democratic National Convention, when Wyoming swung to him, did JFK win the nomination.  Similar to FDR selecting Garner in 1932, Kennedy now chose Johnson as his running mate, and they would win the hotly contested election of 1960 over Richard Nixon.  Kennedy would go on to promote change as the youngest elected president, and though his time in office was cut short by assassination near the end of the third year, he would be seen as one of the more popular and admired presidents of modern times. In the year 1967 Richard Nixon, who had lost the close presidential race of 1960 and then been soundly defeated in his run for California Governor in 1962, decided he was going to try again for the Republican presidential nomination, against the challenge of Michigan Governor George Romney, New York Governor Nelson Rockefeller, and California Governor Ronald Reagan.  Nixon was seen as a long shot, with the thought that a comeback from his loss eight years earlier was highly unlikely. Nixon triumphed at the Republican National Convention in 1968, and surprised many observers when he overcame incumbent Vice President Hubert Humphrey, who led a Democratic party divided over the war in Vietnam and facing a defection of southern conservatives after Lyndon B. Johnson’s signing of the Civil Rights Act of 1964 and the Voting Rights Act of 1965.  The civil rights issue led to Alabama Governor George Wallace running on the American Independent Party line as one of the most serious third party challengers in American history.  Nixon won with only 43.4 percent of the total national vote, but went on to accomplish significant goals in American foreign policy and cooperate with the Democratic controlled Congress in promoting some major domestic reforms.  But his insecurity, perceived paranoia, and inability to accept criticism led to illegal actions, culminating in the Watergate scandal, movement toward impeachment and his resignation in 1974. In 1975, Jimmy Carter had finished his one term as Georgia Governor, and announced his plans to run for president in 1976. This evoked laughter and cynicism, as he was seen as a quite obscure political leader, despite a successful term in office.  Carter had far better known rivals, including Arizona Congressman Morris Udall, California Governor Jerry Brown, Idaho Senator Frank Church, Indiana Senator Birch Bayh, and Washington Senator Henry Jackson. Carter was clearly seen as a dark horse.  But Carter organized early and efficiently, and surprised political observers by winning a majority of the newly expanded state primaries and caucuses, portraying himself as an outsider, political centrist, and moderate reformer.  In choosing Minnesota Senator Walter Mondale, a Hubert Humphrey protégé, as his running mate, Carter united the party, and went on to defeat President Gerald Ford.  Carter would go on to be perceived as a champion of human rights, but had difficulties uniting the nation in his time in office and lost reelection. He is now seen as an elder statesman with a positive public image, more than 40 years after leaving office, and is the longest-lived president. In 1979 Ronald Reagan had been retired from public office after two four-year terms as California Governor from 1967-1975, and two failed bids for the presidency in 1968 and 1976, the latter against President Gerald Ford.  He was nearing the age of 70, and most observers thought the fact that, if elected, he would surpass Dwight Eisenhower to become the oldest president in American history, made Reagan seem like an unlikely long shot.  But Reagan overcame his chief rival George H. W. Bush, then chose him as his running mate, before going on to be elected over President Carter and Independent third party candidate Congressman John Anderson of Illinois, winning a landslide in the Electoral College.  Reagan would go on to promote a transition in American government from New Deal-Great Society Liberalism to Reagan Conservatism, which would be a dominant force in the Republican Party for the next forty years. He would transform the presidency and become a very popular president. In the year 2007, after only two years as an Illinois Senator, Barack Obama would start a long-shot campaign for the Democratic presidential nomination in 2008, with the chief rival being former First Lady and New York Senator Hillary Clinton.  The idea that a mixed-race candidate with a Muslim sounding last name could beat Clinton seemed like a real long shot, but Obama triumphed after a long, heated contest, chose establishment Democratic Senator Joe Biden as his running mate, and overcame Republican nominee John McCain.  He chose Hillary Clinton as his Secretary of State, following the example of Abraham Lincoln with William Henry Seward a century and a half earlier.  Obama would accomplish a major change in health care, promote new initiatives in many other areas of domestic and foreign policy, and be very popular and highly regarded when he left office in 2017, seen as having impacted America in many positive ways. And finally, in 2019, Joe Biden, while highly respected for his 36 year career in the US Senate and his productive eight years as Barack Obama’s vice president, seemed an unlikely successor to the White House in 2020, as he would become the oldest president. With an extensive list of prominent contenders for the Democratic presidential nomination, and his own early terrible performance in Iowa and New Hampshire, Biden seemed like a lost cause until the South Carolina Primary. But then Biden had an amazing revival. With the COVID-19 pandemic raging and President Donald Trump being incapable of dealing with it while dividing the nation for four years, Biden triumphed in November 2020, and has given many observers the impression of a presidency starting off strong.  Some are comparing him to the crisis times of FDR and the New Deal, nine decades ago. So the cases of James K. Polk, Abraham Lincoln, Woodrow Wilson, Franklin D. Roosevelt, John F. Kennedy, Richard Nixon, Jimmy Carter, Ronald Reagan, Barack Obama, and Joe Biden are joined together as examples of how fate and fortune have so often determined the history of the American presidency!
3d7989f60bdae799b51391d252f7c300
https://historynewsnetwork.org/blog/154485
Inequality, Labor Unrest, and Police Brutality in Early 20th Century Spokane, Washington: Jess Walter on His New Historical Novel "The Cold Millions"
Inequality, Labor Unrest, and Police Brutality in Early 20th Century Spokane, Washington: Jess Walter on His New Historical Novel "The Cold Millions" Spokane, Washington. 1909. The City Council bans downtown speeches to curb labor agitation. The Industrial Workers of the World (the IWW—the “Wobblies”) organizes a mass protest against this restriction of free speech. Local police under notorious Spokane Police Chief John Sullivan brutally break up the nonviolent labor protest. Hundreds of union supporters are arrested and jailed. Many are injured. IWW firebrand “Rebel Girl” Elizabeth Gurley Flynn arrives on the scene to secure release of the jailed workers and to organize for the IWW. She is just 19 years old and pregnant, yet she courageously organizes working people in her travels around the Northwest. She later becomes a leading suffragist and one of the co-founders of the American Civil Liberties Union. In union circles, she is exalted still for her leadership, humanity, and bravery. Celebrated Spokane novelist Jess Walter brings to life this fraught history in his new historical novel, The Cold Millions, a titular reference to the many poor and forgotten souls of early 20th century America. With a cast of real and fictional characters, he takes on issues from more than a century ago that resonate today including intolerance, income inequality, police brutality, violence, and human rights. At the same time, the novel plumbs emotional depths as it explores the complexities of friendship, sacrifice, betrayal, lust, cruelty, and love. The story unfolds through the perspective of two orphaned and jobless young men, the Irish American Dolan brothers from Montana, who seek new lives in the metropolis of Spokane. Police jail the idealistic brother Gig, 23 years old, who embraces the promises of the IWW, while younger brother Rye, 16, yearns only for modicum of stability and a home. Yet it’s Rye who accompanies the fiery Gurley Flynn on her fiery campaign for workers as he also becomes enmeshed in the dark schemes of a wealthy Spokane mining magnate. Other characters include a burlesque actress and her performing cougar, a hired assassin, anti-union scabs, hoboes, labor organizers, a crusading attorney, and more. Mr. Walter’s extensive historical research is on full display in The Cold Millions. In the creation of his novel, he pored over period newspapers, maps, diaries, letters, postcards, and more. The novel captures the mood and rhythm of the time, the arcane language, the passion of average people for fairness and justice, as well as the moments of debauchery and humor. Walter’s writing conveys his affection for his hometown of Spokane with full awareness of its fraught history, a reflection of the larger checkered history of the United States. Mr. Walter is best known for his literary novels including Beautiful Ruins and The Financial Lives of the Poets, the National Book Award Finalist The Zero, and Citizen Vince, the winner of the Edgar Award for best novel. He also wrote a critically-acclaimed book of short stories, We Live in Water, and his short fiction has appeared in Harper's, McSweeney's, and Playboy, as well as The Best American Short Stories and The Best American Nonrequired Reading. He began his writing career as a reporter for the Spokesman Review and wrote a nonfiction volume, Ruby Ridge (Originally entitled Every Knee Shall Bend). He lives with his wife Anne and children, Brooklyn, Ava and Alec, in Spokane. Mr. Walter generously responded by email to a series of questions on his writing career and his new novel. Robin Lindley: Thank you for connecting with me Mr. Walter, and congratulations on your powerful new historical novel, The Cold Millions. Before getting to your new book, I’m also interested in your writing career. You have a background in journalism and a career as a prominent literary writer. Did you want to be a writer when you were young? What drew you to a writing career? Jess Walter: I wanted to be a writer as long as I can remember. I created a family magazine with my siblings when I was six or seven (called Reader’s Indigestion) and was the editor of my junior high and high school newspapers. I read voraciously and used to visit the library as a 13-year-old, imagining where my future novels would go. In college, I was a young father, and so I had to switch from majoring in English and creative writing to journalism, so that I could support my young family. But that seven-year detour into newspapers made me a better writer, I think, and certainly a better citizen. Robin Lindley: How does your experience in journalism inform your writing now? Jess Walter: Journalism informs my writing in many ways, I think: certainly the ability to research, and to publish without fear or a kind of preciousness. You don’t come back from a newspaper assignment saying that the “muse didn’t strike.” Likewise, you learn a directness and an economy of style that translates well to fiction. As an early newspaper editor once told me, “You write beautiful descriptions. Now pick one.” But the biggest attribute that I gained from journalism, I would say, is a keen sense of curiosity, and the tools to satisfy it. I think I’m a more outward-looking novelist, with an understanding of systems and institutions, because I worked for newspapers. Robin Lindley: What sparked your career as a novelist? Are there certain writers that have influenced your work? Jess Walter: Hmm, I think of a spark as something external, but a novelist is his or her own spark. You just read and write. Every day. I’ve written pretty much every day since I was a teenager. I wrote fiction for fifteen years before I had much success at it. I wrote a nonfiction book, two unpublished novels, dozens of short stories and was a ghostwriter before I published my first novel. My fiction didn’t support me until my seventh book, and still, I am incredibly lucky that it supports me at all. As for influences, there are so many it’s hard to know where to start. From the top, I’d go with: Joan Didion, Kurt Vonnegut Jr., Don DeLillo and Gabriel Garcia Marquez. Robin Lindley: Some of my favorites too. You’re praised for novels that are always different. As America’s Librarian Nancy Pearl has said, “Jess never writes the same book.” How do you see the arc of your writing career? Jess Walter: Ha! Well, first let me just say that Nancy is a dream reader and a wonderful writer. But isn’t it strange that the anomaly is the person who “never writes the same book”? Shouldn’t that be the case for more writers? I would rather ask, “Why do so many writers keep digging the same hole?” As for me, when I finish a book, I’m ready to do something different. I strive to get better as a novelist, and I think I get better by trying new things. But once I get going on a project, honestly, I don’t think about any of that. I just let the story dictate its genre, style and tone. If I concentrate simply on writing the next book I want to read, the rest takes care of itself. Robin Lindley: It seems that most of your books involve moments in history. How does history play a role in your work? Did you enjoy history as a student? Jess Walter: I did, and I do. But other than The Cold Millions, I wouldn’t say that my writing is particularly tied to historical moments. In fact, I would say, like the journalist I was, I’m more drawn to the contemporary. I was at Ground Zero in the days after the terror attacks of 9/11 and wrote a dark satirical novel about our reaction to those attacks (The Zero), and I wrote a farcical family drama about the financial crisis of 2008 (The Financial Lives of the Poets.) Even this historical novel rose out of my desire to address contemporary issues like income inequality and political and social unrest. With Citizen Vince, I chose to write about the 1980 presidential election in part because of its significance in swinging American politics so firmly to the right over the next forty years. So I guess I would say my interest in history is really about how it impacts the present moment. Robin Lindley: Thanks for those insights. Now, to go to your new, highly praised novel, The Cold Millions, what inspired this particular book? Jess Walter: It’s difficult to distill so many years of thought and research and writing into a few impulses of inspiration, but I’ll try. Early on, I felt the political and social echoes of the last Gilded Age in our current economic climate, and I hoped to write about issues like inequality and nonviolent protest without being didactic. I also wanted to write a kind of labor Western, to collide those genres, the social novel and the adventure story, around the real free speech protests of 1909-10, and to recreate the thriving, boisterous Spokane that I found in old newspapers and postcards. I was also taken by the figure of Elizabeth Gurley Flynn, and hoped to renew interest in her amazing life, while at the same time echoing the youthful activists that I saw leading the current political fights for sensible gun and climate legislation, and against police brutality against African Americans. There were many novelists who inspired me, too, from Tolstoy to Steinbeck to E.L. Doctorow’s Ragtime to William Kennedy’s Ironweed. And finally, a big part of the novel was personal for me. I’m a first-generation college graduate from a working-class family. Both grandfathers were itinerant workers in the 1930s, and my dad worked for 40 years for Kaiser Aluminum, rising to president of his Steelworkers Union local. My dad has Alzheimer’s now, and is at the end of his life, and I wanted to honor his steadfast belief in unions. Growing up, the fairness and egalitarianism of labor was as close as my family had to a religion. I saw this early period of labor as a kind of origin story, filled with idealism and courage, before the unions became tainted by corruption and Communism became connected to the brutal regimes of the twentieth century. Robin Lindley: The novel is filled with history and you have a gift for evoking this age. What was your research process for the book? Did you find especially useful archives and other resources? Jess Walter: I read dozens of books from and about that period, correspondence and academic papers, pored over maps and railroad schedules, but most of my research, honestly, was done bent over microfilm, reading old newspapers. The Spokane Library was very helpful, especially its Northwest Room, and I took several trips to the Seattle Library and to the library at Washington State University. Research is incredibly helpful until it isn’t. At some point, the novelist just has to just create, and to imagine. You become fluent in a period and then you can allow the characters you’ve conjured to drive the action. Robin Lindley: What are a few things you’d like readers to know about Spokane in 1909? Jess Walter: If you can imagine the railroad in 1900 as the equivalent of the internet today—connecting the world in ways it hadn’t before—you can see how Spokane was one of the fastest-growing and most thriving cities in the United States at that time. Every northern railroad line pinched together in Spokane, before spreading out to Portland, Seattle, Vancouver. The incredible wealth from the area’s mining, timber and agricultural flowed through the city. Like Seattle, it was doubling in size every six or seven years, but unlike Seattle, it was known for being an island of sophistication in an empty part of the world, with great hotels and restaurants and one of the best theater scenes in the West, including the largest stage in the world. Robin Lindley: I’m a native of Spokane but never knew of the 1909 Free Speech Movement and the labor strife then. It’s fascinating and now more people will know about this past thanks to your novel. How did you come upon this overlooked campaign for workers? Jess Walter: I can’t remember how I first came across the free speech action in Spokane, but I think it was in the morgue of my old newspaper. Perhaps I was grabbing files on Tom Foley (I covered his last election in 1994) when I pulled the file on Elizabeth Gurley Flynn and noted her story and set it aside as a topic for a later novel. The sheer audacity of Gurley Flynn and the ahead-of-its time inclusivity of the IWW seemed remarkable to me. A few years later, I read that Dashiell Hammett had worked as a Pinkerton detective out of Spokane, investigating labor figures in Montana (the roots of his novel Red Harvest), and I began looking for ways to bring that period to life. For years, I gathered articles and books and mulled over how to tell the story. Robin Lindley: When I was younger, the Industrial Workers of the World, the Wobblies, were seen as bomb-throwing radicals, but you found a different story. What did you learn? Jess Walter: Well, at times, there were bomb-throwing radicals and anarchists in the IWW, but usually the violence came in reaction to the IWW. The union was radical, definitely, pushing for a complete overhaul of capitalism, but it also preached nonviolence. Some members pushed for more direct action, like sabotage and general strikes, but it was actually the IWW’s pacifism that caused it to run afoul of the U.S. government, when the union objected to our entry into World War I. There was awful violence involving the Wobblies, in Everett, in Centralia, in Butte, Montana, but almost always that violence came from the other side, from vigilantes or detectives who had infiltrated the IWW. In fact, the free speech actions in the Northwest were the first successful nonviolent protests in U.S. history, a model for civil rights activists decades later. Robin Lindley: The Free Speech Movement occurred in 1909, a decade before the better-known Seattle General Strike. What did workers gain from the Spokane Movement? Jess Walter: They were very much connected. By 1919, the IWW’s profile in the United States had been greatly diminished, and they were seen as the most radical labor organization in the United States. The Seattle strike was groundbreaking because of its breadth, because more traditional unions took part in it: dockworkers and unions affiliated with the AFL. But city officials fighting the strike used the Wobblies as socialist bogeymen to try to turn public perception against this huge, broad social movement. Robin Lindley: A central character of The Cold Millions is Elizabeth Gurley Flynn, a young labor activist—a real person—who spoke on behalf of workers and the poor. What are a few things you’d like readers to know about her? Jess Walter: I write about Gurley Flynn at a fascinating time in her life. (She would go on to become a founding member of the ACLU, the chairwoman of the Communist Party USA, be jailed for her activism, and become a civil rights activist, among other things.) But in 1909, she was a fiery 19-year-old labor activist and suffragist who had been speaking in factories and rough work camps for three years, known as the East Side Joan of Arc and, by the establishment New York Times, as a “she-dog of anarchy.” I marveled at a pregnant 19-year-year-old, ten years before she can even vote, traveling west by herself to fight for workers’ rights against corrupt police and company goons. Robin Lindley: You humanize real characters in your book such as the “Rebel Girl” Gurley Flynn and the brutal Spokane Police Chief John Sullivan. How do you create the fictional presence and world of a real character? Jess Walter: There is a fine balance, I think. To make them come alive like the other characters, you have to treat them as fictional creations, inventing dialogue, motivations and actions. But I feel a responsibility to the historical figures, as well, and so, with all of those characters, I tried to research them, and to keep the invention to a real minimum. For instance, most of the speeches that Gurley Flynn gives in the novel come from accounts of her actual speeches, in newspaper stories and books. Robin Lindley: You tell much of the story through the eyes of a couple of young Irish-American vagabonds from Montana who are drawn to Spokane. Were they based on real people? How did you choose this point of view? Jess Walter: Gig and Rye are entirely fictional characters. But their story parallels many of the hobos working at that time. And their sense of adventure comes from stories my grandfather used to tell about his own hoboing days a generation later in the 1930s. Robin Lindley: And you etch the age through a range of characters including a Pinkerton detective who sees Spokane as “a box of misery” and “a syphilitic town” that metastasized, a hired killer, an actress who performs with her cougar, a righteous lawyer, wealthy tycoons, and more. Were there historical models for these characters? Jess Walter: Other than Fred Moore, who was an actual labor lawyer who moved from Spokane to other free speech protests around the West, they are all fictional characters burnished by my research into the time. Robin Lindley: The brutality of the Spokane police, jailers, and anti-union thugs may stun some readers. What was the city like in 1909 for the poor, the dispossessed, the nonwhite? Jess Walter: About like it was everywhere. Maybe the one difference was that the city was teeming with itinerant workers because of its location as a hiring center for mining, timber and agriculture jobs. Many of these were recent immigrants from Central Europe, and they suffered through waves of abhorrent racism and xenophobia, as immigrants as varied as the Chinese and the Irish had previously, and as Native Americans and African Americans continuously faced. The Spokane Police, during this period, were accused of everything from brutalizing traveling workers to shaking down the city’s brothels, again, not unlike police in other cities. Robin Lindley: You also capture the arcane language and idioms of the period. How did you come to learn these expressions and obscure words? Jess Walter: It was great fun, immersing myself in the language of the newspapers, the IWW speakers and singers, the Pinkerton detectives and others. Much of it came from newspapers and Wobbly accounts of the free speech protests in Spokane. In capturing the way a 60-something-year-old Pinkerton detective might sound, I read old mysteries to find words that had disappeared from the lexicon, like “the morbs” (a morbid feeling of unease) and “lobcocked” (bothered or blocked from action) … that language, in particular, began to feel like some missing link between Western and Hardboiled literature. Robin Lindley: You present an unsparing account of Spokane history, including an account of atrocities against Native people. What did you learn about treatment of Native people? Jess Walter: This is another thing I feel like I’ve always known. I grew up on the river, near Plantes Ferry and the horse slaughter camp, where in 1858, eight hundred native ponies were ordered shot by Col. George Wright as punishment and warning to the Spokane tribe. In the 1970s, when I was a kid, people were still finding bleached horse bones along that shoreline. I live now just across from what used to be Ft. Wright, near the confluence of the Spokane River and a stream that for 120 years was called Hangman Creek, named for the spot where Wright had tribal leaders hanged when they came to beg for peace. My family lived for a few years on ranch bordering the Spokane Indian Reservation, where the tribe was forcibly relocated. Anyone who doesn’t understand the brutal history of treatment of Native Americans in the place they live is just not paying attention. And not just Spokane. Seattle, Yakima, Manhattan, how many of us live in cities named for the people from whom it was brutally taken. Robin Lindley: Your book is a tribute to human rights, the rights of assembly and free speech, and the struggle to preserve those rights, along with a recognition that all people regardless of social station or wealth or race, deserve access to justice and equal rights. Were you thinking of those values as you wrote The Cold Millions? Jess Walter: Definitely. And I’d add one more, the old-fashioned idea of brotherhood, the kind that Gig and Rye share, and also the kind that they share with Jules and with Gurley Flynn and the leaders of the IWW. Ten years before I was born, in 1955, about one in three Americans belonged to a union. Now that number is less than one in ten. And, not coincidentally, the middle class has eroded and the gap between wealthy and poor is as high as it was in 1909. The book is an elegy for labor idealism and perhaps a suggestion for the road back. Robin Lindley: Are there other books and resources you’d recommend to help readers better understand the history behind The Cold Millions? Jess Walter: Oh, so many. The book has an Acknowledgments section that is chock full of books that I used in my research. But I will suggest one that gives a broad sense of the labor wars of that period in the Northwest, Big Trouble by J. Anthony Lukas. Robin Lindley: It’s clear from your work that you love Spokane despite its checkered history. You’re a native of the city and you still live there. I recall that the late, great Spokane artist Harold Balazs told me that friends asked him why he never moved from Spokane to an arts mecca like New York City or LA. He said, “You bloom where you’re planted.” It seems you share that strong sense of place. Jess Walter: Ha, please point me to the American city that doesn’t have a checked history, and I will move there. Every city is born, as Spokane was, through some combination of brutality toward its Native people and the destruction of its natural resources. I think some people in Seattle look with condescension at Spokane because it’s poor. But equating a poor city with a bad one is rank snobbery. In fact, I would argue that there’s something more fundamentally wrong with a city where a teacher or a police officer can never dream of affording a home. I happen to like Spokane’s grubbiness, its weirdness, its rough edges. Harold’s answer to that question is terrific, like everything about Harold, but I kind of wish he’d have just said, “Go pound sand.” Robin Lindley: Outsiders may see Spokane as conservative bastion in a county that voted for Trump and is represented by a rightwing member of Congress, but the city also has growing arts, literary and higher education communities. Perhaps voting patterns don’t reflect the entire reality of the city. How do you see the social and political evolution of Spokane since 1909? Are younger people there now interested in social and political change? Jess Walter: The city itself is quite liberal, went for Biden by almost 20 points, and has a city council with a 6-1 progressive bent. Because of the Spokane Valley and its more rural areas, Spokane County did tip for Trump, by about 4 points, half the margin of 2016. But I think it’s misleading to think of Spokane as just another part of red Eastern Washington. The real divide is between urban and rural, like everywhere in the United States. And Spokane’s politics has always been far more complex than the West Side of the state wants to imagine. Even in Spokane’s more conservative periods, a Democrat, Tom Foley, represented the region and rose to Speaker of the House. And Spokane had a black mayor, Jim Chase, a decade before Seattle did. As for young people, I think, like everywhere, they are more engaged than I’ve ever seen them, and personally, I can’t wait for them to take the wheel. Robin Lindley: As we today face a politically divided country, a deadly pandemic, a political insurrection, and a history of systemic racism, among other issues, where do you find hope? Jess Walter: Wow, that’s a hard question. I like what Kafka says: “There is infinite hope … but not for us.” Still, deep inside, I cling to an old-fashioned kind of humanism, and the belief in what Lincoln called the better angels of our nature. But, as a novelist, you’d better keep track of the devils, too, because they make for better characters. Robin Lindley: You have a gift for breathing life into history Mr. Walter, and for blessing each of your characters with a sense of presence and humanity. Is there anything you’d like to add about your writing or your new epic novel and its resonance now? Jess Walter: Thank you! No, those were wonderful questions. Robin Lindley: Thank you for your thoughtful words and generosity Mr. Walter. And congratulations on your epic historical novel The Cold Millions and the stellar praise you’re receiving. Well deserved, indeed. Robin Lindley is a Seattle writer and attorney, and the features editor of the History News Network. His articles have appeared in many periodicals. He can be reached at robinlindley@gmail.com
0b4e937bb63a50aaaa17c4431ec9a836
https://historynewsnetwork.org/blog/42108
Richard B. Frank: Review of Hiroshima in History: The Myths of Revisionism, ed. Robert James Maddox
Richard B. Frank: Review of Hiroshima in History: The Myths of Revisionism, ed. Robert James Maddox Related Links HNN Hot Topics: Hiroshima ... What People Think Now
d85b8710ef74df8907f1472c7948103c
https://historynewsnetwork.org/blog/6772
Throwing a "crappy little country" against a wall
Throwing a "crappy little country" against a wall sums up his philosophy of international relations:"every now and again the United States has to pick up a crappy little country and throw it against a wall just to prove we are serious." ADDENDUM: I have received several emails questioning the validity of the" crappy little country" quotation. The link (shown above) discusses this controversy. The original source for the quotation appears to be this article by Jonah Goldberg. Subsequently, a revised article in 2003 by William O. Beeman indicated that that the quotation was not genuine. In 2004, however, Simon Mars states that Ledeen admitted to him that he said it but also stated that his words needed to be put in context. I have written to Professor Beeman to find out why he made the original revision in 2003...despite Mars' statement that Ledeen later admitted that the quotation was accurate.
815bd17260252350d4520363822aece7
https://ehistory.osu.edu/articles/battle-actium
The Battle of Actium
The Battle of Actium The strange battle of Actium ended decades of Roman civil war and resulted in the rise of the first Roman Emperor. Antony's seemingly irrational battle tactics destroyed him, his armies and his famed wife, Cleopatra. Conjecture over Antony's reasons for abandoning the battle and chasing Cleopatra's ship has been fodder for historians, poets and movie writers for centuries. After the assassination of Julius Caesar in 44 BC Rome had no clear leader. Mark Antony (Marcus Antonius) took over Caesar's papers and many of his legions but Gaius Julius Caesar Octavianus was named as heir in Caesar's will. (Octavianus also possessed the ever important name "Caesar".) Since neither of the two men could manage a clear majority of support, they formed the Second Triumvirate with Marcus Aemilus Lepidus. Lepidus was a well respected yet aged General. Individually, Octavianus and Antony continued to persuade senators and generals to join their side. Eventually, Lepidus who had been assigned an unimportant role in Africa, attempted to seize Sicily by force. His troops mutinied and he was forcibly retired by Octavianus. This left Octavianus with control of the Western provinces and Antony with control of those in the East. Antony married Octavianus' sister, Octavia, and an uneasy truce began. Mark Antony and Cleopatra VII began their fateful relationship after he took over the Western provinces. He began to live openly with Cleopatra and eventually married her although he didn't immediately divorce Octavia his Roman wife. This was greatly resented by the Romans and helped erode much of Antony's support with the public and the Senate. Octavianus capitalized on the situation by reading a supposed copy of Antony's will which gave much of his control to Cleopatra's children Regardless of the authenticity of the will, the propaganda worked and the Senate declared war on Cleopatra (and, therefore on Antony as well.) Prior to the battle of Actium, Mark Antony took his and Cleopatra's fleet into the Gulf of Ambracia (located on the west coast of Greece). He used towers on land and a row of ships in the water to guard the entrance to the Gulf. Octavianus setup camp on the Northernmost shore of the Gulf across from the Actium promontory (from which the battle gets it's name.) Over the next few months the two commanders were stalemated. A few battles were fought up and down the coast - the most decisive of which by Agrippa (one of Octavianus' Generals) cutoff Antony's lines of communication further down the coast. During this time disunity increased between Antony, his generals and his wife. Antony's generals didn't trust either Cleopatra or her armies. They also realized that as long as she was present she would act as fuel for Octavianus' propaganda. They argued that if Cleopatra would go home many of the Roman senate, the Roman people and the Roman army would quit their support of Octavianus. In addition, the Roman generals were much more comfortable and experienced with land battles while Cleopatra insisted that Antony had the advantage on the water and should attack by sea. Furthermore she apparently didn't trust her control over Antony unless she was present and thus refused to leave. Mark Antony finally agreed to take Cleopatra's advice and fight the naval battle and to simultaneously take his General's advice and send Cleopatra home. Exactly when Cleopatra and her ships (which made up a large number of the fleet) were to leave and whether or not Antony planned to go with them is a matter of debate to this day. On 2 September 31 BC. Antony moved out to meet Octavianus. Antony's fleet consisted primarily of massive quinqueremes with bronze plates while Octavianus' fleet was made up mainly of smaller Liburnian vessels. The quinqueremes had the advantage of height from which to shoot or attack from and the advantage of the plates which protected them from ramming. The Liburnian ships were much more maneuverable. At the time the primary nature of Roman naval battles was to maneuver into position to ram the opponent and thus sink their ship. Since the quinqueremes couldn't maneuver quick enough to ram the faster Liburnian ships and the Liburnians couldn't do much damage even if they did ram the plated quinqueremes the battle progressed more as a land battle than a standard sea battle. Antony's ships rowed out in two wings where Octavianus' ships were gathered at the entrance to the Gulf. Antony tried to flank Octavianus' right but the sudden move threw his own center into confusion. When Octavianus' center took advantage of the confusion the fighting grew heavy. All day the unusual battle progressed with the land tactics of arrows and spears being fired back and forth without much chance of tangible gain. Late in the afternoon, Cleopatra and her squadron of 60 ships suddenly raised their sails and raced away from the center of the battle to the open ocean. Antony's reaction has baffled historians for ages. When he saw Cleopatra leaving, Antony immediately left his command ship and followed her with 40 of his own ships following. Some have attributed Antony's rash departure to being caught off guard when his lover decided to leave him. Others have argued that Antony and Cleopatra had always secretly planned for him to steal away with her once her ships had the opportunity to break free. What is certain is that a quarter of Antony's fleet left without warning in the middle of the battle leaving the remainder of his fleet to their doom. By the end of the day the Antonian forces had lost 5000 lives and 300 ships. Octavianus no longer had an enemy capable of contending with him on the sea. A week later when all hope of Antony's return was lost, Antony's land forces surrendered as well. A year later, as Octavianus' troops closed in on him, Antony committed suicide. Cleopatra was captured by Octavianus but rather than face the certain humiliation of being paraded through the streets of Rome she had a servant smuggle an asp into her quarters and committed suicide. In less than three years after the battle, Octavianus, now called Augustus Caesar, declared himself emperor. Selected sources: "Actium: Rome's Fate In the Balance" by Barry Porter: Military History Magazine Aug 1997 "The Life and Times of Cleopatra" by Arthur Weigall, G.P. Putnam's Sons 1924
d788c42faa8398df1a5b18ae16ccf5ae
https://ehistory.osu.edu/articles/life-mary-todd-lincoln
The Life Of Mary Todd Lincoln
The Life Of Mary Todd Lincoln Part I: The Early Years Mary Todd Lincoln, the most criticized and misunderstood first lady, experienced more than her share of tragedy during her lifetime. From the time she was six, her life took a melancholy turn from which she never recovered. She suffered from depressive episodes and migraine headaches throughout her life and turned to squandering money on lavish gowns and frivolous accessories during the white house year in hopes of finding relief from the void deep within. During the Civil War, both North and South called her a traitor and seldom was a kind word printed about her by the press. If we examine her early years, her most impressionable years, we become enlightened and can find compassion for the woman who was the wife of the 16th president of the United States. Mary Ann Todd Lincoln was born the third child to Eliza Ann Parker Todd and Robert Smith Todd on December 13, 1818. Preceding her in birth was her eldest sister Elizabeth, followed by her sister Francis. The Todds lived in a quaint two story, nine-room L-shaped house on Short Street in Lexington, KY. At that time, Lexington was a rugged frontier town that had been founded by a handful of men that included Mary Ann's grandfathers Robert Parker and Levi Todd, as well as her great uncles Robert and John Todd. Her father, a Whig politician and storeowner, adequately provided for his family. In his early years, he'd studied to be a lawyer and was later admitted to the Kentucky bar; however, he never practiced law due to the fact there were already too many lawyers in Kentucky. Although the Todds rejected the idea of slavery, they owned one slave for every member of the family. Mary was especially fond of the slave Mammy Sally. Her anti-slavery views developed very early in her life and she was extremely proud and pleased when she learned that Mammy Sally was integral in helping escaped slaves make it to the Ohio River. Her anti-slavery views grew to match those of her father who supported the KY Colonization Society in its efforts to send the freed slave to Liberia. He freely discussed his dislike of slave-selling and opposed efforts to open KY slave markets to out-of-state imports. He believed slavery prevented Lexington from growing commercially. Regardless, his lifestyle contradicted his beliefs: he was a slaveholder in an antislavery family in a slave state. Eliza became pregnant within a short amount of time after Mary Ann's birth, this time giving birth to a long-awaited son named Levi. Another son Robert Parker soon followed, but didn't survive past 14 months. A daughter Ann was born around the time Mary Ann was three years old and in order to avoid confusion between the two daughters, Mary Ann's name was shortened to Mary. A second son George Rogers Clark Todd was born in 1825, bringing the total of the Todd clan to six children. George's birth had taken its toll on Eliza and she became deathly ill. In July 1825, three doctors were summoned to the Todd house to try to save her life. Their attempts proved futile and she passed away at the age of 31, leaving Robert with six children to provide care. Mary, only six years old, was crushed by the death of her mother. Before she had time to mourn the loss, her father shocked her and her siblings when proposed marriage just six months later to Elizabeth "Betsey" Humphreys. Betsey accepted the proposal, but found repeated excuses to postpone the wedding. She was in no hurry to become mother to Robert's six children. At Robert’s persistent urgings, she finally wed him on November 1, 1826. Finding Acceptance Mary’s life—once glorious, filled with hope and joy—was turning dark and dreary. The Todd household took a turn for the worse after the wedding and rooms that were once filled with Eliza's love for her children were now filled with the rantings and ravings of a stepmother who strongly disliked her husband's children. Outsiders witnessed Betsey's cruelty on several occasions and noted the stepmother used shame, disgrace, and embarrassment to keep her husband’s children in line. Mary's older sister Elizabeth stepped forward and assumed the role of "mother" to Mary and the younger children. Even so, Betsey was becoming increasingly miserable in the Todd home and never failed to express it. And each new year brought another Todd into the world. In total, Betsey and Robert added nine more children to their brood. The importance of Education Mary Todd Lincoln Although Robert was a distant father and seldom home, he was concerned that each of his daughters receive a good education. When it came time for Mary to begin her schooling in 1827, he arranged for her to attend Shelby Female Academy, also known as “Wards” for the reverend who was the director. She spent the subsequent five years at Shelby where she was a model student. She studied reading, writing, grammar, arithmetic, history, geography, natural science, French and religion—which may seem like a lot, but in actuality, the boys were taught more. It was considered unacceptable at the time for women to be overly educated—lest they scare off any possibly suitors. Tuition at Wards was a mere $44 per year; French was an additional $8. Mary excelled academically and found a sense of peace and order in her otherwise chaotic world. She invested most of her time and energy on her schooling, probably because it allowed her to escape the miseries at home. In 1832, at age 14, Mary graduated from Wards and whereas most girls would have been satisfied with such an education, Mary was not. As her family was moving into an elegant 14-room house, she entered Madame Mentelle’s for Young Ladies to continue her education. Mentelle’s was run by a 62-year-old, well-traveled French woman and her husband. Tuition increased to $120 a year and although Mentelle’s was located close to Mary’s new home, Mary petitioned the school to allow her to board on the premises. Boarding privileges were usually reserved for the girls who lived a good distance away, but Mentelle’s made an exception in Mary’s case and permitted her to board throughout the week and return to her home for weekends only. Although the Todd’s new home was luxurious, boasting six bedrooms, a two-room nursery and a bathtub situated on the second floor, throughout her life, Mary always considered Mentelle’s her real home. There, she thrived. She participated in French plays, parlor dances, and marched in local parades. She enjoyed acting and found pleasure in mimicking those around her. Of course, those being mimicked rarely found pleasure in this talent. But it did bring her attention—attention she desired much of her young life. In addition to acting, she was fluent in French and was quickly developing an interest in politics. Like her father, she was a confirmed Whig. That same year, a cholera epidemic swept through the state and Lexington wasn’t spared. Many families left the area and of those that remained, hundreds lost their lives. The Todds made the decision to stay and Mary wrote of that epidemic later in her life: “[There was] nothing on the streets but the drivers and horses of the dead carts with the bodies of those who had just died. Toward the last there were not even coffins. Father had all the trunks and boxes taken out of the attic to serve as coffins.” Striking out on her own Mary was considered by those who knew her to be warm-hearted, save her penchant for mimicking others. Standing only 5’2”, she was described as having clear blue eyes, long lashes, light-brown hair and a beautiful complexion. She was an excellent conversationalist and many noted her ambitious nature. She rarely kept her thoughts hidden and was not one for idle chit chat. She spoke her mind freely in a time when women were discouraged from doing so. Her father was proud of her and desired to spend more time with her as he aged. But Mary, following in her sisters’ footsteps, was anxious to leave the nest due to the dissention with Betsey. In the summer of 1836, she made the decision to trail her older sisters to Springfield, IL. Elizabeth had married former Illinois state attorney general Ninian Edwards and was happily situated in the frontier town. Francis saw the move as her chance to flee the Todd home and she joined Elizabeth and her husband. Mary, feeling restless and wanting to experience life, chose this path as well. Mary spent the summer of 1837 at Elizabeth’s beautiful house that overlooked the town. She was happily received by everyone and found the attention stimulating. She became well-known for her ability to hold her own in parlor discussions over the Whigs and the Democrats. She often sat in on discussions with her brother-in-law and cousin John Todd Stuart on whether they should stand for Congress in ’38. In the same rugged, unsettled town was newcomer Abraham Lincoln whose appointment to the 9th Illinois assembly brought him to Springfield. Lincoln, a native Kentuckian as well, was settling into a law practice with Mary’s cousin John Stuart. He was an awkward-looking man and was described as a non-church goer and a loner. He was a farmer’s son whose past jobs included that of laborer, farm hand, carpenter, and ferryman before becoming an attorney. Even so, he managed to earn the moniker “humble Abraham Lincoln” and win a seat in the assembly. Lincoln, himself, described this time in his life as the loneliest he could recall. Whereas Lincoln was lonely that summer, Mary was having the social time of her life. She was disappointed when the summer came to an end and reluctantly made her way back to Lexington. She would have stayed on, but her sister and brother-in-law couldn’t afford to support both Mary and Francis. When she returned to Lexington, she found most of her friends were married or preparing their weddings. She focused her attention on finding employment and accepted a position as apprentice teacher at Ward’s school. Shortly thereafter, fate seemed to intervene and Francis married a Springfield pharmacist and moved out of Elizabeth’s home. This opened the door for Mary to return to Illinois. She hastily packed her backs and made the return to Springfield where she would spend the next 22 years. The Belle of Springfield Mary Todd Lincoln For the first time in her life, Mary acquired a close friend by the name of Mercy Levering. Mercy, much more proper and rigid than Mary, would become her most treasured confidante. Of the two, it was usually Mary’s humorous nature that got them in one predicament or another. Once, the two girls decided to journey into town after heavy rainfall left the roads thick with mud. Mary devised a plan to prevent their slippers and gowns from becoming mud-soaked. They each carried with them wooden shingles that they placed down on the mud to accommodate each step. This worked on the journey to town, but the shingles were useless on the return and the two girls found themselves mud-soaked from the knees down. Mary’s sister Elizabeth held parties as a way for her to meet eligible bachelors. But she found most of them “hypocritical, uninteresting, and frivolous in their affection.” She did, however, have a few suitors, including a 90-pound, 5’4” Democrat by the name of Stephen Douglas. It was thought by the town that Douglas had proposed to Mary at one point, but no one knew for certain. Later in her life, Mary confessed to a friend that Douglas had indeed proposed and she’d replied to him, “I can’t consent to be your wife. I shall become Mrs. President, or I am the victim of false prophets, but it will not be as Mrs. Douglas.” Other suitors included the grandson of Patrick Henry and Edwin Webb, a persistent widower. None of these men touched her heart ad she wrote to Mercy about the latter: “I love him not, and my hand will never be given where my heart is not.” Even when she met Abraham Lincoln, she was not overly impressed. But the two did strike up a friendship. Mary’s sister Francis was anything but impressed with Lincoln. She considered him the “plainest man in Springfield.” Lincoln would have agreed with her. He once wrote in a letter about that time in his life that his “swallowtail coats were too short, his patched trouser too shabby, and his socks rarely matched.” When it came to Lincoln on the dance floor, friend James Conkling wrote in a letter to Mercy that he looked like “old Father Jupiter bending down from the clouds to see what’s going on.” It’s said that when Lincoln met Mary, he wanted to dance with her in the worst way, to which she relayed that he did indeed “dance in the worst way.” Although Mary wanted to be guided by her heart, she also had criteria concerning a potential mate. She shared with her sister that she wanted “a good man, with a head for position, fame and power, a man of mind with a hope and bright prospects rather than all the houses and gold in the world.” She held true to her word. Previous suitors Stephen Douglas and Edwin Webb were both rising politicians at the time. In 1840, Lincoln and “Molly,” as he now lovingly called her, slowly moved their relationship from friendship to courtship. Elizabeth, although she, too, did not approve of Lincoln, often invited him to their home where he and Mary would sit in the parlor and talk. Elizabeth noted that “Lincoln would listen and gaze on [Mary] as if drawn by some superior power, irresistibly so.” But the two being watched were dealing with their own doubts. Lincoln feared he would not make enough money to provide Mary with the life she was accustomed to and Mary feared giving up control of her life to a husband. Elizabeth once stated, “I warned Mary that she and Mr. Lincoln were not suitable. Mr. Edwards and myself believed they were different in nature, and education and raising. They had no feelings alike. They were so different that they could not live happily as man and wife.” Elizabeth, having spent two years trying to create a rift between the two, rejoiced when on January 1, 1841, Mary and Lincoln went their separate ways after an argument. Apparently, Lincoln was to escort Mary to a party and was late in arriving, so she left without him. He finally showed up only to find her flirting with Edwin Webb. That evening, a fuming Lincoln ended their relationship. It’s said that Mary responded by stomping her foot and shouting, “Go and never, never come back!” The breakup took its toll on Lincoln and he missed the following six days on the legislature. When he finally returned, he was described as “reduced, and emaciated in appearance and seems scarcely to possess strength enough to speak above a whisper.” Lincoln became a hypochondriac and in a letter to Mary’s cousin (his law partner), he wrote, “I am now the most miserable man living. If what I feel were equally distributed to the whole human family, there would not be one cheerful face on earth. Whether I shall ever be better I can not tell; I awfully forbode I shall not. To remain as I am is impossible; I must die or be better, it appears to me.” Mary suffered too. She wrote to her friend Mercy, “[Lincoln] deems me unworthy of notice, as I have not met him in the gay world for months, with the usual comfort of misery, imagine that others were as seldom gladdened by his presence as my humble self.” Lincoln left Springfield to visit a friend in Louisville, KY. During his absence, Mary wrote again to Mercy that she was feeling very alone. “The last two or three months have been of interminable length…I was left much to the solitude of my own thoughts…” It was clear Lincoln missed his Molly and Molly missed her Abe. Part II: Politics Many outsiders looked upon the Mary Todd and Abraham Lincoln union with much skepticism. She was short and round, he was tall and lanky. She had a keen fashion sense, his socks rarely matched. She was educated, he was not. Her family had money, his had none. He was loved by all and she was disliked by many. So what kept these two very opposite individuals very much in love during their marriage? Was it simply politics? Love is Eternal Abraham Lincoln In 1842, around the same time her stepmother was giving birth to her 14th sibling, Mary reconciled her relationship with Lincoln at the urgings of mutual friend Eliza Francis who petitioned the two to at least enjoy a friendship. Slowly, the tall lanky man and the round young woman rekindled the romance and Lincoln soon proposed. It was then patterns began to emerge in their relationship that would play out almost daily for the rest of their time together. If Mary felt neglected by her husband, she would flirt with his colleagues to garner his attention. He would respond with indifference and focus his energy on the tasks at hand. Indeed, to outsiders, the only thing the two had in common was a political agenda. On Friday, November 4, 1842, Mary and Lincoln wed at the home of her sister Elizabeth in front of about 30 guests. It was a small, impromptu ceremony that didn't include her father and stepmother among the guests, all of who received only a day's notice of the ceremony. Even the best man was a last-minute thought, having been chosen by Lincoln the day of the ceremony. What seemed to be the only planned part of the festivities was the plain gold wedding band that was placed on Mary's finger. In it was the inscription Love Is Eternal. Reverend Dresser, an Episcopalian minister, married the two in the simple ceremony that many still believed was an awkward pairing. Even Lincoln must have felt so for he wrote one week later to friend Samuel Marshall: "Nothing new here, except my marrying, which to me, is a matter of profound wonder." The newlyweds settled into a room at the Globe Tavern, paying under $10 per week. They occupied the same room Mary's sister Francis had shared with her husband after they'd married. Lincoln returned to work right away and Mary found herself with much idle time. The boredom didn't last long for nine months later she gave birth to a son on August 1, 1843 and named him Robert Todd after her father. It would be Mary who would name all the children—a task Lincoln would later joke about. Once he was asked to name a cannon and he amusingly replied, "…I could never name anything. Mary had to name all the children." In the fall of 1843, the Lincolns moved from the Globe and settled into a four-room cottage on South Fourth Street, paying a mere $100 rent per year. Mary’s father made the journey to Springfield (sans Betsey) to meet his daughter’s new husband and to see his new grandson, as well as his other grandchildren he'd yet to meet. During that visit, he showered Mary with attention and gifts, as if making up for all the years of suffering she'd endured at Betsey's hands. He gave her a $25 gold piece and deeded 80 acres of Illinois land to the newlyweds, plus promised a yearly sum of $1,100. To Lincoln, he handed over a legal case that later yielded a tidy sum of money. Her father's generosity paved the way for the Lincolns to purchase a one-story, five room cottage located on one acre of land. The property cost $1,200 and was purchased from Reverend Dresser, the same reverend who'd performed their marriage ceremony. Although they now had a wonderful home, both recognized that their standard of living still wasn't what Mary was accustomed. In 1846, Mary gave birth to a second son whom she named Edward. "Eddie," was named after Edward Dickinson Baker who'd beaten Lincoln for the Whig nomination for Congress in 1842. Eddie, who was ill most of the time, kept Mary busy. To all outsiders, the new mother seemed to fall off the face of the earth as she took care of her home and children. She was a superb and doting parent, often engaging the boys in a variety of activities. Having been criticized so harshly when she was a child, she parented the boys in such a way that outsiders felt she gave them too much freedom. Meanwhile her husband traveled the circuit trying cases and was away from home more times than not. It was during this time, Mary's anxieties and fears seemed to escalate. She disliked staying alone at night and would often invite guests to spend the night at their home. When Lincoln was at home, he was just as attentive to the children as she was. In fact, he often solely cared for the children while Mary attended a church function or did the marketing. When it came to disciplining the children, neither Mary nor Lincoln seemed to excel. Lincoln confessed he used reason to keep the children in line over "switching." A Lauded Hostess Mary went on to hire several helpers but usually had a difficult time getting along with them. She was fortunate in employing one faithful helper who described Mary as "taking no sassy talk but if you are good to her, she is good to you and a friend to you." She also employed a kind black woman she grew fond of named Mariah Vance. Mariah, who understood Mary and looked on her with compassion, stayed with her for years. For Mary, Mariah may have reminded her of her beloved Mammy Sally from her childhood. Mary took special care in the cleaning and did the cooking herself; however, her husband was quite the finicky eater. An apple was usually enough to fill him. Not only was he not much of an eater, but quite often he would forget to come home for dinner to which Mary would send the children to fetch him. Her domestic skills were not lacking and she entertained frequently in their small home. Isaac Arnold, a frequent guest of the Lincolns expressed, "Mrs. Lincoln often entertained small numbers of friends at dinner and somewhat larger numbers at evening parties. Her table was famed for the excellence of its rare Kentucky dishes and in season was loaded with venison, wild turkeys, prairie chickens and quail and other game." Even though he praised the young Mrs. Lincoln, he would later become very critical of her during the White House years. By the mid-1850s, Lincoln's law practice became profitable and Mary found her small dinner parties turning into large receptions. Although the Lincolns were growing in popularity, Mary didn't conform to the role that was expected of her (and all women during that time). Instead, she spoke her mind freely, expressed her opinions without caution, and could hold her own when the talk turned to politics. There seemed to be little gray area when it came to Mary: most either liked her or disliked her, there were very few who had no opinion. In 1856, after Lincoln was defeated in the senate, Mary sold off the 80 acres given to her by her father for $1,300. The money funded the building of a second floor to their quaint home. Thus four new bedrooms and a back stairway, as well as a double parlor on the first floor, were added. The extra room provided her with areas of the house where she could have quiet, which she relished when she was suffering from a migraine. Although she provided a good home for her children and husband, she sometimes suffered from bouts of melancholy just as her husband did. She wrote to a friend in 1859 during a time when her husband was home, "I hope you may never feel as lonely as I sometimes do…" Losses As Lincoln continued to pursue a political career, Mary carefully groomed and coached her husband. In 1846, he received the Whig nomination for Congress and in August he became one of Illinois' Congressman. And whereas the majority of the congressman left their wives and children at home while they served, Mary and the children journeyed to Washington with Lincoln, which offended most of the male boarders where they'd settled, especially those who knew of Mary Lincoln and disliked her. When his congressional term was up, he sought the position of Commissioner of the General Land Office but did not win that post-even though Mary went on a letter-writing campaign to get him appointed. Instead, he was offered the post of Governor of Oregon-which he graciously declined. Mary Lincoln was not going to travel to such a barren frontier town with two children in tow. During the summer of 1849, Mary's father contracted cholera and passed away. More devastating was the death of Eddie on February 1, 1850. He died of pulmonary tuberculosis. Mary believed in predestination and was certain fate was against her. She told friend Emile Helm, "What is to be is to be and nothing we can say, or do, or be can divert an inexorable fate, but in spite of knowing this, one feels better even after losing, if one has had a brave, whole-hearted fight to get the better of destiny." On December 21, 1850, William Wallace Lincoln was born and almost two years later, on April 4, 1853, Thomas Lincoln, named for his paternal grandfather, was born. He earned the moniker "Tadpole" for the strange shape of his head after the difficult birth. The nickname was soon shortened to Tad. Lincoln's political career seemed to stall during this time but it was jumpstarted in 1854 when he went up against Mary's former beau Senator Steven Douglas. Lincoln's slogan became: "A house divided against itself cannot stand. I believe this government cannot endure permanently half slave and half free," while Mary campaigned, "[Douglas] is a very little giant" beside "my tall Kentuckian." Even so, Lincoln lost again. This time both Lincolns felt the defeat harshly. It wasn't just Lincoln who'd lost the election, she'd lost too. Theirs was a political partnership. To soothe the sting of the loss, Mary turned to spending money on the latest wardrobe fashions. In 1859, 16-year old Robert was leaving the nest for Harvard and the following year Lincoln was being eyed for president. Twelve years had passed since he'd held a public office and an excited Mary found herself surrounded by the political elite. Whereas most politicians' wives were unassuming, Mary surprised the press by taking an active role in politics. She freely expressed her opinions, once again taking the public by surprise and offending them. The election came and went and Lincoln was nominated president. It's evident he saw their marriage as a political partnership as well because he rushed home to tell her, "Mary, Mary, we are elected!" The partnership continued and Mary, feeling as though her husband may understand the issues of the day, felt she understood people and character much better than he ever could. She was instrumental in political appointments. If she did not approve of one of her husband's choices, she would solicit those close to her husband to talk him out of the appointment. Mary's Political Agenda The newly-appointed president and his family took the train to Washington DC. Lincoln had preferred that Mary and the children take an alternative route due to assassination threats, but General Winfield Scott talked him out of it, rationalizing that an assassination would less likely take aim if he were surrounded by his family. Mary, herself, had received many anonymous letters bearing a skull and crossbones and a threat that if her husband took office, he would be assassinated. As the train journeyed to DC, it made many stops, and Mary was surprised at her notoriety. "Where's the Missus? Where's Mrs. Abe?" came the cry from the crowds if she was not at her husband's side to greet the well-wishers. This reinforced her belief that she had been elected as well. Ladies Home journal coined her "Illinois Queen." First on Mary's White House agenda was to assemble a wardrobe, a task that took two weeks and captured national media attention. She tailored her wardrobe after France's Empress Eugenie (who married Napoleon III in 1853). She wanted to be known for her wardrobe, and she was, but not in a flattering light. The press poked fun at her "loud" outfits, yet Lincoln always complimented her. Rarely was a newspaper published without some mention of the new first lady. Not only did she not conform to the dress of the day, she additionally went against convention by making herself very visible. All of her predecessors had spent their time in the White House sequestered on the upper floors while their husbands ran the country. And although she was pleased she was getting media attention, the harsh, exaggerated words that were printed about her stung and made their mark in her already weakened self esteem. In her eyes, the press was no different than her cruel stepmother. Second on Mary's agenda was to give the White House a desperately needed makeover using a $20,000 stipend. Every president since William Harrison had been receiving the funds, but none chose to take advantage of it. Thus, by the time the Lincoln's arrived at the White House, it was in disrepair. After the war broke out and a Rebel invasion seemed imminent, General Scott urged Mary to take her children and return to Springfield. She declined and instead went on a two-week shopping trip, making stops in Philadelphia and New York and in turn, angering the merchants in Washington who'd served the White House for years. Her shopping trip was costly—she spent the entire $20,000, that was supposed to last four years, in one trip. She purchased furnishing, curtains, rugs, china, anything that she felt would lend a regal atmosphere to the White House. At one point, Lincoln did intervene and cautioned her about her spending. He even threatened to pay the shopping expenses she incurred from his own salary if she didn't curtail her spendthrift ways. That threat seemed to take the edge out of her spending; however, she had already run up several debts that she eventually turned over to White House staffers to manage. She quickly learned the art of bartering. Even though Mary was careless with spending, she managed to save $70,000 of his $100,000 salary during his presidency. After the Battle of First Bull Run, Mary became a regular at the newly-established hospitals around DC. There, she provided food and comfort to the wounded. She read to them, brought flowers, wrote letters home and worked tirelessly to raise funds for special needs. She also contributed all the White House liquor to the hospitals. And whereas most women could not stand the sight of an amputated limb, it's reported that Mary was able to tolerate the atrocities of war. Although she had a philanthropist side, it was her spending that seemed to grace the headlines, that and the fact she had three half brothers fighting with the Confederacy. For her sisters' husbands, Mary secured an appointment for each. Ninian was appointed to the Commissary Department of the U. S. Army and Francis’ husband was appointed local paymaster of the volunteers. Mary possessed a defiant streak to say the least. She was not one to be bullied by any of her husband's cabinet members. Once, she sent a friend to Edwin Stanton to try to secure an appointment and after Stanton met with her friend, he sought out Mary and scolded her for the imposition. She promised she would not bother him again, but it's said shortly after his berating, he received a package of newspaper clippings that pointed out his inadequacies with the Union army. Many feel Mary was the sender of the package, which would have been consistent with her sometimes passive-aggressive behavior. Regardless, she was the busiest of any first lady in history and her accomplishments were not trivial. She had successfully redecorated the White House, became an admired hostess, reviewed the troops alongside her husband, and held the hands of the wounded and dying. She overlooked frequent migraines, fevers, depression/loneliness (her husband was preoccupied with the war) and once a concussion to make her self available to the public. She had no less drive than the men of her day. More Losses In January of 1862, when the country was finally accepting it would not be a quick war, Mary decided to hostess a lavish party. Five hundred invitations were sent out and those who received one were delighted, those who didn't were bitter. Many felt with the solemn blanket the war spread, it was no time to be hostessing a party. She did so anyway and as though she were being punished, two weeks after the party, her favorite son Willie died of illness. Mary fell into a deep depression. She was bedridden for weeks and never entered the Green Room, where Willie's body had been laid out, or the room where he died, ever again. Shortly thereafter, Tad became ill, and with Mary in no condition to care for him, an intervening Dorothea Dix posted one of her nurses at the White House. Mary's grief played out in ways that weren't so unusual for the time. She had insomnia and suffered from bizarre nightmares, and although both of those symptoms seemed to be the norm for Lincoln, his grief played out in other ways. Every Thursday, he would sequester himself in Willie's room. Mary, not able to take in the sight of anything that reminded her of her favorite son, quickly removed all her son's items from the White House and sent them to relatives in Springfield. The only items she kept were his pony and his two goats. Mary was downright angry at God for taking something so special from her. Meanwhile, husband and wife were growing apart and to her, it seemed as though they'd been closer in spirit when he was traveling on the circuit and away most of the time during the early years of their marriage. She continually worried about his health and often asked those closest to the president for their opinion on the matter. He was visibly depressed, tired, and giving all his energy to the war. As she came out of her mourning, she found herself being entertained in her notorious Blue Room by many male friends. The Blue Room was intended for entertaining while her husband met with his generals and cabinet. The press, as well as the wives of those invited to the Blue Room, felt it was inappropriate. Maybe Mary did too, but making her husband jealous was the only way she knew to bridge the gap between them. Dreams and Visions In '64, the Lincolns needlessly worried that they would not be reelected. The final outcome was Lincoln managed to win every state—save three. It was then a friend reminded Mary of a vision that Lincoln had had. He'd just woken from a nap when he looked into the full-length mirror in the corner of the room. There, he saw two images of his face, one much lighter than the other. No matter how he shifted, and from what angle he viewed the oddity, both images remained. Mary interpreted her husband's vision to mean he would be reelected, but would not complete his term. The vision didn't surprise her. Her life had been such that when one joyous door opened, another closed shortly thereafter. With that in mind, she immediately spent $1,000 on mourning attire. Mary was now called the Presidentess. As the war came to a close and the Lincolns traveled into the Confederacy, Lincoln had another vision, this one coming to him in a dream. In it, the White House was on fire. He shared this dream with Mary and the very next day, she sent a telegraph to the White House maid: "Send a telegram, direct to City Point…and say if all is right at the house. Everything is left in your charge—be careful." All of this probably weighed heavily on Mary's mind and it played out in ways that made her look less becoming. The following day, a grand review was planned and she was to attend with Lincoln. For reasons beyond her control, she was late in arriving. Not only was she late, but during her carriage ride with Julia Grant to the review, the carriage hit a ditch and Mary banged her head forcefully against the roof. By the time she arrived, she openly expressed her anger and irritation to her husband, not sensitive to the fact that others were within earshot and witnessing her tantrum. Most felt a sadness for Lincoln at that point. He took her anger with calmness and dignity. Mary regretted her behavior that day and returned to the White House with a heavy heart. After a few days, she returned to her husband’s side in the Confederacy and walked through the Confederate capitol. The Lincolns returned to the White House and Lincoln was exposed to another one of his dreams. In this one, he was wandering through the halls of the White House and he came upon the East Room. There was a coffin with a corpse inside. He asked the soldier in attendance, "Who is dead in the White House?" The soldier responded, "The President." Lincoln shared the dream with Mary and it haunted her. Regardless, Lincoln expressed his desire that they both look to the future. Now that the war was over, they needed to focus on each other and nurture their relationship. At her husband’s urgings, the first order of the day was to make plans to attend the theatre that very evening to see Our American Cousin. Part III The Widow Years Having lost her mother at the impressionable age of five created a deep void within Mary Todd Lincoln from which she never recovered. That, coupled with her inability to accept the deaths of sons Eddie and Willie, led to perpetual depression and anxiety that she tried to cure with frequent shopping excursions and trying to win the love and affection of those around her. In the end, she may have assembled quite a collection of beautiful wares, but it cost her the respect of her admirers. Mary’s losses also included the deaths of two half brothers: Sam, a Confederate who was mortally wounded at the Battle of Shiloh; and Aleck, a Confederate killed at the Battle of Baton Rouge. Some, knowing her fondness for her brothers accused her of traitorous behavior during the war. Little did Mary know that although the war was coming to a close, her battles were just beginning. A National Tragedy On Good Friday, April 14, 1865, Mary and Lincoln took a carriage ride where they rekindled their relationship with intimate conversation. It was decided that later that evening, they would attend a showing of Our American Cousin at Ford's Theatre, along with Senator Harris's daughter Clara and her fiancé Major Henry Rathbone. The presidential party arrived at the theatre late, but became happily situated inside the freshly-decorated presidential box as the orchestra played Hail to the Chief. When the applause died down, the play began. About an hour and a half into the performance, Mary intimately slipped her hand into her husband's and leaned over to ask of him what the others in their group would think of her bold display of affection. Before she could absorb his response, a man entered the box and pointed a revolver to the back of the president's head and pulled the trigger. Lincoln slumped over. Mary's screams echoed throughout the theatre and those who witnessed the shooting never forgot the wretched moans that came from Mary over the next few moments. "Oh my God..." she uttered in disbelief, "have I given my husband to die?" Lincoln was quickly removed from the theatre and taken to a private home across the street. A hysterical Mary and her companion Clara followed closely behind, their gowns spattered with Major Rathbone's blood from a saber wound he'd received while trying to subdue Boothe. Lincoln's unresponsive body was laid on a bed in a second-floor bedroom and Mary clung to him begging for a response. The men in attendance were unable tolerate Mary's hysteria and at a time when she should have been consoled and allowed to remain at her husband's side, she was forcibly removed from the room and taken to a downstairs parlor. For the next nine hours, she anxiously awaited her husband's death. Robert, who'd been fetched to the home earlier, divided his time between consoling his mother and sitting beside his father's lifeless body. Mary was finally permitted another visit with her husband and collapsed. By the time she was revived, her husband was dead. She later wrote of his death, "I often think it would have been some solace to me and perhaps have lessened the grief, which is now breaking my heart—if my idolized had passed away, after an illness, and I had been permitted to watch over him and tend him to the last," then she could have, "...thanked him for his lifelong—almost—devotion to me and I could have asked forgiveness, for any inadvertent moment of pain, I may have caused him." Mary did not attend her husband's funeral and had no family members from Springfield come to her side during this difficult period. She became bedridden for the next 40 days and refused callers who came to offer their sympathy, which in turn created talk of her impropriety in dealing with her husband's death. Starting Over President Andrew Johnson was anxious to settle into the White House and his new role; however, he patiently waited for Mary to leave the White House. During her period of confinement, she was oblivious to the goings on around her; the White House staff took advantage of her preoccupation and began looting valuable items. (The following year, the Congressional Committee on House Appropriations investigated the thefts and whether Mary had a hand in the disappearance of these items. She was cleared of any involvement.) Mary began to contemplate her future. Most suggested she return to Springfield, but to return to Springfield where she'd enjoyed so much gaiety with her husband was out of the question. It was also the place where she'd lost Eddie. She finally decided on Chicago and on the same day that the Union chose to celebrate their victory in war, Mary, Robert and Tad boarded a train for Chicago. "I go hence brokenhearted with every hope in life almost crushed...Alas, all is over with me." The three settled along Lake Michigan in the Hyde Park Hotel. It was there that while walking the shores of Lake Michigan she allowed herself to think of her husband and grieve. For the most part, she became a recluse and allowed few people into her world and for those she did interact with, they concluded that Mary was still very much consumed by the events of the evening her husband was assassinated. Today, we recognize her behavior as post traumatic stress disorder. Robert remained active and busied himself by accepting a position as an apprentice in a law firm. Mary's yearly purse totaled $1,500 by the fall of 1865 and she and Tad moved into Clifton House, a boarding house that was home to mostly newcomers and transients. Robert refused to join the two feeling that their new accommodations were dreary, in actuality, he was trying to distance himself from his mother. Creditors began knocking on her door to collect debts incurred during her White House years. To pay off some of the debts, she sold her gowns, and returned jewelry and other items to the place of purchase. She refinanced the remaining debt with a wealthy financier, at a very high-interest rate. She hired Alex Williamson to handle her financial affairs and try to raise contributions for the Mary Lincoln fund. Through his efforts, Mary was able to pay off the vast amount of her debts—although many frowned upon her methods. Regardless, she was proud of her accomplishment. But her accomplishment was overshadowed by the fact that other war-time widows were receiving much more in contributed funds than she was receiving. It was another blow to her already wounded self. In 1866, Simon Cameron promised to raise $20,000 for Mary and in light of this promise, she purchased a house on W. Washington Street in Chicago. The purchase, she hoped, would bring her family back together under one roof. Robert did not support her in this purchase, especially since she didn't have the funds, only a promise. Sure enough, Simon Cameron's interest in Mary waned after he'd won a senatorial nomination. Mary was frustrated by the broken vow and took it upon herself to secure the necessary funds. She sought out those individuals whose careers had been helped by her husband. Robert became irritated at his mother's "begging" and his opinion of her soon fell in alignment with her critics. Unable to afford the house, Mary rented it out and became a vagabond. She felt, "No place is home for me." Her public humiliation continued. In November of 1866, Lincoln's former law partner William Herndon went public with a story that it was Ann Rutledge who had been Lincoln's true love, not Mary Lincoln. He called the marriage of the Lincoln's "a domestic hell...For the last 23 years of his life, Mr. Lincoln had no joy." Mary didn't respond publicly, instead she endured a living hell in solitude. It was during this time that others came to her defense and denied the claims made by Herndon, who was considered an irresponsible alcoholic. Robert also came to his mother's defense during this time and tried to persuade Herndon to drop the story, but he was unsuccessful. No Place is Home In 1867, Mary packed her belongings in what she termed "poor boxes" and traveled—for the first time in her life unaccompanied. Both Robert and Tad were in Washington testifying at the trial of John Surratt. Mary made her way to the spas in Racine, Wisconsin where she took advantage of their therapeutic affect. While there, she began to feel better and seemed to garner a clearer sense of her predicament. She formulated a plan to raise money that included selling her entire White House wardrobe. She no longer needed the clothes as she'd taken to wearing widow's garb since her husband's death. She immediately journeyed to New York where she planned the sale under the alias Mrs. Clarke, but it was only a matter of days until her identity was discovered and she was blasted in the press once again. The sale was a fiasco. She returned to Chicago in time to read the Chicago Journal's coverage: "The most charitable construction that Mary Lincoln's friends can put on her strange course is that she is insane." Robert's opinion of his mother seemed to move in the same direction, more specifically, he wrote to his future wife Mary Harlan, "My mother is on one subject not mentally responsible-it is very hard to deal with someone who is sane on all subjects but one." He referred to her mishandling of money. Robert Todd Lincoln Robert was becoming increasingly embarrassed by his mother's actions. Later that year, Mary learned through a newspaper article that her late husband's estate was ready for disbursement. Neither Robert, nor David Davis who was handling the affair, bothered to tell her. She also learned that although she was receiving a mere $130 a month to live on since Lincoln's death, Robert was receiving twice that amount. This infuriated her since she'd had given up her house on Washington Street because her requests to Davis for an additional income to afford the house were rejected, yet her son's request for more money had been awarded. He'd even received extra money to decorate his bachelor's quarters. Davis was now prepared to divide Lincoln's assets of $110,000 between Robert, Tad (with Robert as guardian), and Mary. Wishing to leave the United States and all the humiliation, both public and private, Mary and 15-year-old Tad boarded a steamer in 1868 bound for Europe—but not before attending the wedding between Robert and Mary Harlan. For the next two years, Frankfurt Germany became home to Mary. There, her eccentricities were seen for just that, and not insanity. She was liked and even admired abroad. In 1869, she became a grandmother and although the relationship with Robert was strained, Mary passed advice freely on to her daughter-in-law about marriage and motherhood. "Don't mope around the house. Attend operas and concerts," she advised. Her life became leisurely and when she wasn't sending lavish gifts to both her daughter-in-law and granddaughter, she was reading books and walking alongside the Main. She journeyed to Baden-Baden to enjoy the sulfurous baths, followed by Nice where she enjoyed the climate. "Was there ever such a climate, such a sunshine, such air—flowers growing in the gardens, oranges on the trees, my windows open all day, looking out upon the calm, blue Mediterranean." Onward to Scotland where "We visited Abbotsford, Dryburgh Abbey, passed six days in charming Edinburgh, seeing oh so much: Glascow…all through the west of dear old Scotia, Burn's birthplace...went to Gerenoch, heaved a sigh over poor Highland Mary's grave—went out into the ocean—entered Fingal's cave—visited Glencoe—Castles unnumerable--Balmoral." Back in Frankfurt, Mary was surprised and delighted to receive a visit from her old friend Sally Orne. The two spent the ensuing days together reminiscing. Mary's lighthearted nature reappeared briefly and apparently the two women made so much noise in Mary's room with their giggling and talking that "a gentleman next door knocked several times, during the night, saying ladies I should like to sleep some. We amused ourselves very much over his discomfiture, last night another sufferer rang the bell for the waiter and quiet at 2 1/2 o'clock this a.m." Sally had heard Robert's assertions that Mary was insane and she wrote at the time of her visit, "As it has been suggested by some that Mrs. Lincoln is partially deranged, having seen her recently it may be proper for me to say to you that I have watched her closely by day and night for weeks and fail to discover any evidence of aberration of mind in her, and I believe her mind to be as clear now as it was in the days of her greatest prosperity and I do believe it is unusually prolonged grief that has given rise to such a report." Seeing Sally renewed Mary's spirit and she began petitioning for a pension once again and after much heated debate in Washington, President Grant signed the bill providing for an annual pension of $3,000 for her. When the French invaded Germany, Mary and Tad journeyed to Milan, Lake Como and Florence before returning to Chicago where they boarded with Robert and his wife. By early 1871, there was friction between the two Marys and Mary chose to move into Clifton House. It was there Tad became very ill with what was initially diagnosed as a cold. But his lungs quickly filled with fluid and on July 15th, he died of "compression of the heart." Mary received no comfort from Robert as she grieved the loss of Tad. In fact, 10 days after Tad's death, Robert left for the Rocky Mountains where he remained in seclusion for a month. The locale was a favored place for men who were suffering from "nervous" disorders. Robert would later express that he'd been "all used up" after his brother's death. Soon, Mary was no longer welcome in Robert's house—it may have been because she learned of her daughter-in-law's struggle with alcoholism. Mary, who now despised the 14th and 15th day of each month—anniversary dates of Lincoln's death and Tad's death, respectively, turned more and more to spiritualists and mediums to find comfort. In 1872, Tad's estate was ready for disbursement and Mary offered to split the estate (worth $35,570) with Robert even though the law entitled her to two thirds. She also loaned Robert $10,000 for a real estate investment. She then traveled to Waukesha, Wisconsin and settled near the health spas next to Lake Michigan. Lunacy vs Eccentricity In 1873, Mary traveled to Canada. In 1875, she desired warmth and traveled to Green Cove, Florida. As the 10th anniversary of her husband's death neared, Mary had a premonition that Robert was dying. Hastily, she left for Chicago where she was relieved to find him in good heath, but angry at her for all the ridiculous fuss. Mary's anxieties during the anniversary of her husband's death played out in unusual ways. She shopped for items she didn't need and then purchased the item in large quantities. At one point, she ordered eight pairs of lace curtains for windows she didn't have, and patiently awaited their arrival. When a knock came at her door, expecting the caller to be delivery the curtains, she opened the door and was surprised to find two uniformed men and an attorney, the same attorney who'd nominated her husband for president in 1860. It was then Mary learned she was being charged with lunacy and was directed to attend a trial immediately where a jury would deliberate her sanity. Mary told the men, "You mean to say I am crazy—I am much obliged to you but I am abundantly able to take care of myself. Where is my son Robert?" It was later Mary learned it was Robert who took out the warrant for her arrest as a lunatic. In fact, he hired Pinkerton men to follow her throughout her travels and meetings with mediums and spiritualists. He'd also questioned her doctors, maids, waiters and store clerks and then petitioned them to testify against her. One by one, they did so and concurred with Robert's assessment his mother was insane. Mary had a poor defense, one that was appointed to her by Robert, and it was prearranged the attorney would not provide her a defense that was in her best interest. It only took the all-male jury 10 minutes to return a verdict: guilty of insanity. Her sentence was to hand over her bonds, give control of her money to a court-appointed conservator, and accept detainment in a private asylum in Batavia, IL. If Mary's trial had been held modern day, she would never have been charged with lunacy—maybe eccentricity—but not lunacy. Mary was being condemned for being ahead of her time. On May 20, 1875, she was admitted to Bellevue and from the moment she passed through its doors, she was planning her exit—not her escape, but her legal exit. She wrote letter after letter trying to secure an attorney to represent her, but this was difficult since all of her mail was censored. She finally found allies in attorney Myra Bradwell and her husband Judge James B. Bradwell. Although a court decreed that Myra Bradwell could not practice law as "the paramount destiny and mission of women are to fulfill the noble and benign offices of wife and mother," she set out to put Mary’s case back in the media. When a Chicago Times reporter asked her if Mary Lincoln was insane, Myra replied, "Mary Lincoln is no more insane than I am." While Myra worked on the outside, Mary worked on the inside and prearranged with her sister Elizabeth that she would reside at her home in Springfield after her release. Initially, Elizabeth agreed until Robert stepped in and applied pressure to Elizabeth to deny Mary's request. He even concocted several stories to further declare his mother's lunacy and was able to sway Elizabeth to his side. Myra privately met with Elizabeth and set the record straight and Elizabeth held firm in her offer to Mary to join them. Judge Bradwell sent a letter to Bellevue threatening habeas corpus. Robert continued to pay doctors (with Mary's money) for their prognosis, which of course supported his theory his mother was insane. Regardless, Mary was finally released to her sister and made the trip back to Springfield. Robert still held her funds and refused to send her money—not even for a new bonnet to wear to church. Freedom Mary was cheerful and sociable at her sister's home, but she continued to fight Robert for her property and money. She felt as long as he held both, she would not be free. She knew Robert was still pursuing his quest to have her committed so she thought to bargain with him. She made him an offer that if he would place her money in a Springfield bank, she would release to him the contents of her current will, naming him and his daughter heirs to her estate. There was a veiled threat amid her words that if he did not comply, she would disinherit him. Finally, Robert complied. On June 15, 1876 another jury found her "restored to reason and capable to manage and control her estate." Robert returned to Mary $73,000, including $60,000 in bonds. With the new ruling, Mary wasted no time in forwarding a letter to her son where she demanded the immediate return of all her personal belongings that he was in possession. She signed the letter Mrs. A. Lincoln. She also returned all the items that Robert had given to her, which didn't amount to much. The gift-giving had been obviously one-sided. Her funds restored, Mary decided to journey to Europe. She felt safer with an ocean separating her and her son, who she knew was still trying to have her committed. Abroad, she settled in Pau, France where she spent the next four years. "I live, very much alone," she wrote in 1877, "and do not identify myself with the [French] —have a few friends and prefer to remain secluded..." She traveled extensively, visiting Rome, Naples, Sorrento, Vichy. In 1879, Ulysses and Julia Grant traveled to Pau and although they knew Mary to be residing there, they didn't visit her. The old Mary would have felt slighted and snubbed, but she looked upon their act with indifference. In 1880, after two serious falls, she wrote to her sister, "I cannot trust myself any longer away from you all. I am too ill and feeble in health." She returned to her sister's home and within a year, weighing only 100 pounds, Mary was nearly blind. She was diagnosed with kidney, eye and back problems. A New York reporter interviewed the physician who treated Mary and asked of the ailing woman's sanity. The doctor responded, "She is no more insane than you or I are and if you come with me to talk with her, you would understand that." With her medical bills rising, 64-year old Mary petitioned Congress to increase her pension. It was increased to $5,000 and she was awarded $15,000 in back pay. She never collected any of the money. On July 15, 1882, on the anniversary of Tad's death, she collapsed in her bedroom and that evening fell into a coma. On July 16, Mary Todd Lincoln died of a stroke. Mary was buried on July 19 and the Springfield mayor declared a holiday in observance. Thousands lined the streets and the First Presbyterian Church was crowded. For once, the newspapers were kind to her and restored her character in death. What would Mary's life have been like if those who'd rallied at her death had rallied during her life? After the service, Robert and Mary's sister led the procession to Oak Ridge Cemetery where she was laid to rest among those who had abandoned her throughout her life. In 1884, Robert inherited his mother's estate, not because he was listed in his mother's will because she destroyed the only copy, but because IL state law named him as her natural heir. Source: Mary Todd Lincoln: A biography, Baker, Jean H. 1986, W.W. Norton & Co.
dad0682e2e5d55d1fe8c2cb02e17f626
https://ehistory.osu.edu/articles/siege-paris-during-franco-prussian-war
The Siege of Paris during the Franco-Prussian War
The Siege of Paris during the Franco-Prussian War Through the first half of 1870 a confrontational fever with Germany spread throughout France. On July 15 Emperor Napoleon III led his nation "into one of the most disastrous wars in her history." (1) The Franco-Prussian conflict did not officially commence until July 19, 1870. In the course of its first weeks it produced a series of demoralizing defeats for the French. The army of Napoleon III "went to war ill-equipped, badly led, trained and organized, and with inferior numbers." (2) On August 19, one French army was trapped in the fortress of Metz and on September 1, the Empire of Napoleon III came crushing down when a second army was captured at Sedan with the Emperor himself. Three days later the news reached Paris and the fall of the Empire was proclaimed. The Empress left for England and a provisional government took power. (3) For the next five months, the "city of lights," as Parisians had proudly proclaimed "the center of the universe," was transformed. It became an army camp--French soldiers, National Guardsmen, volunteers-within, Prussian forces without. Luxuries, and then basic necessities slowly disappeared. Food became scarce, and the inhabitants resorted to edibles normally associated with other species. The government under General Trochu and leaders like Victor Hugo, Jules Favre, and Adolphe Thiers, tried to govern internal as well as external pressures. Finally, on January 27, an armistice was signed. It brought temporary calm to the capital, before the storm of the Paris commune and the second siege arrived. The new government in Paris, after the defeat at Sedan, was composed in part by publicists, politicians, lawyers, and teachers who had opposed Louis Napoleon's coup d'etat in 1851. "The Government of National Defense" was the official title, and nearly all kinds of political opinions were included, with the exception of the Bonapartists. The actual power rested with the Legitimists, Orleanists, and other conservatives. General Trochu, military governor of Paris and an Orleanist, held the presidency. Others included Leon Gambetta-minister of the Interior, General Le Flo- Minister for War, Jules Favre-Minister of Foreign Affairs and vice-president, Victor Hugo, Count Henri Rochefort-journalist and political enemy of Napoleon III who spent many years in prison, and Adolphe Thiers-the old minister of Louis Phillipe who went on diplomatic missions for the new republic. (4) Besides the day-to-day operation of the government, the three main objectives of the Government of National Defense were the procurement of a favorable peace treaty, enlistment of the aid of foreign powers, and the military preparation of Paris. The first objective got off to a bad start on September 6 when Jules Favre announced, "France would not give up an inch of her territory nor a stone of her fortresses." (5) This attitude went counter to that of Otto Von Bismarck, Chancellor of Germany, who saw the cession of territory as being as indispensable to the Prussians as it was inadmissible to the French. Bismarck demanded the immediate turnover of Alsace-Lorraine as well as Metz, Strasbourg, and Mont-Valerien (the fortress commanding Paris). Bismarck's proposals were rejected and the government was forced to defend the city and continue the war. Negotiations continued; however, nothing concrete came out of them until the end of January when Jules Favre was sent to Versailles to discuss the terms of armistice. By this time Paris had been bombarded, food and other essential stores were nearly exhausted, and Prussian victories throughout the rest of France were a daily occurrence. The armistice was to set up the preliminary conditions for a peace treaty to be signed. Its terms included the surrender of all French fortifications, except those serving as prisons; laying down their weapons with the exception of the Army which was to act independently for the maintenance of order, the immediate exchange of prisoners, and Paris was to pay 200,000,000 francs for war reparations within a fortnight. Also, anyone leaving the city needed a French military pass. (6) Back in September, the French government began pursuing the second objective, acquiring foreign aid, when Thiers was sent to England, Austria, and Russia to enlist help. He was sympathetically welcomed, but was unable to shore up any support. Only America showed enthusiasm for the new French Republic, however they were not yet ready to intervene on their behalf. Thiers tried again in October with the same results. From that point on he was used solely as the representative of the French government in the ongoing negotiations with Bismarck. Prior to the investment of Paris, the provisional government made efforts to prepare the military forces of the city. These efforts included: manpower allocations, defensive fortification and supplies. Troops were brought back from the surrounding provinces. General Vinoy's forces, which escaped capture at Sedan, were later consolidated with those of the provinces. Together they became the Provincial Mobile Guard. Meanwhile the National Guard furnished sufficient manpower to increase its size from 90,000 to more than 300,000 men. (7) Another aspect of the military preparation was the establishment of strong defensive fortifications. The forts in the vicinity of Paris were abandoned because it would have required too much work and time to get them ready, and the decision was made to move the defensive lines closer to the city's environs. All forests and wooded areas deemed favorable to enemy advantage were cut. Thus were the forests of Montmorency, Bundy, Boulogne, and Vincennes treated. The allocation of supplies was vital to the defense of Paris. Barracks, hospitals and factories for the manufacture of military hardware were established all over the city. Railway shops became cannon foundries, while tobacco factories became arsenals. The Louvre was transformed into an armament shop after the art gallery was moved for safekeeping. Balloons were constructed at the Orleans railway stations. (8) Hotels, department stores, theaters, and public buildings served as hospitals. The Tuileries and the Napoleon and Empress Circuses became barracks. (9) When in action, all the forces were under the Commander-in-Chief of the Army and subject to military law. Most of these actions centered on small sorties, unassumingly called "reconnaissances." In late September 1870, the objects of the sorties were to test the tenacity of the troops and probe the Prussian circle to determine its vulnerability. As for the Prussians, once the city was surrounded and more troops made available for the siege, the question was whether to bombard the capital or starve it into surrender. In his diary entry for October 8, Crown Prince Frederick states, "we shall certainly have to make up our minds to a bombardment of Paris... but to postpone as long as possible their actual accomplishment, for I count definitely on starving out the city." (10) The bombardment did not begin until January 4. The arrival of the shelling did not panic the Parisians. They had been expecting it since October. Precautions were taken to protect all works of art. Sandbags were placed in the windows of the Louvre, the School of Fine Arts and other important buildings, while outside monuments were taken underground. The bombardment lasted twenty-three days, usually from two to five hours each night. In the end, the Parisians refused to be intimidated and the psychological advantage of this tactic was lost. The siege of Paris slowly made its impact in an area critical to survival: the economy. According to a correspondent for The Times of London, "Business for France is everywhere broken up, and one-third of the country is devastated and ruined." (11) The first segment to directly feel the enclosure was the import and export activity. In order to survive, Paris needed a self-supporting economy, while also channeling most of its resources for the defense. Factories were now employed in making military necessities, instead of consumer goods. When the siege dragged on, the prospects for a speedy recovery evaporated and finally gave out completely when the bombardment began as some of those factories, in conjunction with other businesses, were damaged. The Prussians might not have been purposely inclined to destroy the French economy, except in one particular area: food consumption. The government's failure to establish a census system early during the siege caused it to miscalculate on its supply of comestibles, playing into the hands of the invaders. The census did not take place until December 30 and it was discovered that Paris contained a population of 2,005,709 residents excluding the armed forces. (12) The government however, did ask foreigners to leave, but the number who did was offset by the arrival of refugees from the provinces. This number of inhabitants and the Prussian encirclement had disastrous consequences. Early in 1870, the price of food had increased and by the start of the Franco-Prussian conflict it was 25 percent higher. (13) Prices did not go much higher because the government announced the number of cattle, sheep, and hogs within Paris to be adequate. However, everyone, even the government, believed the siege would last a very short time, perhaps a maximum of two months. The situation did not change until the early days of October. A few days before October 15, butchers suddenly refused to sell more than a day's ration. On October 15, the official rationing of meat began and continued throughout the entire siege, each portion becoming smaller and smaller. Eventually, nothing was left and Parisians resorted to other types of meat. The first substitute for the regular meat diet was horse. Parisians disdained it, at first, and it took the Horse-Eating Society to inform the public of the advantages to eating horse. When it finally came down to eating them, all breeds were included, from thoroughbred to mules. With time even this type of nourishment became rare, so other meats were introduced into the diet. Dogs, cats, and rats (14) were frequently eaten. The animals of the zoo were added to this diet, including Castor and Pollux, the two elephants that were the pride of Paris. Only the lions, tigers, and monkeys were spared; the big cats for the difficulty of approaching them, the monkeys because of "some vague Darwinian notion that they were the relatives of the people of Paris and eating them would be tantamount to cannibalism." (15) During the middle of January, the government placed bread on the ration list, setting the daily quota at 300 grams for adults and half that amount for children. Parisians then realized that they were on the verge of starvation. As for the Prussians, this meant a quick solution to the conflict as Frederick III writes on his diary entry for January 7, "There is news from Bordeaux that provisions in Paris would be exhausted about the end of January, and at best could only last until early in February. I trust this may be true." (16) The terrible ordeal suffered by Paris between 1870-1871 was not their first, according to a German newspaper story reprinted in The Times. In 1590, Henry IV stood before Paris much like Bismarck was doing, and the city knew nothing worse. According to the story, the people of Paris forgot what meat was and they had to subsist on leaves or roots dug up from under stones. Terrible diseases broke out and in three months 12,000 people died. Bread no longer existed while all the dogs were captured and eaten. (17) The maledictions associated with siege warfare were no strangers to Parisians; however, the peace treaty with Germany brought needed relief before the arrival of the Paris Commune with its own set of trials and tribulations. Notes: 1. "The French Army and Politics 1870-1970"- pg. 7 2. Ibid. pg. 7 3. "The War Against Paris"- pg. 1 4. "The Siege of Paris 1870-1871"- pg. 6 5. Ibid. pg. 20 6. "The War Diary of the Emperor Frederick III"- pg. 283 7. "The Siege of Paris 1870-1871"- pg. 22 8. Balloons served to carry the mail and diplomats outside the city safely from Prussian attack. Pigeons were used to carry messages. For more on this aspect of the siege read "Airlift 1870" by John Fisher. 9. "The Siege of Paris 1870-1871"- pg. 24 10. "The War Diary of the Emperor Frederick III"- pg. 150 11. The Times of London, 1870 edition 12. "The Siege of Paris 1870-1871"- pg. 43 13. Ibid. pg. 44 14. The price of rats became so high that not everyone could afford this delicacy, which was considered of the highest quality since rats fed on cheese and grains. 15. "The Siege of Paris 1870-1871"- pg. 63 16. "The War Diary of Emperor Frederick III"- pg. 253 17. The Times of London, 1870 edition Bibliography Kranzberg, Melvin. The Siege of Paris, 1870-1871. A Political and Social History. Greenwood Press Publishers. Connecticut. 1950 Tombs, Robert. The War Against Paris- 1871. Cambridge University Press. Cambridge. 1981 Allinson, A. R. (translator and editor)- The War Diary of the Emperor Frederick III- 1870-1871. Greenwood Press Publishers. Connecticut. 1926 Horne, Alistair. The French Army and Politics- 1870 to 1970. Peter Bedrick Books. New York. 1984
4f1e9c84a67ee402b0cd62974b121234
https://ehistory.osu.edu/battles/actium
Actium
Actium The strange battle of Actium ended decades of Roman civil war and resulted in the rise of the first Roman Emperor. The seemingly irrational battle tactics of Antony destroyed himself, his armies and his famed wife, Cleopatra. Conjecture over Antony's reasons for abandoning the battle and chasing Cleopatra's ship has been fodder for historians, poets and movie writers for centuries. After the assassination of Julius Caesar in 44 BC Rome had no clear leader. Mark Antony (Marcus Antonius) took over Caesar's papers and many of his legions but Gaius Julius Caesar Octavianus was named as heir in Caesar's will. (Octavianus also possessed the ever important name "Caesar".) Since neither of the two men could manage a clear majority of support, they formed the Second Triumvirate with Marcus Aemilus Lepidus. Lepidus was a well respected yet aged General. Individually, Octavianus and Antony continued to persuade senators and generals to join their side. Eventually, Lepidus who had been assigned an unimportant role in Africa, attempted to seize Sicily by force. His troops mutinied and he was forcibly retired by Octavianus. This left Octavianus with control of the Eastern provinces and Antony with control of those in the West. Antony married Octavianus' sister, Octavia, and an uneasy truce began. Mark Antony and Cleopatra VII began their fateful relationship after he took over the Western provinces. He began to live openly with Cleopatra and eventually married her although he didn't immediately divorce Octavia his Roman wife. This was greatly resented by the Romans and helped erode much of Antony's support with the public and the Senate. Octavianus capitalized on the situation by reading a supposed copy of Antony's will which gave much of his control to Cleopatra's children Regardless of the authenticity of the will, the propaganda worked and the Senate declared war on Cleopatra (and, therefore on Antony as well.) Prior to the battle of Actium, Mark Antony took his and Cleopatra's fleet into the Gulf of Ambracia (located on the west coast of Greece). He used towers on land and a row of ships in the water to guard the entrance to the Gulf. Octavianus setup camp on the Northernmost shore of the Gulf across from the Actium promontory (from which the battle gets it's name.) Over the next few months the two commanders were stalemated. A few battles were fought up and down the coast - the most decisive of which by Agrippa (one of Octavianus' Generals) cutoff Antony's lines of communication further down the coast. During this time disunity increased between Antony, his generals and his wife. Antony's generals didn't trust either Cleopatra or her armies. They also realized that as long as she was present she would act as fuel for Octavianus' propaganda. They argued that if Cleopatra would go home many of the Roman senate, the Roman people and the Roman army would quit their support of Octavianus. In addition, the Roman generals were much more comfortable and experienced with land battles while Cleopatra insisted that Antony had the advantage on the water and should attack by sea. Furthermore she apparently didn't trust her control over Antony unless she was present and thus refused to leave. Mark Antony finally agreed to take Cleopatra's advice and fight the naval battle and to simultaneously take his General's advice and send Cleopatra home. Exactly when Cleopatra and her ships (which made up a large number of the fleet) were to leave and whether or not Antony planned to go with them is a matter of debate to this day. On 2 September 31 BC. Antony moved out to meet Octavianus. Antony's fleet consisted primarily of massive quinqueremes with bronze plates while Octavianus' fleet was made up mainly of smaller Liburnian vessels. The quinqueremes had the advantage of height from which to shoot or attack from and the advantage of the plates which protected them from ramming. The Liburnian ships were much more maneuverable. At the time the primary nature of Roman naval battles was to maneuver into position to ram the opponent and thus sink their ship. Since the quinqueremes couldn't maneuver quick enough to ram the faster Liburnian ships and the Liburnians couldn't do much damage even if they did ram the plated quinqueremes the battle progressed more as a land battle than a standard sea battle. Antony's ships rowed out in two wings where Octavianus' ships were gathered at the entrance to the Gulf. Antony tried to flank Octavianus' right but the sudden move threw his own center into confusion. When Octavianus' center took advantage of the confusion the fighting grew heavy. All day the unusual battle progressed with the land tactics of arrows and spears being fired back and forth without much chance of tangible gain. Late in the afternoon, Cleopatra and her squadron of 60 ships suddenly raised their sails and raced away from the center of the battle to the open ocean. Antony's reaction has baffled historians for ages. When he saw Cleopatra leaving, Antony immediately left his command ship and followed her with 40 of his own ships following. Some have attributed Antony's rash departure to being caught off guard when his lover decided to leave him. Others have argued that Antony and Cleopatra had always secretly planned for him to steal away with her once her ships had the opportunity to break free. What is certain is that a quarter of Antony's fleet left without warning in the middle of the battle leaving the remainder of his fleet to their doom. By the end of the day the Antonian forces had lost 5000 lives and 300 ships. Octavianus no longer had an enemy capable of contending with him on the sea. A week later when all hope of Antony's return was lost, Antony's land forces surrendered as well. A year later, as Octavianus' troops closed in on him, Antony committed suicide. Cleopatra was captured by Octavianus but rather than face the certain humiliation of being paraded through the streets of Rome she had a servant smuggle an asp into her quarters and committed suicide. In less than three years after the battle, Octavianus, now called Augustus Caesar, declared himself emperor. Selected sources: "Actium: Rome's Fate In the Balance" by Barry Porter: Military History Magazine Aug 1997 "The Life and Times of Cleopatra" by Arthur Weigall, G.P. Putnam's Sons 1924
afcab931a851f8d3340899a89d5647d8
https://ehistory.osu.edu/battles/agincourt
Agincourt
Agincourt After the successful siege at Harfleur, Henry marched his force of about 6000 knights, archers and men-at-arms towards Calais. During his march the French army of 20,000 was able to position itself between Henry and Calais. Henry used a narrow front channeled by woodland to give his heavily outnumbered force a chance. The French deployed in three lines. The first line attacked and was repulsed by the English longbowmen. The second line attacked and was beaten back. The third line moved to engage but loss heart when they crossed the field covered with French dead; they soon retreated. Henry was left with control of the battlefield and a decisive victory. He soon resumed his march to Calais. French Leadership: Charles d'Albret and Jean Bouciquaut II
6bafac143c102f6026f8f22c80283a4b
https://ehistory.osu.edu/battles/auburn-catletts-station-st-stephens-church
Auburn (Catlett's Station, St. Stephen's Church)
Auburn (Catlett's Station, St. Stephen's Church) Maj. Gen. William. H. French and Maj. Gen. G.K. Warren, USA Both sides had brigades in action. There were under 200 casualties for the two days. On October 13, Stuart, with Fitzhugh Lee and Lomax's brigades, skirmished with the rearguard of the Union III Corps near Auburn. Finding himself cut off by retreating Federal columns, Stuart secreted his troopers in a wooded ravine until the unsuspecting Federals moved on. The next day as the Federal army withdrew towards Manassas Junction, Owens and Smyth's Union brigades (of Warren's II Corps) fought a rearguard action against Stuart's cavalry and infantry of Harry Hays's division near Auburn. Stuart's cavalry boldly bluffed Warren's infantry and escaped disaster. The II Corps pushed on to Catlett Station on the Orange & Alexandria Railroad.
6262e8e22d2bd6d90a14b8918c9ced98
https://ehistory.osu.edu/battles/beauge
Beauge
Beauge Beauge was one of the first defeats for the English during the Hundred Years War. French and Scottish forces combine to raid the English possessions in Normandy. Thomas, the duke of Clarence, (Henry V's brother) attempted to intercept the allied forces. During the interception Thomas' cavalry outdistanced his infantry and the French and Scottish forces decimated the English and Thomas was killed. English Leadership: Thomas, duke of Clarence French Leadership: John Stewart, earl of Buchan
d6eab2cfa0b402c3b377c9dade1d080f
https://ehistory.osu.edu/biographies/abigail-fillmore
Abigail Fillmore
Abigail Fillmore Abigail Fillmore was the wife of Millard Fillmore and the first of the First Ladies to hold a job after marriage. She believed that women should have equal access to higher education and had the capacity to succeed at all intellectual pursuits. Though suffering from several physical ailments, she appeared at many public and official events with the President. Childhood and Early YearsAbigail Powers was born on March 13, 1798 in Stillwater, Saratoga County, New York, while it was still on the fringe of civilization. She was one of seven children: five brothers and one sister. Her father, a locally prominent Baptist preacher named Lemuel Powers, died May 18, 1800, but he left a rich educational legacy to Abigail and her siblings in his large personal library of books. In 1801, Abigail's brother, Cyrus Powers - 19 years her senior - left Stillwater for the western village of Sempronius, Cayuga County, New York. In April of 1804, the widowed Mrs. Powers took her remaining six children to Albany to join a departing wagon train to Sempronius, where they lived with Cyrus Powers. The family was impoverished, thus making Abigail the first First Lady to rise from the lower-economic class. Abigail's brother Cyrus taught school in Sempronius from 1801 to 1803 in a double-log house built on land owned by the First Baptist Church there. After a three-year term by an eccentric successor, David Powers, another brother, and then a cousin Gershom Powers succeeded as teachers. Therefore, she received an excellent education from her mother and the family library at home, and from her brother in the schoolroom. During this time, Abigail developed a love of reading literature, but she also became proficient in math, government, history, philosophy and geography.In 1814, Abigail Powers became the teacher of the Sempronius village school. Although her first year of teaching was conducted in the same building where her predecessors had taught, unlike them, she taught in what was now a public institution. In 1812, the town council had approved funding for a public school. In 1815, a schoolhouse was built at Sayles Corners that became Sempronius' first district school, and she taught there. In 1817, after three years of working part-time, Abigail was employed full-time as a teacher. In 1819, she also began teaching at the private New Hope Academy in the nearby village of New Hope. Her oldest pupil there was 19-year-old Millard Fillmore (1800-1874), who came to the Academy to supplement his brief frontier school lessons in arithmetic, reading, spelling and writing. Fillmore's poverty and eagerness to learn mirrored Abigail's own experiences and ambitions, and she helped him learn quickly. On subjects where they both lacked knowledge, they studied together. Fillmore was abruptly separated from his family when they moved, and gradually the relationship evolved into romantic attachment.Fillmore's goal was a career in law. As he pursued his legal studies and Abigail continued her teaching, they did not see each other for three years but kept in touch by letter. In the interim, he apprenticed to a lawyer, taught school in the city of Buffalo and began a law practice in the nearby town of East Aurora, New York, across the street from which he built a home. In the summer of 1824, Abigail moved Lisle, New York, and became a private tutor to three of her first cousins, the daughters of her father's brother Herman Powers. Her professional reputation earned her the invitation to open a private school in Broome County, which she accepted. She returned to Sempronius and resumed her teaching career there in 1825. Marriage and FamilyAfter a very long courtship, Abigail married Millard Fillmore on February 5, 1826. Without a honeymoon, they settled in East Aurora. Abigail taught in a public school there until the birth of her son, Millard Powers Fillmore (1828), making her the first First Lady to draw a salary from independent employment as a married woman.. She maintained a lifelong interest in education. In 1829, Abigail remained in East Aurora when Millard Fillmore went to the state capital in Albany, New York to serve a term in the state legislature. During that time, they continued to correspond. Abigail began to purchase books of literature, poetry and the classics to build upon his collection of law books at home, the core of what would become their personal library of over four-thousand titles. Two years later, Millard Fillmore returned to practice law in Buffalo, to which the family moved from East Aurora. Attaining prosperity at last, Fillmore bought his family a six-room house in Buffalo, where Mary Abigail Fillmore was born in 1832. Together, the Fillmores helped to establish a lending library and a college in the city. While raising her children, Abigail also read widely and continued her pursuit of education by learning French and piano. She also practiced scientific horticulture, cultivating floral species in a conservatory built onto their home. Having remained in New York during his initial term, in 1836 Abigail accompanied her husband to Washington, DC for his second term in Congress. The children were left in Buffalo with relatives and her letters to them during their separation were balanced between academic admonishments and maternal love. Enduring long separations from her extended family often left Abigail forlorn. She learned by letter in 1838 that her mother had died, and wept through the night at her loss.In Washington she listened to Senate and House debates, read newspapers and discussed the political issues of the day. She became an advisor to her husband, and performed the social duties of a wife interested in furthering her husband's political career, attending important social events and leaving her calling cards at the homes of various government officials. Fillmore left Congress in 1842. Uncommon for women of her era, she also enjoyed physical activity, especially sea bathing, but despised the "waste of time" necessary for dressing and arriving at the shore. In 1842 she badly broke her ankle then failed to let it heal properly. Bed-ridden, then housebound for months, she was unable to continue her vigorous exercise of walking. After two years on crutches, she was able to walk freely but was never again free from pain. As he pursued various offices in the years to come, Abigail suffered from the effects of her foot injury and various illnesses which limited her involvement in and enjoyment of Fillmore's political success. In 1847 when he was elected New York State Comptroller they temporarily moved to Albany, New York; their children were away in boarding school and college. By 1848, Abigail was experiencing back and leg problems and lung inflammation.In 1848, Zachary Taylor was nominated as the Whig Party's presidential candidate and Millard Fillmore was chosen as the vice-presidential candidate. Abigail spent much of the campaign confined to her room, with headaches and back and hip pain. When the ticket of Zachary Taylor and Fillmore won the election in 1848, ill health kept Abigail in New York. She did not attend the March 4, 1849 Inauguration and except for a brief April 1850 visit remained in Buffalo, desperately lonely for her son in Boston at Harvard College, her daughter in boarding school and her husband in Washington. Her public role as the Vice-President's wife was very limited. First Lady (July 9, 1850 – March 4, 1853) Abigail Fillmore was vacationing with her children at the New Jersey shore when they learned that President Taylor had died on July 9, 1850 and that her husband was now President. He briefly joined them there, after which they all moved into the White House. The First Lady was 52 years old. Image: President Millard FillmoreEven after the period of official mourning for President Taylor, the social life of the Fillmore administration remained subdued. The new First Lady reduced the burden of her duties by limiting the social calendar and asking her daughter, Mary Abigail Abbie Fillmore, to hostess events when she was ill. Still, Abigail's social obligations were demanding. The First Lady held morning receptions on Tuesdays, hostessed large formal dinners on Thursdays and welcomed guests to smaller dinners on Saturdays. Because her Friday evening receptions required two hours of standing at her husband's side to greet the public, Abigail would often spend the entire day in bed to rest her bad ankle. She also hosted the open house New Year's receptions on New Year's Eve in 1850 and 1852.Receiving greater press coverage than her more socially active predecessor Sarah Polk, the First Lady received her first mention in the public press just nine days after President Taylor's death. Newspapers and journals gave heavy coverage to the regal green coach with silver and mother-of-pearl mountings and blue silk interiors that was presented to the First Lady as a gift from the citizens of Albany. Abigail Fillmore may have become part of the larger nation's awareness because technology had so rapidly advanced in four years that the general public were able to see what she looked like in person. A full-length photograph of the First Lady was mass-produced on small, hard paper cards known as carte de visites, that were made available for sale in 1853 at the Washington, DC studio that made the original photograph.Highly conscious of her public appearances, she hired a maid who also dressed her hair, and a seamstress whose work made Abigail Fillmore the first First Lady to wear clothing created with a relatively new invention, the sewing machine. She also appeared with the President at public and official ceremonies, contrary to the prevailing ideal of a wife as a purely private person whose domain was strictly domestic. Abigail did not seem to endorse the Polk adminstration's ban on hard liquor or any public temperance movement. Certainly her belief in a woman's right to equal access to higher educational opportunities and her capacity to succeed at all intellectual pursuits might suggest she supported the general principals of the 1848 Seneca Falls Women's Rights convention. While Abigail was an important cultural and intellectual presence in her husband's administration, her political influence also appears to be significant. A friend quoted the President as saying he "never took any important step without her counsel and advice," and a reporter called her "remarkably well informed" the political issues her husband faced.Millard Fillmore did not win the Whig presidential nomination in 1852. Planning an extensive tour of the American South in the weeks following their departure from the White House, the Fillmores moved to a suite at the nearby Willard Hotel. On March 4, 1853, Abigail remained near her husband throughout the ceremonies of President Franklin Pierce's inauguration, while a raw northeast wind whipped snow over the crowd. Returning chilled to the Willard Hotel, she caught a cold and the next day came down with a fever. Within days, her lingering cold developed into pneumonia. To prevent her lungs from swelling with liquid, her bed was leaned nearly upright but her condition worsened. Abigail Fillmore died at the hotel on March 30, 1853 at age 55, 26 days after leaving the White House. Both Congress and the President's Cabinet adjourned in mourning, and public offices closed as her family took her body home to Buffalo for burial. Her death was more widely reported in detail than that of any of her predecessors. In an April 4, 1853 letter, Washington Irving wrote a friend: After her mother's death, Abbie Fillmore assumed responsibility for her father's household at their Buffalo home. She became his companion at the few public events he attended in Buffalo, but her most famous appearance was during the "Grand Excursion" of June 1854 which publicized newly created transportation links between railway and steam boat travel. Abbie and President Fillmore were among several hundred prominent citizens to travel from Chicago to Rock Island, Illinois by rail, then to St. Paul, Minnesota Territory and back by steamboat. Covered by large eastern newspapers, the event especially celebrated the natural beauty of the upper Midwest. The scenic wonder was captured in June 7 accounts from Trempealeau, Wisconsin where Abbie Fillmore made a dramatic and swift climb to a bluff on horseback, the very image of a healthy and adventurous American girl. Seven weeks later, while visiting her grandfather in East Aurora, Abbie contracted cholera and died in one day. SOURCESWikipedia: Abigail FillmoreMiller Center: Abigail FillmoreFirst Lady Biography: Abigail FillmoreWhitehouse.gov: Abigail Powers Fillmore
30756663916ffd6ea6db637245af0232
https://ehistory.osu.edu/biographies/adolf-hitler
Adolf Hitler
Adolf Hitler Austrian born, Adolf Hitler would rise to become the leader of Germany and one of the most hated men in all of history. Born in 1889, Hitler fought in World War I. The peace imposed on Germany after that war angered him, and the rest of his life he sought to reverse the peace that had humiliated his adopted country. In 1919, he founded the National Socialist Worker's Party, and in 1923, he was imprisoned for the Munich Putsch. In 1930, because of the severe economic downturn that he blamed on the Jews, his party won several seats in the German legislature. He used fear and intimidation, particularly the Brownshirts, to consolidate and maintain power. He established the SS, the Gestapo, and Concentration Camps, where Jews and those opposed to Hitler were sent. Hitler began the war in Europe in 1939 when German forces invaded Poland in a blitzkrieg attack. He then invaded France and his neighbors to the North, but failed to subdue Great Britain, who defeated the Germans in the Battle of Britain. In 1941 he invaded the Soviet Union (Operation Barbarossa) and pushed all the way to Moscow before the Russians were able to stop him. Because of Hitler's refusal to give up any land all ready taken, the Germans suffered defeats at Stalingrad and the Battle of Kursk. The British and Americans also pushed him out of North Africa. In 1944, the Allies landed at Normandy in France, and pushed the Germans further and further back, liberating Europe as they went. Hitler, in one last act of desperation, launched an offensive that became known as the Battle of the Bulge. While the Germans made initial successes, they were eventually halted and forced to retreat. In April 1945, with the Soviets in Berlin and the Americans pushing forward in the west, Hitler committed suicide, along with his long-time mistress and wife for a day, Eva Braun. Germany surrendered May 8, 1945.
f1c13ac09d91f1844ea20d44544726b2
https://ehistory.osu.edu/biographies/joan-of-arc
Joan of Arc (The Maid of Orleans)
Joan of Arc (The Maid of Orleans) Canonized 16 May 1920. Joan of Arc was a peasant girl who believed she was under a divine mission to save France. In 1429, she persuaded Charles VII to let her lead a force to relieve Orleans. She led the French forces to a decisive victory at Orleans which allowed France to finally gain permanent advantage over the English. She was captured in 1430 by the Burgundians at Compiegne; Charles VII did not attempt to ransom her. She was burned at the stake by the English as a heretic in Rouen on May 30, 1431. She was about 19 years old.
5078ba5a26d3ecedd7d5ea282527ef06
https://ehistory.osu.edu/biographies/john-alexander-mcclernand
John Alexander McClernand
John Alexander McClernand "A political general that vied with Grant for fame and control of the army, he was finally sent back to Illinois." John McClernand nurtured a long career as a public servant, serving as a legislator, a general, and a judge. He was born on May 30, 1812 in Breckenridge County, Kentucky but grew up in Shawneetown, Illinois. McClernand passed the bar in 1832, after which he worked as a trader for a couple of years and later established a newspaper, the Shawneetown Democrat. He was elected to the Illinois Legislature four times (1836, 1840, 1842, and 1843) and to Congress for the first time in 1843, serving four terms, leaving in 1851. He was again elected to Congress in 1861 but shortly resigned to take a commission as a Brigadier General in the Union Army, even though he lacked military experience (he had served briefly in the Black Hawk War.) McClernand was given a brigade in Missouri, serving under General Ulysses S. Grant, and performed well at the engagement at Belmont, Missouri, where the Union forces surprised the Confederates and pushed them from their positions. Believing the day was won the Union soldiers began celebrating and McClernand started a political speech. However, the Confederates ferried reinforcements across the Mississippi, rallied, and routed the attackers. McClernand cut short his harangue. In February 1862, Grant elevated McClernand to command of the 1st Division, Department of the Missouri, which he led in the advances on Forts Henry and Donelson. The U. S. Navy, under the command of Admiral Foote, took Fort Henry without any help from the Army. But at Fort Donelson, McClernand, on the right flank, was attacked by the Confederates and was being pushed back when Grant arrived just in time to take control and stop the Confederate advance. In March 1862, McClernand was promoted to Major General and commanded the 1st Division, Army of the Tennessee. He led the division at Shiloh and Corinth and was soon back in Illinois to raise troops, a job at which he excelled. After his recruitment duties, Lincoln put him in charge of the Vicksburg operation, but Grant, who didn't like McClernand (and vice versa), started the campaigna campaign which began with Sherman's defeat at Chickasaw Bayou before McClernand arrived to take command.
485d31270372db16c8f363176cc4f0e1
https://ehistory.osu.edu/biographies/john-cabell-breckinridge
John Cabell Breckinridge
John Cabell Breckinridge "Prior to the war, Breckinridge served as the US Vice-President under Buchanan. During the war he served as a general in the Confederate Army and later as the Confederate Secretary of War." John Breckinridge was one of the rare political generals who was good at both. He was born in Kentucky, graduated college at 18, studied a bit more at what is now Princeton (then the College of New Jersey), and then studied the law. He set up practice in Lexington, Kentucky and soon moved in influential circles. He served' in the Mexican War as major of the Third Kentucky Volunteers but the regiment never saw action. He went from there into Kentucky politics as a state representative.
2a867c7326e3a9aecb7ee5e1a51eb95e
https://ehistory.osu.edu/biographies/john-daniel-imboden-defender-valley
John Daniel Imboden (Defender of the Valley)
John Daniel Imboden (Defender of the Valley) Confederate cavalry commander. Served for the entire duration of the War. Brigadier General John D. Imboden was born in Staunton, Virginia. He attended Washington College for two terms, but didn't graduate. He taught school for a while at the Virginia Institute for the Education of the Deaf, Dumb and Blind in Staunton. Although a competent teacher, he chose to study law and opened a practice in Staunton. He had a run at state politics, with lackluster results. Although he did serve in the state legislature, he was unsuccessful in his bid to be a representative at the Virginia Secession Convention. Imboden entered service at the start of the war, serving first as commander of the Staunton Artillery at Harper's Ferry, after its initial capture. He fought at 1st Manassas, where he was wounded by a shell fragment. He then organized the Virginia Partisan Rangers. The unit was redesignated the 62nd Virginia Mounted Infantry, which Imboden led at Cross Keys and Port Republic. He commanded a brigade of cavalry under Jeb Stuart at Gettysburg. During the Confederate withdrawal after the battle, Lee charged Imboden with escorting the train of thousands of wounded back to Virginia. Arriving at Williamsport, Imboden found the pontoon bridge destroyed, and Federal cavalry attacked the wagon train of wounded. Imboden, with the river at his back, put on a stubborn defense until General Fitz Lee's cavalry arrived and the Federals were driven off. He commanded a brigade of Ransom's Division of 2nd Corps in 1864. After a bout with typhoid in the fall of 1864, Imboden finished his wartime service performing prison duty in Aiken, South Carolina. After the war, Imboden practiced law in Richmond, Virginia, then spent his last years in the mining industry in Washington County. He died in Damascus, Georgia in August of 1895, and is buried at Richmond, Virginia.
1317ce4c5e2e7034390a635ecb509786
https://ehistory.osu.edu/biographies/john-fearless
John (The Fearless)
John (The Fearless) Years ruled: 1404 - 1419Son of: Philip the BoldMarried to: Margaret of Bavaria (1385)Children: Philip the Good John earned the moniker Fearless during a crusade he attempted to lead against the Turks in Nikopol in 1396. The crusaders were defeated and John was captured. He was ransomed a year later. At the age of 33 he succeeded his father as duke. During the battle of Agincourt he was noticeaby missing. During the next few years he negotiated with Henry V but no firm alliance was ever struck. He was assassinated in 1419 by partisans of the the dauphin Charles (later King Charles VII) during a negotiation session.
85c00f97b0b4adebca856a71946b453f
https://ehistory.osu.edu/biographies/john-gaunt
John of Gaunt
John of Gaunt John of Gaunt (John was born in Ghent) was the fourth son of Edward III. He was one of the richest and most powerful lords in England. In 1373, John led a 600 mile raid from Calais to Bordeaux with about 10,000 men. Even though he cut straight through the middle of France the raid was without a single siege or battle. When his brother's son (Richard II) was crowned, John remained loyal and helped his nephew during his reign.
912dd0b0558cbd2a3e621fcfa750f61a
https://ehistory.osu.edu/biographies/john-newton
John Newton
John Newton "Stalwart Union major general. Army engineer. Brigade, division, and corps commander, cited for bravery and achievement in several battles. Significant postwar career as U.S. Army Engineer." John Newton was born in Norfolk, Virginia on August 22, 1822, the son of a U.S. Congressman. He was a West Point graduate, Class of 1842. Prewar, he served as an Army engineer and West Point instructor in various subjects. He also had a major part in the construction of at least half a dozen forts. At the outbreak of hostilities, he was working on fortifications in Delaware. Newton was assigned as Chief Engineer of two different departments, then worked on Washington defenses. He commanded brigades of the Army of the Potomac at West Point, Virginia; Gaines' Mill; Glendale; South Mountain and Sharpsburg. At South Mountain, he led a bayonet charge which resulted in taking the enemy position. Newton and Brigadier General John Cochrane went to Washington on December 30, 1862, met with President Lincoln, and told the President that General Burnside planned to again cross the Rappahannock. They believed Burnside did not have the confidence of his subordinate generals, and would again be defeated. They denied that they were seeking the removal of Burnside. This meeting started a chain of events that resulted in the removal of Burnside late in January of 1863. Newton commanded the Union 3rd Division, VI Corps under General John Sedgwick at Chancellorsville. His division quickly carried Marye's Heights with a bayonet charge. He assumed temporary command of I Corps at Gettysburg on 1 July 1863 when General John F. Reynolds was killed. He later commanded 2nd Division, IV Corps. This was General Sherman's old command. He served under Sherman, who regarded him highly. In the Atlanta campaign, his unit carried Rocky-face Ridge, and fought at Dalton, Adairsville, Dallas, Kennesaw Mountain, Peach Tree Creek, Jonesborough and Lovejoy's Station. At Peach Tree Creek, he prevented a dangerous Confederate movement against Sherman. His rapidly constructed works allowed him to turn back the Confederate thrust. He then commanded the District of Key West & Tortugas. Postwar, Newton accepted a regular commission as lieutenant colonel of engineers. He was successful in a number of difficult engineering projects, mostly in the New York area. His specialty was removing obstacles from the harbor. He served until 1886, when he retired as a brigadier general. He then served as Commissioner of Public Works for the City of New York. Still later, he became President, Panama Railroad Company. He served as such until his death. Newton died in New York City on May 1, 1895. He is buried at West Point, New York.
da34b162e3c3866f8f0967f43ae757e2
https://ehistory.osu.edu/biographies/john-pegram
John Pegram
John Pegram "Surrendered to McClellan early in the war but later returned to serve under Beauregard, Bragg and Kirby Smith. He was killed in actions around Petersburg." Pegram was another of the sons of the South who thought more of honor than money ' else he would not have chosen a military career. He graduated from West Point (1854) and was a dragoon in US service. In 1861 he went with his state (Virginia) and was promptly jumped from lieutenant to lieutenant colonel. His first command was in western Virginia. At the battle of Rich Mountain (July 11, 1861) a Union force under Rosecrans and McClellan surrounded his force. Half broke out, but Pegram was captured. His defeat was one of the stepping-stones in McClellan's successful west Virginia campaign, the campaign that catapulted him to command of the Army of the Potomac by the end of the month. Pegram wasn't imprisoned long, and when exchanged he was posted to the cavalry on the other side of the Appalachians. For a bit he was chief of engineer to Beauregard, then Bragg; he then rose to be Kirby Smith's Chief of Staff. He finally got back into field command with a cavalry brigade, which was detached to support Forrest's raid on Murfreesboro. He went on another raid, into Kentucky, then in November 1862 he took command of a division of cavalry in Forrest's Cavalry Corps. That took him in the battle of Chickamauga, the last action he saw in the western theater. The next month (October 1863) he received a transfer (which he'd requested) to his home state. He took command of an infantry brigade in II Corps of the Army of Northern Virginia in time for the Mine Run Campaign. The next spring he was wounded at the Wilderness, and returned in July, when the whole division had been moved out to the Shenandoah Valley under Jubal Early. He fought at the third battle of Winchester, which broke the Confederate strength in the Shenandoah. Casualties were heavy, and earned him promotion when his division commander (Stephen Ramseur) was transferred to take command of another division. Pegram was never promoted to Major General, but he led the division the rest of his life, through the bitter dregs of the Valley Campaign, and on to Petersburg when the remnants of Early's men were transferred there. He married during the long, cold winter of 1864-65, in Richmond. The Confederate capital was starved of entertainments, and his wedding was a social highlight. On February 6, 1865 he was killed at the battle of Hatcher's Run, when his division and Little Billy Mahone's fought off a Union raid on Lee's flank.
de7d136f6fe920681bf9b6aef7d501e1
https://ehistory.osu.edu/biographies/john-pope
John Pope
John Pope Went from command of the Army of the Mississipi to the Army of the Potomac but alienated his troops with his infamous "Address" on taking command. In less than four months he was replaced by McClellan. John Pope was a native Kentuckian, a professional soldier, and given more responsibility than his talents merited. He'd been an outstanding student at West Point (Class of 1842) and posted to the creme-de-la-creme, the Topographical Engineers. He did well in peacetime, and during the Mexican War, where he won two brevets. He was marked for high places, although his rank hardly reflected it, for he was still a captain fourteen years after graduating. Proximity to politicians has hurt few military careers. Pope commanded Lincoln's escort before the 1861 inauguration and was promptly promoted from Captain to Brigadier General of Volunteers. His organizational skills were then put to use in a string of commands in Illinois and Missouri as the Union tried to make armies out of volunteers. He did well, and was given command of an Army of the Mississippi' which was not very large but stronger than any nearby Confederate force.
8002c49822c5de119e97bda369444266
https://ehistory.osu.edu/books/1965/0009
Page 009
Page 009 prepared for any eventuality and ready to land at Da Nang or Saigon as the situation required. By the end of February, President Johnson had made the decision to commit a two-battalion Marine expeditionary brigade to Da Nang with the mission of protecting the base from enemy incursion. General Karch and members of his staff once more visited General Westmoreland on 25 February to discuss plans for a Marine landing at Da Nang. The MEB commander left Saigon two days later for Da Nang where he coordinated his plans with the South Vietnamese I Corps Commander, Major General Nguyen Chanh Thi, the virtual warlord of South Vietnam's five northern provinces. Karch later recalled: General Karch and his staff immediately departed Da Nang for Subic Bay and then Okinawa. On 27 February (26 February, Washington time), the Department of State cabled Ambassador Taylor that the Marines were to be landed and that he was to secure approval from the Government of Vietnam for this eventuality. On the afternoon of the 28th, Ambassador Taylor met with Vietnamese Prime Minister Phan Huy Quat to discuss with him the proposed American landing. The following day, 1 March, the Ambassador met with the Minister of the Vietnamese Armed Forces, General Nguyen Van Thieu and the Vietnamese Chairman of the Joint General Staff, General Tran Van Minh ("Little Minh") to discuss the details of the deployment of the 9th MEB. The two Vietnamese officers posed no objections to the proposed commitment of American combat troops. They did, however, express concern about the reaction of the Vietnamese population and requested that the American forces be brought into Da Nang "in the most inconspicuous way feasible."16 Evidently this "inconspicuous way" statement had some effect on U.S. officials in Washington. On 3 March, Ambassador Taylor received a message from Assistant Secretary of Defense John T. McNaughton stating that it was desirable to deploy the Army's 173d Airborne Brigade by air from Okinawa instead of the 9th MEB.17 Some Washington planners obviously believed that the light infantry of an airborne brigade landing at Da Nang airfield would be a "quieter arrival" than the more formidable appearance of a Marine brigade with its tanks, amphibian tractors, and other heavy weapons arriving in an armada of amphibious ships. General Westmoreland, supported by the American Ambassador, immediately objected to the proposed change. Both considered that the Marines were more self-sustaining. Admiral Sharp, Commander in Chief, Pacific, cabled the JCS: The objections to the MEB landing were overruled and on 7 March 1965 (6 March 1965, Washington time) the JCS sent the long-awaited signal to land the 9th MEB at once with two of its three BLTs. The Landing The days before the landing were a hectic period for General Karch and the Marines of the brigade. General Karch and his staff had completed 9th MEB Operational Plan 37D-65 for the amphibious landing of a BLT and the airlift of another battalion from Okinawa to Da Nang on 26 February. The MEB staff then conducted a command post exercise (CPX) on Okinawa. According to Major Ruel T. Scyphers, the MEB G-1, the operations order for the deployment of the MEB, "was put together following a non-stop 48 hour CPX ... we got word about 2000 [27 February] and armed with a staff manual and some borrowed clerks we put together an order and had it boxed about 0300...."19 Still on Okinawa at the beginning of March, General Karch scheduled a two-day map exercise of the Da Nang area beginning on 2 March and a briefing for Lieutenant Colonel Herbert J. Bain,* commanding officer of the 1st Battalion, 3d Marines, whose battalion was slated to fly to Da Nang. On the * Lieutenant Colonel Bain was a combat veteran of World War II. He had earned a battlefield commission in November 1944 and later was awarded the Silver Star for his heroic actions during the Okinawa campaign in 1945.
c798645a4cd9ee7ebc6f7470bc68504b
https://ehistory.osu.edu/books/1965/0016
Page 016
Page 016 CHAPTER 2 The 9th MEB in Vietnam The First Weeks-Estimate of the Situation-More Marines Arrive-An Expanded Mission-Chu Lai The First Weeks Despite the arrival of the 9th MEB, the Marine intervention in Vietnam was still of a limited nature. The Joint Chiefs of Staff made this very clear in their landing order of 7 March which directed that ''the U.S. Marine Force will not, repeat will not, engage in day-to-day actions against the Viet Cong.'' General Westmoreland gave the 9th MEB the responsibility to protect the vital Da Nang Airbase from enemy attack but declared that 'overall responsiblity for the defense of Da Nang area remains a RVNAF [Republic of Vietnam Armed Forces] responsibility.'1 To carry out this limited mission, General Karch had nearly 5,000 Marines under his command, including McPartlin's and Bain's infantry battalions, two helicopter squadrons, and limited logistic and combat support forces. The brigade had absorbed the former Marine Unit Vietnam (MUV), or Task Unit 79.3.5, better remembered as SHUFLY. On 9 March the MUV became Marine Aircraft Group (MAG) 16. Colonel John H. King, Jr., the former MUV commander and veteran Marine aviator who had commanded a helicopter squadron in Korea, assumed command of MAG-16. The 9th MEB air-ground team at Da Nang faced a difficult logistical situation. General Karch later recalled: The 3d Service Battalion and Force Service Regiment on Okinawa provided the personnel for the Brigade Logistic Support Group (BLSG). According to Colonel Oddy, 'When the time came to embark the Brigade, we split the Service Battalion 50/50, and supported by personnel from the Force Service Regiment, we were ready to launch the fledging BLSG.'* Original plans called for a BLSG in excess of 1,000 men, but because the Joint Chiefs imposed a personnel ceiling on the number of Marines who could be brought into Vietnam the group had been cut to 660 men. Colonel Oddy recalled in 1976, 'The personnel ceiling resulted in an extremely austere staff group that made service and support a big question mark . . . .'3 General Karch remarked that there were several contingency plans which fitted the situation in Vietnam better than the one that was used.4 The only representatives of the brigade logistic group who participated in the first phase of the landing were the executive officer, Major Pat Morgan, and 11 other Marines. They arrived on 10 March by air with elements of BLT 1/3 and assumed control of the entire logistic operation, but the advance echelon could accomplish very little 'except to console the MEB that supplies were on the way.'5 Despite the activation of the BLSG on 12 March and the arrival of its commanding officer, Lieutenant Colonel George H. Smith, six days later, the first two weeks for the MEB were a logistic nightmare. The entire brigade subsisted on the 15 days of rations that had landed with McPartlin's battalion and an *Colonel Oddy wrote in 1976:' . . . this was a time when unrestricted officers with infantry MOSs could be assigned command of service units. This was fortunate for me as I had previous command experience with infantry platoons, companies and battalions and it seemed unlikely I would command one of the infantry regiments of the Division. Command of a large service organization and the opportunity to formulate from scratch a larger task organized service and support group was certainly a major high point in my career.'' Col Robert J. Oddy, Comments on draft MS, dtd 25Oct76 (Vietnam Comment File).
5cb3220424aa540b974483dc8f26896d
https://ehistory.osu.edu/books/1965/0019
Page 019
Page 019 USMC Photo A184119 HAWK missiles move to new positions on Hill 327 from the airfield. The missiles are from Battery B, 1st LAAM Battalion. airfield perimeter. The battalion was prepared to support these posts with a strong reaction force which could deploy rapidly to any sector of the airfield. The inherent difficulty of the unit's defensive assignment was that the battalion could not establish listening posts or conduct defensive reconnaissance patrols beyond the confines of the airfield. Although McPartlin's battalion ran patrols into the hills to the west, his Marines encountered no Viet Cong. In fact, the first Americans casualties were inflicted by another Marine when two men from a three-man listening post left their positions to investigate a suspicious movement to their front. The two men apparently lost their way in the dark and came upon their remaining partner from the rear. He turned and opened fire, mortally wounding his two comrades. Initially, the Marine infantrymen suffered more from the heat and humidity than from the combat situation. In order to reduce the number of heat prostration casualties. General Karch restricted defensive patrols and heavy work to the cooler hours of the early morning and late afternoon. Although acclimatization only required time, other problems were not so easily solved. Relations with the South Vietnamese often were difficult. For example, Bain's relief of some ARVN forces at the airfield on 13 March was delayed when the Vietnamese refused to leave their positions. The Marines had to make further liaison with the ARVN headquarters before completing the relief the next day. McPartlin's battalion recorded a similar experience. The Marines attempted to establish a mutual check point with the Popular Forces (PF), Vietnam's home defense militia.* The PFs showed * Popular Forces were Vietnamese who were recruited and served in their local villages and hamlets. Regional Forces (RFs) were Vietnamese forces recruited within a province and assigned to the province chief. Although comparison with the U. S. institutions is not exact, one could say that PFs were county- or parish-type troops while RFs were state forces.
f22b8987e2890d72305d74e7c847d1d9
https://ehistory.osu.edu/books/1965/0024
Page 024
Page 024 USMCPhotoA184344 Marines from the 3d Battalion, 4th Marines have just debarked from helicopters at Phu Bai. The troops in the background, ready to embark in the same helicopter for the return trip to Da Nang, are from the 2d Battalion, 3d Marines. While the units of the 1st Brigade and the two squadrons from the United States arrived in the Western Pacific, plans for the deployment of the Marine reinforcements authorized by the President from Okinawa and Japan to Vietnam were completed. Colonel Edwin B. Wheeler's Regimental Landing Team (RLT) 3, composed of the RLT headquarters and two battalion landing teams, BLT 2/3 from his own regiment, commanded by Lieutenant Colonel David A. Clement, and BLT 3/4 from the newly arrived 4th Marines, commanded by Lieutenant Colonel Donald R. Jones, made up the ground component. Air elements of the reinforcements consisted of Lieutenant Colonels Paul L. Hitchcock's Marine Air Support Squadron (MASS) 2 and William C. McGraw, Jr.'s VMFA-531. The 3d Marine Expeditionary Brigade headquarters, which General Collins had activated on 14 March under Lieutenant Colonel Edward Cook after the landing of the 9th MEB, was to control the movement. On 4 April, the 1st Brigade commander, Brigadier General Marion E. Carl, a World War II flying ace who had downed 18 Japanese aircraft, assumed command of the 3d MEB. He left Okinawa the following day to join Admiral Wulzen on board the Mount McKinley at Subic Bay. As the Mount McKinley weighed anchor for the South China Sea, Task Force 76 and 3d MEB staffs completed embarkation and landing plans. RLT 3, BLT 3/4, and MASS-2 would sail from Okinawa on board five tank landing ships. VMFA-531 would fly its aircraft to Da Nang, while its heavy support equipment would follow in amphibious shipping. Lieutenant Colonel Clement's BLT 2/3 was already on board the ships of Navy Task Group 76.6, having completed the JUNGLE DRUM III exercise in Thailand. On 4 April, while underway for the Philippines, the task group received instructions to move to a position 50 miles off the coast of Da Nang. The amphibious squadron arrived there the next day and awaited further landing instructions. Landing plans of the 3d MEB directed Lieutenant Colonel Clement, a holder of the Silver Star from the Korean War, to land his battalion over RED Beach 2 while the supplies and heavy equipment of the battalion landing team were unloaded at the LST landing on the Tiensha Peninsula, across the Da