id
stringlengths
32
32
url
stringlengths
31
1.58k
title
stringlengths
0
1.02k
contents
stringlengths
92
1.17M
73184fabb328f34da8d68b762ccf2265
https://www.smithsonianmag.com/history/lost-city-cambodia-180958508/
The Lost City of Cambodia
The Lost City of Cambodia Jean-Baptiste Chevance senses that we’re closing in on our target. Paused in a jungle clearing in northwestern Cambodia, the French archaeologist studies his GPS and mops the sweat from his forehead with a bandanna. The temperature is pushing 95, and the equatorial sun beats down through the forest canopy. For two hours, Chevance, known to everyone as JB, has been leading me, along with a two-man Cambodian research team, on a grueling trek. We’ve ripped our arms and faces on six-foot shrubs studded with thorns, been savaged by red biting ants, and stumbled over vines that stretch at ankle height across the forest floor. Chevance checks the coordinates. “You can see that the vegetation here is very green, and the plants are different from the ones we have seen,” he says. “That’s an indication of a permanent water source.” Temple of a Thousand Faces Seconds later, as if on cue, the ground beneath our feet gives way, and we sink into a three-foot-deep muddy pool. Chevance, a lanky 41-year-old dressed in olive drab and toting a black backpack, smiles triumphantly. We are quite possibly the first human beings to set foot in this square-shaped, man-made reservoir in more than 1,000 years. Yet this isn’t merely an overgrown pond we’ve stumbled into. It’s proof of an advanced engineering system that propelled and sustained a vanished civilization. The vast urban center that Chevance is now exploring was first described more than a century ago, but it had been lost to the jungle until researchers led by him and an Australian colleague, Damian Evans, rediscovered it in 2012. It lies on this overgrown 1,300-foot plateau, known as Phnom Kulen (Mountain of the Lychee fruit), northeast of Siem Reap. Numerous excavations as well as high-tech laser surveys conducted from helicopters have revealed that the lost city was far more sophisticated than anyone had ever imagined—a sprawling network of temples, palaces, ordinary dwellings and waterworks infrastructure. “We knew this might be out there,” says Chevance, as we roar back down a jungle trail toward his house in a rural village on the plateau. “But this gave us the evidence we were hoping for.” Phnom Kulen is only some 25 miles north of a metropolis that reached its zenith three centuries later—the greatest city of the Khmer Empire, and possibly the most glorious religious center in the history of mankind: Angkor, derived from the Sanskrit word nagara, or holy city, site of the famed temple Angkor Wat. But first there arose Phnom Kulen, the birthplace of the great Khmer civilization that dominated most of Southeast Asia from the 9th to the 15th centuries. The Khmer Empire would find its highest expression at Angkor. But the defining elements of Kulen—sacred temples, reflecting the influence of Hinduism, decorated with images of regional deities and the Hindu god Vishnu, and a brilliantly engineered water-supply system to support this early Khmer capital—would later be mirrored and enlarged at Angkor. By the 12th century, at Angkor, adherence to Buddhism would also put its own stamp on the temples there. ********** Nothing ignites an archaeologist’s imagination like the prospect of a lost city. In the late 19th century, French explorers and scholars, pursuing fragmentary clues about the existence of Phnom Kulen, hacked their way through the jungles of Southeast Asia. Inscriptions found on temple doors and walls made mention of a splendid hilltop capital called Mahendraparvata (the mountain of the great Indra, king of the gods), and its warrior-priest monarch, Jayavarman II, who organized several independent principalities into a single kingdom in the beginning of the ninth century. This story is a selection from the April issue of Smithsonian magazine Another French archaeologist, Philippe Stern, trekked to the top of the Phnom Kulen plateau in 1936, and in five weeks of excavations he and his co-workers uncovered the ruins of 17 Hindu temples, fallen carved lintels, statues of the Hindu god Vishnu, and remnants of a great stone pyramid. Stern believed that he had located Mahendraparvata. But the temples of Angkor, built on a more accessible flat plain and visible on a larger scale, were more compelling to archaeologists, and the excavations at Phnom Kulen never advanced much beyond Stern’s initial dig. Then came decades of neglect and horror. In 1965, at the height of the Vietnam War, Norodom Sihanouk allowed the North Vietnamese to set up bases inside Cambodia to attack the U.S.-backed South Vietnamese Army. Four years later, President Nixon escalated a secret bombing campaign of Cambodia, killing tens of thousands and helping to turn a ragtag group of Communist guerrillas into the fanatical Khmer Rouge. This radicalized army marched into Cambodia’s capital, Phnom Penh, in April 1975, declared the Year Zero, emptied out cities and herded millions into rice-growing communes. About two million people—nearly one-quarter of the population—were executed or died of starvation and disease before the Vietnamese toppled the Khmer Rouge in 1979. Phnom Kulen became the last sanctuary of the Khmer Rouge, and their leader, Pol Pot, known as Brother Number One. The last of the guerrillas didn’t surrender and descend from the plateau until 1998—Pol Pot died that year near the Thai border, not far from Phnom Kulen—leaving behind a traumatized population and a landscape strewn with unexploded ordnance. Chevance reached Phnom Kulen in 2000, while conducting research for advanced degrees in Khmer archaeology. “There were no bridges, no roads; it was just after the end of the war,” Chevance says as we eat steamed rice and pork with members of his staff, all of us seated on the wood-plank floor of a traditional stilted house, their headquarters in Anlong Thom, a village on the plateau. “I was one of the first Westerners to go back to this village since the war began,” Chevance says. “People were, like, ‘Wow.’ And I had a coup de foudre—the feeling of falling in love—for the people, the landscape, the architecture, the ruins, the forest.” It wasn’t until 2012, though, that Chevance marshaled high-tech evidence for a lost city, after he teamed up with Evans, who is based in Siem Reap with the French School of Asian Studies. Evans had become fascinated by Lidar (for Light Detection and Ranging), which uses lasers to probe a landscape, including concealed structures. Mounted on a helicopter, the laser continually aims pulses toward the ground below, so many that a large number streak through the spaces between the leaves and branches, and are reflected back to the aircraft and registered by a GPS unit. By calculating the precise distances between the airborne laser and myriad points on the earth’s surface, computer software can generate a three-dimensional digital image of what lies below. Lidar had recently revealed details of the Mayan ruins of Caracol in Belize’s rainforest, and exposed La Ciudad Blanca, or The White City, a legendary settlement in the Honduran jungle that had eluded ground searches for centuries. The jungles of Kulen presented a problem, however: Rampant illegal logging of valuable hardwoods had stripped away much of the primary forest, allowing dense new undergrowth to fill in the gaps. It was unclear whether the lasers could locate enough holes in the canopy to penetrate to the forest floor. Despite skepticism, Evans, with help from Chevance, raised enough money to survey more than 90,000 acres in both Phnom Kulen and Angkor. “The whole thing was pulled together with chewing gum and duct tape,” Evans says. In April 2012, Evans joined Lidar technicians as they flew in a helicopter at 2,600 feet in a crosshatch pattern over Phnom Kulen. About two months after the overflights, Evans, awaiting the processing of visual data they had collected, switched on his desktop. He stared “in astonishment,” he says, as the ghostly legendary kingdom resolved before his eyes into an intricate cityscape: remnants of boulevards, reservoirs, ponds, dams, dikes, irrigation canals, agricultural plots, low-density settlement complexes and orderly rows of temples. They were all clustered around what the archaeologists realized must be a royal palace, a vast structure surrounded by a network of earthen dikes—the ninth-century fortress of King Jayavarman II. “To suspect that a city is there, somewhere underneath the forest, and then to see the entire structure revealed with such clarity and precision was extraordinary,” Evans told me. “It was amazing.” Now the two archaeologists are using the Lidar images to understand how Mahendraparvata developed as a royal capital. The early water-management system they now saw in detail demonstrates how water was diverted to areas on the plateau that lacked a steady flow, and how various structures controlled supplies during rainless periods. “They employed a complex series of diversions, dikes and dams. Those dams are huge, and they required huge manpower,” Chevance says. At the dawn of the Khmer Empire, he goes on, “They were already showing an engineering capacity that translated into wealth and stability and political power.” The Lidar imagery also has revealed the presence of dozens of ten-foot-high, 30-foot-wide mounds in symmetrical rows on the jungle floor. Chevance and Evans at first speculated that they were burial sites—but, in succeeding excavations, they found no bones, ashes, urns, sarcophagi or other artifacts to support that hypothesis. “They were archaeologically sterile,” says Evans. “They are a mystery, and they may remain a mystery. We may never know what those things are.” Lidar surveys of Angkor also detected several mounds that are virtually identical to those at Phnom Kulen—just one of many startling similarities of the two cities. Indeed, as the archaeologists studied the images of Mahendraparvata, they realized with a flash of insight that they were looking at the template for Angkor. ********** Chevance and I set out on dirt bikes, bouncing over rickety wooden bridges that cross silt-laden streams, groaning up steep hills and plunging down switchback trails hemmed in by dense stands of cashew trees (grown illegally in this reserve). In one large clearing we come across the discarded remnants of huge mahogany trees that have been felled with a chain saw, cut into pieces and dragged out in ox carts. Chevance suspects the culprit is an affluent resident in the village of Anlong Thom, but says that fingering him will be pointless. “We will send a report to a government minister, but nothing will change,” he says. “The rangers are on the take.” At the highest point on the plateau, Chevance leads me on foot up a slope to a monumental five-tiered platform made of sandstone and laterite (a rusty-red rock): the mountaintop pyramid of Rong Chen. The name translates as Garden of the Chinese, and refers to a local myth in which Chinese seafarers smashed their ship against the mountaintop at a time when an ocean supposedly surrounded the peak. It was here, in A.D. 802, according to an inscription in Sanskrit and ancient Khmer found in an 11th-century temple in eastern Thailand, that Jayavarman II had himself consecrated king of the Khmer Empire, at that time a dominion probably a bit smaller than contemporary Cambodia. And it was here, too, that the king created a cult of divinely ordained royal authority. More than 1,200 years later, in 2008, Chevance had arrived at the mountaintop with a team of 120 locally hired laborers. Government experts demined the area; then the team began digging. The excavation suggested that it was the centerpiece of a royal metropolis—a conviction later confirmed by the Lidar overflights. “You don’t build a pyramid temple in the middle of nowhere,” Chevance tells me. “It’s an archaeological type that belongs to a capital city.” Today Rong Chen is a darkly numinous place, where the glories of an ancient Khmer civilization collide with the terrors of a modern one. Unexploded mines still lie buried here—the result of Khmer Rouge efforts to protect their mountain redoubt from assault. “We saw a few mines at the last moment when we were doing the excavations,” Chevance tells me, warning me not to venture too far from the pyramid. “Most of the villages on Phnom Kulen were mined. The road between the villages was mined.” The hilltop camp afforded the Communist fighters a sanctuary near the strategic city of Siem Reap, then in government hands, and served as the base from which the Khmer Rouge carried out acts of sabotage—including blocking a spillway that carried water from Phnom Kulen into the city. “They prevented water from reaching Siem Reap, and the Cambodian Army knew that.” The result, Chevance says, was that the mountain was bombed. “You can still find B-52 bomb craters here.” Chevance and I get back on our dirt bikes and bounce down a path to the best-preserved remnant of Jayavarman II’s capital: an 80-foot-high tower, Prasat O Paong (Temple of the Tree of the Small River), standing alone in a jungle clearing. The facade of the Hindu temple glows a burnished red in the setting sun, and intricate brickwork reaches to the apex of the tapered column. Ceramics inside this and other temples excavated on Phnom Kulen prove that they remained pilgrimage sites as late as the 11th century—an indicator that the structures continued to influence the rest of the Khmer Empire long after Jayavarman II moved his capital from Phnom Kulen to the Angkor plain and the city’s original population had disappeared. ********** Angkor—which Chevance and Evans describe as “an engineered landscape on a scale perhaps without parallel in the preindustrial world”—is a place that inspires superlatives. Achieving its apogee in the late 12th and early 13th centuries, the site, at its peak, was an urban center extending over nearly 400 square miles. Chevance leads me up the near-vertical stone steps of Pre Rup, a soaring tenth-century structure with a platform made of laterite and sandstone. It represents a transition point, a synthesis of the two extraordinary temples we explored on the plateau, Prasat O Paong and Rong Chen. “It is a pyramid with three levels,” Chevance tells me, as we clamber among the deserted ruins in the heat. “On top you also have five towers similar to the ones we saw on the mountain. It is a combination of two architectural styles.” As has now become clear, thanks to Lidar, Phnom Kulen, faintly visible on the horizon 25 miles away, influenced far more than the later city’s sacred architecture. To support Angkor’s expanding population, which may have reached one million, engineers developed a water-distribution system that mirrored the one used on the plateau. They collected water from the Siem Reap River, a tributary of the Mekong, that flows from the plateau, in two enormous reservoirs, then built an intricate series of irrigation channels, dams and dikes that distributed water evenly across the plain. Although Angkor’s soil is sandy and not highly fertile, the masterful engineering allowed farmers to produce several rice crops annually, among the highest yields in Asia. “The secret to their success was their ability to even out the peaks and troughs seasonally and annually, to stabilize water and therefore maximize food production,” Damian Evans tells me. Angkor was at its height during the reign of Jayavarman VII (circa 1181-1220), regarded by scholars as the greatest king of the Khmer Empire. Two days after my arrival in Angkor, I’m standing with Evans on the highest platform of the king’s masterpiece, the temple known as the Bayon. Evans gestures across a stunning tableau of sandstone terraces, pillars and towers, as well as galleries carved with bas-reliefs depicting warriors marching into battle. “No king who came afterward ever built on this scale again,” says Evans. Jayavarman VII, who made Mahayana Buddhism the Khmer Empire’s state religion, grafted what are commonly believed to be his own features onto a serenely smiling Buddhist divinity. Its massive stone face beams in dozens of iterations throughout this complex, radiating compassion and kindness across the four corners of the empire. It is here, in the heart of Jayavarman VII’s capital, that the histories of Ang­kor and Mahendraparvata converge most powerfully. “You are looking at cities that are widely separated in space and time,” Evans tells me. “But each has an urban core defined by a grid of streets and a central state temple—the Bayon here, Rong Chen there—at the center.” Yet the Lidar data show that the cities followed divergent paths. While Mahendraparvata was a masterpiece of urban planning, with temples and dwellings carefully laid out by Jayavarman II around wide boulevards—a Khmer version of Haussmann’s Paris—Angkor developed haphazardly. Densely populated neighborhoods of wooden houses squeezed against the edges of the Bayon. Evans describes Angkor as a “messy aggregation of centuries of development, with features superimposed one on top of another.” Beneath the jungle canopy south of the city, Evans’ Lidar surveys have detected huge spirals inscribed into the landscape, covering one square mile, reminiscent of the ancient geoglyphs discovered in the Nazca Desert of southern Peru. Like the mystery mounds, the spirals contained no artifacts, no clues about their function. “They could have a meaning encoded in them that may never be known,” Evans says. ********** The sheer ambition of the Khmer kings, their re-engineering of a jungled landscape into an urban one, sowed the seeds of destruction. New research has provided a clearer picture of the sequence of events that may have doomed Mahendraparvata. The Lidar data revealed that its population didn’t engage in terraced rice farming in their mountain metropolis—which meant that they almost certainly relied on slash-and-burn agriculture. That would have depleted the soil rapidly, and probably contributed to the decline and fall of the city. The evidence backs up research conducted by Chevance and a colleague, who analyzed soil samples taken from a reservoir on Phnom Kulen. Evidence showed that vast amounts of soil and sand “got washed down the valley, indicating deforestation,” says Chevance. Soil from a later date contained a high concentration of jungle vegetation, which suggests that the land had been abandoned and taken over again by the tropical forest. In the case of Mahendraparvata, this process likely occurred more rapidly than at Angkor—a major population center for about 600 years—where decline came more slowly. Over time, the artificially engineered landscape almost certainly led to topsoil degradation, deforestation and other changes that drastically reduced the capacity to feed the population and made Angkor increasingly difficult to manage. Leaders of the rival kingdom of Ayutthaya, in what is now Thailand, sacked Angkor in 1431. It was abandoned and left to decay, doomed to the same fate as its predecessor, Mahendraparvata. “There are in the kingdom of Cambodia the ruins of an ancient city, which some say was constructed by Romans or by Alexander the Great,” the Spanish explorer Marcelo de Ribadeneyra wrote when he chanced upon Angkor nearly two centuries later. “It is a marvelous fact that none of the natives can live in these ruins, which are the resorts of wild beasts.” “There are still many questions to answer,” Chevance tells me. “We know more about temples and kings than everyday life.” When it comes to the inhabitants of Mahendraparvata, Chevance adds, a fundamental question underlies his work: “How did they live?” Answering that query will be difficult, because few traces of ordinary Khmer life remain: While temples —built for the ages—endure, Mahendraparvata’s population constructed their dwelling places out of wood, which rotted away long ago. Even the royal palace, which probably employed thousands of people, has been reduced to a few crumbling platforms, pavements, gutters, dikes and roof tiles. Last year, as part of the Cambodian Archaeological Lidar Initiative, Evans and Chevance conducted a new series of helicopter surveys of Phnom Kulen to take in “the entire mountain range,” says Evans—more than 100 square miles encompassing archaeological sites, rock quarries and traces of ancient cities. The CALI project also included overflights to investigate ancient provincial centers of military and industrial significance, as well as the Khmer capital of Sambor Prei Kuk, 100 miles south of Angkor. The city endured from the seventh to the ninth centuries, declining just as Angkor was on the rise. In total, the CALI campaign covered more than 700 square miles. Ten ground teams worked alongside the aerial survey teams in remote areas, and in extreme heat, refueling choppers, conferring with local authorities, collecting precision GPS data at ground stations, and persuading local people to stop burning off forest, so that flights relying on aerial sensors would not have the ground obscured by smoke. The result of this ambitious effort, funded by the European Research Council, was a “unique archive,” says Evans, of the ways that human beings transformed the natural environment and shaped Khmer history over 2,000 years. The results will be published in a peer-reviewed journal later this year. Further surveys are planned using drones and satellites. Evans’ teams are currently on the ground across Cambodia, investigating surface remains shown by Lidar. This ambitious effort, he believes, eventually will reveal the entire mosaic of Southeast Asia’s greatest civilization, only now beginning to come into focus. Ultimately, he believes, what will emerge is a dazzling, nuanced understanding of a “complex hierarchy with an unmatched scale.” Chiara Goia is an Italian documentary and fine arts photographer who has won the Sony World Photography Award in Arts and Entertainment, and the Canon prize for emerging photographers. Her work has appeared in such publications as National Geographic, TIME and Vanity Fair. Joshua Hammer is a contributing writer to Smithsonian magazine and the author of several books, including The Bad-Ass Librarians of Timbuktu: And Their Race to Save the World's Most Precious Manuscripts and The Falcon Thief: A True Tale of Adventure, Treachery, and the Hunt for the Perfect Bird.
104b148f4f299f0930fab7897f2c5066
https://www.smithsonianmag.com/history/lost-history-yellowstone-180976518/?utm_source=smithsoniandaily&utm_medium=email&utm_campaign=20210105-daily-responsive&spMailingID=44215152&spUserID=OTkzMDk3MzE4ODk5S0&spJobID=1920403648&spReportId=MTkyMDQwMzY0OAS2
After 14 summers excavating in Yellowstone National Park, Doug MacDonald has a simple rule of thumb. “Pretty much anywhere you’d want to pitch a tent, there are artifacts,” he says, holding up a 3,000-year-old obsidian projectile point that his team has just dug out of the ground. “Like us, Native Americans liked to camp on flat ground, close to water, with a beautiful view.” We’re standing on a rise near the Yellowstone River, or the Elk River as most Native American tribes called it. A thin wet snow is falling in late June, and a few scattered bison are grazing in the sagebrush across the river. Apart from the road running through it, the valley probably looks much as it did 30 centuries ago, when someone chipped away at this small piece of black glassy stone until it was lethally sharp and symmetrical, then fastened it to a straightened shaft of wood and hurled it at bison with a spear-throwing tool, or atlatl. This article is a selection from the January/February issue of Smithsonian magazine “The big myth about Yellowstone is that it’s a pristine wilderness untouched by humanity,” says MacDonald. “Native Americans were hunting and gathering here for at least 11,000 years. They were pushed out by the government after the park was established. The Army was brought in to keep them out, and the public was told that Native Americans were never here in the first place because they were afraid of the geysers.” MacDonald is slim, clean-cut, in his early 50s. Originally from central Maine, he is a professor of anthropology at the University of Montana and the author of a recent book, Before Yellowstone: Native American Archaeology in the National Park. Drawing on his own extensive discoveries in the field, the work of previous archaeologists, the historical record and Native American oral traditions, MacDonald provides an essential account of Yellowstone’s human past. Tobin Roop, chief of cultural resources at Yellowstone, says, “As an archaeologist, working in partnership with the park, MacDonald has really opened up our understanding of the nuances and complexities of the prehistory.” MacDonald sees his work, in part, as a moral necessity. “This is a story that was deliberately covered up and it needs to be told,” he says. “Most visitors to the park have no idea that hunter-gatherers were an integral part of this landscape for thousands of years.” In the last three decades, the National Park Service has made substantial efforts to research and explain the Native American history and prehistory of Yellowstone, but the virgin-wilderness myth is still promoted in the brochure that every visitor receives at the park entrance: “When you watch animals in Yellowstone, you glimpse the world as it was before humans.” Asked if he considers that sentence absurd, or offensive to Native Americans, MacDonald answers with a wry smile. “Let’s just say the marketing hasn’t caught up with the research,” he says. “Humans have been in Yellowstone since the time of mammoths and mastodons.” Shane Doyle, a research associate at Montana State University and a member of the Apsaalooke (Crow) Nation, burst out laughing when I read him that sentence from the brochure. But his laughter had an edge to it. “The park is a slap in the face to Native people,” he said. “There is almost no mention of the dispossession and violence that happened. We have essentially been erased from the park, and that leads to a lot of hard feelings, although we do love to go to Yellowstone and reminisce about our ancestors living there in a good way.” * * * On the road between the Norris Geyser Basin and Mammoth Hot Springs is a massive outcrop of dark volcanic rock known as Obsidian Cliff, closed to the public to prevent pilfering. This was the most important source in North America for high-quality obsidian, a type of volcanic glass that forms when lava cools rapidly. It yields the sharpest edge of any natural substance on earth, ten times sharper than a razor blade, and Native Americans prized it for making knives, hide-scraping tools, projectile points for spears and atlatl darts, and, after the invention of the bow and arrow 1,500 years ago, for arrowheads. For the first people who explored the high geothermal Yellowstone plateau—the first to see Old Faithful and the other scenic wonders—Obsidian Cliff was a crucial discovery and perhaps the best reason to keep coming back. In that era, after the rapid melting of half-mile-thick glaciers that had covered the landscape, Yellowstone was a daunting place to visit. Winters were longer and harsher than they are today, and summers were wet and soggy with flooded valleys, dangerous rivers and a superabundance of mosquitoes. MacDonald made one of the most exciting finds of his career in 2013 on the South Arm of Yellowstone Lake: a broken obsidian projectile point with a flake removed from its base in a telltale fashion. It was a Clovis point, approximately 11,000 years old and made by the earliest visitors to Yellowstone. The Clovis people (named after Clovis, New Mexico, where their distinctive, fluted points were first discovered in 1929) were hardy, fur-clad, highly successful hunters. Their prey included woolly mammoths, mastodons and other animals that would become extinct, including a bison twice the size of our modern species. The Clovis point that MacDonald’s team spotted on the beach is one of only two ever found in the park, suggesting that the Clovis people were infrequent visitors. They preferred the lower elevation plains of present-day Wyoming and Montana, where the weather was milder and large herds of megafauna supported them for 1,000 years or more. MacDonald thinks a few bands of Clovis people lived in the valleys below the Yellowstone plateau. They would come up occasionally in the summer to harvest plants and hunt and get more obsidian. “Native Americans were the first hard-rock miners in Wyoming and it was arduous work,” says MacDonald. “We’ve found more than 50 quarry sites on Obsidian Cliff, and some of them are chest-deep pits where they dug down to get to the good obsidian, probably using the scapular blade of an elk. Obsidian comes in a cobble [sizable lump]. You have to dig that out of the ground, then break it apart and start knapping the smaller pieces. We found literally millions of obsidian flakes on the cliff, and we see them all over the park, wherever people were sitting in camp making tools.” Each obsidian flow has its own distinctive chemical signature, which can be identified by X-ray fluorescence, a technique developed in the 1960s. Artifacts made of Yellowstone obsidian from Obsidian Cliff have been found all over the Rockies and the Great Plains, in Alberta, and as far east as Wisconsin, Michigan and Ontario. Clearly it was a valuable commodity and widely traded. On the Scioto River south of Columbus, Ohio, archaeologists identified 300 pounds of Yellowstone obsidian in mounds built by the Hopewell people 2,000 years ago. It’s possible the obsidian was traded there by intermediaries, but MacDonald and some other archaeologists believe that groups of Hopewell made the 4,000-mile round trip, by foot and canoe, to bring back the precious stone. “In 2009, we found a very large ceremonial knife, typical of the Hopewell culture and unlike anything from this region, on a terrace above Yellowstone Lake,” he says. “How did it get there? It’s not far-fetched to think that it was lost by Hopewell people on a trip to Obsidian Cliff. They would have left in early spring and followed the rivers, just like Lewis and Clark, except 2,000 years earlier.” Another tantalizing relic, found inside a Hopewell mound in Ohio, is a copper sculpture of a bighorn ram’s horn. Then as now, there were no bighorn sheep in the Midwest or the Great Plains. But if Hopewell people were making epic journeys west to get obsidian, they would have seen bighorns in the Northern Rockies, and the animals were particularly abundant in Yellowstone. * * * Twenty miles long and 14 miles wide, Yellowstone Lake is the largest natural high-elevation lake in North America. MacDonald describes the five summers he spent on the remote, roadless southern and eastern shores of the lake with a small crew of graduate students as “the most exciting and also the most frightening experience of my career.” Today we are standing on the northern shore, which is accessible by road. A cold wind is blowing, and the water looks like a choppy sea with spray flying off the whitecaps. “We had to use canoes to get there and load them with all our gear,” he recalls. “The water gets really rough in bad weather, much worse than you see today, and we nearly got swamped a few times. One of our crew got hypothermia. We had to build an illegal fire to save his life. Another time my guys were stalked on the beach by a cougar.” Grizzlies are his biggest fear. MacDonald always carries bear spray in Yellowstone, never walks alone and is careful to make plenty of noise in the woods. One night at the lake, he recalls, he and his crew were eating steaks around a campfire when they saw a young grizzly bear staring at them from 200 yards. That night they heard his roars and barks echoing across the lake; they surmised that the bear was frustrated because a bigger grizzly was keeping him away from an elk carcass a quarter-mile distant. “The next day he attacked our camp,” says MacDonald. “He peed in my tent, pooped everywhere, destroyed the fire pit, licked the grill, just trashed everything. We stayed up all night making noise, and thankfully it worked. He didn’t come back. I still have that tent and it still reeks of bear pee.” They also had trouble from bison and bull elk that occupied their excavation sites and declined to leave. They endured torrential rains and ferocious electric storms. Once they had to evacuate in canoes because of a forest fire. “We all had the feeling that the gods wanted us out of there, and we kept finding amazing stuff. There were basically sites everywhere.” Among their discoveries were a 6,000-year-old hearth, a Late Prehistoric stone circle (or tepee base) lying intact under a foot of dirt, and a wide variety of stone tools and projectile points. Excavating a small boulder with obsidian flakes littered around its base, they knew that someone, man or woman, boy or girl, had sat there making tools 3,000 years ago. “I think both genders knapped stone tools, because they were in such constant use and demand,” says MacDonald. MacDonald’s team found evidence of continual human occupation on the lakeshore for 9,500 years, starting with the Cody Culture people, whose square-stemmed projectile points and asymmetrical knives were first discovered in Cody, Wyoming. More than 70 Cody points and knives have been found in Yellowstone, with the greatest concentration at the lake. “The climate was getting hotter and drier and it was cool up here in summer. As the bison migrated up to the higher elevations, Cody people almost certainly followed them.” Over the following millennia, as the climate warmed, the modern bison evolved and human populations rose in the Great Plains and Rockies. Yellowstone became a favored summer destination, drawing people from hundreds of miles away, and the lakeshore was an ideal place to camp. There is no evidence of conflict among the different tribal groups; MacDonald thinks they probably traded and visited with one another. The peak of Native American activity in Yellowstone was in the Late Archaic period, 3,000 to 1,500 years ago, but even in the 19th century it was still heavily used, with as many as ten tribes living around the lake, including Crow, Blackfeet, Flathead, Shoshone, Nez Perce and Bannock. Today, as sedentary people, we equate “living” in a place with long-term or even permanent settlement. But for hunter-gatherers who follow animal migrations, avoid climate extremes and harvest different plants as they ripen in different areas, the word has a different meaning. They live in a place for part of the year, then leave and come back, generation after generation. One Shoshone group known as the Sheepeaters seldom left the current park boundaries, because they were able to harvest bighorn sheep year-round. But most Native Americans in Yellowstone moved down to lower, warmer elevations in winter, and returned to the high plateau in the spring. A few brave souls returned in late winter to walk on the frozen lake and hunt bears hibernating on the islands. “They were probably getting the spiritual power of the animal, and demonstrating their courage, by entering the dens,” says MacDonald. “People have hunted bears that way in Siberia, Northern Europe, anywhere there’s bears. Some people still do. You can see the videos on YouTube. Young adult males are the only ones stupid enough to do it, and I imagine that was the case here too.” * * * When MacDonald was a freshman at Brown University, in Providence, Rhode Island, he studied political economy, international development and finance, and envisioned a career at the World Bank or the International Monetary Fund. Then he spent a couple of summers in central Mexico with friends who liked visiting archaeological sites, often traveling on third-class rural “chicken buses” to get there. “Some of those sites were amazing, and when I got back to Brown, I started taking archaeology classes,” he says. “One of them was taught by Richard Gould, who is kind of a famous guy, and it was about hunter-gatherers. It made me realize that I didn’t want to spend my life at the World Bank. I wanted to work on the archaeology of hunter-gatherers instead.” MacDonald has never killed his own meat and knows little about edible and medicinal plants, but he believes that hunting and gathering is the most successful way of living that humanity has ever devised. “We’re proud of our technological advances, but in historical terms our society has lasted a split second,” he says. “We lived as hunter-gatherers for three million years. We moved around in extended family groups that took care of each other. It was egalitarian because there was no wealth. It was a healthy way for humans to live and we were well adapted for it by evolution.” He came to Yellowstone because it’s the ideal place to study the archaeology of hunter-gatherers. It has never been farmed or logged, and most of its archaeological sites are intact. Morally, however, it’s a difficult place for him to work, because he “greatly laments” the removal of hunter-gatherers from the land and wishes they could come back. “There’s an irony to this,” he says. “We kicked Native Americans out of Yellowstone to make a park. Now we’re trying to find out how they lived here.” In the oral traditions of the Crow, Shoshone, Blackfeet, Flathead, Bannock, Nez Perce and other tribes with ancient associations to Yellowstone, there is a rich store of material about the country they knew as “land of the geysers,” “land of the burning ground,” “the place of hot water,” “land of vapors” or “many smoke.” Much of this knowledge was gathered into a 2004 book, Restoring a Presence, by Peter Nabokov and Lawrence Loendorf, whose research was funded by the National Park Service. Archaeological research supports and complements the tribal oral histories, and also reaches back further in time. In the view of Elaine Hale, who was the archaeologist at Yellowstone for 25 years, and has co-written a history of archaeology in the park, MacDonald “dives deeper than the rest.” Asked to elaborate, she says, “He uses a wider range of scientific techniques and equipment, like ground-penetrating radar and pollen analysis. He’s unique in the heart and thoughtfulness he brings to his work. He shares, promotes, communicates. He’s inspired so many students by bringing them to the park, including a lot of Native American students. For prehistoric archaeology in Yellowstone, no one is more well versed, and he’s reframed the whole approach.” It was by measuring the decay of radioactive carbon in charcoal buried in the ground that MacDonald was able to date the lakeshore hearth as 6,000 years old, within an accuracy of 30 years. By testing blood and fat residues on 9,000-year-old stone knives and spear points, he found out that Cody people in Yellowstone primarily hunted bison and bear, but also elk, deer, rabbit and other species. Microscopic remains of plants sifted from ancient campsites reveal what Native Americans were gathering thousands of years ago. Camas and bitterroot, both of which contain protein and grow in alpine meadows, were presumably vital to survival. Traces also have been detected of goosefoot, sunflower, sagebrush, wild onion, prickly pear cactus, balsamroot and various grasses, although hundreds of other species were probably gathered as well. In their campfires they were burning pine, spruce, ash, aspen, sagebrush and mistletoe. At a site above the Yellowstone River, MacDonald’s crews excavated three stone circles marking the location of tepees. The circles were 400 years old and they inspired MacDonald to imagine a day in the existence of the family who had lived here. “I thought about them in late October, ” he says. “The father, uncle and son are hunting in the hills above the river, the women collecting driftwood from the riverbanks, everyone is nervously watching black storm clouds come over the mountains and realizing that it’s time to hurry home.” In MacDonald’s imagining, the father has killed a deer with his bow, and now, with the help of his brother and son, he quickly butchers it. They use large obsidian knives hafted by rabbit cordage to bone handles. The meat, which they pack into leather bags, will provide food to the extended family for a few days, and the hide will be made into leggings for the coming winter. Meanwhile, mother and her baby, grandmother, aunt and daughter walk along the river in a howling wind, followed by three wolf-like dogs. They surprise a rabbit, which daughter shoots with her bow. She skins the animal with an obsidian blade while the baby wails on her mother’s back from the bitter wind and driving snowflakes. In the last ten days, this extended family band has raised and lowered its tepee five times. They are moving quickly off the high Yellowstone plateau toward their first winter camp by the river. Now, as the storm rages with full force, they raise the tepee again, father and son tying the poles together at the top while the women adjust the hides. Grandmother and aunt push rocks over the bottom edges of the hides, to block the wind and snow. The entire process takes about an hour. Everyone has cold feet and numb hands except the baby in its cradle board. They enter the tepee and manage to get a fire going with the dry willow and sagebrush that the women packed in a bag. They lay down their gear and sleeping hides of bear and bison on the floor of the tepee, which is broad enough to accommodate all six adults and three children. The women unpack the rabbit meat and a variety of wild herbs and vegetables. They will eat well this evening and stay warm as the first winter storm of the year rages outside. Four hundred years later, MacDonald’s crew excavated the fire pit in this tepee circle. They found tiny pieces of charcoal from the sagebrush in the fire, pieces of rabbit bone and plants from a stew, a stone scraping tool used to process deer hide into leggings, and a small pile of obsidian flakes. “I imagine that daughter made herself a new arrow point to replace the one she used to kill the rabbit,” says MacDonald. “They kept the fire going all night with sagebrush, and the sparks went up through the intercrossed poles high above them.” A particular challenge for archaeologists in Yellowstone is the acidic soil, which has dissolved away most organic material in the archaeological record. They can’t determine what clothing looked like, for example, and they’ve found the remains of only a few human beings. One was a woman buried with a dog 2,000 years ago near the current location of the Fishing Bridge visitor center. When human remains are discovered, the park service calls in elders and council members from the 26 Native American tribes associated with Yellowstone, who decide the best course of action. The woman and her dog were reburied inside the park with a traditional ceremony. MacDonald thinks that the steep, forbidding mountains above the plateau are the real terra incognita for archaeologists. Yellowstone has 40 mountain peaks above 10,000 feet, and we know from Native American testimonies that they were important religious sites. People went there to pray and seek visions by fasting. For shelter from the wind, they built small structures of stacked rocks known as fasting beds. A few of these have been found in Yellowstone, on peaks with panoramic views, and MacDonald is confident that archaeologists will locate more. There is no truth to the idea that Native Americans were afraid of the geysers and thermal features. Archaeologists have excavated hundreds of campsites near the geysers, and the Shoshone would soak the horns of bighorn sheep in the bubbling hot springs before reshaping them into beautiful and deadly bows. In general, Yellowstone’s geysers, mud pots, hot springs and fumaroles were regarded as places of great spiritual power. From interviews with Plenty Coups, Hunts to Die and other 19th-century Crow warriors, we know that a famous Crow shaman called the Fringe (born in 1820, he died from smallpox in the 1860s) would come to the big geysers in Yellowstone to heal wounded people and seek visions. According to Hunts to Die, in his interview with the photographer-ethnographer Edward Curtis, the spirits in the geysers were afraid of people, rather than the other way around. But if you approached the spouting water in a pure and humble manner, some Native Americans believed, the spirits would reveal themselves and you could harness their powers. * * * Muted sunlight, filtering down through a thin layer of clouds, works a kind of magic at the Grand Canyon of the Yellowstone River. It saturates the colors on the canyon walls—yellows, reds, dark brown, orange, pink, white—and makes them glow with such intensity that the rocks appear to be lit from within. This is my first time seeing this famous canyon with its thundering waterfalls. While I struggle to make visual sense of it—how can the colors glow so brightly in this gray light?—MacDonald tells me about the artist Thomas Moran, whose 1872 painting of this scene, when displayed to legislators in Washington, D.C., was instrumental in getting Yellowstone designated as America’s national park. But MacDonald’s main reason for bringing me to this famed American vista was to point out that “this was part of the original Crow reservation.” Shane Doyle, the Crow scholar at Montana State, later outlined the history. “The original Crow reservation in 1851 was over 30 million acres, and it included the entire eastern half of what would be Yellowstone. In 1868, prompted by a gold rush, that was reduced to eight million acres, and we lost all our land in Wyoming. We had no conflict with white settlers, we scouted for the U.S. Army, we tried to be allies to the whites, and we got treated like all the other tribes. Our reservation now is about two million acres.” In 1872, when President Ulysses S. Grant signed 2.2 million acres of Wyoming, Montana and Idaho into existence as Yellowstone National Park, several different tribal groups were camped around Yellowstone Lake and along the Madison and Yellowstone rivers. The Crow still legally owned a strip of land in Montana along the Yellowstone River. Sheepeaters were hunting and gathering in the more remote areas and managed to stay inside the park for another seven years. When the national park proposal was being debated in Washington, there had been little discussion about the “Indian” presence in Yellowstone and none about the land’s cultural importance to the tribes. They belonged on reservations, it was thought, where they could be instructed in English, Christianity, sedentary agriculture, individualism, capitalism and other Euro-American values. The park was created to protect the scenic wonders and wildlife from white hunters, prospectors, loggers and settlers. To encourage tourism, park officials and local promoters played down the presence of Native Americans and circulated the falsehood that they were afraid of the geysers. Anthropologist Matthew Sanger, a curator at the Smithsonian National Museum of the American Indian, stresses that conflicts with Native Americans were ongoing in the West at that time; Custer's defeat at the Little Big Horn was in 1876. “Creating a massive park in tribal lands was a distinct political act and it happened under a president who was fervently against Native peoples,” he says. “The park also represents the idea in Western philosophy that people are separate from nature, whereas Native American philosophy sees them as deeply intertwined.” On August 24, 1877, a party of nine visitors from Radersburg, Montana, were camped near Fountain Geyser, having made a glorious tour of the park. At 5 in the morning, as they were preparing breakfast, a group of Nez Perce warriors came into their camp, asking if they had seen soldiers and demanding food. Then more warriors appeared in the distance. The Radersburg party nervously packed up their wagons and started down the Firehole River, where they encountered some 800 Nez Perce and 2,000 horses. The nine tourists, having come to Yellowstone as sightseers, now found themselves in the thick of an armed conflict between the Nez Perce and the U.S. Army. Faced with the prospect of becoming farmers on a reservation, these Nez Perce had chosen to flee their homelands in Oregon. They were being pursued by the Army, with skirmishes and battles along the way. Angry young warriors had killed a number of whites. The Nez Perce were hoping to find refuge with the Crows in the buffalo country of Wyoming and Montana, or with Sitting Bull in Canada, where they could continue to live their traditional life of hunting and gathering. Contrary to what was reported in the newspapers at the time and has been taught to American schoolchildren ever since, the leader of the Nez Perce flight was not Chief Joseph. Joseph was a simple camp chief who made no military decisions and took charge of the Nez Perce only during their final surrender. As the great procession of warriors, elders, women, children, dogs and horses passed through Yellowstone, they were led by a half-white buffalo hunter known as Poker Joe. Against his instructions, a group of young warriors ended up looting the Radersburg party’s wagons and attacking the tourists. In the park today road signs identify where the Nez Perce went next—across the Yellowstone River in the Hayden Valley, then to Yellowstone Lake, and up over what’s now called Dead Indian Pass in the northeastern corner of the park. Their old friends the Crows turned them away, so the Nez Perce went north toward Canada but were surrounded by the U.S. military in the Bears Paw Mountains of northern Montana. Joseph, the last chief standing, took over and, according to legend, he made a famous surrender speech: “From where the sun now stands, I will fight no more forever.” But that was not the end of armed conflict inside the new park. The following year, 1878, a group of Bannock and Shoshone warriors fled into Yellowstone after a violent uprising in Idaho. The same U.S. Cavalry general who had forced the Nez Perce to surrender, Nelson Miles, defeated them within 20 miles of Dead Indian Pass. To counteract the bad publicity generated by these two “Indian wars,” as they were described, park officials launched marketing campaigns that sought to erase the history of Native American presence in the park. Starting in 1886, the U.S. Cavalry patrolled the park for 32 years, to make tourists feel safer and discourage Native Americans from hunting and gathering in their old haunts. In MacDonald’s opinion, the existence of Yellowstone National Park, and the United States of America, came at a “terrible cost” to Native Americans, and the least we can do today is acknowledge the truth. “When people look at Yellowstone, they should see a landscape rich with Native American history, not a pristine wilderness. They’re driving on roads that were Native American trails. They’re camping where people camped for thousands of years.” MacDonald has no Native American blood, but he regards the people who lived in Yellowstone for 11,000 years as something like ancestors. “We’re all descended from hunter-gatherers who lived in similar ways to the people here,” he says. “They were really successful at surviving in difficult conditions. We know this because we’re alive. If they hadn’t been so resourceful and successful, none of us would be here today.” He would like to see more signs and exhibits about the park’s original inhabitants, first and foremost at Obsidian Cliff, but the park service is more concerned about protecting the site from possible looting. Shane Doyle has been advocating for a tepee village inside the park, where tribal college students could teach park visitors about the Native American history. “So far I’ve got nowhere,” Doyle says. “It might take a really long time, but I’m hopeful we’ll get there in the end. Surely, they can’t just keep pretending we were never there.” Editor's note: An ealier version of this story said that two members of the Radersburg tourist party were killed by the Nez Perce in 1877. Two tourists were shot in the head, but they all survived the attack.
74868f026a8814a4015c9b7b0f834683
https://www.smithsonianmag.com/history/lovers-shanxi-saved-chinas-ancient-architectural-treasures-before-lost-forever-180961424/
The Couple Who Saved China’s Ancient Architectural Treasures Before They Were Lost Forever
The Couple Who Saved China’s Ancient Architectural Treasures Before They Were Lost Forever Architectural preservation is rarely so thrilling as it was in 1930s China. As the country teetered on the edge of war and revolution, a handful of obsessive scholars were making adventurous expeditions into the country’s vast rural hinterland, searching for the forgotten treasures of ancient Chinese architecture. At the time, there were no official records of historic structures that survived in the provinces. The semi-feudal countryside had become a dangerous and unpredictable place: Travelers venturing only a few miles from major cities had to brave muddy roads, lice-infested inns, dubious food and the risk of meeting bandits, rebels and warlord armies. But although these intellectuals traveled by mule cart, rickshaw or even on foot, their rewards were great. Within the remotest valleys of China lay exquisitely carved temples staffed by shaven-headed monks much as they had been for centuries, their roofs filled with bats, their candlelit corridors lined with dust-covered masterpieces. Chinese Architecture: Art and Artifacts The two leaders of this small but dedicated group have taken on a mythic status in China today: the architect Liang Sicheng and his brilliant poet wife, Lin Huiyin. This prodigiously talented couple, who are now revered in much the same way as Diego Rivera and Frida Kahlo in Mexico, were part of a new generation of Western-educated thinkers who came of age in the 1920s. Born into aristocratic, progressive families, they had both studied at the University of Pennsylvania and other Ivy League schools in the United States, and had traveled widely in Europe. Overseas, they were made immediately aware of the dearth of studies on China’s rich architectural tradition. So on their return to Beijing, the cosmopolitan pair became pioneers of the discipline, espousing the Western idea that historic structures are best studied by firsthand observation on field trips. This was a radical idea in China, where scholars had always researched the past through manuscripts in the safety of their libraries, or at most, made unsystematic studies of the imperial palaces in Beijing. But with flamboyant bravado, Liang and Lin—along with a half dozen or so other young scholars in the grandly named Institute for Research in Chinese Architecture—used the only information available, following stray leads in ancient texts, chasing up rumors and clues found in cave murals, even, in one case, an old folkloric song. It was, Liang later wrote, “like a blind man riding a blind horse.” Despite the difficulties, the couple would go on to make a string of extraordinary discoveries in the 1930s, documenting almost 2,000 exquisitely carved temples, pagodas and monasteries that were on the verge of being lost forever. Photographs show the pair scrambling among stone Buddhas and across tiled roofs, Liang Sicheng the gaunt, bespectacled and reserved aesthete, scion of an illustrious family of political reformers (on par with being a Roosevelt or Kennedy in the U.S.), Lin Huiyin the more extroverted and effervescent artist, often wearing daring white sailor slacks in the Western fashion. The beautiful Lin was already legendary for the romantic passions she had inspired, leaving a trail of lovelorn writers and philosophers, including the renowned Indian poet Rabindranath Tagore, who once composed a poem in praise of her charms. (“The blue of the sky / fell in love with the green of the earth. / The breeze between them sighs, ‘Alas!’”) This article is a selection from the January/February issue of Smithsonian magazine “Liang and Lin founded the entire field of Chinese historical architecture,” says Nancy Steinhardt, professor of East Asian art at the University of Pennsylvania. “They were the first to actually go out and find these ancient structures. But the importance of their field trips goes beyond that: So many of the temples were later lost—during the war with Japan, the revolutionary civil war and the Communist attacks on tradition like the Cultural Revolution—that their photos and studies are now invaluable documents.” The romantic pair, whose letters are suffused with a love of poetry and literature, returned most often to the province of Shanxi (“west of the mountains”). Its untouched landscape was the ultimate time capsule from imperial China. An arid plateau 350 miles from Beijing, cut off by mountains, rivers and deserts, Shanxi had avoided China’s most destructive wars for over 1,000 years. There had been spells of fabulous prosperity as late at the 19th century, when its merchants and bankers managed the financial life of the last dynasty, the Qing. But by the 1930s, it had drifted into impoverished oblivion—and poverty, as the axiom goes, is the preservationist’s friend. Shanxi, it was found, resembled a living museum, where an astonishing number of ancient structures had survived. One of the most significant excursions to Shanxi occurred in 1934, when Liang and Lin were joined by two young American friends, John King Fairbank and his wife, Wilma. The couples had met through friends, and the Fairbanks became regular guests at the salons hosted by Liang and Lin for Chinese philosophers, artists and writers. It was an influential friendship: John, a lanky, sandy-haired South Dakotan, would go on to become the founding figure in Sinology in the United States, and an adviser to the U.S. government on Chinese policy from World War II to the 1970s. (The prestigious Fairbank Center for Chinese Studies at Harvard University bears his name.) Wilma was a fine arts major from Radcliffe, a feisty New Englander in the mold of Katharine Hepburn, who would later become an authority on Chinese art in her own right, and play a key role in saving Liang and Lin’s work from oblivion. But in the summer of 1934, the Fairbanks were two wide-eyed newlyweds in Beijing, where John was researching his PhD in Chinese history, and they eagerly agreed to meet the Liangs in Shanxi. The four spent several weeks making forays from an idyllic mountain retreat called Fenyang, before they decided to find the remote temple of Guangsheng. Today, the details of this 1934 journey can be reconstructed from an intimate photographic diary made by Wilma Fairbank and from her memoir. The prospect of the 70 miles of travel had at first seemed “trivial,” Wilma noted, but it became a weeklong expedition. Summer rains had turned the road to “gumbo,” so the antique Model T Ford they had hired gave out after ten miles. They transferred their luggage to mule carts, but were soon forced by the soldiers of the local warlord Yan Shinxan, who were building a railroad line along the only roads, to take the back trails, which were traversable only by rickshaw. (John was particularly uncomfortable being pulled by human beings, and sympathized when the surly drivers complained, “We have been doing ox and horse work.”) When the tracks became “bottomless jelly,” the four were forced to walk, led after dark by a child carrying a lantern. Liang Sicheng battled on through the mire, despite his near-lame leg, the result of a youthful motorbike accident. The inns en route were dismal, so they looked for alternative arrangements, sleeping one night in an empty Ming dynasty mansion, others in the homes of lonely missionaries. All along the route they were surrounded by peasants who stared in wonder at Liang and Lin, unable to conceive of Chinese gentry taking an interest in their rural world. Often, the histrionic Lin Huiyin would fall into “black moods” and complain vociferously about every setback, which astonished the stiff-upper-lipped, WASPish Wilma Fairbank. But while the diva poet could be “unbearable,” Wilma conceded, “when she was rested she responded to beautiful views and humorous encounters with utter delight.” The discomforts were instantly forgotten when the exhausted party finally spotted the graceful proportions of the Guangsheng Temple one dusk. The monks allowed the Fairbanks to sleep in the moonlit courtyard, while the Liangs set up their cots beneath ancient statues. Next morning, the Liangs marveled at the temple’s inventive structural flourishes created by a nameless ancient architect, and found a fascinating mural of a theatrical performance from A.D. 1326. They climbed a steep hill to the Upper Temple, where a pagoda was encrusted in colored glazed tiles. Behind the enormous Buddha’s head was a secret staircase, and when they reached the 13th story, they were rewarded with sweeping views of the countryside, as serene as a Ming watercolor. The years of field trips would ultimately represent an interlude of dreamlike contentment for Liang and Lin, as their lives were caught in the wheels of Chinese history. All explorations in northern China were halted by the Japanese invasion in 1937, which forced the couple to flee Beijing with their two young children to ever-harsher and more distant refuges. (The Fairbanks had left one year earlier, but John returned as a U.S. intelligence officer during World War II and Wilma soon after.) There was a moment of hope after the Japanese surrender, when Liang and Lin were welcomed back to Beijing as leading intellectuals, and Liang, as “the father of modern Chinese architecture,” returned to the United States to teach at Yale in 1946 and work with Le Corbusier on the design of the United Nations Plaza in New York. But then came the Communist triumph in 1949. Liang and Lin initially supported the revolution, but soon found themselves out of step with Mao Zedong’s desire to eradicate China’s “feudal” heritage. Most famously, the pair argued passionately for the preservation of Beijing, then the world’s largest and most intact walled city, considered by many as beautiful as Paris. Tragically, Mao ordered its 25 miles of fortress walls and many of its monuments destroyed—which one U.S. scholar has denounced as “among the greatest acts of urban vandalism in history.” The rest of their lives have a tragic aura. Lin Huiyin, who had always been frail, succumbed to a long battle with tuberculosis in 1955, and Liang, despite his international renown, was trapped in 1966 by the anti-intellectual mania of the Cultural Revolution. The frenzied attack on Chinese tradition meant that Liang was forced to wear a black placard around his neck declaring him a “reactionary academic authority.” Beaten and mocked by Red Guards, stripped of his honors and his position, Liang died brokenhearted in a one-room garret in 1972, convinced that he and his wife’s life’s work had been wasted. Miraculously, he was wrong, thanks to the dramatic volte-face of China’s modern history. After the death of Mao in 1976, Liang Sicheng was among the first wave of persecuted intellectuals to be rehabilitated. Lin Huiyin’s poetry was republished to widespread acclaim, and Liang’s portrait even appeared on a postage stamp in 1992. In the 1980s, Fairbank managed to track down the pair’s drawings and photographs from the 1930s, and reunite them with a manuscript Liang had been working on during World War II. The posthumous volume, An Illustrated History of Chinese Architecture, became an enduring testament to the couple’s work. Today, the younger generations of Chinese are fascinated by these visionary figures, whose dramatic lives have turned them into “cultural icons, almost with demigod status,” says Steinhardt of the University of Pennsylvania. The dashing pair have been the subjects of TV documentaries, and Lin Huiyin’s love life has been pored over in biographies and soap operas. She is regularly voted the most beautiful woman in Chinese history and will be played in an upcoming feature film by the sultry actress Zhang Ziyi, of Crouching Tiger, Hidden Dragon fame. “For Chinese women, Lin Huiyin seems to have it all,” says Annie Zhou, Lin’s great-granddaughter, who was raised in the United States. “She’s smart, beautiful and independent. But there’s also a nostalgia for her world in the 1920s and ’30s, which was the intellectual peak of modern Chinese history.” “Since when did historical preservationists get to be so sexy?” muses Maya Lin, the famous American artist and architect, who happens to be Lin Huiyin’s niece. Talking in her loft-studio in downtown Manhattan, Maya pointed through enormous windows at the cast-iron district of SoHo, which was saved by activists in New York in the 1960s and ’70s. “They’ve become folk heroes in China for having stood up for preservation, like Jane Jacobs here in New York, and they are celebrities in certain academic circles in the United States.” She recalls being cornered by elderly (male) professors at Yale who raved about meeting her aunt, their eyes lighting up when they spoke of her. “Most people in China know more about Liang and Lin’s personalities and love lives than their work. But from an architectural point of view, they are hugely important. If it weren’t for them, we would have no record of so many ancient Chinese styles, which simply disappeared.” Since China’s embrace of capitalism in the 1980s, a growing number of Chinese are realizing the wisdom of Liang and Lin’s preservation message. As Beijing’s wretched pollution and traffic gridlock have reached world headlines, Liang’s 1950 plan to save the historic city has taken on a prophetic value. “I realize now how terrible it is for a person to be so far ahead of his time,” says Hu Jingcao, the Beijing filmmaker who directed the documentary Liang and Lin in 2010. “Liang saw things 50 years before everyone else. Now we say, Let’s plan our cities, let’s keep them beautiful! Let’s make them work for people, not just cars. But for him, the idea only led to frustration and suffering.” The situation is more encouraging in Liang and Lin’s favorite destination, Shanxi. The isolated province still contains around 70 percent of China’s structures older than the 14th century—and the couple’s magnum opus on Chinese architecture can be used as a unique guidebook. I had heard that the most evocative temples survive there, although they take some effort to reach. The backwaters of Shanxi remain rustic, their inhabitants unused to foreigners, and getting around is still an adventure, even if run-ins with warlords have been phased out. A renewed search for the temples would provide a rare view back to the 1930s, when China was poised on the knife-edge of history, before its slide into cataclysmic wars and Maoist self-destruction. Of course, historic quests in modern China require some planning. It’s one of the ironies of history that the province containing the greatest concentration of antiquities has also become one of the most polluted spots on the planet. Since the 1980s, coal-rich Shanxi has sold its black soul to mining, its hills pockmarked with smelters churning out electricity for the country’s insatiable factories. Of the world’s most polluted cities, 16 of the top 20 are in China, according to a recent study by the World Bank. Three of the worst are in Shanxi. I had to wonder where Liang and Lin would choose as a base today. As the plane approached Taiyuan, the provincial capital, and dove beneath rust-colored layers of murk, the air in the cabin suddenly filled with the smell of burning rubber. This once-picturesque outpost, where Liang and Lin clambered among the temple eaves, has become one of China’s many anonymous “second-tier” cities, rung by shabby skyscrapers. Other Shanxi favorites have suffered in the development craze. In the grottoes of Yungang, whose caves full of giant carved Buddhas were silent and eerie when Lin sketched them in 1931, riotous tour groups are now funneled through an enormous new imperial-style entrance, across artificial lakes and into faux palaces, creating a carnival atmosphere. But luckily, there is still a place where Liang and Lin would feel happy —Pingyao, China’s last intact walled town, and one of its most evocative historic sites. When the pair was traveling in the 1930s, dozens and dozens of these impressive fortress towns were scattered across the Shanxi plains. In fact, according to the imperial encyclopedia of the 14th century, there were 4,478 walled towns in China at one time. But one by one their defenses were knocked down after the revolution as symbols of the feudal past. Pingyao survived only because authorities in the poor district lacked the resources to topple its formidable fortifications, which are up to 39 feet thick, 33 feet high and topped with 72 watchtowers. The crenelated bastions, dating from 1370, also enclosed a thriving ancient town, its lane ways lined with lavish mansions, temples and banks dating from the 18th century, when Pingyao was the Qing dynasty’s financial capital. A dusty highway now leads to Pingyao’s enormous fortress gates, but once inside, all vehicular traffic is forced to stop. It is an instant step back to the elusive dream of Old China. On my own visit, arriving at night, I was at first disconcerted by the lack of street lighting. In the near-darkness, I edged along narrow cobbled alleys, past noodle shops where the cooks were bent over bubbling caldrons. Street vendors roasted kebabs on charcoal grills. Soon my eyes adjusted to the dark, and I spotted rows of lanterns illuminating ornate facades with gold calligraphy, all historic establishments dating from the 16th to 18th centuries, including exotic spice merchants and martial arts agencies that had once provided protection for banks. One half-expects silk-robed kung fu warriors to appear, tripping lightly across the terra-cotta tile roofs à la Ang Lee. Liang and Lin’s spirits hover over the remote town today. Having survived the Red Guards, Pingyao became the site of an intense conservation battle in 1980, when the local government decided to “rejuvenate” the town by blasting six roads through its heart for car traffic. One of China’s most respected urban historians, Ruan Yisan of Shanghai’s Tongji University—who met Lin Huiyin in the early 1950s and attended lectures given by Liang Sicheng—arrived to halt the steamrollers. He was given one month by the state governor to devise an alternative proposal. Ruan took up residence in Pingyao with 11 of his best students and got to work, braving lice, rock-hard kang beds with coal burners beneath them for warmth, and continual bouts of dysentery. Finally, Ruan’s plan was accepted, the roads were diverted and the old town of Pingyao was saved. His efforts were rewarded when Unesco declared the entire town a World Heritage site in 1997. Only today is it being discovered by foreign travelers. The town’s first upscale hotel, Jing’s Residence, is housed inside the magnificent 18th-century home of a wealthy silk merchant. After an exacting renovation, it was opened in 2009 by a coal baroness named Yang Jing, who first visited Pingyao 22 years ago while running an export business. Local craftsmen employed both ancient and contemporary designs in the interior, and the chef specializes in modern twists on traditional dishes, such as the local corned beef served with cat’s ear-shaped noodles. Many Chinese are now visiting Pingyao, and although Prof. Ruan Yisan is 82 years old, he returns every summer to monitor its condition and lead teams on renovation projects. I met him over a banquet in an elegant courtyard, where he was addressing fresh-faced volunteers from France, Shanghai and Beijing for a project that would now be led by his grandson. “I learned from Liang Sicheng’s mistakes,” he declared, waving his chopsticks theatrically. “He went straight into conflict with Chairman Mao. It was a fight he couldn’t win.” Instead, Ruan said, he preferred to convince government officials that heritage preservation is in their own interest, helping them improve the economy by promoting tourism. But, as ever, tourism is a delicate balancing act. For the moment, Pingyao looks much as it did when Liang and Lin were traveling, but its population is declining and its hundreds of ornate wooden structures are fragile. “The larger public buildings, where admission can be charged, are very well maintained,” Ruan explained. “The problem is now the dozens of residential houses that make up the actual texture of Pingyao, many of which are in urgent need of repair.” He has started the Ruan Yisan Heritage Foundation to continue his efforts to preserve the town, and he believes a preservation spirit is spreading in Chinese society—if gradually. The hotelier Yang Jing agrees: “At first, most Chinese people found Pingyao too dirty,” she said. “They certainly didn’t understand the idea of a ‘historic hotel,’ and would immediately ask to change to a bigger room, then leave after one night. They wanted somewhere like a Hilton, with a big shiny bathroom.” She added with a smile: “But it has been slowly changing. People are tired of Chinese cities that all look the same.” Poring over Liang and Lin’s Illustrated History, I drew up a map of the couple’s greatest discoveries. While Shanxi is little visited by travelers, its rural villages seem to have fallen off the charts entirely. Nobody in Pingyao had even heard of the temples I spoke about, although they were included on detailed road charts. So I was forced to cajole wary drivers to take me to visit the most sacred, forgotten spots. Some, like the so-called Muta, China’s tallest wooden pagoda dating from 1056, were easy to locate: The highway south of Datong runs alongside it, so it still rises gracefully over semi-suburban farmland. Others, like the Guangsheng Temple, which Liang and Lin visited with the Fairbanks in 1934, involved a more concerted effort. It lies in the hills near Linfen, now one of the most toxic of Shanxi’s coal outposts. (In 2007, Linfen had the honor of being declared “the world’s most polluted city.”) Much of the landscape is now completely disguised by industry: Mountains are stripped bare, highways are clogged with coal trucks. Back in 1934, Lin Huiyin had written, “When we arrived in Shanxi, the azure of the sky was nearly transparent, and the flowing clouds were mesmerizing.... The beauty of such scenery pierced my heart and even hurt a little.” Today, there are no hints of azure. A gritty mist hangs over everything, concealing all views beyond a few hundred yards. It’s a haunted landscape where you never hear birds or see insects. Here, the silent spring has already arrived. Finally, the veil of pollution lifts as the road rises into the pine-covered hills. The Lower Temple of Guangsheng is still announced by a bubbling emerald spring, as it was in 1934, and although many of the features were vandalized by Japanese troops and Red Guards, the ancient mural of the theatrical performance remains. A monk, one of 20 who now live there, explained that the Upper Temple was more intact. (“The Red Guards were too lazy to climb there!”) I counted 436 steps up to the hill crest, where the lovely 13-story pagoda was still gleaming with colored glazed tiles. Another monk was meditating cross-legged, as a cassette recorder played Om Mani Padme Hum. I was determined to find the “secret” stairway. After making endless inquiries, I convinced a guard to wake the abbot from his afternoon nap and got a key. He led me into the pagoda and opened a grille to the second level, now followed by a couple of other curious monks. It was pitch black, so I used the light from my iPhone to peer behind an enormous grinning Buddha. Sure enough, there were worn stone steps leading up. Wilma described the staircase’s unique design: “We groped our way up in single file. At the top of the first flight, we were startled to find that there were no landings. When you bumped your head against a blank wall you knew you had come to the end of one flight of stairs. You had to turn around there and step over empty space onto the first step of the next flight.” I eagerly pressed ahead—but was soon blocked by another padlocked grille, whose key, the guard remembered, was kept by a government official in the faraway capital, no doubt in his desk drawer. Still, as I crouched in the darkness, I could glimpse that the ancient architect really had not put a landing, for reasons we will never know. Liang and Lin’s greatest triumph came three years later. Their dream had always been to find a wooden temple from the golden age of Chinese art, the glorious Tang dynasty (A.D. 618-907). It had always rankled that Japan claimed the oldest structures in the East, although there were references to far more ancient temples in China. But after years of searching, the likelihood of finding a wooden building that had survived 11 centuries of wars, periodic religious persecutions, vandalism, decay and accidents had begun to seem fantastical. (“After all, a spark of incense could bring down an entire temple,” Liang fretted.) In June 1937, Liang and Lin set off hopefully into the sacred Buddhist mountain range of Wutai Shan, traveling by mule along serpentine tracks into the most verdant pocket of Shanxi, this time accompanied by a young scholar named Mo Zongjiang. The group hoped that, while the most famous Tang structures had probably been rebuilt many times over, those on the less-visited fringes might have endured in obscurity. The actual discovery must have had a cinematic quality. On the third day, they spotted a low temple on a crest, surrounded by pine trees and caught in the last rays of sun. It was called Foguang Si, the Temple of Buddha’s Light. As the monks led them through the courtyard to the East Hall, Liang and Lin’s excitement mounted: A glance at the eaves revealed its antiquity. “But could it be older than the oldest wooden structure we had yet found?” Liang later wrote breathlessly. Today, Wutai Shan’s otherworldly beauty is heightened by a blissful lack of pollution. From winding country roads that seemed to climb forever, I looked down at immense views of the valleys, then stared up in grateful acknowledgment of the blue sky. The summer air was cool and pure, and I noticed that many of the velvety green mountains were topped with their own mysterious monasteries. The logistics of travel were also reminiscent of an earlier age. Inside the rattling bus, pilgrims huddled over their nameless food items, each sending a pungent culinary odor into the exotic mix. We arrived at the only town in the mountain range, a Chinese version of the Wild West, where the hotels seem to actually pride themselves on provincial inefficiency. I took a room whose walls were covered in three types of mold. In the muddy street below, dogs ran in and out of stores offering cheap incense and “Auspicious Artifacts Wholesale.” I quickly learned that the sight of foreigners is rare enough to provoke stares and requests for photographs. And ordering in the restaurants is an adventure all its own, although one menu provided heroic English translations, evidently plucked from online dictionaries: Tiger Eggs with Burning Flesh, After the Noise Subspace, Delicious Larry, Elbow Sauce. Back at my hotel, guests smoked in the hallways in their undershirts; on the street below, a rooster crowed from 3 a.m. until dawn. I could sympathize with Lin Huiyin, who complained in one letter to Wilma Fairbank that travel in rural China alternated between “heaven and hell.” (“We rejoice over all the beauty and color in art and humanity,” she wrote of the road, “and are more than often appalled and dismayed by dirt and smells of places we have to eat and sleep.”) In the morning, I haggled with a driver to take me the last 23 miles to the Temple of Buddha’s Light. It is another small miracle that the Red Guards never made it to this lost valley, leaving the temple in much the same condition as when Liang and Lin stumbled here dust-covered on their mule litters. I found it, just as they had, bathed in crystalline sunshine among the pine trees. Across an immaculately swept courtyard, near-vertical stone stairs led up to the East Hall. At the top, I turned around and saw that the view across the mountain ranges had been totally untouched by the modern age. In 1937, when monks heaved open the enormous wooden portals, the pair was struck by a powerful stench: The temple’s roof was covered by thousands of bats, looking, according to Liang, “like a thick spread of caviar.” The travelers gazed in rapture as they took in the Tang murals and statues that rose “like an enchanted deified forest.” But most exciting were the designs of the roof, whose intricate trusses were in the distinctive Tang style: Here was a concrete example of a style hitherto known only from paintings and literary descriptions, and whose manner of construction historians could previously only guess. Liang and Lin crawled over a layer of decaying bat corpses beneath the ceiling. They were so excited to document details such as the “crescent-moon beam,” they didn’t notice the hundreds of insect bites until later. Their most euphoric moment came when Lin Huiyin spotted lines of ink calligraphy on a rafter, and the date “The 11th year of Ta-chung, Tang Dynasty”—A.D. 857 by the Western calendar, confirming that this was the oldest wooden building ever found in China. (An older temple would be found nearby in the 1950s, but it was far more humble.) Liang raved: “The importance and unexpectedness of our find made this the happiest hours of my years of hunting for ancient architecture.” Today, the bats have been cleared out, but the temple still has a powerful ammonia reek—the new residents being feral cats. Liang and Lin’s discovery also had a certain ominous poignancy. When they returned to civilization, they read their first newspaper in weeks —learning to their horror that while they were enraptured in the Temple of Buddha’s Light, on July 7 the Japanese Army had attacked Beijing. It was the beginning of a long nightmare for China, and decades of personal hardship for Liang and Lin. In the agonizing years to come, they would return to this moment in Shanxi as the time of their greatest happiness. “Liang and Lin’s generation really suffered in China,” says Hu Jingcao, director of the eight-part Chinese TV series on Liang and Lin. “In 1920s and ’30s, they led such beautiful lives, but then they were plunged into such misery.” Liang Sicheng outlived Lin by 17 years, and saw many of his dreams shattered as Beijing and many historical sites were destroyed by thoughtless development and rampaging Maoist cadres. “How could anyone succeed at that time?” asked Hu Jingcao. In the depths of the Sino-Japanese war in 1941, lying in her sickbed, Lin Huiyin had written a poem for an airman friend killed in combat: Let’s not talk about who wronged you. It was the age, hopeless, unweighable. China has yet to move forward; dark night Waits its daybreak. It could stand as an elegy for herself and her husband. ********** Back in Beijing, I had one last pilgrimage to make. Liang and Lin’s courtyard home in the 1930s is now a site that has become a contested symbol of the pair’s complex legacy. As the world knows, the Chinese capital is one of the world’s great planning disasters. Even the better-educated taxi drivers talk with nostalgia of the plan Liang Sicheng once offered that would have made it a green, livable city. (He even wanted to turn the top of the walls into a pedestrian park, anticipating the High Line in New York by six decades.) According to activist He Shuzhong, founder of the Beijing Cultural Heritage Protection Center, the public’s new fascination with Liang and Lin reflects a growing unease that development has gone too far in destroying the past: “They had a vision of Beijing as a human-scale city,” he said, “which is now nothing but a dream.” From the relative calm of the Peninsula Hotel near the Forbidden City, I walked for 20 minutes along an avenue of gleaming skyscrapers toward the roaring din of the Second Ring Road, built on the outline of the city walls destroyed by Mao. (On the evening before the wrecking balls arrived, Liang sat on the walls and wept.) Hidden behind a noodle bar was the entrance to one of the few remaining hutongs, or narrow lane ways, that once made Beijing such an enchanting historical bastion. (The American city planner Edmund Bacon, who spent a year working in China in the 1930s, described Old Beijing as “possibly the greatest single work of man on the face of the earth.”) Number 24 Bei Zong Bu was where Liang and Lin spent some of their happiest days, hosting salons for their haute-bohemian friends, which included the Fairbanks—discussing the latest news in European art and Chinese literature, and the gossip from Harvard Square. The future challenges for Chinese preservationists are inscribed in the story of this site. In 2007, the ten families who occupied the mansion were moved out, and plans were made to redevelop the area. But an instant outcry led Liang and Lin’s house, although damaged, to be declared an “immovable cultural relic.” Then, in the lull before Chinese New Year in 2012, a construction company with links to the government simply moved in and destroyed the house overnight. When the company was slapped with a token $80,000 fine, outrage flooded social media sites, and even some state-owned newspapers condemned the destruction. Preservationists were at least heartened by the outcry and described it as China’s “Penn Station moment,” referring to the destruction of the New York landmark in 1966 that galvanized the U.S. preservation movement. When I arrived at the address, it was blocked off by a high wall of corrugated iron. Two security guards eyed me suspiciously as I poked my head inside to see a construction site, where a half-built courtyard house, modeled on the ancient original, stood surrounded by rubble. In a typically surreal Chinese gesture, Liang and Lin’s home is now being recreated from plans and photographs as a simulacrum, although no official announcements have been made about its future status as a memorial. Despite powerful obstacles, preservationists remain cautiously optimistic about the future. “Yes, many Chinese people are still indifferent to their heritage,” admits He Shuzhong. “The general public, government officials, even some university professors only want neighborhoods to be bigger, brighter, with more designer stores! But I think the worst period of destruction is over. The protests over Liang and Lin’s house show that people are valuing their heritage in a way they weren’t five years ago.” How public concern can be translated into government policy in authoritarian China remains to be seen—the sheer amount of money behind new developments, and the levels of corruption often seem to be unstoppable—but the growing number of supporters shows that historic preservation may soon be based on more than just hope. ********** On my return to Manhattan, Maya Lin recalled that it wasn’t until she was 21 that her father told her about her celebrated aunt. He admitted that his “adoration” of his older sister, Lin Huiyin, had made him invert the traditional Chinese favoritism for sons, and place all his hopes and attention on her. “My whole life was framed by my father’s respect for Lin Huiyin,” she marveled. The artist showed me a model for a postmodern bell tower she is designing for Shantou University, in Guangdong province, China. Whereas Liang Sicheng and Lin Huiyin never had the opportunity to solely design any great buildings, the newly rich China has become one of the world’s hotbeds of innovative contemporary architecture. “You could say that Lin’s passion for art and architecture flows through me,” Maya said. “Now I’m doing what she wanted to.” Stefen Chow is a Bejing-based photographer with extensive mountaineering experience. His work frequently appears in Wall Street Journal and Fortune magazine, and he is at work on a long-term project called Poverty Line, which examines the daily food choices of the poor in countries around the world. Tony Perrottet is a contributing writer for Smithsonian magazine, a regular contributor to the New York Times and WSJ Magazine, and the author of six books including ¡Cuba Libre!: Che, Fidel and the Improbable Revolution that Changed World History,The Naked Olympics: The True Story of the Ancient Games and Napoleon's Privates: 2500 Years of History Unzipped
1d3f5066f85253f15de98bd96adf6c1a
https://www.smithsonianmag.com/history/lunch-atop-a-skyscraper-photograph-the-story-behind-the-famous-shot-43931148/?no-ist
NYC
NYC On September 20, 1932, high above 41st Street in Manhattan, 11 ironworkers took part in a daring publicity stunt. The men were accustomed to walking along the girders of the RCA building (now called the GE building) they were constructing in Rockefeller Center. On this particular day, though, they humored a photographer, who was drumming up excitement about the project’s near completion. Some of the tradesmen tossed a football; a few pretended to nap. But, most famously, all 11 ate lunch on a steel beam, their feet dangling 850 feet above the city’s streets. You’ve seen the photograph before—and probably some of the playful parodies it has spawned too. My brother had a poster in his childhood bedroom with actors, such as Tom Cruise and Leonardo DiCaprio, photoshopped in place of the steelworkers. The portrait has become an icon of 20th century American photography. But how much do you know about it? For the Irish filmmaker Seán Ó Cualáin, the mystery surrounding the photograph is a large part of its appeal. “There are so many unknowns,” he says. Who was the photographer? And who are the men? “They could be anybody,” says Ó Cualáin. “We can all place ourselves on that beam. I think that is why the photograph works.” Ó Cualáin did not plan to tell the story of the photograph, but that’s exactly what he has done in his latest documentary, Men at Lunch, which debuted earlier this month at the Toronto International Film Festival. “It was a happy accident,” says Ó Cualáin. He and his brother, Eamonn, the film’s producer, were in a pub in Galway, when they noticed a copy of the photograph hanging in a corner. Beside the photograph was a note from the son of a local immigrant who left Ireland for New York in the 1920s: "This is my dad on the far right and my uncle-in-law on the far left." They asked the bartender about the note, and "like all good Irish barmen," says Ó Cualáin, he put them in contact with Pat Glynn, the Bostonite who penned it, that very night. The filmmakers’ curiosity led them on a journey from the supposed relatives of a couple of the men pictured to the Rockefeller Center photography archives in New York City and a storage facility in Pennsylvania where the licensing company Corbis holds the original glass plate negative. In the process, the Ó Cualáin brothers confirmed that the photograph is real, and not a darkroom trick, as has been speculated. They turned up three possible photographers and, for the first time ever, unquestionably identified two of the men on the beam. Click on the highlighted portions of the famous photograph, below, to learn more about its long-held secrets. The notes have been prepared based on conversations with Seán Ó Cualáin and Ken Johnston, director of historical photography at Corbis. The photograph is part of Corbis’ prestigious Bettmann Archive. Lunch atop a Skyscraper (PDF) Lunch atop a Skyscraper (Text) Megan Gambino is an editor and writer for Smithsonian.com and founded “Document Deep Dive.” Previously, she worked for Outside magazine in New Mexico.
a6bda865435381149c80dc6b77b2e5c5
https://www.smithsonianmag.com/history/lunch-atop-a-skyscraper-photograph-the-story-behind-the-famous-shot-43931148/?page=2
NYC
NYC On September 20, 1932, high above 41st Street in Manhattan, 11 ironworkers took part in a daring publicity stunt. The men were accustomed to walking along the girders of the RCA building (now called the GE building) they were constructing in Rockefeller Center. On this particular day, though, they humored a photographer, who was drumming up excitement about the project’s near completion. Some of the tradesmen tossed a football; a few pretended to nap. But, most famously, all 11 ate lunch on a steel beam, their feet dangling 850 feet above the city’s streets. You’ve seen the photograph before—and probably some of the playful parodies it has spawned too. My brother had a poster in his childhood bedroom with actors, such as Tom Cruise and Leonardo DiCaprio, photoshopped in place of the steelworkers. The portrait has become an icon of 20th century American photography. But how much do you know about it? For the Irish filmmaker Seán Ó Cualáin, the mystery surrounding the photograph is a large part of its appeal. “There are so many unknowns,” he says. Who was the photographer? And who are the men? “They could be anybody,” says Ó Cualáin. “We can all place ourselves on that beam. I think that is why the photograph works.” Ó Cualáin did not plan to tell the story of the photograph, but that’s exactly what he has done in his latest documentary, Men at Lunch, which debuted earlier this month at the Toronto International Film Festival. “It was a happy accident,” says Ó Cualáin. He and his brother, Eamonn, the film’s producer, were in a pub in Galway, when they noticed a copy of the photograph hanging in a corner. Beside the photograph was a note from the son of a local immigrant who left Ireland for New York in the 1920s: "This is my dad on the far right and my uncle-in-law on the far left." They asked the bartender about the note, and "like all good Irish barmen," says Ó Cualáin, he put them in contact with Pat Glynn, the Bostonite who penned it, that very night. The filmmakers’ curiosity led them on a journey from the supposed relatives of a couple of the men pictured to the Rockefeller Center photography archives in New York City and a storage facility in Pennsylvania where the licensing company Corbis holds the original glass plate negative. In the process, the Ó Cualáin brothers confirmed that the photograph is real, and not a darkroom trick, as has been speculated. They turned up three possible photographers and, for the first time ever, unquestionably identified two of the men on the beam. Click on the highlighted portions of the famous photograph, below, to learn more about its long-held secrets. The notes have been prepared based on conversations with Seán Ó Cualáin and Ken Johnston, director of historical photography at Corbis. The photograph is part of Corbis’ prestigious Bettmann Archive. Lunch atop a Skyscraper (PDF) Lunch atop a Skyscraper (Text) Megan Gambino is an editor and writer for Smithsonian.com and founded “Document Deep Dive.” Previously, she worked for Outside magazine in New Mexico.
0b788d9f2d0f568f302daf11d8198b43
https://www.smithsonianmag.com/history/madam-cj-walker-netflix-close-up-180974152/
Madam C.J. Walker Gets a Netflix Close-Up
Madam C.J. Walker Gets a Netflix Close-Up Madam C.J. Walker, born Sarah Breedlove in Louisiana in 1867, was the most successful black wellness mogul of her day. Now a new Netflix series will show how this enterprising daughter of freed slaves empowered generations of black women to prosper. Breedlove was in her 30s when she began treating her bald spots with beeswax, copper sulfate and sulfur. She found it so effective she sold it to other black women door-to-door in Denver. In 1908, having married a Colorado journalist named Charles J. Walker, she launched a beauty school in Pittsburgh with the profits from “Mrs. Walker’s Wonderful Hair Grower.” Soon she was training a legion of women who earned a living selling her products. Indeed, her most significant legacy might be “the opportunities she provided for other black women to become economically autonomous through selling her products,” says Crystal Marie Moten, a curator at the National Museum of American History. Walker died in 1919, and even after making substantial donations to organizations in the black community, her fortune was estimated at $600,000 to $700,000—or $8.9 million to $10.7 million today. Read more about Madam C.J. Walker in this story from a Smithsonian magazine collaboration with Wondery. This article is a selection from the March 2020 issue of Smithsonian magazine Ted Scheinman is a senior editor for Smithsonian magazine. He is the author of Camp Austen: My Life as an Accidental Jane Austen Superfan
16da43171a8516503eaf71367767609d
https://www.smithsonianmag.com/history/madam-montessori-68331163/
Madam Montessori
Madam Montessori Six-year-old shari and her 5-year-old classmate Ugochi are adding 1,756 and 1,268. They’ve penciled the numbers neatly into their notebooks, but the method they’re using to come up with the answer—3,024—isn’t something you’d see in most American schools, let alone kindergartens. Each little girl loads a wooden tray with gold beads. Sprawled on a mat on the floor, they combine six of Shari’s beads and eight of Ugochi’s. “Nine units, ten units!” Ugochi counts triumphantly. With that, she scoops up ten beads and skips across the room to a cabinet, where she trades them in for a “10 bar”—ten beads wired together. Now the girls count in unison: “five 10s, six 10s, seven, eight, nine, ten 10s!” Then, pigtails flying, they run to trade in the 10s for a 100. The 21 other children in the class at the public Matthew Henson Elementary School in Landover, Maryland, seem equally energetic as they follow their own independent agendas. Fiveyear- old Taiwo lays out wooden letters that spell “May is back. I am happy.” Nearby, two 4-year-old boys stack pink blocks, watch them topple, then stack them again, this time with the larger ones on the bottom. A 3-year-old uses a cotton swab to polish a tiny silver pitcher— a task that refines motor skills—while a 5- year-old gets herself a bowl of cereal, eats it at the snack table, then cleans up everything. Nearly a century ago, a young Italian physician imagined that children would learn better in a classroom like this one—a place where they could choose among lessons carefully designed to encourage their development. Since then, the views of Maria Montessori, who died 50 years ago this year, have met with both worldwide acclaim and yawning indifference. Her method, which she developed with the children of Rome’s worst slum, is now more commonly applied to the oftpampered offspring of the well-heeled. Montessorians embrace Maria and her ideology with a fervor that often borders on the cultlike, while critics say Montessori classes are either too lax and individualized or, paradoxically, too rigidly structured. “ Her ideas were so radical,” says Mary Hayes, general secretary of the Association Montessori Internationale (AMI). “We’re still trying to convince the world that this is the best way for children to grow.” Teacher rosemary beam alcott sits on the floor with Ugochi and Shari, who show her their notebooks. “Did you exchange your 10 ones for a 10 bar? Did you carry? Did you write it down? How many 100s do you have?” “None,” Ugochi replies. “That’s great!” says Alcott. She turns to Taiwo. “May is back. I am happy. Me is flowers,” the child and teacher read together. “It doesn’t make sense,” Alcott says. Taiwo giggles. Back to the mathematicians. “Ugochi, please show me a 3 going in the right direction.” Ugochi erases, and writes again. “Good job! OK, put the beads away. I’m going to give you another problem.” Back to Taiwo, whose letters now read, “May is back. I am happy the flowers smell good.” “Wow!” exclaims Alcott. “What a wonderful story.” Now a 5-year-old boy brings her his work. Using pieces from a wooden puzzle, he has traced the states around Texas on a piece of paper, colored them, copied labels and pasted them onto his new map. “Louisiana, Arkansas, Oklahoma, New Mexico,” reads Alcott. “Very good!” Montessori’s own life was fraught with conflict and controversy. Born in 1870, of genteel origins, she fought doggedly for the right to study medicine, becoming Italy’s first female physician. Yet she abandoned medicine to embrace education, a profession she had once scorned. An outspoken advocate of women’s rights, for years she hid the fact that she was the mother of an illegitimate child. Little Mario was sent to a wet nurse in the country and later to boarding school. It wasn’t until he was 15, and Montessori’s own mother had died, that she publicly acknowledged her son and brought him to live with her. Yet whatever her personal travails, Montessori’s educational vision has not only survived into a new century, it is thriving as never before. Many of her once-radical ideas— including the notions that children learn through hands-on activity, that the preschool years are a time of critical brain development and that parents should be partners in their children’s education—are now accepted wisdom. “She made a lasting contribution,” says David Elkind, professor of child development at TuftsUniversity and author of The Hurried Child. “She recognized that there was an education particularly appropriate to young children, that it wasn’t just a smaller-sized second grade.” Indeed, a half century after her death, Montessori methods are used increasingly in public schools like Henson, in Prince George’s County, Maryland, where 400 children are on a waiting list for Montessori classes. The county adopted Montessori in 1986 as part of a school desegregation program, and parents have fought hard to keep it. Doris Woolridge, who has three daughters, including Shari, in Montessori classes at Henson, believes the system can hold its own, even in this era of increased emphasis on standardized exams. “To see a 5-year-old adding into the thousands—I’m just amazed,” says Woolridge, an attorney for the District of Columbia. “I saw them working with the beads, and they learned so quickly.” Among other things, Woolridge approves of the Montessori idea of multiage classrooms. “The younger kids mimic the older kids,” she says, “and the older ones help lead the class.” Perhaps none of Maria Montessori’s ideas sound as revolutionary now as they once did, but in her time she was a breaker of barriers. Born in the Italian province of Ancona, she grew up in a time when teaching was one of the few professions open to educated women. Her father, an accountant, urged her to take that path, but her mother supported Maria’s insistence, at age 12, that she attend a technical school to study mathematics. In her teens, Maria further tested her father’s patience by considering becoming an engineer. She gave that up only because she decided to be a doctor. University officials finally surrendered to her persistence, but Maria’s fellow medical students shunned her, and she was allowed to perform dissections only at night, alone, because it was unthinkable that men and women would view a naked body together. In 1896, at age 25, Maria completed her medical degree. “So here I am: famous!” she wrote to a friend. “It is not very difficult, as you see. I am not famous because of my skill or my intelligence, but for my courage and indifference towards everything.” Fame, however earned, had its privileges. Later that year, Montessori was asked to represent Italy at an international women’s congress in Berlin. The press swooned over the charming, bright-eyed young doctor who called for equal pay for women. “The little speech of Signorina Montessori,” wrote one Italian journalist, “with its musical cadence and the graceful gestures of her elegantly gloved hands, would have been a triumph even without her medical degree or her timely spirit of emancipation—the triumph of Italian feminine grace.” Back home in Rome, Montessori began caring for private patients and doing research at the University of Rome’s psychiatric clinic. At the asylum, she came in contact with children labeled “deficient and insane,” though most were more likely autistic or retarded. Locked all day in barren rooms, they would scuffle over crumbs of bread on the floor. Observing them, Montessori realized that the children were starved not for food but for stimulation. That set her to reading widely, in philosophy, anthropology and educational theory. Mental deficiency, she decided, was often a pedagogical problem. Experimenting with various materials, she developed a sensory-rich environment, designing letters, beads and puzzles that children could manipulate, and simple tasks such as mat weaving that prepared them for more challenging ones. After working with Montessori for two years, some of the “deficient” children were able to read, write and pass standard public-school tests. If retarded children could conquer such exams, Montessori wondered, what results would her methods have on normal youngsters in traditional classroom settings? She visited schools and found students “like butterflies mounted on pins,” she wrote, “fastened each to his place, the desk, spreading the useless wings of barren and meaningless knowledge which they have acquired.” Montessori’s own barely formed vision combined Jean- Jacques Rousseau’s philosophy of the nobility of the child with a more pragmatic view that work—and through it the mastery of the child’s immediate environment—was the key to individual development. To do that, she maintained, each child must be free to pursue what interests him most at his own pace but in a specially prepared environment. Montessori’s chance to act on her philosophy came in 1906 when a group of real estate investors asked her to organize a program for the children in Rome’s downtrodden San Lorenzo district so that the children, whose parents were off working all day, would not deface building walls. The investors gave Montessori a room in one of the buildings and 50 preschoolers, ages 2 to 6. Her medical colleagues were amazed that she would involve herself in something as mundane as day care, but Montessori was undeterred. She asked society women to contribute money for toys and materials and hired the daughter of the building’s porter to assist her. The Casa dei Bambini, or Children’s House, opened January 6, 1907. At first, Montessori just observed. She noticed that the children came to prefer her teaching materials to toys and would spend hours putting wooden cylinders into holes or arranging cubes to build a tower. As they worked, they became calmer and happier. As the months passed, Montessori modified materials and added new activities, including gardening, gymnastics, making and serving lunch, and caring for pets and plants. Children who misbehaved were given nothing to do. The children soon started asking Montessori to teach them to read and write. So she devised sandpaper letters that they could touch and trace, pronouncing the sounds as they did so. One day during recess, a 5-year-old boy cried excitedly, “I can write!” and wrote the word mano—hand— with chalk on the pavement. Other children began writing, too, and news of the miraculous 4- and 5-year-olds who taught themselves to write traveled quickly. Acolytes from around the world flocked to Rome to sit at Montessori’s knee, and soon Montessori schools were popping up in Switzerland, England, the United States, India, China, Mexico, Syria and New Zealand. Alexander Graham Bell, who had started his career as a teacher of the deaf, was fascinated by Montessori and in 1912 established a Montessori class in his Washington, D.C. home for his two grandchildren and a half-dozen neighborhood kids. A Montessori class, taught in a glass-walled classroom, would be one of the most popular exhibitions at the 1915 Panama– Pacific International Exposition in San Francisco. But success proved more than even Montessori could handle. Though she had resigned her university chair to concentrate on the schools, she found herself overwhelmed by the demands for lectures, training and interviews. She complained bitterly about books describing her program and insisted that only she was qualified to train teachers. The fact that she had patented her teaching materials irked more than a few critics, one of whom decried the act as “sordid commercialism.” Other educators also raised questions. Most prominent among them was William Heard Kilpatrick, a disciple of John Dewey, who dismissed Montessori’s methods as too formal and restrictive, failing to spark children’s imaginations sufficiently. By the 1920s, interest in Montessori had waned in the United States. A Montessori revival began in the late 1950s, led by Nancy Rambusch, a mother frustrated by the lack of choices for her children’s education. After going to Europe for Montessori training, she started a school in Greenwich, Connecticut. Others followed. Today, there are some 5,000 Montessori schools in the United States, some affiliated with AMI, others with the American Montessori Society, founded by Rambusch. Some schools using Montessori methods are not certified at all, and some that claim to use them do anything but. The little research that exists on the benefits of the method indicates that Montessori students do well in the long term, but more research is needed. “We have to verify that we’re in tune with brain development, and that our kids are prepared at all levels,” says Jonathan Wolff, a Montessori teacher and consultant in Encinitas, California. Lilian Katz, professor emerita of early childhood education at the University of Illinois, says the criticisms of Montessori’s methods—obsession with the “correct” use of blocks and beads, the lack of emphasis on fantasy and creativity— are valid but don’t compromise the value of the program. “It’s pretty solid,” says Katz. “The strategies the teachers use are very clear. Children seem to respond well.” With pinched budgets, little time for recess or music, and increased emphasis on standardized tests, these are tough times in education. But Maria Montessori’s legacy has never been more valued, even as it adapts to meet the needs of a new century. For some teachers, says Paul Epstein, head of the Chiaravalle Montessori School in Evanston, Illinois, “the materials have become the method. But you can do Montessori with a bucket of sticks and stones or any set of objects if you know the principles of learning.” Epstein’s middle school students don’t play with blocks. Instead, they’re doing something Maria never imagined, but doubtless would like. Last year, they ran the school’s snack bar, a hands-on task designed to help them with skills they will need as adults: common sense and time management. Says Epstein with a smile: “They’re learning to be entrepreneurs.”
3bddfb5f21c7e5d517b40084856159d8
https://www.smithsonianmag.com/history/man-who-brought-swastika-germany-and-how-nazis-stole-it-180962812/
The Man Who Brought the Swastika to Germany, and How the Nazis Stole It
The Man Who Brought the Swastika to Germany, and How the Nazis Stole It When archaeologist Heinrich Schliemann traveled to Ithaca, Greece in 1868, one goal was foremost in his mind: discovering the ancient city of Troy using Homer’s Iliad. The epic poem was widely believed to be no more than a myth, but Schliemann was convinced otherwise. For him, it was a map to the hidden location of ancient cities. Over the next several years the German businessman, who made his fortune in trading raw materials for ammunition production, tramped around the Mediterranean. Schliemann took Homer’s advice on everything from local customs to treating physical maladies. Trained at the Sorbonne, he used Homer’s verses to identify what he thought were the epic’s real-world locations. “One of his greatest strengths is that he had a genuine historical interest. What he wanted was to uncover the Homeric world, to know whether it existed, whether the Trojan war happened,” writes classics scholar D.F. Easton. “But here also is a weakness. He was not very good at separating fact from interpretation.” It wasn’t until 1871 that Schliemann achieved his dream. The discovery catapulted him to fame, and with his fame came a burst of interest in all that he uncovered. The intrepid archaeologist found his Homeric city, but he also found something else: the swastika, a symbol that would be manipulated to shape world history. Schliemann found his epic city—and the swastika—on the Aegean cost of Turkey. There, he continued the excavations started by British archaeologist Frank Calvert at a site known as Hisarlik mound. Schliemann’s methods were brutal—he used crowbars and battering rams to excavate—but effective. He quickly realized the site held seven different layers from societies going back thousands of years. Schliemann had found Troy—and the remains of civilizations coming before and after it. And on shards of pottery and sculpture throughout the layers, he found at least 1,800 variations on the same symbol: spindle-whorls, or swastikas. He would go on to see the swastika everywhere, from Tibet to Paraguay to the Gold Coast of Africa. And as Schliemann’s exploits grew more famous, and archaeological discoveries became a way of creating a narrative of national identity, the swastika grew more prominent. It exploded in popularity as a symbol of good fortune, appearing on Coca-Cola products, Boy Scouts’ and Girls’ Club materials and even American military uniforms, reports the BBC. But as it rose to fame, the swastika became tied into a much more volatile movement: a wave of nationalism spreading across Germany. “The antiquities unearthed by Dr. Schliemann at Troy acquire for us a double interest,” wrote British linguist Archibald Sayce in 1896. “They carry us back to the later stone ages of the Aryan race.” Initially, “Aryan” was a term used to delineate the Indo-European language group, not a racial classification. Scholars in the burgeoning field of linguistics had noticed similarities among the German, Romance and Sanskrit languages. The rising interest in eugenics and racial hygiene, however, led some to corrupt Aryan into a descriptor for an ancient, master racial identity with a clear throughline to contemporary Germany. As the Washington Post reported in a story about the rise of Nazism several years before the start of World War II, “[Aryanism]… was an intellectual dispute between bewhiskered scholars as to the existence of a pure and undefiled Aryan race at one stage of the earth’s history.” In the 19th century, French aristocrat Arthur de Gobineau and others made the connection between the mythical Aryans and the Germans, who were the superior descendants of the early people, now destined to lead the world towards greater advancement by conquering their neighbors. The findings of Schliemann’s dig in Turkey, then, suddenly had a deeper, ideological meaning. For the nationalists, the “purely Aryan symbol” Schliemann uncovered was no longer an archaeological mystery—it was a stand-in for their superiority. German nationalist groups like the Reichshammerbund (a 1912 anti-Semitic group) and the Bavarian Freikorps (paramilitarists who wanted to overthrow the Weimar Republic in Germany) used the swastika to reflect their “newly discovered” identity as the master race. It didn’t matter that it traditionally meant good fortune, or that it was found everywhere from monuments to the Greek goddess Artemis to representations of Brahma and Buddha and at Native American sites, or that no one was truly certain of its origins. “When Heinrich Schliemann discovered swastika-like decorations on pottery fragments in all archaeological levels at Troy, it was seen as evidence for a racial continuity and proof that the inhabitants of the site had been Aryan all along,” writes anthropologist Gwendolyn Leick. “The link between the swastika and Indo-European origin, once forged was impossible to discard. It allowed the projection of nationalist feelings and associations onto a universal symbol, which hence served as a distinguishing boundary marker between non-Aryan, or rather non-German, and German identity.” As the swastika became more and more intertwined with German nationalism, Adolf Hitler’s influence grew—and he adopted the hooked cross as the Nazi party symbol in 1920. “He was attracted to it because it was already being used in other nationalist, racialist groups,” says Steven Heller, author of The Swastika: Symbol Beyond Redemption? and Iron Fists: Branding the 20th-Century Totalitarian State. “I think he also understood instinctually that there had to be a symbol as powerful as the hammer and sickle, which was their nearest enemy.” To further enshrine the swastika as a symbol of Nazi power, Joseph Goebbels (Hitler’s minister of propaganda) issued a decree on May 19, 1933 that prevented unauthorized commercial use of the hooked cross. The symbol also featured prominently Leni Riefenstahl’s propagandist film Triumph of the Will, writes historian Malcolm Quinn. “When Hitler is absent… his place is taken by the swastika, which, like the image of the Führer, becomes a switching station for personal and national identities.” The symbol was on uniforms, flags and even as a marching formation at rallies. Efforts to ban the display of the swastika and other Nazi iconography in the post-war years—including current German criminal laws that prohibit the public use of the swastika and the Nazi salute—seem to have only further enshrined the evil regime it was co-opted by. Today the symbol remains a weapon of white supremacist groups around the world. In recent months, its prevalence has spiked around the U.S., with swastikas appearing around New York City, Portland, Pennsylvania, California and elsewhere. It seems the harder authority figures attempt to quash it out, the greater its power to intimidate. For Heller, this is an intractable problem. “I think you can’t win,” Heller says. “Either you try to extinguish it, and if that’s the case you’ve got to brainwash an awful lot of people, or you let it continue, and it will brainwash a lot of people. As long as it captures people’s imaginations, as long as it represents evil, as long as that symbol retains its charge, it’s going to be very hard to cleanse it.” Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
6c0712989bd434438c487c71cd22d968
https://www.smithsonianmag.com/history/man-who-sold-eiffel-tower-twice-180958370/
The Man Who Sold the Eiffel Tower. Twice.
The Man Who Sold the Eiffel Tower. Twice. The air was as crisp as a hundred dollar bill, on April 27, 1936. A southwesterly breeze filled the bright white sails of the pleasure boats sailing across the San Francisco Bay. Through the cabin window of a ferryboat, a man studied the horizon. His tired eyes were hooded, his dark hair swept backwards, his hands and feet locked in iron chains. Behind a curtain of grey mist, he caught his first dreadful glimpse of Alcatraz Island. “Count” Victor Lustig, 46 years old at the time, was America’s most dangerous con man. In a lengthy criminal career, his sleight-of-hand tricks and get-rich-quick schemes had rocked Jazz-Era America and the rest of the world. In Paris, he had sold the Eiffel Tower in an audacious confidence game—not once, but twice. Finally, in 1935, Lustig was captured after masterminding a counterfeit banknote operation so vast that it threatened to shake confidence in the American economy. A judge in New York sentenced him to 20 years on Alcatraz. For fans of “Catch Me if You Can” and “The Sting,” Handsome Devil is the dazzling true story of Count Victor Lustig, history’s most daring – and flamboyant – con man. Lustig was unlike any other inmate to arrive on the Rock. He dressed like a matinee idol, possessed a hypnotic charm, spoke five languages fluently and evaded the law like a figure from fiction. In fact, the Milwaukee Journal described him as ‘a story book character’. One Secret Service agent wrote that Lustig was “as elusive as a puff of cigarette smoke and as charming as a young girl’s dream,” while the New York Times editorialized: “He was not the hand-kissing type of bogus Count—too keen for that. Instead of theatrical, he was always the reserved, dignified noble man.” The fake title was just the tip of Lustig’s deceptions. He used 47 aliases and carried dozens of fake passports. He created a web of lies so thick that even today his true identity remains shrouded in mystery. On his Alcatraz paperwork, prison officials called him “Robert V. Miller,” which was just another of his pseudonyms. The con man had always claimed to hail from a long line of aristocrats who owned European castles, yet newly discovered documents reveal more humble beginnings. In prison interviews, he told investigators that he was born in the Austria-Hungarian town of Hostinné on January 4, 1890. The village is arranged around a Baroque clock tower in the shadow of the Krkonoše mountains (it is now a part of the Czech Republic). During his crime spree, Lustig had boasted that his father, Ludwig, was the burgomaster, or mayor, of the town. But in recently uncovered prison papers, he describes his father and mother as the “poorest peasant people” who raised him in a grim house made from stone. Lustig claimed he stole to survive, but only from the greedy and dishonest. More textured accounts of Lustig’s childhood can be found in various true crime magazines of the time, informed by his criminal associates and investigators. In the early 1900s, as a teenager, Lustig scampered up the criminal ladder, progressing from panhandler to pickpocket, to burglar, to street hustler. According to True Detective Mysteries magazine he perfected every card trick known: “palming, slipping cards from the deck, dealing from the bottom,” and by the time he reached adulthood, Lustig could make a deck of cards “do everything but talk.” First-class passengers aboard transatlantic ships became his first victims. The newly rich were easy pickings. When Lustig arrived in the United States at the end of World War I, the “Roaring Twenties” were in full swing and money was changing hands at a fevered pace. Lustig quickly became known to detectives in 40 American cities as ‘the Scarred,’ thanks to a livid, two-and-a-half inch gash along his left cheekbone, a souvenir from a love rival in Paris. Yet Lustig was a considered a “smoothie” who had never held a gun, and enjoyed mounting butterflies. Records show that he was just five-foot-seven-inches tall and weighed 140 pounds. His most successful scam was the “Rumanian money box.” It was a small box fashioned from cedar wood, with complicated rollers and brass dials. Lustig claimed the contraption could copy banknotes using “Radium.” The big show he gave to victims was sometimes aided by a sidekick named “Dapper” Dan Collins, described by the New York Times as a former ‘circus lion tamer and death-defying bicycle rider.’ Lustig’s repertoire also included fake horse race schemes, feigned seizures during business meetings, and bogus real estate investments. These capers made him a public enemy and a millionaire. America in the 1920s was infested with such confidence rackets, operated by smooth-talking immigrants like Charles Ponzi, namesake of the “Ponzi scheme.” These European con artists were professionals who called their victims ‘marks’ instead of suckers, and who acted not like thugs, but gentlemen. According to the crime magazine True Detective, Lustig was a man who “society took by one hand, the underworld by the other…a flesh-and-blood Jekyll-Hyde.” Yet he treated all women with respect. On November 3, 1919, he married a pretty Kansan named Roberta Noret. A memoir by Lustig’s late daughter recalls how Lustig raised a secret family on whom he lavished his ill-gotten gains. The rest he spent on gambling, and on his lover, Billie Mae Scheible, the buxom owner of a million-dollar prostitution racket. Then, in 1925, he embarked upon what swindling experts call “the big store.” Lustig arrived in Paris in May of that year, according to the memoir of U.S. Secret Service agent James Johnson. There, Lustig commissioned stationary carrying the official French government seal. Next, he presented himself at the front desk of the Hôtel de Crillon, a stone palace on the Place de la Concorde. From there, pretending to be a French government official, Lustig wrote to the top people in the French scrap metal industry, inviting them to the hotel for a meeting. “Because of engineering faults, costly repairs, and political problems I cannot discuss, the tearing down of the Eiffel Tower has become mandatory,” he reportedly told them in a quiet hotel room. The tower would be sold to the highest bidder, he announced. His audience was captivated, and their bids flowed in. It was a scam Lustig pulled off more than once, sources said. Amazingly, the con man liked to boast of his criminal achievements, and even penned a list of rules for would-be swindlers. They’re still circulated today: _________________________________________ LUSTIG’S TEN COMMANDMENTS OF THE CON 1. Be a patient listener (it is this, not fast talking, that gets a con-man his coups). 2. Never look bored. 3. Wait for the other person to reveal any political opinions, then agree with them. 4. Let the other person reveal religious views, then have the same ones. 5. Hint at sex talk, but don’t follow it up unless the other fellow shows a strong interest. 6. Never discuss illness, unless some special concern is shown. 7. Never pry into a person’s personal circumstances (they’ll tell you all eventually). 8. Never boast. Just let your importance be quietly obvious. 9. Never be untidy. 10. Never get drunk. _________________________________________ Like many career criminals, it was greed that led to Lustig’s demise. On December 11, 1928, businessman Thomas Kearns invited Lustig to his Massachusetts home to discuss an investment. Lustig crept upstairs and stole $16,000 from a drawer. Such a barefaced theft was out of character for the con man, and Kearns screamed to the police. Next, Lustig had the audacity to trick a Texas sheriff with his moneybox, and later gave him counterfeit cash, which attracted the attention of the Secret Service. “Victor Lustig was [a] top man in the modern world of crime” wrote another agent called Frank Seckler, “He was the only one I ever heard of who swindled the law.” Yet it was Secret Service agent Peter A. Rubano who vowed to put Lustig behind bars. Rubano was a heavy-set Italian-American with a double chin, sad eyes, and endless ambition. Born and raised in the Bronx, Rubano had made his name by trapping the notorious gangster Ignazio “The Wolf” Lupo. Rubano delighted in seeing his name in the newspapers, and he would dedicate many years to catching Lustig. When the Austrian entered the counterfeit banknote business in 1930, Lustig fell under Rubano’s crosshairs. Teaming up with gangland forger William Watts, Lustig created banknotes so flawless they fooled even bank tellers. “Lustig-Watts notes were the supernotes of the era,” says Joseph Boling, chief judge of the American Numismatic Association, a specialist in authenticating notes. Lustig daringly chose to copy $100 bills, those scrutinized most by bank tellers, and became “like some other government, issuing money in rivalry with the United States Treasury,” a judge later commented.  It was feared that a run of fake bills this large could wobble international confidence in the dollar. Catching the count became a cat-and-mouse game for Rubano and the Secret Service. Lustig traveled with a trunk of disguises and could transform easily into a rabbi, a priest, a bellhop or a porter. Dressed like a baggage man, he could escape any hotel in a pinch—and even take his luggage with him. But the net was closing in. Lustig finally felt a tug on the velvet-collar of his Chesterfield coat on a New York street corner on May 10, 1935. A voice ordered: “Hands in the air”. Lustig studied the circle of men surrounding him, and noticed Agent Rubano, who led him away in handcuffs. It was a victory for the Secret Service. But not for long. On the Sunday before Labor Day, September 1, 1935, Lustig escaped from the ‘inescapable’ Federal Detention Center in Manhattan. He fashioned a rope from bed sheets, cut through his bars, and swung from the window like an urban Tarzan. When a group of onlookers stopped and pointed, the prisoner took a rag from his pocket and pretended to be a window cleaner. Landing on his feet, Lustig gave his audience a polite bow, and then sprinted away ‘like a deer.’ Police dashed to his cell. They discovered a handwritten note on his pillow, an extract from Victor Hugo’s Les Miserables: He allowed himself to be led in a promise; Jean Valjean had his promise. Even to a convict, especially to a convict. It may give the convict confidence and guide him on the right path. Law was not made by God and Man can be wrong. Lustig evaded the law until the Saturday night of September 28, 1935. In Pittsburgh, the dashing crook ducked into a waiting car on the city’s north side. Watching from a hiding position, FBI agent G. K. Firestone gave the signal to Pittsburgh Secret Service agent Fred Gruber. The two federal officers leapt into their car and gave chase. For nine blocks their vehicles rode neck-and-neck, engines roaring. When Lustig’s driver refused to stop, the agents rammed their car into his, locking their wheels together. Sparks flew. The cars crashed to a halt. The agents pulled their service weapons and threw open the doors. According to the Pittsburgh Post-Gazette, Lustig told his captors: “Well, boys, here I am.” Count Victor Lustig was hauled before the judge in New York in November 1935. “His pale, lean face was a study and his tapering white hands rested on the bar before the bench,” observed a reporter from the New York Herald-Tribune. Just before sentencing, another journalist overheard a Secret Service agent tell Lustig: “Count, you’re the smoothest con man that ever lived.” As soon as he stepped onto Alcatraz Island, prison guards searched Lustig’s body for concealed watch springs and razor blades and hosed him down with freezing seawater. They marched him along the main corridor between the cells—known as ‘Broadway’—in his birthday suit. There was a chorus of howls, whistles, and the clanging of metal cups against bars. “He is somewhat superficially humiliated,” Lustig’s prison record said, referring to him as ‘Miller’, “he asserts that he was accused of everything in the category of crime, including the burning of Chicago.” Whatever his true identity, the cold weather took its toll on prisoner #300. By December 7, 1946, Lustig had made a staggering 1,192 medical requests and filled 507 prescriptions. The prison guards believed he was faking, that his illness was part of an escape plan. They even found torn bed sheets in his cell, signs of his expert rope making. According to medical reports, Lustig was “inclined to magnify physical complaints... [and] constantly complaining of real and imaginary ills.” He was transferred to a secure medical facility in Springfield, Missouri, where doctors soon realized he was not faking. There, he died from complications arising from pneumonia. Somehow, Lustig’s family kept his death a secret for two years, until August 31, 1949. But Lustig’s Houdini-like departure from earth was not even his greatest deception. In March of 2015, a historian named Tomáš Anděl, from Lustig’s home town of Hostinné, began a tireless search for biographical information about the town’s most famous citizen. He searched through records rescued from Nazi bonfires, pored over electoral rolls and historical documents. “He must have attended school in Hostinné,” Anděl reasoned in the Hostinné Bulletin, “yet he is not even mentioned in the list of pupils attending the local primary school.” After much searching, Anděl concluded, there is not a scrap of evidence that Lustig was ever born. We may never know the true identity of Count Victor Lustig. But we do know for certain that the world’s most flamboyant con man died at 8:30pm on March 11, 1947. On his death certificate a clerk wrote this for his occupation: ‘Apprentice salesman.’ Adapted from ‘Handsome Devil’ by Jeff Maysh
8521bb443ba183ec3e573a43a18714d7
https://www.smithsonianmag.com/history/march-anniversaries-4-292528/
March Anniversaries
March Anniversaries 50 Years Ago Corps Values In an executive order signed March 1, 1961, President John F. Kennedy founds the Peace Corps to supply aid and promote international understanding. Calling for volunteers—members receive basic living expenses but no salary—to teach, build schools, help eradicate malaria and improve agriculture, Kennedy predicts, “It will not be easy.” Kennedy brother-in-law Sargent Shriver heads the program, and in August, the first volunteers head to Ghana. By 2010, more than 200,000 corps members will have served in 139 countries. 80 Years Ago Oh Say Can You Sing On March 3, 1931, more than 100 years after Francis Scott Key wrote the words, Congress votes to make “The Star-Spangled Banner” the national anthem of the United States. Key’s poem, memorializing the flag flown during the 1814 battle of Fort McHenry and set to the tune of “To Anacreon in Heaven,” had been the anthem of the military for decades when some five million signers petitioned Congress to make it the nation’s as well. Despite arguments that the song is virtually unsingable, President Hoover signs the act that day. 140 Years Ago Presumptuous Fellow British-American journalist Henry Stanley sets out from Bagamoyo (in modern Tanzania) March 21, 1871, in search of explorer David Livingstone, whose expedition to find the source of the Nile has not been heard from in four years. Backed by the New York Herald, Stanley leads 192 men on an eight-month journey into Africa’s interior, confronting crocodiles and curious locals before finally greeting the ailing explorer, as he recounts, “Dr. Livingstone, I presume?” (Whether he really said it is not known.) His account is published in 1872. Livingstone dies in 1873, still looking for the Nile’s source. 150 Years Ago Better Angels With the capitol dome still unfinished behind him, Abraham Lincoln is sworn in as the nation’s 16th president on March 4, 1861. Chief Justice Roger Taney, author of the Dred Scott decision denying citizenship to black Americans and barring the government from banning slavery in its territories, administers the oath. Lincoln, addressing the secession of seven Southern states after his election, proclaims the Union perpetual and efforts to dissolve it unlawful. He promises to “preserve, protect and defend” it and appeals to the “better angels of our nature” for peace. But by April, the Civil War is on. 170 Years Ago Amistad Verdict Joseph Cinque and the more than 30 other surviving Africans jailed since 1839 for mutiny on the Cuban schooner Amistad are freed March 9, 1841, when the U.S. Supreme Court rules they had been illegally enslaved. It is a victory for abolitionists and for John Quincy Adams, who argues that President Van Buren has no right to return them to Cuba. The survivors return to Africa in 1842.
a0e893f2f75f67b5cc0d62c1d78ee2b4
https://www.smithsonianmag.com/history/marquis-lafayette-sails-again-180954590/?all
The Marquis de Lafayette Sails Again
The Marquis de Lafayette Sails Again The sun was sparkling off the Bay of Biscay and a light breeze barely ruffled the sails as the three-masted frigate l’Hermione headed out from La Rochelle for sea trials one morning last October. It was a beautiful day, dammit! This would be one of the new ship’s first times out in open water, and the captain, a Breton sea dog named Yann Cariou, was eager to see what it and its crew of 18 seasoned sailors and 54 volunteers could do. The balmy weather would test neither. The Marquis: Lafayette Reconsidered Cariou fired up the two 400-horsepower Italian engines and motored north looking for wind. At dinner in the galley, he made a show of peeking under the tables, as if he were playing a children’s game. “No wind here,” he says with mock gravity. But there was good news, meaning bad news, on the radar. A big storm off Iceland was generating nasty low-pressure systems as far south as Brittany, so that’s where we headed. Many people had waited a long time for this moment. The French spent 17 years and $28 million replicating the Hermione down to the last detail, from its gilded-lion figurehead to the fleur-de-lis painted on its stern. When the original Hermione was built in 1779, it was the pride of a newly re-energized French Navy: a 216-foot, 32-gun barracuda that could take a real bite out of the arrogant English, who not only ruled the waves but concocted an in-your-face anthem about it—“Rule, Britannia!”—in 1740. With a sleek, copper-bottomed hull, the Hermione could out-sail almost any ship it couldn’t out-shoot. Even the English recognized the Hermione’s excellence when they captured its sister ship, the Concorde. They promptly reverse-engineered their prize, drawing detailed schematics to help re­create the vessel for their own fleet. This proved a stroke of luck 200 years later when France decided it was tired of being the only great seagoing nation without a replicated tall ship of its own. “In the 1980s, we restored the shipyards at Rochefort, where l’Hermione was built, and made them a cultural monument,” says Benedict Donnelly, who heads France’s Hermione project, the Association Hermione-La Fayette, supported by public funds and private donations. “But then in the ’90s we said, we’re missing something. A recreated tall ship. France is really the poor relation among nations in this department. The Hermione was the jewel of the navy from a glorious moment in French maritime history—which hasn’t always been glorious, thanks to our friends the English. Happily, our English friends had captured the Hermione’s sister ship and left us the plans.” There’s another reason that the Hermione sails again—it possesses a particular transatlantic back story and cachet. In March 1780, the Hermione set out from Rochefort bound for Boston. Its speed and agility suited it ideally to the task of carrying Gilbert du Motier, Marquis de Lafayette, back to America. He was charged with giving George Washington the nation-saving news that France would soon be sending an infusion of arms, ships and men. That life support was due in no small part to Lafayette’s tireless cheerleading. His earlier efforts had helped nudge King Louis XVI into recognizing the United States and signing a defensive alliance with it in 1778 (just how big a nudge is open to debate, since French policy was already strongly inclined in this direction for reasons of pure realpolitik). Now, Lafayette, the public face of France in the United States, was returning to deliver the goods. Surely Lafayette’s name could work the same fund-raising magic for a recreated Hermione, this time in the America-to-France direction. The connection with Lafayette has brought in U.S. donors under the auspices of the Friends of Hermione-Lafayette in America, a nonprofit that has helped to raise roughly one-quarter the $4.5 million it is costing to send the replicated Hermione from Rochefort voyaging to America and back. Donnelly, whose own background seems tailor-made for overseeing the Hermione project since 1992—his mother is French and his American father participated in the D-Day invasion at Normandy—says that was never a consideration. “Choosing to rebuild Lafayette’s boat was not a question of marketing,” he insists. Still, a project that has often been as cash-strapped as Washington’s Continentals has benefited from a brisk American tail wind. After crossing the Atlantic this month, the ship will dock in many of the ports that figured in the Revolution, to welcome the curious aboard to discover a ship lost to history and the young marquis who is a misunderstood American icon. ‘unknown’ works here. Hermione will be unknown to Americans And in Manhattan, the New-York Historical Society is mounting the exhibition “Lafayette’s Hermione: Voyage 2015,” on view May 29 through August 16. Pretty much everyone in the United States has heard of Lafayette. Scores of towns around the U.S. are named for him, from Fayetteville, North Carolina, to Fayette, Maine, to Lafayette, Oregon (to this list must be added every town named La Grange, after Lafayette’s manse, the Château de la Grange-Bleneau). But the man himself has been swallowed up in a hazy myth surrounding his general helpfulness. He turns out to be more interesting than his myth, not to mention a good deal quirkier. “Americans don’t in the least know who Lafayette was. The story has been lost in the telling,” says Laura Auricchio, author of a new biography, The Marquis: Lafayette Reconsidered. The Marquis de Lafayette who first arrived on U.S. soil in South Carolina on June 13, 1777, was an unformed, untested youth of 19. In a way, he had nowhere else to go. He had been orphaned young—his father was killed when the English crushed the French at Minden in 1759, during the Seven Years’ War. The early death of his parents left him a very rich young man. In 1774, Lafayette, then 16, was married off to 14-year-old Adrienne de Noailles, who came from one of France’s best-born and most powerful families. The marriage made the provincial Lafayette an instant player at court, but his door pass did him little good. For one thing, he was a lousy dancer. Lafayette himself confessed in his memoirs that he made a clumsy courtier, undone “by the gaucheness of my manners which...never yielded to the graces of the court or to the charms of supper in the capital.” The match with Adrienne also brought Lafayette a lieutenant’s commission in the Noailles Dragoons, and with it the promise of an army career. But here, too, he hit an unexpected wall. A broad military reorganization in 1775 affected many of France’s existing regiments, Lafayette’s among them. He and many others like him suddenly found themselves sidelined with little hope of advancement. It was in this context that Lafayette took up America’s fight for freedom. So did many of his frustrated compatriots, whose motives ran the gamut from high-minded to mercenary. “I am well nigh harassed to death with applications of officers to go out to America,” wrote the American diplomat Silas Deane, who worked alongside Benjamin Franklin in Paris to drum up French aid. Deane and Franklin were pretty picky, and many who asked to fight were turned away. In Lafayette, however, they recognized a pearl of great value—that is to say, great promotional value. In his signed agreement accepting Lafayette’s services and commissioning him an (unpaid) major general, Deane enumerates an unusual list of qualifications for a commanding officer: “high birth, alliances, the great dignities which his family holds at this court, his considerable estates in this realm...and above all, his zeal for the liberty of our provinces.” Thus recommended, the marquis first set sail for America in April 1777. Lafayette never fully understood that his real job was to help get France into the war, not to fight it himself. Politically, he could be obtuse. “He was an ingénu and quite naive,” says Auricchio. “The opposite of someone like Talleyrand.” I met with the historian Laurence Chatel de Brancion—who with co-author Patrick Villiers published the French-language biography La Fayette: Rêver la gloire (Dreaming of Glory) in 2013—at her grand apartment near Parc Monceau in Paris. On her father’s side of the family (an ancestor helped found Newport, Rhode Island), Chatel de Brancion is a member of the Daughters of the American Revolution. Through the French branch of the DAR, she oversaw a donation to the Hermione re-creation project. But when it comes to Lafayette the man, she takes the cold-eyed view often found on her side of the Atlantic. The man often called a “citizen of two worlds” turns out to be a hero in only one of them. “Lafayette is just an image. He’s the portrait of the terrible inconsequence of the French elite of that period,” Chatel de Brancion tells me. “Franklin used Lafayette, purely and simply. He said, ‘Cover this guy with glory, don’t let him go too near the fighting, and send him back to France full of enthusiasm.’” Moreover, she adds dryly, “Everything the U.S. thanks Lafayette for, it should be thanking Franklin for.” Maybe so, but nobody will deny that Lafayette played his assigned part perfectly. After an initial chilly reception, he stepped quickly into the role of America’s BFF—Best French Friend. This required a lot more than just showing up. Many of the Frenchmen Silas Deane sent over managed to make themselves deeply unpopular with their haughty manners and their prickly sense of entitlement (Deane later took considerable heat for this). “These people think of nothing but their incessant intrigues and backbitings,” wrote the German-born French officer Johann de Kalb, the brilliant soldier who came over with Lafayette on the 1777 voyage. “Lafayette is the sole exception....He is an excellent young man.” The very qualities that made Lafayette a dud at Versailles made him a hit in Boston, Philadelphia and Valley Forge. He was straightforward and enthusiastic. He said what he meant, and then he said it again, and then he said it again. His stubborn optimism in the face of hardship rivaled Candide’s. He was, well, a lot like us. “He had a certain self-deprecating charm, and the ability to make fun of himself, which is not the French style of humor,” says Auricchio. Crucially, Lafayette won over George Washington, a commander-in-chief with a marked distaste for intimacy and a hostility to the French officer class. In explaining how Lafayette broke the ice, Chatel de Brancion makes much of the fact that Lafayette fought in the blue uniform of a major general in the Continental Army. “We’ve lost the subtlety of that gesture today. Washington was honored that a foreign aristocrat would fight in that uniform—it did him, Washington, enormous credit.” But clothing alone can’t explain the unusually affectionate bond that sprang up between the two men. Lafayette spent much of the war at Washington’s side and at one point pretty much moved into his house. He named his own son George Washington. By all accounts, the relationship was a bright spot in both their lives. It has withstood the full Freudian treatment over the years; history has yet to find a dark underside to it. It didn’t hurt that Lafayette happened to be the truest of true believers. Auricchio quotes a French comrade who tries to convince Lafayette to stop being such a sap by believing Americans “are unified by the love of virtue, of liberty...that they are simple, good hospitable people who prefer beneficence to all our vain pleasures.” But that is what he believed, and nothing could convince him otherwise. Lafayette’s American bubble remained unburst to the end. It must be said that battlefield heroics contribute little to Lafayette’s legacy, even though he sought to win glory through force of arms at every opportunity. Whether by circumstance or design—Chatel de Brancion says some of both—Lafayette was rarely put in a position to risk serious harm. Lafayette’s physical courage was beyond question, but his ardor often outweighed his military judgment. Moreover, as Franklin counseled, it was prudent to protect such a valuable political chess piece. No one wanted Lafayette to meet the fate of his friend de Kalb (DeKalb Avenue, Brooklyn). He was shot and bayoneted repeatedly at the Battle of Camden, dying of his wounds three days later. Lafayette’s brush with death came at the disastrous Battle of Brandywine on September 11, 1777, when a musket ball passed through the fleshy part of his lower leg. In this, as in so many things, Lafayette had luck on his side. The wound did him little harm (he was treated by Washington’s personal physician) and made him an instant hero. Another exploit burnished Lafayette’s reputation as a fighting man. On May 20, 1778, Lafayette and his small detachment of Pennsylvania militiamen, at their camp outside Philadelphia, found that they were surrounded by 5,000 redcoats advancing from several directions. Lafayette’s coolness in organizing a retreat in which only nine of his men were killed is nothing short of “miraculous,” writes Auricchio. In January 1779, with a lull in the fighting, Lafayette sailed back to France, where he continued to knock himself out seeking crucial additional assistance on America’s behalf. (“It is fortunate for the king that Lafayette does not take it into his head to strip Versailles of its furniture, to send to his dear Americans,” the Count de Maurepas remarked in the royal council.) What Lafayette wanted above all was to return to America in a French uniform at the head of the French expeditionary force forming in early 1780. Instead, the job was given to the battle-hardened Count de Rochambeau. Lafayette’s mission to Washington aboard the Hermione was given to him as a consolation prize. Capt. Yann Cariou finally found the rough weather he was looking for. Two days after setting out from La Rochelle, he moored the Hermione in a bay off the Crozon peninsula near the northwestern tip of France, almost within sight of where he was born on the Pointe du Raz. These are notoriously angry waters, and they lived up to their billing. All hands welcomed the foul, blustery morning that greeted us the following day. We sailed out of the bay under a sharp breeze, the Hermione skimming along at ten knots and Mozart’s Symphony No. 25 in G minor cranking on the PA. Everybody was elated. The volunteer crew of men and women mainly in their 20s—French, Swedish, Belgian, German and one American—strained to hoist more sail, eight or ten of them on each line (there were no winches in 1779; the Swedish bosun noted that if a time machine sent him back to the original Hermione, he would make sure to bring a portable winch with him). As instructed, everybody grunted, “Oh! hisse!” in cadence as they pulled. It’s French for “heave ho,” pronounced oh eese; the bosun tells me that you get demonstrably better pulling power if you sing out while you pull. Before long the wind picked up to Force 8, a gale. The Hermione was slicing through the high swells at 12 or so knots, very fast and close to its top speed. Captain Cariou was smiling broadly as the swells knocked the ship from side to side. “I’m astonished at what she can do,” says Cariou shaking his head appreciatively. Before he took over as the Hermione’s skipper in 2012, Cariou served as captain of the 167-foot Belem, the French merchant marine’s three-masted training barque. The sluggish Belem was built in 1896 to haul sugar from the West Indies back to France. Cariou was amazed by the difference. “The hull is perfect! She pushes very little water ahead of her, and she chews up very little wake behind.” The swells had picked up now, and the wind was whistling through the rigging. Some 60 feet up, the crew in yellow slickers was working fast to reef the mainsail while balancing shakily on a slender rope. Looking up I feared for them all, but particularly for the lone American, Adam Hodges-LeClaire from Lincoln, Massachusetts. Adam is a college student obsessed with Revolutionary War history to the point that he sews his own period clothes. He wore nothing else on board, including skimpy leather shoes loosely tied with cord—not the best for keeping a foothold on a madly swaying line. “Please don’t say I’m crazy,” Adam asks me politely. “Say I’m...passionate.” Several sailors got seasick. “If you can’t handle this, you’re in the wrong business,” says Charlène Gicquel, the pint-size first mate from the English Channel port of Cancale who came over with Cariou from the Belem. “But then,” she adds, “we’re all masochists.” This was the same kind of weather that the Hermione ran into near the beginning of its 38-day journey across the Atlantic in 1780. The ship’s captain, Louis-René-Madeleine Le Vassor, Comte de Latouche-Tréville, noted the worsening conditions in his log. March 26: “Hermione pitching violently.” March 30: “Wind turns to northwest with strong swells. I note with concern that the ship is straining.” Poor Lafayette. He was an unhappy sailor even in a calm sea—“I believe we sadden each other, [the sea] and I,” he wrote during his first trip over. Rough water made him violently ill. Laurence Chatel de Brancion envisions Lafayette most likely on deck during the gale, hugging the Hermione’s main mast. That’s what the German charlatan Franz Anton Mesmer recommended as a cure for seasickness. Lafayette was mesmerized—that’s where we get the word—by Mesmer’s crackpot theory of animal magnetism (in fairness, so was half of Europe). Even after Mesmer’s claims had been thoroughly debunked (by Benjamin Franklin, among others) Lafayette may never have stopped believing. “When it came to matters scientific, Lafayette’s enthusiasm sometimes trumped his good sense,” Auricchio writes with some delicacy. The destinies of Lafayette and the Hermione diverged after Lafayette debarked in Boston on April 28, 1780; he then traveled overland to join Washington at his headquarters in Morristown, New Jersey. The Hermione’s 34-year-old Capt. Latouche-Tréville sailed off to win great renown of his own against the English. Little more than a month after dropping off Lafayette, Latouche-Tréville sighted the 32-gun English frigate Iris off Long Island. The two warships pounded each other at murderously close range for an hour and a half. Finally, the Iris withdrew, apparently in no shape to continue. The Hermione was badly damaged, and counted 10 dead and 37 wounded. The two captains subsequently argued in the press about who had actually won. But for the current Hermione’s captain, Yann Cariou, the question doesn’t even arise: “We won,” he tells me with a look that made me drop any follow-up questions. Latouche-Tréville continued to reel off naval victories, often against great odds, in the Hermione and in other ships, during the American Revolution and the Napoleonic Wars. On two occasions in 1801, he bloodied the nose of the invincible Lord Nelson. He was supposed to command at Trafalgar, but, alas for France, he died the year before the battle. “If we had had him at Trafalgar, everything would have been different,” insists Cariou, sounding like a die-hard Brooklyn Dodgers fan replaying some of the World Series they lost to the Yankees before 1955. Lafayette, for his part, wrote to his wife shortly after debarking the Hermione in Massachusetts. “It is to the roar of cannon that I arrive or depart; the principal residents mount their horses to accompany me,” Lafayette reported. “In short, my love, my reception here is greater than anything that I could describe to you.” Did all this adulation go to his head? Yes, it did. An exasperated John Adams, no great fan, wrote in his diary that Lafayette “would be thought the unum necessarium in everything.” Upon joining Washington in Morristown, Lafayette began agitating for a joint invasion of New York, where the British were strongly entrenched. Rochambeau had to slap him down, more than once. “He forgets that there is still a left flank in a landing, which the whole English Navy will exterminate,” he wrote to another officer. Rochambeau, along with Washington and the Count de Grasse, commander of the French fleet, opted for bottling up Cornwallis in Yorktown, allowing France to deploy the weight of both its army and its navy in support of Washington’s Continental Army. The outcome speaks for itself. Yorktown briefly reunited Lafayette and the Hermione for the last time: He led 1,200 light infantry to keep Cornwallis busy in Virginia while the French tightened the noose around Yorktown from the sea; the Hermione was part of that noose. The way Laurence Chatel de Brancion sees it, Rochambeau never really got the credit he was due. History dies hard. “The French still think the Americans should be grateful, because without us, they would never have won the war, which is true,” says Bruno Gravellier, a former naval officer who is the superintendent aboard the Hermione. “It was a long time ago, but I still get a sense of friction between the U.S. and the French sides of the association.” The remainder of Lafayette’s long life—he died in 1834 at age 76—belongs to the history of France. He unfailingly demonstrated a willingness to rise above the factionalism that gripped France as it headed toward its own revolution. It sounds good and helps make Lafayette an emotionally sympathetic character, seen from here. But, like many of Lafayette’s best qualities, it earned him little credit in his native land. An aristocratic liberal in the late 1700s and early 1800s was like a Rockefeller Republican today—a chimerical creature unbeloved by those whose differences he tries to split. Even Thomas Jefferson, in 1789, warned Lafayette against attempting to “trim between two sides,” but Lafayette didn’t listen. When thinking of Lafayette, Americans will always see the fiery youth at Washington’s side, doing his damnedest for our country. Everything else is commentary, and maybe that’s a fair way for an American to look at him. In the turbulent history of France after Lafayette’s return from America—a period that saw the French Revolution, the rise and fall of Napoleon and the restoration of the monarchy—Lafayette, a son of the Enlightenment and the American Revolution, in public life or private, steadfastly articulated his devotion to one principle: the pursuit of liberty. Yet the French retain a different image. On July 17, 1791, a large crowd demonstrated on the Champ de Mars in Paris. Lafayette, commander in chief of the new National Guard, brought his troops to maintain order. A thrown rock, a dragoon down, and suddenly the troops opened fire, killing perhaps 100. There were twists and turns to come, but the massacre did incalculable damage to Lafayette’s reputation. “He was catastrophic,” is Chatel de Brancion’s unappealable verdict. Lafayette did remain in the French Army until 1792 and later held office as deputy to the National Convention after the fall of Napoleon in 1815. As the Hermione at last enters the Gironde estuary, headed for Bordeaux at the end of a week of sea trials, we are suddenly surrounded by dozens of small motor craft and sailboats. The vessels weave in and out, their occupants waving, and blasting their air horns. It’s heady stuff, and it inflated all our spirits. This must have been something like what Lafayette witnessed as the Hermione sailed into Boston Harbor in 1780. He must have been fairly drunk on it, too, given what Jefferson called his “canine appetite for fame.” But maybe he can be forgiven. In such a moment, you don’t ask yourself what you’ve done to deserve such fanfare. You just smile broadly and think, All this? For me? Joshua Levine is a Paris-based freelance journalist. He has written for Forbes and the Financial Times, and is the author of The Rise and Fall of the House of Barneys.
2524ed6b83c6f41ef128288651c147af
https://www.smithsonianmag.com/history/martha-washington-life-elusive-historians-180976983/
Why Martha Washington’s Life Is So Elusive to Historians
Why Martha Washington’s Life Is So Elusive to Historians Ask any American what Martha Washington looked like, and you’ll hear of a kindly, plump grandmother, her neck modestly covered and her gray hair poking out of a round, frilled mob-cap, as she was portrayed in Gilbert Stuart’s 1796 portrait. Her husband explained her straightforward style in a 1790 letter: Martha’s “wishes coincide with my own as to simplicity of dress, and everything which can tend to support propriety of character without partaking of the follies of luxury and ostentation.” Buy tickets now for a virtual lecture delivered by the best-selling historian Martha, then the first lady, was 65 when she sat for that famous portrait, but in earlier paintings, she is slim, her neckline plunging, décolletage on full display, her dark hair offset with a fashionable bonnet. (Make no mistake about it: Martha was considered attractive.) Her wardrobe—including custom-made slippers in purple satin with silver trimmings, which she paired with a silk dress with deep yellow brocade and rich lace on her wedding day—indicates a fashionista who embraced bold colors and sumptuous fabrics that conveyed her lofty social and economic standing. And it wasn’t just Martha, or Lady Washington as she was called: The couple’s ledgers are full of extravagant clothing purchases, for George as well. I made use of those sources in my biography of George Washington, You Never Forget Your First, but I felt frustrated by the limited descriptions of Martha that we find in letters, and which focus almost exclusively on her role as wife, mother and enslaver. Biographers have tended to value her simply as a witness to a great man. Artists painted her according to the standards of the time, with details one would expect to see from any woman in her position—nothing particular to this woman. Indeed, Martha might be pleased by how little we know about her inner life; after George died, she burned all the letters from their 40-year marriage, although a few have been discovered stuck in the back of a desk drawer. Historians are limited by the archives, and by ourselves. Biographers study documents to tell the story of a person’s life, using clothes and accessories to add color to their accounts. But what if we’re missing something obvious because we don’t know what to look for? Of Martha’s few surviving dresses, I’ve spent the most time looking at this one, and when I imagine Martha, I picture her in this dress. She wore it during the 1780s, a period I think of as the Washingtons’ second chance at a normal life. They were no longer royal subjects or colonists, but citizens; George was world-famous and finally satisfied with life; Martha was happily raising the young children of her late, ne’er-do-well son, John Parke Custis, as well as her nieces and nephews. They had experienced loss, triumph, life outside of Virginia, and believed, erroneously, that their life of public service had ended with the American Revolution. By the end of the decade, of course, they would become the first first family. But was I seeing her clearly? The catalog entry for the dress listed the pattern I remembered, with flowers, butterflies and ladybugs—and other parts I didn’t remember. I suddenly found it odd that the 58 insects on the dress included beetles, ants and spiders, but I didn’t know the reasons behind these images. Assuming Martha chose the pattern, it reveals something important. Zara Anishanslin, a historian of material culture who has spent time at the Washingtons’ home at Mount Vernon as a researcher and fellow, posed an intriguing theory to me. “Martha was a naturalist,” Anishanslin explained. Or rather, Martha would have been a naturalist, had she been born a man, or in a different era; she had very few ways of expressing her passion for the natural world, which makes it easy to overlook. As Anishanslin spoke, I was riveted—in part because, after reading every Martha Washington biography, this was the only new, original insight I’d ever come across about her, and I wondered what the best medium would be to convey this forgotten element of Martha’s life. An academic history would hardly be the best medium to spotlight objects attesting to Martha’s passion for nature; a museum exhibition would be better. If I were curating such an exhibition, I would place the dress in the largest of three glass cases, front and center. In another case, I would display the 12 seashell-patterned cushions Martha made with the help of enslaved women at Mount Vernon. In the third, I’d display 12 Months of Flowers, one of the only books from her first marriage, to Daniel Parke Custis, that she kept for personal use. The arrangement would be the first chance to see Martha’s husbands used as accessories to enhance our understanding of her. I’d call the exhibition “Don’t Be Fooled by the Bonnet.” This article is a selection from the March issue of Smithsonian magazine Alexis Coe takes a closer look at our first president—and finds he is not quite the man we remember Alexis Coe is the New York Times bestselling author of You Never Forget Your First: A Biography of George Washington. Hugh Talman is a staff photographer at the Smithsonian's National Museum of American History.
0d704e6ea90d08d5d0e360f01086b572
https://www.smithsonianmag.com/history/massive-new-database-connect-billions-historic-records-tell-full-story-american-slavery-180973721/?fbclid=IwAR1_FhYNUGqJClL-V8ANr2U-zjhYv-TPM46A54WjKgWlxqlb8SZa3Bya5n0
A Massive New Database Will Connect Billions of Historic Records to Tell the Full Story of American Slavery
A Massive New Database Will Connect Billions of Historic Records to Tell the Full Story of American Slavery In 1834, a 22-year-old Yoruba man who would come to be known as Manuel Vidau was captured as a prisoner of war and sold to slave traders in Lagos, today the largest city in Nigeria. A Spanish ship transported him to Cuba, where he was sold to a white man who forced him to roll 400 cigars a day (if his pace slowed, he recalled, he would be “stripped, tied down and flogged with the cow hide”). A decade later, however, Vidau secured permission from a new owner to hire himself out, and with his earnings he bought a share in a lottery ticket—and won. That allowed him finally to buy his freedom. He married a fellow former slave, Maria Picard, and they adopted a young relative whose parents had died of cholera. Vidau supported his wife and son by continuing to roll cigars, eventually making enough money to cover their passage to England. Vidau’s stroke of fortune is known today only because he had a chance encounter with a representative of the British and Foreign Anti-Slavery Society. The organization recorded his story in its journal, which was later shelved in a university library, digitized and eventually collected in an online database called “Freedom Narratives.” Enslaved people like Vidau—torn away from their communities of origin, deprived of the ability to write about themselves and treated as cargo or property in official documents—often left little of themselves to the historic record. Still, even a few facts can shape the outline of a life of sorrow, adversity, perseverance and triumph. “One of the biggest challenges in slave studies is this idea that people were unknowable, that the slave trade destroyed individuality,” says Daryle Williams, a historian at the University of Maryland. “But the slave trade didn’t erase people. We have all kinds of information that’s knowable—property records, records related to births, deaths and marriages. There are billions of records. It just takes a lot of time to go look at them, and to trace the arc of an individual life.” Williams, a specialist in the African diaspora of Brazil, is one of the principal investigators of a massive new online database called “Enslaved: Peoples of the Historic Slave Trade,” which will launch in 2020. It aims to serve as a clearinghouse for information about enslaved people and their captors. Headquartered at Matrix, the Center for Digital Humanities & Social Sciences at Michigan State University, and funded by a founding $1.5 million grant from the Mellon Foundation, Enslaved will serve as a hub for many smaller digitization projects, Freedom Narratives among them. For the first time, says Williams, anyone from academic historians to amateur genealogists will be able to trace individuals, families, ethnic groups and populations through dozens, hundreds or even thousands of archives, making connections that will enrich our understanding of slavery. “This tool,” Williams says, “will have the potential to show that even in the context of this horrible crime, there are still threads that hold people’s lives together.” * * * The study of the historic slave trade depends on numbers—the 12.5 million people kidnapped from Africa and shipped to the New World between 1525 and 1866, the 10.7 million who survived the two-month voyage, the 3.9 million enslaved in the United States just before the Civil War. These figures are horrifying, but at the same time their very enormousness can have a numbing effect, which is why contemporary historians are increasingly turning to biography. “Individual stories make a difference,” says Leslie Harris, a historian at Northwestern University, who writes about and teaches the history of slavery. “We do need to know the vast numbers that we’re talking about, that this was the largest forced migration in history, but when you begin to talk about these big concepts in terms of individual lives, you can better understand what these things mean.” The challenge, says Harris, who is not affiliated with the Enslaved project, has been to move beyond the well-told stories of once-enslaved activists like Harriet Tubman and Frederick Douglass. The “linked open data” at the core of the Enslaved archive offers broader possibilities. “This project is so important,” Harris says. “It could help us gain a greater understanding of how people weren’t just swept up in history, but how they spoke back to power, how they fought for their families.” It’s always been easiest to assemble a vivid picture about people whose lives are well documented, whether in letters, newspapers or official records held in libraries and archives. For that reason, the doings of white people from the upper classes have long made up the core of what Americans and Europeans tend to think of as history. “For too long, it has been difficult, painstaking and often impossible to write histories of all but a relatively few Americans of African descent, because documents have not been organized in a way that allows that,” explains Walter Hawthorne, a historian at Michigan State and one of the Enslaved project’s principal investigators. “Documentation often exists, but it has not been well preserved, well cataloged and made searchable.” Historians, of course, have long made good use of various records, from plantation inventories and escaped slave advertisements to personal narratives collected by obscure abolition societies. But those details are housed at far-flung institutions, and not consistently organized. Jane Landers, a historian at Vanderbilt University, set out in 2003 to change that. Since that time, the project called the “Slave Societies Digital Archive” has digitized some 700,000 pages of religious and other documents from colonial Brazil, Colombia, Cuba, Florida and Angola. Unlike in the English colonies, where enslaved people were treated almost exclusively as property, in Spanish and Portuguese America, they “were considered fully human, with souls to be saved,” Landers says. Their life events were faithfully recorded, often by the Catholic church. The earliest of these archives date to the 16th century. “We keep finding surprises,” Landers says. “We have found records for long-abandoned Franciscan missions in the middle of nowhere in Brazil, for cities that no longer exist in Cuba or in Haiti. Wonderful scholars before me have used some of these ecclesiastics records and incorporated them into studies, but nobody had really studied them at length, or made a point of collecting them.” By partnering with Enslaved, the Slave Societies Digital Archive can link their work with other collections. Emory University, for example, has digitized records of nearly 36,000 historic slaving voyages and details of 91,491 Africans liberated by naval courts, which will also be included in “Enslaved.” The Matrix team at Michigan State hosts an open access archive about enslaved people in Louisiana, which includes names, ethnicities and occupations of individuals listed in government records. And Harvard’s Hutchins Center for African and African American Research, led by Henry Louis Gates Jr., is contributing a selection of its collected biographies of people of African descent. “What we want to do is take a portion of everyone’s data and put it in one big pot,” says Dean Rehberger, the director of Matrix and another of Enslaved’s principal investigators. “Then we can see if the same person appears in more than one, and we can build up these fragments and put them together.” It turns out there’s a surprisingly simple way to turn life histories, ship manifests, census records and other information into machine-readable data: the semantic triple, which involves entering information in three-part sentences, each with a subject, a predicate and an object. “It’s something like, ‘Maria Picard was born in 1822,’ or ‘Maria Picard married Manuel Vidau,’” explains Rehberger. Such three-part units of information can be mined from any biography, list, article or directory, and then linked to other information units in a vast network. Thanks to modern computing power, so-called “triplestores” now exist with hundreds of billions of entries on every topic imaginable. The Michigan State team has spent two years building their own vast network of triples. But the project, they realize, may never be complete. The historic slave trade lasted nearly 350 years and touched millions of lives, and there remain undiscovered or little-known troves of information around the world. Even a family Bible could hold a valuable data point. So in addition to acting as a database for existing slavery information, Enslaved will also offer a publishing platform for data, with a peer-review process modeled after scholarly journals. “Historians tend to just go out and collect what they want, whatever they need for their particular thing,” Rehberger says. “But what if you actually went to a physical archive thinking in larger terms, that this is something that could be of value to others? We want people to see that publishing data is an important part of humanities research, just like it is in the sciences. And isn’t it interesting to think that digital humanities is going to be led, transformed even, by slave studies?” * * * After Manuel Vidau and his wife, Maria Picard, set sail for England, they hoped to be able to return to Lagos and reunite with family they had last seen decades before. It isn’t known if they ever made it home. But perhaps, in some ship’s manifest or census record waiting to be digitized and connected, there lies a clue to the fate of this ordinary man who made a life for himself against all odds. A preservationist races to save the poignant domestic legacy of the nation’s slaveholding past When Jobie Hill first stepped over the threshold of a slave house, her experience was visceral. “You notice the size, the amount of light, the ventilation,” she says, “and you can imagine what it would have been like for you, personally, to live there.” Hill, an Iowa architect specializing in historic preservation, has spent the past seven years visiting former slave dwellings. At each location, she records GPS coordinates, makes photos and sketches a site plan. She adds these drawings to a digital database, called “Saving Slave Houses,” which currently includes 145 sites across the United States. When possible, she includes descriptions of the homes from the enslaved African-Americans who lived in them. To locate the slave houses, Hill relies largely on a government survey from the 1930s that included about 500 of them. There’s an urgency to her work because most of these buildings remain in private hands and aren’t protected sites. Often, property owners don’t even know their sheds, cottages or outbuildings were slave quarters until Hill gets in touch. While many slave houses are in disrepair, Hill says the fact that they’re still standing at all, more than 150 years after emancipation, is often a testament to the skill and ingenuity with which enslaved people built them. As Hill says, “These were not just helpless, hopeless people.” Editor's note, December 18, 2019: An earlier version of this story mistakenly swapped the photo captions for the letter by Cleto Congo and the 1767 slave inventory. This article is a selection from the January/February 2020 issue of Smithsonian magazine Amy Crawford is a Michigan-based freelance journalist writing about cities, science, the environment, art and education. A longtime Smithsonian contributor, her work also appears in CityLab and the Boston Globe.
6d380682d6537eeafb0f80f95da64f61
https://www.smithsonianmag.com/history/merchant-marine-were-unsung-heroes-world-war-ii-180959253/
The Merchant Marine Were the Unsung Heroes of World War II
The Merchant Marine Were the Unsung Heroes of World War II “The sailor from the merchant ships was in those days known to America as a bum,” the former mariner and author Felix Reisenberg wrote. “He was associated with rotgut whiskey, waterfront brawls and quickie strikes that held up big passenger ships at New York, New Orleans and San Francisco . . .” The era was the earliest stages of the United States’ involvement in World War II, and Nazi Germany was already bringing the war right to the nation’s shores – with shocking results. U-boats devastated merchant shipping off the U.S. East Coast and Gulf Coast, attacking vessels within sight of beaches in Virginia, North Carolina and Florida, and at the mouth of the Mississippi River. America was too undermanned and ill-equipped to defend its own shoreline. U-boats used the glow of American coastal cities to silhouette merchant ships for torpedo strikes, like ducks in a carnival shooting gallery. On those ships were not military personnel but merchant mariners -- civilian volunteers with the U.S. Merchant Marine, hauling vital war cargo for the Allies. Merchant mariners were the supply line that provided virtually everything Allied armies needed in order to survive and fight on foreign battlefields. The seamen had no military standing or government benefits, but they possessed an unusual variety of courage and gave their lives for their country as valiantly as those in the armed forces did. Surviving a U-boat attack often meant running a gauntlet of dangers, including fire, explosions, icy water, sharks, flaming oil slicks and long odysseys in open lifeboats. “You were taking a chance, that’s for sure,” recalled Jack Rowe, a merchant mariner from tiny Gwynn’s Island in Mathews County, Virginia. “But a lot of people were taking chances. You couldn’t just say, ‘Why me?’” Standing lookout on a merchant ship was nerve-racking, especially around dawn and dusk, when the colors of the sea and sky merged into a gray haze, and any ripple of motion or flash of color might be the plume of a torpedo. “Occasionally a man will get the jitters and will be noticed walking the deck at night when he should be asleep,” recalled mariner Raymond Edwards. Once a torpedo struck, every moment became precious and every decision irreversible. “Even two seconds could mean the difference between life and death for any member of the crew. Running in the wrong direction might cut a sailor off from all means of escape. Jumping overboard at the wrong spot or at the wrong instant might easily cost a life. If a sailor is lucky enough to be alive after a torpedo hits his ship, it takes quick thinking and fast action to get him off the ship and into a lifeboat. Many are saved by sheer luck.” The U-boat war was particularly unforgiving to merchant mariners. The Merchant Marine suffered a higher casualty rate than any branch of the military, losing 9,300 men, with most of the losses occurring in 1942, when most merchant ships sailed U.S. waters with little or no protection from the U.S. Navy. In March 1942 alone, 27 ships from six Allied nations were sunk off U.S. shores. Statistically, America’s coastal waters were the most dangerous, the scene of half the world’s sinkings. The experience of being torpedoed was so common that the president of the Boston Seaman’s Club founded a “40-Fathom Club” for those who had survived it. “I hope the membership won’t become too large,” he added, but it grew larger every day as rescue ships brought oil-soaked survivors to the docks at Halifax, Boston, New York, Norfolk, Morehead City, Miami, and Havana. Many of the mariners who survived torpedo attacks went right back to sea, often sailing through the same perilous waters, only to torpedoed again. One mariner was torpedoed ten times. Despite their sacrifices, the members of the 40-Fathom Club were viewed by the American public with some ambivalence. Mariners were in such demand that shipping companies had lowered their standards and filled out crews with drunks, idlers, thieves, brawlers, and card sharps. The Merchant Marine’s image was further eroded by the presence of Communists in the maritime unions, although most mariners had no interest in radical politics. But they were deplored by some Navy leaders for refusing to bend to military discipline. Other critics complained the mariners’ wartime bonuses raised their pay higher than that of military men— ignoring the facts that mariners received no government benefits, paid income taxes, and earned money only when their ships were at sea. If their ships were torpedoed, they stopped getting paid the moment they hit the water. They were off the clock when swimming for their lives. And their civilian status would shut them out of a lifetime’s worth of military benefits including health care, money for college and low-interest loans. Not everyone piled on the Merchant Marine. President Franklin D. Roosevelt praised mariners in speeches, and his wife, Eleanor, credited them with “supreme courage” and suggested they be issued uniforms. Helen Lawrenson, a writer for Collier’s magazine, waded into a dingy seamen’s bar in Greenwich Village and was charmed by a group of mariners who went by the names of Low Life McCormick, No Pants Jones, Screwball McCarthy, Foghorn Russell, Soapbox Smitty, Riff Raff, and Whiskey Bill. Ten of the twelve mariners she met had been torpedoed at least once, and one of the other two complained, “I feel so out of place. I’m a wallflower, a nobody.” Lawrenson wrote that the mariners cut decidedly unromantic figures, guzzling “vast and formidable quantities of beer” while belting out sea ditties with raw lyrics. Beneath the surface, however, she found them intensely patriotic, casually fearless, and wise to the workings of the world. “They were the best informed, the most widely traveled, and the most truly sophisticated men I have ever met,” she concluded. The New York Times characterized merchant mariners as the war’s unsung heroes: “No one steps up to the bar to buy them drinks. No moist-eyed old ladies turn to them in the subway to murmur ‘God bless you.’ The cop on the beat, gentle with the tipsy soldier or the unsteady gob [Navy man], is apt to put his nightstick to the britches of a merchant sailor who has tippled heavily in the town’s bars to celebrate his rescue from the sea.” Most of the mariners who sailed against the U-boats are gone now. The few thousand who remain have come to regard Memorial Day as a celebration that has never fully included them. But it’s still not too late to remember, belatedly, how much we owe them. From THE MATHEWS MEN: Seven Brothers and the War Against Hitler's U-boats by William Geroux, published by Viking, an imprint of Penguin Publishing Group, a division of Penguin Random House LLC Copyright © 2016 by William Geroux.
014447e65468cd38457eaa84741d29e8
https://www.smithsonianmag.com/history/meriwether-lewis-mysterious-death-144006713/
Meriwether Lewis’ Mysterious Death
Meriwether Lewis’ Mysterious Death Captain Meriwether Lewis—William Clark’s expedition partner on the Corps of Discovery’s historic trek to the Pacific, Thomas Jefferson’s confidante, governor of the Upper Louisiana Territory and all-around American hero—was only 35 when he died of gunshot wounds sustained along a perilous Tennessee trail called Natchez Trace. A broken column, symbol of a life cut short, marks his grave. But exactly what transpired at a remote inn 200 years ago this Saturday? Most historians agree that he committed suicide; others are convinced he was murdered. Now Lewis’s descendants and some scholars are campaigning to exhume his body, which is buried on national parkland not far from Hohenwald, Tenn. “This controversy has existed since his death,” says Tom McSwain, Lewis’s great-great-great-great nephew who helped start a Web site, “Solve the Mystery,” that lays out family members’ point of view. “When there’s so much uncertainty and doubt, we must have more evidence. History is about finding the truth,” he adds. The National Park Service is currently reviewing the exhumation request. The intrigue surrounding the famous explorer’s untimely death has spawned a cottage industry of books and articles, with experts from a variety of fields, including forensics and mental health, weighing in. Scholars have reconstructed lunar cycles to prove that the innkeeper’s wife couldn’t have seen what she said she saw that moonless night. Black powder pistols have been test-fired, forgeries claimed and mitochondrial DNA extracted from living relatives. Yet even now, precious little is known about the events of October 10, 1809, after Lewis – armed with several pistols, a rifle and a tomahawk – stopped at a log cabin lodging house known as Grinder’s Stand. He and Clark had finished their expedition three years earlier; Lewis, who was by then a governor of the large swath of land that constituted the Upper Louisiana Territory, was on his way to Washington, D.C. to settle financial matters. By some accounts, Lewis arrived at the inn with servants; by others, he arrived alone. That night, Mrs. Grinder, the innkeeper’s wife, heard several shots. She later said she saw a wounded Lewis crawling around, begging for water, but was too afraid to help him. He died, apparently of bullet wounds to the head and abdomen, shortly before sunrise the next day. One of his traveling companions, who arrived later, buried him nearby. His friends assumed it was suicide. Before he left St. Louis, Lewis had given several associates the power to distribute his possessions in the event of his death; while traveling, he composed a will. Lewis had reportedly attempted to take his own life several times a few weeks earlier and was known to suffer from what Jefferson called “sensible depressions of mind.” Clark had also observed his companion’s melancholy states. “I fear the weight of his mind has overcome him,” he wrote after receiving word of Lewis’s fate. At the time of his death Lewis’s depressive tendencies were compounded by other problems: he was having financial troubles and likely suffered from alcoholism and other illnesses, possibly syphilis or malaria, the latter of which was known to cause bouts of dementia. Surprisingly, he may also have felt like something of a failure. Though the Corps of Discovery had traversed thousands of miles of wilderness with few casualties, Lewis and Clark did not find the Northwest Passage to the Pacific, the mission’s primary goal; the system of trading posts that they’d established began to fall apart before the explorers returned home. And now Lewis, the consummate adventurer, suddenly found himself stuck in a desk job. “At the end of his life he was a horrible drunk, terribly depressed, who could never even finish his [expedition] journals,” says Paul Douglas Newman, a professor of history who teaches “Lewis and Clark and The Early American Republic” at the University of Pittsburgh. An American icon, Lewis was also a human being, and the expedition “was the pinnacle of Lewis’s life,” Newman says. “He came back and he just could not readjust. On the mission it was ‘how do we stay alive and collect information?’ Then suddenly you’re heroes. There’s a certain amount of stress to reentering the world. It was like coming back from the moon.” Interestingly, John Guice, one of the most prominent critics of the suicide theory, uses a very different astronaut comparison. Lewis was indeed “like a man coming back from the moon,” Guice notes. But rather than feeling alienated, he would have been busy enjoying a level of Buzz Aldrin-like celebrity. “He had so much to live for,” says Guice, professor emeritus of history at The University of Southern Mississippi and the editor of By His Own Hand? The Mysterious Death of Meriwether Lewis. “This was the apex of a hero’s career. He was the governor of a huge territory. There were songs and poems written about him. This wasn’t just anybody who kicked the bucket.” Besides, how could an expert marksman botch his own suicide and be forced to shoot himself twice? Guice believes that bandits roaming the notoriously dangerous Natchez Trace killed Lewis. Other murder theories range from the scandalous (the innkeeper discovered Lewis in flagrante with Mrs. Grinder) to the conspiratorial (a corrupt Army general named James Wilkinson hatched an assassination plot.) Though Lewis’s mother is said to have believed he was murdered, that idea didn’t have much traction until the 1840s, when a commission of Tennesseans set out to honor Lewis by erecting a marker over his grave. While examining the remains, committee members wrote that “it was more probable that he died at the hands of an assassin.” Unfortunately, they failed to say why. But the science of autopsies has come a long way since then, says James Starrs, a George Washington University Law School professor and forensics expert who is pressing for an exhumation. For one thing, with mitochondrial DNA samples he’s already taken from several of Lewis’ female descendants, scientists can confirm that the body really is Lewis’s (corpses were not uncommon on the Natchez Trace). If the skeleton is his, and intact, they can analyze gunpowder residue to see if he was shot at close range and examine fracture patterns in the skull. They could also potentially learn about his nutritional health, what drugs he was using and if he was suffering from syphilis. Historians would hold such details dear, Starrs says: “Nobody even knows how tall Meriwether Lewis was. We could do the DNA to find out the color of his hair.” Some scholars aren’t so sure that an exhumation will clarify matters. “Maybe there is an answer beneath the monument to help us understand,” says James Holmberg, curator of Special Collections at the Filson Historical Society in Louisville, Ky., who has published work on Lewis’s life and death. “But I don’t know if it would change anybody’s mind one way or the other.” The details of the case are so sketchy that “it’s like trying to grab a shadow,” Holmberg says. “You try to reach out but you can never get a hold of it.” Even minor features of the story fluctuate. In some versions, Seaman, Lewis’s loyal Newfoundland who guarded his master against bears on the long journey West, remained by his grave, refusing to eat or drink. In other accounts, the dog was never there at all. However Lewis died, his death had a considerable effect on the young country. A year and a half after the shooting, ornithologist Alexander Wilson, a friend of Lewis’s, interviewed Mrs. Grinder, becoming one of the first among many people who have investigated the case. He gave the Grinders money to maintain Lewis’s grave and visited the site himself. There, reflecting on the adventure-loving young man who had mapped “the gloomy and savage wilderness which I was just entering alone,” Wilson broke down and wept. A frequent contributor to Smithsonian, Abigail Tucker is the author of The Lion in the Living Room: How House Cats Tamed Us and Took Over the World. More information is available at her website: abigailtucker.com
2312ec9f9989117bbd5a51530fa9ad78
https://www.smithsonianmag.com/history/michael-jackson-donald-trump-and-other-famous-americans-who-escaped-brushes-death-180961707/
Michael Jackson, Donald Trump and Other Famous Americans Who Escaped Brushes With Death
Michael Jackson, Donald Trump and Other Famous Americans Who Escaped Brushes With Death Donald J. Trump, long before he became President-Elect of the United States, would call October 10, 1989, “a day that changed my life.” As he tells the story of that day, the then 43-year-old real estate developer was bidding goodbye to three of his executives who were about to catch a chartered helicopter to Atlantic City. “For an instant, as they were walking out, I thought of going with them,” Trump wrote in his 1990 book, Trump: Surviving at the Top. “I fly down to Atlantic City at least once a week, and I knew that if I made the forty-five-minute helicopter trip then, we could continue talking business on the way. But there was just too much to do in the office that day. As quickly as the idea had popped into my mind, I decided not to go.” Later that afternoon he received the news: All three executives, as well as their pilot and copilot, were dead. The helicopter’s rotors had broken off in midair and it had crashed into a wooded median on the Garden State Parkway in New Jersey. Looking back, Trump would reflect that the crash taught him about the fragility of life. “It doesn’t matter who you are, how good you are at what you do, how many beautiful buildings you put up, or how many people know your name," he wrote in his book. "No one on earth can be totally secure, because nothing can completely protect you from life’s tragedies and the relentless passage of time.” In the midst of last year’s presidential election campaign, reporters from Buzzfeed and Mother Jones resurrected accusations from Trump biographers that he intentionally inflated (or imagined) his part in the day's tragedy. Accounts differ, but some say he wouldn't have left New York because he had a meeting later that day. Others say he would never have considered boarding the ill-fated flight, as the only helicopters he'd fly on were his own. Stories of close calls with tragedies are the fodder of many an autiobiography or personal tale. Here are 12 others who dodged death: The future photographer was just four years old when he was awakened by a thundering noise, felt his bed being jerked around the room, and watched as one chimney of his family’s house plummeted past his window. It was the beginning of the famous San Francisco Earthquake of 1906. After the initial jolt, young Adams went outside to explore. In his autobiography, he remembered being “very curious, wanting to be everywhere at once. There were many minor aftershocks, and I could hear them coming. It was fun for me, but not for anyone else.” Fun, that is, until an especially strong aftershock flung him against a garden wall, badly breaking his nose. His nose remained off-kilter for the rest of his life. Although the earthquake itself lasted only about a minute, the fires it caused burned on for three days. An estimated 3,000 residents died and more than 500 city blocks were destroyed. “From our house I saw vast curtains of smoke by day and walls of flame by night,” Adams recalled. “Refugees poured into our district, setting up their pitiful camps in the dunes with what they had carried from their burning or fire-threatened homes.” Despite his early encounter with the fury of nature, Adams grew up to become one of the natural world’s greatest chroniclers and advocates. He died in 1984 at the age of 82. In December 1944, the future president was an assistant navigator aboard the light aircraft carrier U.S.S. Monterey in the Philippine Sea when the ship ran into a deadly storm aptly named Typhoon Cobra. Powerful winds and high waves caused three of the Navy destroyers in the group to capsize. According to historian Douglas Brinkley, more than 800 sailors were lost, including six from Ford’s own ship. One victim might well have been Ford himself. As he remembered the incident in his 1979 autobiography, A Time to Heal, he had just returned to his bunk after four hours on watch during the storm, when he began to smell smoke and went back to investigate. “As I stepped on the flight deck, the ship suddenly rolled about 25 degrees,” he wrote. “I lost my footing, fell to the deck flat on my face and started sliding toward the port side as if I were on a toboggan slide.” Ford’s slide was finally halted by a two-inch-high steel ridge that ran along the deck to keep the flight crew’s tools from falling into the sea. “I was lucky; I could have easily gone overboard.” Ford’s troubles weren’t over, though. He soon realized he was right about the fire. The storm had torn planes on the hangar deck loose from their moorings, and as they collided, some of their gas tanks ruptured. Then stray sparks set the gasoline on fire. Meanwhile, the typhoon raged on. Although the Navy told the crew to abandon ship, the captain asked for time to fight the blaze.  Seven tense hours later, as Ford recalled it, the fire had been extinguished and the badly damaged ship headed for the island of Saipan. “Years later, when I became President, I remembered that fire at the height of the typhoon and I considered it a marvelous metaphor for the ship of state,” he wrote. Ford would live on to serve 25 years in Congress and as president of the United States from 1974 to 1977, following Richard Nixon’s resignation. He died in 2006 at the age of 93. The famous R&B vocal group—known for such hits as “Reach Out, I'll Be There”—was scheduled to catch Pan Am Flight 103 from London in December 1988. However, a recording commitment forced them remain in London and take a later plane. Less than 40 minutes after takeoff, the flight was brought down by a bomb planted on board. It crashed in the town of Lockerbie, Scotland, killing all 259 passengers and crew and another 11 people on the ground. The terrorist act was later attributed to the Libyan government of Muammar Gaddafi. In October 2016, Duke Fakir, the last surviving member of the group, told British reporters that the group would have boarded the flight but for a BBC producer who insisted they record a pair of upcoming television appearances in two separate sessions rather than one. “I was glad, so, so glad that we didn’t do it in one session,” Fakir said. The Four Tops were not the only celebrities who might have been on board. Sex Pistols singer John Lydon, aka Johnny Rotten, was also set to be on the flight, as was the actress Kim Cattrall. Lydon missed the flight because his wife was slow in packing; Cattrall also missed boarding when she went to buy a teapot to bring home to her mother. The Four Tops, with some changes in personnel, continue to perform to this day. They were inducted into the Rock & Roll Hall of Fame in 1990. A decade after 9/11, Michael Jackson’s older brother Jermaine made headlines when he asserted that, if not for a late night on September 10, the pop star would have been at the World Trade Center on the morning of the terrorist attack. “Thankfully, none of us had had a clue that Michael was due at a meeting that morning at the top of one of the Twin Towers,” Jermaine wrote in his 2011 book, You Are Not Alone Michael: Through a Brother’s Eyes.  “We only discovered this when Mother phoned his hotel to make sure he was okay. She, Rebbie [Jackson] and a few others had left him there around 3 a.m. ‘Mother, I’m okay, thanks to you,’ he told her. ‘You kept me up talking so late that I overslept and missed my appointment.’” One colorful tale that emerged in the aftermath of 9/11 had Jackson fleeing New York in a rental car with Elizabeth Taylor and Marlon Brando—the trio gorging on fast food en route and making it as far as the Midwest. Alas, that account has never been verified. Jackson would live for another eight years after 9/11. In 2016, seven years after his death in 2009, he topped the Forbes list of highest-paid dead celebrities, with earnings for the year estimated at $825 million. The future U.S. senator and presidential candidate was a 31-year-old naval aviator in 1967. One July morning, as he was about to take off from the aircraft carrier U.S.S. Forrestal, then in the Tonkin Gulf off Vietnam, a stray missile from another plane hit either his fuel tank of that of the plane next to him (historical accounts differ). Burning jet fuel spewed across the deck, along with one or more bombs from the damaged plane. McCain escaped his plane—only to step into another inferno. “Small pieces of hot shrapnel from the exploded bomb tore into my legs and chest,” he remembered in his 1999 memoir, Faith of My Fathers.  “All around me was mayhem. Planes were burning….Body parts, pieces of the ship, and scraps of planes were dropping onto the deck.” The crew fought for more than a day to get the fire under control. The death toll would eventually reach 132 men, with two others missing and assumed dead. The Forrestal took two years to repair. Just three months later, McCain faced death again. On a bombing run over Hanoi, his plane was hit by a Russian missile he described as “the size of a telephone pole.” McCain managed to eject from the plane but was badly injured. Captured by the North Vietnamese, he spent the next five years as a prisoner of war. After his release in 1973, McCain continued to serve in the Navy until 1981.He was elected to the House in 1982, to the Senate in 1986, and ran as the Republican candidate for president in 2008. Today he is the senior U.S. senator from Arizona. Dan Quayle was a young Indiana congressman in 1978, when a friend and fellow House member, California Democrat Leo Ryan, invited him on a trip to Guyana. The purpose of Ryan’s trip was to investigate abuse allegations against the American-born cult leader Jim Jones, who had moved his followers from California to the South American country a year earlier. Because he had two young children and a third on the way, Quayle wrote in his 1994 memoir, Standing Firm, “I begged off this one, even though Leo asked me two or three times.” That proved fortuitous on Quayle’s part. At the end of his visit to Jonestown, Ryan, three journalists and one cult defector were shot dead on an airstrip as they attempted to leave. Eleven other people were wounded in the attack by Peoples Temple gunmen. Later that day, on Jones’s orders, more than 900 members of the cult were either murdered or killed themselves by willingly drinking cyanide-laced punch. Jones died of a gunshot wound. Dan Quayle went on to serve in the U.S. Senate and as vice president of the United States from 1989 to 1993. In 1844, Tyler, the tenth president of the United States, was part of a large group of dignitaries who came aboard the new, state-of-the art warship U.S.S. Princeton for a cruise on the Potomac River. The festivities included a demonstration of the ship’s powerful guns, said to be capable of hurling a 200-pound cannon ball a distance of five miles. The guns fired several times without incident. Then, in another test firing, one of them exploded, sending shrapnel across the ship’s deck. Eight people were killed, including Tyler’s secretary of state and secretary of the navy. At least 20 were injured. Missouri Senator Thomas Hart Benton, great-great uncle of the famous American painter, was knocked unconscious by the explosion. When he came to, he remembered “seeing the gun itself split open—two seamen, the blood oozing from their ears and nostrils, rising and reeling near me—Commodore Stockton, hat gone, and face blackened, standing bolt upright, staring fixedly upon the shattered gun.” Luckily for Tyler, who would otherwise have been in the line of fire, he had lingered below deck, supposedly to hear his son-in-law perform a song. Tyler left the presidency in 1845 and died in 1862 at the age of 71. The future Academy-Award-nominated director grew up in Galveston, Texas, where, as a boy of five, he survived the legendary Galveston Hurricane of 1900, still considered the deadliest natural disaster in U.S. history. While estimates vary, as many as 12,000 people may have died in the storm. Before anyone realized the full fury of what was to come, Vidor’s mother took him and two young friends to the beach to see the spectacular waves. Vidor described the scene in a magazine story published years later: “I could see the waves crash against the streetcar trestle, then shoot into the air as high as the telephone poles….I was only five then, but I remember now that it seemed as if we were in a bowl looking up toward the level of the sea. As we stood there in the sandy street…I wanted to take my mother's hand and hurry her away. I felt as if the sea was going to break over the edge of the bowl and come pouring down upon us.” The Vidors took shelter in the house where the other two boys were visiting. As the first floor filled with seawater, they moved up to the second, eventually crowding into one small room with more than 30 other people. In the morning, they left Galveston by boat and headed to the Texas mainland, passing countless floating corpses along the way. Vidor would grow up to become a celebrated filmmaker, with a career that spanned both silent movies and talkies. Among his better-known works are The Big Parade, Stella Dallas, Duel in the Sun, The Fountainhead, and several scenes in The Wizard of Oz. He died in 1982 at age 88. The tough-guy actor and his family were traveling in Europe in 1939 when word came that the German army was preparing to invade Poland—an act that signaled beginning of World War II. Like many other Americans, they decided to get packing. As Robinson tells the story in his 1958 autobiography, My Father, My Son, the ship they had in mind was the British ocean liner Athenia. “But something went wrong, the boat was crowded or left early,” he wrote. “Anyway, I remember the best we could do was to get a single cabin on an American ship, the S.S. Washington.” Their accommodations on the Washington may have been cramped, but the Robinsons would have been even less comfortable on the Athenia. On September 3, 1939, it was stuck by a torpedo from a German U-boat off the coast of Ireland, becoming the first British ship sunk by the Germans in World War II. Of the roughly 1,400 passengers and crew on board, a reported 112 died, including 28 Americans. The rest were rescued, in part because the ship took 14 hours to sink. Fearful that the incident would mobilize the then-neutral U.S., Nazi propagandists denied any involvement and tried to blame it on the British. The S.S. Washington arrived safely in New York with a passenger list that not only included the Robinson family but Sara Delano Roosevelt, mother of the president, and one of his sons, James. Robinson went on to make some of his best movies, including Double Indemnity, Key Largo, and The Stranger. He died in 1973 at the age of 79. Greg Daugherty is a magazine editor and writer as well as a frequent contributor to Smithsonian.com. His books include You Can Write for Magazines.
d09cadee2e8098c75bd8e5f0937c2730
https://www.smithsonianmag.com/history/minstrel-show-1964-worlds-fair-180951239/?utm_source=mandiner&utm_medium=link&utm_campaign=mandiner_202101
The Story Behind the Failed Minstrel Show at the 1964 World’s Fair
The Story Behind the Failed Minstrel Show at the 1964 World’s Fair Two weeks after opening day of the 1964 New York World’s Fair, a minstrel show like no other debuted on the Flushing Meadows fairgrounds. America, Be Seated!, the Louisiana Pavilion’s self-styled “modern minstrel show,” ditched the blackface and featured an integrated cast of white and black actors singing and dancing in harmony. According to a World’s Fair press release, the “all-stops-out slapstick pageant of American history” would combine the “happy flavor of minstrel shows…with original music and modern comedy skits.” The concept sounds like a contradiction in terms: Minstrelsy, a relic of 19th-century theater, disappeared from the American stage in the early 1900s, and its defining component, blackface, was rooted in racism. Blackface minstrel shows originated in the 1830s as a popular form of musical entertainment: white actors, made up with burnt cork or greasepaint, performed sentimental songs and comedy bits with exaggerated mannerisms based on black stereotypes. This genre went into decline after the Civil War as vaudeville took over the nation’s theaters, but blackface made the leap from stage to screen, appearing in films such as The Jazz Singer (1920) and Swing Time (1936), and to radio, heard in the long-running serial “Amos ‘n’ Andy.” But the “updated” minstrel show at the 1964 World’s Fair defied the bigoted origins of the genre to become, ironically, the event’s most progressive attraction. Historically, world’s fairs were all about progress. These international expositions, staged in cities around the globe from the 1850s to the 1960s, unveiled dazzling inventions, such as the sewing machine (1855) and the elevated train (1893), along with utopian visions of the future, such as General Motors’ “Futurama” at the 1939 New York World’s Fair, which depicted a network of expressways connecting the United States. That year’s World’s Fair, also in Flushing Meadows, Queens, is regarded as one of the most influential of the 20th century, renowned for its streamlined art deco style and technological innovations. The 1964-65 World’s Fair, on the other hand, was a study in corporate excess. Boasting an 80-foot-tall tire Ferris wheel (sponsored by U.S. Rubber), Disney-produced animatronics (including the debut of “It’s a Small World”), and a tasteless display of Michelangelo’s Pieta (set in a niche with flickering blue lights, behind bulletproof glass, accessible only by moving walkway), the Fair was not nearly as rarefied as its theme of “Peace Through Understanding” let on. The New York Times’ Ada Louise Huxtable called the Fair’s architecture kitschy and “grotesque.” “There are few new ideas here,” she wrote. “At a time when the possibilities for genuine innovations have never been greater, there is little real imagination…” Historian Robert Rydell has described the 1964 Fair as a “large, rambling, unfocused exposition” that ended the era of American world’s fairs. Much of the blame has been laid on Robert Moses, president of the World’s Fair and mid-20th-century “master builder” of New York City. Moses pledged that the event would cater to “middle roaders,” meaning the ordinary middle-class folks “in slacks and…in their best bibs and tuckers” who came in search of a wholesome good time. The Fair, he vowed, would have no point of view on art or culture or politics. But his incessant diatribes against “avant garde critics and leftwing commentators” amounted to a platform of lily-white conservatism, conforming to his own septuagenarian tastes. In 1962, the Urban League accused the World’s Fair Corporation of racially discriminatory hiring practices, forcing Moses, who dismissed the charges as “nonsense,” to begrudgingly adopt an equal-employment policy. Moses was never a friend to minorities—his slum clearance policies displaced thousands of low-income New Yorkers, overwhelmingly black and Hispanic—and the picture he wanted to present at the Fair was one of blissful ignorance rather than integration. It was about the “warmth, humanity and happiness visible these summer days on Flushing Meadow,” he wrote in October 1964. “That’s the Fair. That’s New York after three hundred years. That’s America.” Trite as it was, America, Be Seated! challenged that credo of complacency. The musical was the brainchild of Mike Todd, Jr. (son of film producer Mike Todd), who saw it as a bona fide theatrical work rather than a carnival amusement. Todd Jr. predicted that the show would ride its World’s Fair success to productions elsewhere in the country. “It could go anywhere,” he told the New York Times. Much to his chagrin, the show went nowhere: it closed after two days with a paltry $300 in receipts. But a May 3, 1964, cast performance on “The Ed Sullivan Show”—the only known recorded performance of the musical—offers clues to what America, Be Seated! looked like and why it didn’t catch on. (An archival copy of the episode is available for viewing at the Paley Center for Media in New York City. We were unable to locate any images of the show.) The cast appeared on “Ed Sullivan” to promote the musical’s World’s Fair debut in grand Louisiana showboat style: ladies in ruffled bodices and flouncy A-line skirts; men in ruffled tailcoats, plaid lapels, and two-tone shoes; and everyone in straw porkpie hats. Four of the show’s fifteen performers were black, and three of these were featured soloists as well as stars in their own right—Lola Falana and Mae Barnes on the swinging “That’s How a Woman Gets Her Man,” and Louis Gossett, Jr. on the man’s response, “Don’t Let a Woman Get You, Man.” One song, “Gotta Sing the Way I Feel Today,” was unabashedly mawkish, with lyrics like “Share this wonderful feeling in the air.” But the title number addressed what would have been on every viewer’s mind: race. Between verses, the interlocutor (Ronny Graham) downplayed the issue: Now, somebody said our minstrel show should not be done for sport That we should have a message of significant import And so we have a message, a most essential one Please listen very carefully Our message is…have fun! The song’s chorus, however—“America, be seated, here’s a modern minstrel show”—repeatedly brought race to the fore. To invoke minstrelsy was to invoke race and, in 1964, racial strife. Even Flushing Meadows had a part to play in the battle for civil rights: on the Fair’s opening day, April 22, members of the Congress of Racial Equality (CORE) disrupted subway traffic to the fairgrounds and picketed in front of park and pavilion entrances. President Lyndon B. Johnson was on hand to deliver the opening address, and during his speech, protestors shouted “Freedom Now” and “Jim Crow Must Go!” These demonstrations took advantage of World’s Fair media coverage to draw attention to the cause. They were directed not at the Fair but at the American public. “For every new car that is shown at the World’s Fair, we will submit a cattle prod,” said CORE leader James Farmer. “For every piece of bright chrome that is on display, we will show the charred remains of an Alabama church. And for the grand and great steel Unisphere [the centerpiece of the Fair], we’ll submit our bodies from all over the country as witnesses against the Northern ghetto and Southern brutality.” When Farmer blocked the door to the New York City pavilion, he called it a “‘symbolic act,’ in the same way…that Negroes have been blocked from good jobs, houses and schools in the city.” The New York Times reported that “most of the opening-day crowd seemed to pay little attention,” however, and those that did responded with obscenities and comments like “Ship ’em back to Africa” and “Get the gas ovens ready.” Of the 750 demonstrators, less than half were arrested, mostly on charges of disorderly conduct that were later dropped, and seven people sustained minor injuries. Both sides were eager to avoid the violence that continued to rage in the South. Less than eight months prior, four black girls were killed in the bombing of a Birmingham church. In January 1964, Louis Allen, a black Mississippi man who had witnessed the murder of a voting-rights activist, was shot to death in his driveway. In March, race riots in Jacksonville, Florida, claimed the life of a 35-year-old black mother, Johnnie Mae Chappell. And after the Student Nonviolent Coordinating Committee announced plans for its “Freedom Summer,” the Ku Klux Klan began to mobilize in Mississippi, burning crosses throughout the state on April 24. The specter of racial unrest would have loomed large in fairgoers’ minds when they heard the term “integrated” and saw blacks and whites together on stage in America, Be Seated! Judging by reviews of the musical’s previews in Boston and New Haven, Connecticut, America, Be Seated! attempted to confront the issue of race head-on. Critical response was mixed, but all of the reviewers commented on the politics of the production. Frederick Guidry of the Christian Science Monitor called the show a “lighthearted call for people all over the United States to find refuge from racial tension in a relaxed acceptance of the American ideal of equality.” These earlier performances contained segments too edgy for “Ed Sullivan.” In the preview Guidry saw, the opening number contained an overt allusion to the civil rights movement—“We haven’t a lot of time to read / But can we picket, yes indeed!”—which was noticeably absent from the “Ed Sullivan” version. “The struggle for full equality,” Guidry wrote, “is never very far from a lyric or a joke.” One comedy bit saw a white director asking a black actor to play to slave stereotype; the actor responded, “I’m chairman of the local chapter of CORE, and you’re going to call me Rastus?” The show’s boldest jokes, however, came from black comedian Timmie Rogers. According to Boston Globe critic Kevin Kelly, Rogers “razz[ed] his own race with a humorous fury that might even bring a smile to the NAACP. Rogers, for example, explained that Negroes have a new cosmetic to keep up with the white man’s desire to be tanned. It’s called Clorox.” The comedian also referred to a new white youth organization called SPONGE, or the Society for the Prevention of Negroes Getting Everything. Remarkably, the musical received support from the NAACP. The organization, understandably turned off by the minstrel show label, was critical of the production at first, but after seeing a Boston preview NAACP officials reversed their stance, praising the revue as an “asset for integration.” William H. Booth, president of the Jamaica, Queens, NAACP branch said: “I have no serious objections. There is nothing in this show detrimental to or ridiculing Negroes. In fact, it is a satire on the old-style minstrel show.” The organization expressed concerns over Timmie Rogers’ jokes about Clorox skin bleach and cannibalism in the Congo, but the comedian agreed to cut them. Boston NAACP president Kenneth Guscott stated that “while the NAACP is flatly against minstrel shows, this one is an integrated production in the true sense that it shows how Negroes feel about discriminatory stereotypes.” Another NAACP official called America, Be Seated! a “spoof on Negro stereotypes.” The critical consensus was that despite its minstrel show marketing—and Variety’s optimistic prediction that it could be “the forerunner of a revival of minstrelsy”—America, Be Seated! actually hewed closer to the vaudeville tradition. Without blackface, it only had the music and three-part structure of traditional minstrelsy. In the end, that miscategorization may have spelled the show’s rapid doom. Variety reported that the “‘minstrel’ connotation” proved to be “b.o. [box office] poison” at the New Haven premiere and that Mike Todd subsequently dropped it from the show’s publicity. But the lyrics of the opening number remained unchanged for the “Ed Sullivan” appearance, which in any case “proved no b.o. tonic.” Tepid turnout for the Fair as a whole didn’t help the musical’s prospects. The 1964-65 Fair drew a total of 52 million visitors in two seasons—well short of its projected 70 million—and closed with $30 million in debt. Mike Todd Jr., whose chief claim to fame (aside from his parentage) was a movie theater gimmick called “Smell-o-Vision,” blamed philistines for the musical’s failure. He told the New York Amsterdam News that “presenting it at the Louisiana Pavilion was like trying to bring legitimate theatre into a night club. It couldn’t compete with the drinks.” In an interview with the Boston Globe, he complained about the Fair’s consumerist atmosphere. “All I could see was kids with hats on,” he said. “World Fair hats…the kind with a feather in it that always gets lost on the way home. That’s what the people were buying. Hats, not shows.” As Timmie Rogers put it, they “never had a chance.” Fifty years later, a handful of reviews and a set on “Ed Sullivan” are all we have to judge the merits of America, Be Seated! It was a corny show, to be sure, but not much cornier than anything else at the World’s Fair, which promised good, old-fashioned, apolitical fun. Even though Todd Jr. inflated the musical’s long-term prospects, there’s no doubt that America, Be Seated! offered something exceptional: a reappropriation of a taboo style. It meant well. But for whatever reason, fairgoers weren’t interested in seeing a “modern minstrel show.” Vicky Gan, a former editorial intern at Smithsonian Magazine, currently works for the Washingtonian.
ad4f8195e298b3f5a89de3f498eb5edf
https://www.smithsonianmag.com/history/modern-craft-cocktail-movement-got-its-start-during-prohibition-180971265/
The Modern Craft Cocktail Movement Got Its Start During Prohibition
The Modern Craft Cocktail Movement Got Its Start During Prohibition With America in the middle of a flourishing craft beer and craft spirits movement, it’s easy to forget that Prohibition was once the law of the land. One hundred years ago, on January 17, 1920, Prohibition went into effect, one year after Nebraska became the 36th of the country’s 48 states to ratify the 18th Amendment. The law forbade the production of beverages that contained more than one-half of 1 percent alcohol. Breweries, wineries and distilleries across America were shuttered. Most never reopened. Prohibition may be long dead, but the speakeasies and cocktails it spawned are still with us. Much of the era’s bootleg liquor was stomach-turning. The need to make this bad alcohol drinkable – and to provide buyers a discreet place to consume it – created a phenomenon that lives on in today’s craft cocktail movement and faux speakeasies. For better or worse, Prohibition changed the way Americans drank, and its cultural impact has never really gone away. During Prohibition, the primary source of drinking alcohol was industrial alcohol – the kind used for making ink, perfumes and campstove fuel. About 3 gallons of faux gin or whiskey could be made from 1 gallon of industrial alcohol. The authors of the Volstead Act, the law enacted to carry out the 18th Amendment, had anticipated this: It required that industrial alcohol be denatured, which means that it’s been adulterated with chemicals that make it unfit to drink. Bootleggers quickly adapted and figured out ways to remove or neutralize these adulterants. The process changed the flavor of the finished product – and not for the better. Poor quality notwithstanding, around one-third of the 150 million gallons of industrial alcohol produced in 1925 was thought to have been diverted to the illegal alcohol trade. The next most common source of alcohol in Prohibition was alcohol cooked up in illegal stills, producing what came to be called moonshine. By the end of Prohibition, the Prohibition Bureau was seizing nearly a quarter-million illegal stills each year. The homemade alcohol of this era was harsh. It was almost never barrel-aged and most moonshiners would try to mimic flavors by mixing in some suspect ingredients. They found they could simulate bourbon by adding dead rats or rotten meat to the moonshine and letting it sit for a few days. They made gin by adding juniper oil to raw alcohol, while they mixed in creosote, an antiseptic made from wood tar, to recreate scotch’s smokey flavor. With few alternatives, these dubious versions of familiar spirits were nonetheless in high demand. Bootleggers much preferred to trade in spirits than in beer or wine because a bottle of bootleg gin or whiskey could fetch a far higher price than a bottle of beer or wine. Prior to Prohibition, distilled spirits accounted for less than 40 percent of the alcohol consumed in America. By the end of the “noble experiment” distilled spirits made up more than 75 percent of alcohol sales. To make the hard liquor palatable, drinkers and bartenders mixed in various ingredients that were flavored and often sweet. Gin was one of the most popular beverages of the era because it was usually the simplest, cheapest and fastest beverage to produce: Take some alcohol, thin it with water, add glycerin and juniper oil, and voila – gin! For this reason, many of the cocktails created during Prohibition used gin. Popular creations of the era included the Bee’s Knees, a gin-based drink that used honey to fend off funky flavors, and the Last Word, which mixed gin with Chartreuse and maraschino cherry liqueur and is said to have been created at the Detroit Athletic Club in 1922. Rum was another popular Prohibition tipple, with huge amounts smuggled into the country from Caribbean nations via small boats captained by “rum-runners.” The Mary Pickford was a cocktail invented in the 1920s that used rum and red grapefruit juice. The cocktail trend became an important part of home entertaining as well. With beer and wine less available, people hosted dinner parties featuring creative cocktails. Some even dispensed with the dinner part altogether, hosting newly fashionable cocktail parties. Cocktails became synonymous with America the way wine was synonymous with France and Italy. Beginning in the late 1980s, enterprising bartenders and restaurateurs sought to recreate the atmosphere of the Prohibition-era speakeasy, with creative cocktails served in dimly lit lounges. The modern craft cocktail movement in America probably dates to the reopening of the legendary Rainbow Room at New York’s Rockefeller Center in 1988. The new bartender, Dale Degroff, created a cocktail list filled with classics from the Prohibition era, along with new recipes based on timeless ingredients and techniques. Around the same time, across town at the Odeon, bar owner Toby Cecchini created “Sex and the City” favorite the Cosmopolitan – a vodka martini with cranberry juice, lime juice and triple sec. A movement was born: Bartenders became superstars and cocktail menus expanded with new drinks featuring exotic ingredients, like the Lost in Translation – a take on the Manhattan using Japanese whiskey, craft vermouth and mushroom-flavored sugar syrup – or the Dry Dock, a gin fizz made with cardamom bitters, lavender-scented simple syrup and grapefruit. In 1999, legendary bartender Sasha Petraske opened Milk & Honey as an alternative to noisy bars with poorly made cocktails. Petraske wanted a quiet bar with world-class drinks, where, according to the code for patrons, there would be “no hooting, hollering, shouting, or other loud behavior,” “gentlemen will not introduce themselves to ladies” and “gentlemen will remove their hats.” Petraske insisted on the highest quality liquors and mixers. Even the ice was customized for each cocktail. Many of what are now clichés in the craft cocktail bars – big, hard ice cubes, bartenders with Edwardian facial hair and neckties, rules for entry and service – originated at Milk & Honey. A lot of the early bars that subscribed to the craft cocktail ethos emulated the speakeasies of the Prohibition era. The idea was to make them seem special and exclusive, and some of the new “speakeasies” incorporated gimmicks like requiring customers to enter behind bookcases or through phone booths. They’re meant to be places where customers can come to appreciate the drink – not the band, not the food, not the pickup scene. Luckily, today’s drinker doesn’t have to worry about rotgut liquor: The craft distilling industry provides tasty spirits that can be either enjoyed in cocktails or simply sipped neat. Jeffrey Miller is an Associate Professor and Program Coordinator of Hospitality Management at Colorado State University.
580ca69fa937f367a430bd16e5f08352
https://www.smithsonianmag.com/history/mongolia-melts-climate-change-looters-close-in-180968764/
As Mongolia Melts, Looters Close In On Priceless Artifacts
As Mongolia Melts, Looters Close In On Priceless Artifacts The history and archaeology of Mongolia, most famously the sites associated with the largest land empire in the history of the world under Ghengis Khan, are of global importance. But they’re facing unprecedented threats as climate change and looting impact ancient sites and collections. Climate change and looting may seem to be unrelated issues. But deteriorating climate and environmental conditions result in decreased grazing potential and loss of profits for the region’s many nomadic herders. Paired with a general economic decline, herders and other Mongolians are having to supplement their incomes, turning to alternative ways of making money. For some, it’s searching for ancient treasures to sell on the illegal antiquities market. The vast Mongolian landscape, whether it be plains, deserts or mountains, is dotted with man-made stone mounds marking the burials of ancient peoples. The practice started sometime in the neolithic period (roughly 6,000-8,000 years ago) with simple stone mounds the size of a kitchen table. These usually contain a human body and a few animal bones. Over time, the burials became larger (some over 1300 feet long) and more complex, incorporating thousands of horse sacrifices, tools, chariots, tapestries, family complexes, and eventually treasure (such as gold, jewelry and gems). For Mongolians, these remains are the lasting reminders of their ancient past and a physical tie to their priceless cultural heritage. Mongolia has reasonably good laws regarding the protection of cultural heritage. But poor understanding of the laws, and the nearly impossible task of enforcing them in such a large space with relatively few people and meager budgets keep those laws from being effective. And laws can’t protect Mongolia’s cultural heritage from climate change. The looting of archaeological sites in Mongolia has been happening for a very long time. Regional archaeologists have shared anecdotes of finding skeletons with break-in tools made from deer antlers in shafts of 2,000 year old royal tombs in central Mongolia. These unlucky would-be thieves risked the unstable sands collapsing in the shafts above them for a chance at riches, not long after the royal leaders had been buried there. But many recent pits dug directly into burial sites around Mongolia, some that are more than 3,000 years old, suggest modern day looting is on the rise. For the untrained looter, any rock feature has the potential to contain valuable goods and so grave after grave is torn apart. Many of these will contain no more than human and animal bones. Archaeologists’ interest in these burials lie in the information they contain for research, but this is worthless on the black antiquities market. But to steer looters away from these burials would be to teach them which ones to target for treasure and so this strategy is avoided. Archaeologists working in northern Mongolia in 2017 found hundreds of looted sites, including an 800 year old cemetery consisting of at least 40 burials. Each and every one of them had been completely destroyed by looters looking for treasure. Human remains and miscellaneous artefacts such as bows, arrows, quivers, and clothing were left scattered on the surface. Having survived over 800 years underground, these priceless bows, arrows, cloth fragments and bones likely have less than a year on the surface before they’re gone forever. This is not to mention the loss of whatever goods (gold, silver, gems) the looters decided was valuable enough to keep. Archaeological teams are currently working against climate change, looters, and each other for the chance to unearth rare mummies in the region that are known to pique public interest within Mongolia and abroad. A 2017 exhibit at the National Museum of Mongolia featured two mummies and their impressive burial goods—one of which had been rescued from the hands of looters by archaeologists and local police. Though they appeared not to have been particularly high ranking individuals, their belongings displayed incredible variety, artistry and detail. The result of natural processes rather than intentional mummification as in ancient Egypt, some of these mummies are preserved by very dry environments protected in caves and rock shelters. Others are ice mummies, interred in burials that were constructed in such a way that water seeped in and froze—creating a unique preservation environment. Both preservation environments produce artifacts that rarely survive such long periods of time. This includes human tissues like skin and hair, clothing and tapestries, wooden artifacts, and the remains of plants and animals associated with the burial. As looters zero in on these sites, and climate change melts ice and changes the environmental conditions in other yet unknown ways, archaeologists are racing to locate, and preserve these finds. But with little infrastructure, small budgets and almost no specialised training in how to handle such remains, there’s some concern about the long term preservation of even those remains archaeologists are able to rescue. Efforts to provide training opportunities, international collaborations with mummy experts, and improved infrastructure and facilities are underway, but these collections are so fragile there is little time to spare. The situation in Mongolia could help us to understand and find new solutions to dealing with changes in climate and the economic drivers behind looting. Humans around the world in many different times have faced and had to adapt to climate change, economic strife and technological innovations. There’s truth represented by a material record of the “things” left by ancient peoples and in Mongolia, the study of this record has led to an understanding of the impact of early food production and horse domestication, the emergence of new social and political structures and the dominance of a nomadic empire. Julia Kate Clark, Endeavor Fellow, Flinders University; Director, NOMAD Science, Flinders University.
12a7aba3708d00464fab8ec020e423ed
https://www.smithsonianmag.com/history/motorized-scooter-boom-hit-century-dockless-scooters-180971989/
The Motorized Scooter Boom That Hit a Century Before Dockless Scooters
The Motorized Scooter Boom That Hit a Century Before Dockless Scooters Peter Minton was riding his motorized scooter on Rockaway Beach Boulevard when the patrolman served him with a summons to appear in traffic court. The reason: the 16-year-old was operating the vehicle without a driver’s license. Minton wasn’t zipping along on a Lime, Bird, Skip or Spin. Instead, the news item dates back to July 1939, when the motorized scooter was first booming in the U.S. Long before Silicon Valley companies swarmed American cities with their cheap rideshare scooters, the Autoped disrupted it all first when it hit the pavement around 1915. The Online Bike Museum explains that the Autoped, the first mass-produced motorized scooter ride in the U.S., was “[e]ssentially an enlarged child’s scooter with an engine mounted over the front wheel.” Though some reports claimed it could reach speeds of 35 miles per hour, the steering column operated the clutch and brake, which the museum noted made the ride “unsteady” when it pushed 20 mph. Later, a battery-operated version of the Autoped was made available when the Everready Battery Company bought the outfit. The concept of the scooter stretches back at least a century before to 1817 and Baron Karl von Drais de Sauerbrun of Germany. After he debuted his early two-wheeled, human-powered ride, the velocipede concept was quickly spun off into bicycles, tricycles and kick scooters. Give or take a few decades, the transportation was being motorized, too, with rear treadle drives popping up in Scotland around the 1840s, according to the Encyclopedia Britannica. Come the turn of the 19th century, battery-powered machines were also entering into the fold; Ogden Bolton Jr. was issued a U.S. patent for his battery-powered bicycle in 1895. But the Autoped (and its first generation predecessor, the Motoped) can be seen as “the true ancestors of the modern motor scooter,” according to the museum. It came at a time when there were scarcely any safety regulations for motorized vehicles on the road. While Connecticut created the first statewide traffic law to regulate motor vehicles in 1901 and New York introduced drunk driving laws roughly a decade later, by the time the Autoped rolled out, traffic lights were still 15 years away from being introduced. The patent for the design of the “self-propelled vehicle” went to inventor Arthur Hugo Cecil Gibson, though it appears that Joseph F. Merkel, the designer behind the Flying Merkel motorcycle, helped significantly in the creation of the final product. The rides were manufactured through the Autoped Company of America, first incorporated in 1913, which set up shop in Long Island City in Queens, New York, in the fall of 1915. At first, the cycling press of the day wrote off the Autoped as “a ‘freak’ vehicle,” according to New York State Museum senior historian emeritus Geoffrey N. Stein. But the Autoped hung around longer than expected, perhaps because it intrigued a wide tent of users. As its advertisement copy makes clear, it was marketing broadly: “The Autoped is an ideal short distance conveyance for business or professional men or women to and from their places of business; for women to go shopping or calling; for physicians to make their regular daily calls or to answer hurry calls; for the older children to go about quickly for outing or school; for servants when they are sent on errands; for grocers, druggists and other merchants for quick delivery purposes; for commercial salesman to call on the trade; for employees to ride to and from work; for collectors; repairmen; messengers, and for anybody else who wants to save money, time and energy in going about. All will enjoy the comfort and pleasure of AUTOPEDING.” Just as their modern-day equivalents have come under fire for being toys of the wealthy elite, the Autoped’s marketing certainly carried a bit of a class element. An advertisement that ran in Puck magazine in 1916—“Look out for the Autoped girl”— pictured an illustration of a fashionable well-to-do white woman in a fabulous hat, a fur wrapped around her neck. The copy was clearly after a specific demographic: “If you were the sort of person who did your gift shopping in the 1916 equivalent of the Neiman Marcus Christmas catalog (Hammacher Schlemmer, maybe), an Autoped was on your list,” explains Hemmings Daily, the blog of the classic car marketplace. But the Autoped wasn’t just a plaything of the rich. Just like the bicycle before it, the advent of the motorized scooter promoted a level of freedom and mobility for women that gave the messaging “Look out for the Autoped girl,” more heft. Over at Mashable, Chris Wild recounts the story of the “suffragette on a scooter,” Lady Florence Norman, who rode her Autoped to work in central London. Meanwhile, Amelia Earhart, the famous aviatrix, appeared in multiple photographs with the Autoped around California, even after it stopped being manufactured around 1921. With Earhart on it, it’s easy to imagine why the caption to one of those photographs reads: “In the near future, we are told, no one will walk at all.” Businesses also gave the Autoped a try. The best example might be the New York Postal Service, which used the slim rides to deliver mail. To the frustration of police, delinquents saw their own window of opportunity in the nimble machinery, repurposing them as getaway vehicles. “Groups of rowdy youth were soon terrorizing the boroughs of Brooklyn, Queens and Manhattan,” writes the Online Bike Museum, highlighting the intriguingly named Long Island Bogtrotters. Led by the “legendary” Fat Burns, the museum notes the group even made a Yonkers Grand Prix with the machines. “[T]he first and last” of such an event. Still, like the ubiquitous packs of tourists traveling via Segway today, the majority of the machines were used for recreation. Stein features one gleeful picture of two women taking part in an impromptu Autoped race on the sand in Long Island that had been snapped for a 1916 Motorcycle Illustrated issue. California businesses, the historian noted, had purchased 50 machines by 1917 so they could be "rented out at the beach resorts next season.” But just as dockless scooters today struggle to recuperate costs—while there’s been billions invested in the eco-friendly startups, a profitable business model remains a work-in-progress to put it diplomatically—the Autoped’s lifespan was ultimately cut short by its bottom line. Erwin Tragatsch, author of The Illustrated Encyclopedia of Motorcycles, tells Stein that "like all other scooters of that period, the Autoped was not a commercial success." Experts he spoke to suggested the problem may have had to do with the need for the device, which was more expensive than a bicycle but didn’t offer the seated comfort of a motorcycle. The Autoped was, perhaps, just a little ahead of its time with what it was offering. After the Great Depression hit, the Cushman company, which got its start making engines in the early 1900s, picked up where its predecessor left off, finding new utility in the ride among those pinching pennies. Stuck with a surplus of Husky engines as the Depression lingered, the company got creative. In 1936, it debuted the Cushman Auto-Glide. “A byproduct of the 1929 catastrophe, the scooter was lauded for being thrifty,” Cycle World magazine later wrote because of its price point and gas needs. One brochure went as far as to claim driving an Auto-Glide was “NO COST AT ALL,” adding, “Why, it’s actually cheaper than walking.” Ultimately, the Auto-Glide and its competitors were dogged by the same kinds of regulations that sent Peter Minton to traffic court in 1939. The years of “driving dangerously” of the early 1900s were changing as lawmakers attempted to get ahold of the early age of the automobile. “Little attention has yet been paid to the right of any man to drive a car,” the New York Times had bemoaned in 1907, suggesting that “Something akin to the French system, which is the ideal plan of licensing drivers, furnishing them with official cards with the penalty of revoking the license in addition to a jail sentence for a second or third serious offense,” was needed in the U.S. By the 1930s, the framework of such a system had arrived. “It says much that Cushman faced serious financial problems again when the U.S. government introduced more stringent traffic laws for young riders,” Josh Sims comments in Scootermania, which chronicles the evolution of the ride. It’s easy to see how the times we’re living in now echo back to the first scooter boom. “Today’s startups are promoting their products by following the same playbook as cars: get them on the street, and figure out how to regulate them afterward. That strategy also propelled Uber and Lyft to multi-billion dollar valuations,” Michael J. Coren wrote for Quartz in 2018. But it remains unclear how the vehicles will fare as lawmakers once more play catch up to regulate the rides this go around. Jacqueline Mansky is a freelance writer and editor living in Los Angeles. She was previously the assistant web editor, humanities, for Smithsonian magazine.
12b786f52f261529b84a842af2f1a8f7
https://www.smithsonianmag.com/history/mr-lincoln-goes-to-hollywood-82330187/?no-ist
The Assassination of Abraham Lincoln
The Assassination of Abraham Lincoln In Lincoln, the Steven Spielberg movie opening this month, President Abraham Lincoln has a talk with U.S. Representative Thaddeus Stevens that should be studied in civics classes today. The scene goes down easy, thanks to the moviemakers’ art, but the point Lincoln makes is tough. Stevens, as Tommy Lee Jones plays him, is the meanest man in Congress, but also that body’s fiercest opponent of slavery. Because Lincoln’s primary purpose has been to hold the Union together, and he has been approaching abolition in a roundabout, politic way, Stevens by 1865 has come to regard him as “the capitulating compromiser, the dawdler.” The congressman wore with aplomb, and wears in the movie, a ridiculous black hairpiece—it’s round, so he doesn’t have to worry about which part goes in front. A contemporary said of Stevens and Lincoln that “no two men, perhaps, so entirely different in character, ever threw off more spontaneous jokes.” Stevens’ wit, however, was biting. “He could convulse the House,” wrote biographer Fawn M. Brodie, “by saying, ‘I yield to the gentleman for a few feeble remarks.’” Many of his declarations were too funky for the Congressional Globe (predecessor of the Congressional Record), which did, however, preserve this one: “There was a gentleman from the far West sitting next to me, but he went away and the seat seems just as clean as it was before.” Lincoln’s wit was indirect, friendly—Doris Kearns Goodwin quotes him as describing laughter as “the joyous, universal evergreen of life” in her book Team of Rivals: The Political Genius of Abraham Lincoln, on which the movie is partly based. But it was also purposeful. Stevens was a man of unmitigated principle. Lincoln got some great things done. What Lincoln, played most convincingly by Daniel Day-Lewis, says to Stevens in the movie, in effect, is this: A compass will point you true north. But it won’t show you the swamps between you and there. If you don’t avoid the swamps, what’s the use of knowing true north? That’s a key moment in the movie. It is also something that I wish more people would take to heart—people I talk with about politics, especially people I agree with. Today, as in 1865, people tend to be sure they are right, and maybe they are—Stevens was, courageously. What people don’t always want to take on board is that people who disagree with them may be just as resolutely sure they are right. That’s one reason the road to progress, or regression, in a democracy is seldom straight, entirely open or, strictly speaking, democratic. If Lincoln’s truth is marching on, it should inspire people to acknowledge that doing right is a tricky proposition. “I did not want to make a movie about a monument,” Spielberg told me. “I wanted the audience to get into the working process of the president.” Lincoln came out against slavery in a speech in 1854, but in that same speech he declared that denouncing slaveholders wouldn’t convert them. He compared them to drunkards, writes Goodwin: Though the cause be “naked truth itself, transformed to the heaviest lance, harder than steel” [Lincoln said], the sanctimonious reformer could no more pierce the heart of the drinker or the slaveowner than “penetrate the hard shell of a tortoise with a rye straw. Such is man, and so must he be understood by those who would lead him.” In order to “win a man to your cause,” Lincoln explained, you must first reach his heart, “the great high road to his reason.” As it happened, the fight for and against slave-owning would take the lowest of roads: four years of insanely wasteful war, which killed (by the most recent reliable estimate) some 750,000 people, almost 2.5 percent of the U.S. population at the time, or the equivalent of 7.5 million people today. But winning the war wasn’t enough to end slavery. Lincoln, the movie, shows how Lincoln went about avoiding swamps and reaching people’s hearts, or anyway their interests, so all the bloodshed would not be in vain. *** When Goodwin saw the movie, she says, “I felt like I was watching Lincoln!” She speaks with authority, because for eight years, “I awakened with Lincoln every morning and thought about him every night,” while working on Team of Rivals. “I still miss him,” she adds. “He’s the most interesting person I know.” Goodwin points to a whole 20-foot-long wall of books about Lincoln, in one of the four book-lined libraries in her home in Concord, Massachusetts, which she shares with husband Richard Goodwin, and his mementos from his days as speechwriter and adviser to Presidents Kennedy and Johnson—he wrote the “We Shall Overcome” speech that Johnson delivered on national television, in 1965, in heartfelt support of the Voting Rights Act. She worked with Johnson, too, and wrote a book about him. “Lincoln’s ethical and human side still outranks all the other presidents,” she says. “I had always thought of him as a statesman—but I came to realize he was our greatest politician.” The movie project began with Goodwin’s book, before she had written much of it. When she and Spielberg met, in 1999, he asked her what she was working on, and she said Lincoln. “At that moment,” says Spielberg, “I was impulsively seized with the chutzpah to ask her to let me reserve the motion-picture rights.” To which effrontery she responded, in so many words: Cool. Her original plan had been to write about Mary and Abe Lincoln, as she had about Franklin and Eleanor Roosevelt. “But I realized that he spent more time with members of his cabinet,” she says. And so Goodwin’s book became an infectiously loving portrait of Lincoln’s empathy, his magnanimity and his shrewdness, as shown in his bringing together a cabinet of political enemies, some more conservative than he, others more radical, and maneuvering them into doing what needed to be done. Prominent among those worthies was Secretary of the Treasury Salmon Chase. Goodwin notes that when that august-looking widower and his daughter Kate, the willowy belle of Washington society, “made an entrance, a hush invariably fell over the room, as if a king and his queen stood in the doorway.” And yet, wrote Navy Secretary Gideon Welles, Chase was “destitute of wit.” He could be funny inadvertently. Goodwin cites his confiding to a friend that he “was tormented by his own name. He fervently wished to change its ‘awkward, fishy’ sound to something more elegant. ‘How wd. this name do (Spencer de Cheyce or Spencer Payne Cheyce,)’ he inquired.” Not only was Chase fatuous, but like Stevens he regarded Lincoln as too conservative, too sympathetic to the South, too cautious about pressing abolition. But Chase was capable, so Lincoln gave him the dead-serious job of keeping the Union and its war effort financially afloat. Chase did so, earnestly and admirably. He also put his own picture on the upper left-hand corner of the first federally issued paper money. Chase was so sure he should have been president, he kept trying—even though Lincoln bypassed loyal supporters to appoint him chief justice of the United States—to undermine Lincoln politically so he could succeed him after one term. Lincoln was aware of Chase’s treachery, but he didn’t take it personally, because the country needed Chase where he was. Lincoln’s lack of self-importance extended even further with that pluperfect horse’s ass Gen. George B. McClellan. In 1861, McClellan was using his command of the Army of the Potomac to enhance his self-esteem (“You have no idea how the men brighten up now, when I go among them”) rather than to engage the enemy. In letters home he was mocking Lincoln as “the original gorilla.” Lincoln kept urging McClellan to fight. In reading Goodwin’s book, I tried to identify which of its many lively scenes would be in the movie. Of a night when Lincoln, Secretary of State William Seward and Lincoln’s secretary John Hay went to McClellan’s house, she writes: Told that the general was at a wedding, the three waited in the parlor for an hour. When McClellan arrived home, the porter told him the president was waiting, but McClellan passed by the parlor room and climbed the stairs to his private quarters. After another half hour, Lincoln again sent word that he was waiting, only to be informed that the general had gone to sleep. Young John Hay was enraged....To Hay’s surprise, Lincoln “seemed not to have noticed it specially, saying it was better at this time not to be making points of etiquette & personal dignity.” He would hold McClellan’s horse, he once said, if a victory could be achieved. Finally relieved of his command in November 1862, McClellan ran against Lincoln in the 1864 election, on a platform of ending the war on terms congenial to the Confederacy, and lost handily. It’s too bad Lincoln could not have snatched McClellan’s horse from under him, so to speak. But after the election, notes Tony Kushner, who wrote the screenplay, “Lincoln knew that unless slavery was gone, the war wasn’t really going to end.” So although the movie is based in part on Goodwin’s book, Kushner says, Lincoln didn’t begin to coalesce until Spielberg said, “Why don’t we make a movie about passing the 13th Amendment?” *** Kushner’s own most prominent work is the greatly acclaimed play Angels in America: angels, Mormons, Valium, Roy Cohn, people dying of AIDS. So it’s not as though he sticks to the tried and true. But he says his first reaction to Spielberg’s amendment notion was: This is the first serious movie about Lincoln in seventy-odd years! We can’t base it on that! In January 1865, Lincoln has just been re-elected and the war is nearly won. The Emancipation Proclamation, laid down by the president under what he claimed to be special wartime powers, abolishes slavery only within areas “in rebellion” against the Union and perhaps not permanently even there. So while Lincoln’s administration has got a harpoon into slavery, the monster could still, “with one ‘flop’ of his tail, send us all into eternity.” That turn of metaphor is quoted in Goodwin’s book. But the battle for the 13th Amendment, which outlawed slavery nationwide and permanently, is confined to 5 of her 754 pages. “I don't like biopics that trot you through years and years of a very rich and complicated life,” Kushner says. “I had thought I would go from September 1863 to the assassination, focusing on the relationship of Lincoln and Salmon Chase. Three times I started, got to a hundred or so pages, and never got farther than January 1864. You could make a very long miniseries out of any week Lincoln occupied the White House.” He sent Goodwin draft after draft of the script, which at one point was up to 500 pages. “Tony originally had Kate in,” says Goodwin, “and if the film had been 25 hours long....” Then Spielberg brought up the 13th Amendment, which the Chases had nothing to do with. In the course of six years working on the script, Kushner did a great deal of original research, which kept spreading. For example: “I was looking for a play Lincoln might have seen in early March of ’65...[and] I found a Romeo and Juliet starring Avonia Jones, from Richmond, who was rumored to be a Confederate sympathizer—she left the country immediately after the war, went to England and became an acting teacher, and one of her pupils was Belle Boyd, a famous Confederate spy. And the guy who was supposed to be in Romeo and Juliet with her was replaced at the last moment by John Wilkes Booth—who was plotting then to kidnap Lincoln. I thought, ‘I’ve discovered another member of the conspiracy!’” Avonia didn’t fit in Lincoln, so she too had to go—but the Nashville lawyer W.N. Bilbo, another one of the obscure figures Kushner found, survived. And as played by James Spader, Bilbo, who appears nowhere in Team of Rivals, nearly steals the show as a political operative who helps round up votes for the amendment, offering jobs and flashing greenbacks to conceivably swayable Democrats and border-state Republicans. If another director went to a major studio with a drama of legislation, he’d be told to run it over to PBS. Even there, it might be greeted with tight smiles. But although “people accuse Steven of going for the lowest common denominator and that kind of thing,” says Kushner, “he is willing to take big chances.” And nobody has ever accused Spielberg of not knowing where the story is, or how to move it along. Spielberg had talked to Liam Neeson, who starred in his Schindler’s List, about playing Lincoln. Neeson had the height. “But this is Daniel’s role,” Spielberg says. “This is not one of my absent-father movies. But Lincoln could be in the same room with you, and he would go absent on you, he would not be there, he would be in process, working something out. I don’t know anybody who could have shown that except Daniel.” On the set everyone addressed Day-Lewis as “Mr. Lincoln” or “Mr. President.” “That was my idea,” Spielberg says. “I addressed all the actors pretty much by the roles they were playing. When actors stepped off the set they could be whoever they felt they needed to be, but physically on the set I wanted everybody to be in an authentic mood.” He never did that in any of his 49 other directorial efforts. (“I couldn’t address Daniel at all,” says Kushner. “I would send him texts. I called myself ‘Your metaphysical conundrum,’ because as the writer of the movie, I shouldn’t exist.”) Henry Fonda in Young Mister Lincoln (1939) might as well be a youngish Henry Fonda, or perhaps Mister Roberts, with nose enhancement. Walter Huston in Abraham Lincoln (1930) wears a startling amount of lipstick in the early scenes, and later when waxing either witty or profound he sounds a little like W.C. Fields. Day-Lewis is made to resemble Lincoln more than enough for a good poster shot, but the character’s consistency is beyond verisimilitude. Lincoln, 6-foot-4, was taller than everyone around him by a greater degree than is Day-Lewis, who is 6-foot-1 1/2. I can’t help thinking that Lincoln’s voice was even less mellow (it was described as high-pitched and thin, and his singing was more recitational than melodious) than the workable, vaguely accented tenor that Day-Lewis has devised. At first acquaintance Lincoln came off gawkier, goofier, uglier than Day-Lewis could very well emulate. If we could reconstitute Lincoln himself, like the T. Rex in Jurassic Park, his looks and carriage might put us off. Day-Lewis gives us a Lincoln with layers, angles, depths and sparks. He tosses in some authentic-looking flat-footed strides, and at one point stretches unpresidentially across the floor he’s lying on to stoke the fire. More crucially, he conveys Lincoln’s ability to lead not by logic or force but by such devices as timing (knowing when a time is ripe), amusement (he not only got away with laughing at his own stories, sometimes for reasons unclear, but also improved his hold on the audience thereby) and at least making people think he was getting into where they were coming from. We know that Lincoln was a great writer and highly quotable in conversation, but Lincoln captures him as a verbal tactician. Seward (ably played by David Straithairn) is outraged. He’s yelling at Lincoln for doing something he swore he wouldn’t, something Seward is convinced will be disastrous. Lincoln, unruffled, muses about looking into the seeds of time and seeing which grains will grow, and then says something else that I, and quite possibly Seward, didn’t catch, and then something about time being a great thickener of things. There’s a beat. Seward says he supposes. Another beat. Then he says he has no idea what Lincoln’s talking about. Here’s a more complicated and masterly example. The whole cabinet is yelling at Lincoln. The Confederacy is about to fall, he’s already proclaimed emancipation, why risk his popularity now by pushing for this amendment? Well, he says affably, he’s not so sure the Emancipation Proclamation will still be binding after the war. He doesn’t recall his attorney general at the time being too excited about it being legal, only that it wasn’t criminal. His tone becomes subtly more backwoodsy, and he makes a squeezy motion with his hands. Then his eyes light up as he recalls defending, back in Illinois, a Mrs. Goings, charged with murdering her violent husband in a heated moment. Melissa Goings is another figure who doesn’t appear in Team of Rivals, but her case is on the record. In 1857, the newly widowed 70-year-old stood accused of bludgeoning her 77-year-old husband with a piece of firewood. In the most common version of the story, Lincoln, sensing hostility in the judge but sympathy among the townspeople, called for a recess, during which his client disappeared. Back in court, the bailiff accused Lincoln of encouraging her to bolt, and he professed his innocence: “I did not run her off. She wanted to know where she could get a good drink of water, and I told her there was mighty good water in Tennessee.” She was never found, and her bail—$1,000—was forgiven. In the movie, the cabinet members start laughing as Lincoln reminisces, even though they may be trying to parse precisely what the story has to do with the 13th Amendment. Then he shifts into a crisp, logical explication of the proclamation’s insufficiency. In summary he strikes a personal note; he felt the war demanded it, therefore his oath demanded it, and he hoped it was legal. Shifting gears without a hitch, he tells them what he wants from them: to stand behind him. He gives them another laugh—he compares himself to the windy preacher who, once embarked on a sermon, is too lazy to stop—then he puts his foot down: He’s going to sign the 13th Amendment. His lips press so firmly together they tremble just slightly. Lincoln’s telling of the Goings case varies slightly from the historical record, but in fact there is an account of Lincoln departing from the record himself, in telling the story differently from the way he does in the movie. “The rule was,” says Kushner, “that we wouldn’t alter anything in a meaningful way from what happened.” Conversations are clearly invented, but I haven’t found anything in the movie that is contradicted by history, except that Grant looks too dressy at Appomattox. (Lee does, for a change, look authentically corpulent at that point in his life.) Lincoln provides no golden interracial glow. The n-word crops up often enough to help establish the crudeness, acceptedness and breadth of anti-black sentiment in those days. A couple of incidental pop-ups aside, there are three African-American characters, all of them based reliably on history. One is a White House servant and another one, in a nice twist involving Stevens, comes in almost at the end. The third is Elizabeth Keckley, Mary Lincoln’s dressmaker and confidante. Before the amendment comes to a vote, after much lobbying and palm-greasing, there’s an astringent little scene in which she asks Lincoln whether he will accept her people as equals. He doesn’t know her, or her people, he replies. But since they are presumably “bare, forked animals” like everyone else, he says, he will get used to them. Lincoln was certainly acquainted with Keckley (and presumably with King Lear, whence “bare, forked animals” comes), but in the context of the times, he may have thought of black people as unknowable. At any rate the climate of opinion in 1865, even among progressive people in the North, was not such as to make racial equality an easy sell. In fact, if the public got the notion the 13th Amendment was a step toward establishing black people as social equals, or even toward giving them the vote, the measure would have been doomed. That’s where Lincoln’s scene with Thaddeus Stevens comes in. *** Stevens is the only white character in the movie who expressly holds it self-evident that every man is created equal. In debate, he vituperates with relish—You fatuous nincompoop, you unnatural noise!—at foes of the amendment. But one of those, Rep. Fernando Wood of New York, thinks he has outslicked Stevens. He has pressed him to state whether he believes the amendment’s true purpose is to establish black people as just as good as whites in all respects. You can see Stevens itching to say, “Why yes, of course,” and then to snicker at the anti-amendment forces’ unrighteous outrage. But that would be playing into their hands; borderline yea-votes would be scared off. Instead he says, well, the purpose of the amendment— And looks up into the gallery, where Mrs. Lincoln sits with Mrs. Keckley. The first lady has become a fan of the amendment, but not of literal equality, nor certainly of Stevens, whom she sees as a demented radical. The purpose of the amendment, he says again, is—equality before the law. And nowhere else. Mary is delighted; Keckley stiffens and goes outside. (She may be Mary’s confidante, but that doesn’t mean Mary is hers.) Stevens looks up and sees Mary alone. Mary smiles down at him. He smiles back, thinly. No “joyous, universal evergreen” in that exchange, but it will have to do. Stevens has evidently taken Lincoln’s point about avoiding swamps. His radical allies are appalled. One asks whether he’s lost his soul; Stevens replies, mildly, that he just wants the amendment to pass. And to the accusation that there’s nothing he won’t say toward that end, he says: Seems not. Later, after the amendment passes, Stevens pays semi-sardonic tribute to Lincoln, along the lines of something the congressman actually once said: that the greatest measure of the century “was passed by corruption, aided and abetted by the purest man in America.” That is the kind of purity we “bare, forked animals” can demand of political leaders today, assuming they’re good enough at it. Of course, Lincoln got shot for it (I won’t spoil for you the movie’s masterstroke, its handling of the assassination), and with that erasure of Lincoln’s genuine adherence to “malice toward none,” Stevens and the other radical Republicans helped make Reconstruction as humiliating as possible for the white South. For instance, Kushner notes, a true-north Congress declined to give Southern burial societies any assistance in finding or identifying remains of the Confederate dead, thereby contributing to a swamp in which equality even before the law bogged down for a century, until nonviolent tricksters worthy of Lincoln provoked President Johnson, nearly as good a politician as Lincoln, to push through the civil rights acts of the 1960s. How about the present? Goodwin points out that the 13th Amendment was passed during a post-election rump session of Congress, when a number of representatives, knowing they weren't coming back anyway, could be prevailed upon to vote their consciences. "We have a rump session coming up now," she observes.
9dd1c32b2b17dc229e737dd90da6ad34
https://www.smithsonianmag.com/history/murder-wasnt-very-pretty-the-rise-and-fall-of-dc-stephenson-18935042/?no-ist=
“Murder Wasn’t Very Pretty”: The Rise and Fall of D.C. Stephenson
“Murder Wasn’t Very Pretty”: The Rise and Fall of D.C. Stephenson On March 16, 1925, in the muted morning light of a hotel room in Hammond, Indiana, 29-year-old Madge Oberholtzer reached into the pocket of the man sleeping next to her. She found the grip of his revolver and slid it out, inch by inch, praying he wouldn’t stir. The man was D.C. Stephenson, political power broker and Grand Dragon of the Ku Klux Klan in 23 Northern states. With shaking hands she aimed the gun between his closed eyes. What passed for a lucid thought came to mind: She would disgrace her family if she were to commit murder; instead, she would kill herself. She crept into an adjoining room and faced a full-length mirror. Beneath her dress chunks of her were missing. Bite marks covered her face, neck, breasts, back, legs and ankles, a macabre pattern of polka dots etched along her skin. She was bleeding from the mouth; he had even chewed her tongue. Her hand was steadier this time, lifting the gun to her temple, when she heard a step outside the door and the squeak of a turning knob. It was one of Stephenson’s associates. She buried the gun into the fold of her dress and slipped it back into the sleeping man’s pocket. She would find another way to kill herself, if he didn’t kill her first. It was the beginning of the end, in different ways, for both Madge Oberholtzer and D.C. Stephenson, although the politician had long believed himself infallible. “I am the law in Indiana,” he famously declared, and with reason. At age 33, Stephenson was one of the most powerful men in the state, having controlled the governor’s election and the movements of several state legislators, influencing bills on nutrition, steam pollution, fire insurance, highways and even oleomargarine, all of which would line his pockets with graft. His hand-picked candidate for mayor of Indianapolis seemed certain to win election, and Stephenson himself dreamed of running for the U.S. Senate, even president. Stephenson’s political success was directly tied to his leadership within the Klan, which by 1925 had a quarter-million members in Indiana alone, accounting for more than 30 percent of the state’s white male population. At the height of its popularity, the Klan was a mainstream organization whose roster included lawyers, doctors, college professors, ministers and politicians at every level, most of them middle- and upper-middle-class white Protestants who performed community service and supported Prohibition. The Klan exploited nativist fears of foreign ethnic groups and religions, Catholicism in particular. (Prejudice against African-Americans was not as much of a motivating factor to join the Klan in Indiana as it was in the South.)  “Out in Indiana everybody seems to belong,” reported the New York Times in 1923. “Easterners have been surprised at the ready conquest by the Klan of a state which seemed of all our forty-eight the least imperiled by any kind of menace.” The rise of Davis Curtis Stephenson seemed equally perplexing, especially since no one—not even those who professed to be his closest friends—knew much about him. “I’m a nobody from nowhere, really—but I’ve got the biggest brains,” he boasted. “I’m going to be the biggest man in the United States!” Stephenson told them his father was a wealthy businessman from South Bend who had sent him to college, but he quit to work in the coal business in Evansville, in the southwest tip of the state. When America entered World War I war in 1917, Stephenson said, he volunteered for the Army and was decorated for fighting the Germans in France. Upon his return, he learned that he was a millionaire; stocks he had purchased before the war had skyrocketed in value. He did well wholesaling coal and running an automobile-accessory business, and joined the Klan in 1921. Knights in Atlanta were impressed with his leadership ability and appointed him to head the organization in the Hoosier State. In reality, Stephenson was born in 1891 in Houston, Texas, the son of a sharecropper. The family moved to Maysville, Oklahoma, where he attended school in a Methodist Church. He was an avid reader, especially interested in politics and history, and graduated from the eighth grade at age 16. That was the end of his formal education. He got a job with a Socialist newspaper and studied the party’s leaders, particularly Oscar Ameringer, who would go on to advocate for African-American enfranchisement and help elect an anti-Klan governor. Stephenson admired Ameringer’s style, the way he sold his politics as if he were a vaudeville pitchman, and he would later implement the Socialist’s techniques at rallies for the Klan. In 1915, the blond, blue-eyed Stephenson courted a local girl named Nettie Hamilton, placing her picture in the newspaper under the headline: “THE MOST BEAUTIFUL GIRL IN OKLAHOMA.” They married and moved to Madill, where he worked at the local newspaper. But Stephenson, in what would become a pattern, got into a fight with his publisher after a bout of drinking and lost his job. He abandoned his pregnant wife and drifted to Cushing. In 1917, Nettie tracked him down and filed for a divorce, after which Stephenson volunteered for the Army. Instead of fighting bravely on the battlefields of Europe, as he liked to boast, he was sent to Boone, Iowa, to work as a recruiter. After the war he took a job as a traveling salesman, and in Akron, Ohio, met his next wife, Violet Carroll. The couple moved to Evansville, Indiana, where Stephenson worked as a stock salesman for the Citizens Coal Company, and where a newly revitalized Ku Klux Klan was taking root. Despite his intensely private nature—“It’s no one’s business where I was born or who my folks were,” he once snapped—Stephenson made friends easily, developing a gregarious, slap-shoulder bonhomie, careful to never patronize or condescend. Despite his limited education, his speech was fluent and polished. When a local Klan organizer asked him to get involved, Stephenson initially demurred. “They kept after me,” he told the New York World, “and explained to me than the Klan was not an organization which took Negroes out, cut off their noses, and threw them into the fire.… I was told that the Klan was a strictly patriotic organization.… They finally convinced me the Klan was a good thing and I joined.” As Stephenson’s career took off his marriage began to flounder. He drank heavily and succumbed to wild rages, once blackening his wife’s eye and another time scratching her face and kicking her. After their divorce in 1922, Stephenson began dating his 22-year-old secretary, frequently bringing her on work trips to Ohio, where he was establishing new offices for the Klan. During one such excursion the couple was parked in Stephenson’s Cadillac, lights off, on a country road in the outskirts of Columbus. Deputy Sheriff Charles M. Hoff stopped to investigate. “What are you doing there with your pants unbuttoned?” he asked. Stephenson grabbed the girl’s left hand and thrust it toward the window. “My God, would you insult this girl?” he said. “Did you see that ring, that diamond ring? I am going to marry this girl; we are engaged.” He added that he was “an official” and “couldn’t afford to have all this notoriety and publicity.” He pleaded guilty to a parking citation and indecent exposure. Stephenson soon had another brush with notoriety. Joseph Cleary, a security officer for the Deschler Hotel in Columbus, was called to check on a report of a disturbance in Stephenson’s room on the upper floor. Cleary found a shattered mirror, smashed chairs, empty bottles of booze strewn about the floor. The hotel’s manicurist reported that when she arrived for Stephenson’s appointment, “there were three full quarts of whiskey and when I told him that I didn’t want any, he came over and grabbed me. He said that he would give me a hundred dollars if I would allow him to have intercourse with me. Of course, he was more rude than I care to be in expressing it… I told him that I was not in the habit of being insulted by anyone like that, and he said… ‘You will or I’ll kill you.’” She fled and ran into two of his associates outside, who tried to console her. “Don’t pay any attention to him,” one said. “He is a good fellow; he is drunk; he is all right when he is sober. You go downstairs and don’t bother about it.” Stephenson met Madge Oberholtzer on January 12, 1925, at the inauguration gala for Governor Ed Jackson, who, with Stephenson’s help, had earned a reputation as the candidate most loathed by the “papists.” She was there at the invitation of a member of the inaugural committee, and busied herself making name tags and running errands. During dinner she sat across from Stephenson, who inquired about her background with flattering persistence. She grew up in Indianapolis, where her father worked as a postal clerk and her family belonged to the Irvington Methodist Church. She was, a friend would later say, “an independent soul, yet timid. I don’t think anybody disliked Madge, but she didn’t make a great effort to make people like her, either.” She studied English, mathematics, zoology and logic at Butler College in Irvington, but dropped out, without explanation, at the end of her junior year. Currently she was the manager of the Indiana Young People’s Reading Circle, a special section of the Indiana Department of Public Instruction. She’d heard the rumors, though, that the Reading Circle program—and her job—were about to be eliminated due to budget cuts. She was 28 years old and still living with her parents. Stephenson asked her to dance. The two began seeing each other frequently. She acted as his aide during the 1925 session of the General Assembly, carrying messages from his office down to his friends, and helped him write a nutrition book, One Hundred Years of Health. Using her Reading Circle connections, she planned to help sell the books to schools throughout the state. Around 10 p.m. on March 15, 1925, Oberholtzer returned home from an evening with a friend. Her mother told her that Stephenson’s secretary had called and said he was leaving for Chicago and needed to see her at once. Oberholtzer changed into a black velvet dress and was met at her front door by one of Stephenson’s bodyguards. Eight hours later, her mother was on the phone with lawyer Ada J. Smith, frantic that Madge had never come home. Two days later, while her parents were conferring with Smith at his office, a car pulled up outside the Oberholtzer home. Eunice Schultz, a boarder, heard someone groaning and saw Oberholzer being carried upstairs by a large man, who said the girl had been hurt in a car accident. Schultz called the family doctor, John Kingsbury, who hurried to Oberholzer’s bedside. “She was in a state of shock,” Kingsbury later recalled. “Her body was cold.” She told him that she didn’t expect, or want, to get well—that she wanted to die. He pressed her until she told him the whole story. When she’d arrived at Stephenson’s, she said, she realized that he was drunker than she’d ever seen him. He forced her to start drinking and ordered her to accompany him to Chicago. Someone shoved her into a car, drove her to Union Station, and dragged her onto a train, where she was pushed into a lower berth in a private compartment with Stephenson. She was “bitten, chewed and pummeled,” she said. They never reached Chicago, stopping at Hammond, Indiana, where they checked into a hotel. She was lowered onto a bed next to Stephenson, who soon fell asleep. Later that morning, she asked him for money to buy a hat and some makeup. Instead, she went to a drugstore and bought a box of mercury bichloride tablets. Back at the hotel, she intended to take the entire box but could choke down only three. When Stephenson discovered what she had done, he panicked and ordered his driver to take them back to Indianapolis. He forced her to drink ginger ale and milk, which she vomited all over the inside of the car. He worried she might die in the back seat. All the while she cried and screamed and begged to be thrown from the car and left on the side of the road. “You will stay right here until you marry me,” she recalled him saying. “You must forget this, what is done has been done, I am the law and the power.” She died on April 14, nearly a month later, with her parents and nurse by her bedside. The official cause was mercury poisoning. Marion County prosecutor William Remy—one of the few officials Stephenson could not control—had him charged with rape, kidnapping, conspiracy and second-degree murder. His former political cronies, including Governor Jackson, swiftly abandoned him, and the Indiana Kourier called him an “enemy of the order.” Stephenson’s lawyers argued that Klan forces loyal to a political rival had set him up and questioned whether he could be held responsible for what was ultimately a suicide. “If this so-called dying declaration declares anything, it is a dying declaration of suicide, not homicide,” defense attorney Ephraim Inman said. “… Has everybody lost his head? Pray, are we all insane?” The citizens of Indiana also expressed some skepticism about Oberholzer’s deathbed statement. “That was a gruesome trial,” one woman recalled. “This girl might have been a party girl, I supposed she was or she wouldn’t have been on that train, but even back in those days you know, murder wasn’t very pretty.” On November 14, 1925, Stephenson was convicted and sentenced to life imprisonment. By 1928, the Indiana Klan, once the strongest in the Invisible Empire, had collapsed, with membership totaling only 4,000, down from a high of half a million. Stephenson was paroled in 1950 on the condition that he take a job in Illinois and settle in that state. Instead he went to Minnesota, where he was arrested and sent back to prison in Indiana. Six years later he was discharged by Governor George Craig, who reasoned, “I don’t see why Stephenson won’t be able to cope with life. He’s mentally all right.” Stephenson moved to Seymour, Indiana, where he married his third wife, Martha Dickinson. They separated in 1962, after Stephenson was arrested and accused of trying to  force a 16-year-old girl into his car. The judge issued a $300 fine, which Stephenson paid out of pocket. Next he wandered to Jonesboro, Tennessee, where he met a widowed Sunday school teacher named Martha Murray Sutton. She was 55; he was 74. They wed, although he had never officially divorced the previous Martha. He suffered a heart attack on June 28, 1966, while bringing her a basket of fruit. She held him as he died. ”I knew nothing of his background,” his widow said. “Except that I loved him very much and we were married. He was a very wonderful person.” Sources: Books: M. William Lutholtz, Grand Dragon: D.C. Stephenson and the Ku Klux Klan in Indiana. West Lafayette, IN: Purdue University Press, 1991; Richard K. Tucker, The Dragon and the Cross: The Rise and Fall of the Ku Klux Klan in Middle America. Hamden, CT: The Shoe String Press, 1991; David H. Bennett: The Party of Fear: From Nativist Movements to the New Right in American History. Chapel Hill: University of North Carolina Press, 1988. Articles: “Stephenson Fights Murder Testimony.” New York Times, November 6, 1925; “Indiana Swayed Entirely By Klan.” New York Times, November 7, 1923; “Holds Ex-Klansman on Assault Charge.” New York Times, April 4, 1925; “Stephenson Held for Death of Girl.” New York Times, April 21, 1925; “Finds Ex-Klan Head Murdered Woman.” New York Times, November 15, 1925. Karen Abbott is a contributing writer for history for Smithsonian.com and the author of the books Sin in the Second City and American Rose. Her forthcoming book, Liar, Temptress, Soldier, Spy, will be published by HarperCollins in September.
f10ce3a0ceed0c32aa03ae621622f4dc
https://www.smithsonianmag.com/history/museum-director-defied-nazis-180974905/
The Museum Director Who Defied the Nazis
The Museum Director Who Defied the Nazis When Nazi tanks rolled into Paris in the early morning of June 14, 1940, most Parisians had already left the city in a mass exodus to the south. All the museums were closed except the Musée de l’Homme, or Museum of Mankind, which tacked a freshly placed French translation of Rudyard Kipling’s poem “If” to its doors: If you can keep your head when all about you are losing theirs...you’ll be a Man, my son! It was a defiant gesture, a dangerous message and even a sly call to arms: Unbenown to the invading army, the man behind the sign, the museum’s director, would become a moving force in the nation’s secret counteroffensive network. With his bald pate, round eyeglasses and winged collar, Paul Rivet, an anthropologist then in his 60s, may seem an unlikely hero. Yet a recent wave of scholarship has revealed the true extent of his bravery and ingenuity in helping undermine not only the Nazis but also their French collaborators. This work, significantly, grew out of his long academic career, in which he boldly criticized racist ideas promoted by many anthropologists (and adopted by the Nazis). But by the summer of 1940, his fight was no longer an intellectual exercise. Rivet and his band of museum protégés—young scholars who didn’t hesitate to take up arms and risk their lives—went on to organize one of the earliest groups in the French underground­. It was this group, in fact, that helped to give the movement a name: the Resistance. The story of the Musée de l’Homme group would end tragically, in betrayal, but historians agree that it showed the French people, many of whom were at first resigned to the occupation, that it was possible to oppose the Nazis—in spirit but also in action, by stealing their military plans, helping their prisoners escape and generally frustrating them. The Musée de l’Homme group “fed and watered the Resistance to come,” Julien Blanc, a historian, wrote in the first detailed study of that group, published in French in 2010. A physician by training, Rivet became interested in anthropology in 1901, when he joined a five-year scientific expedition to Ecuador to measure the curvature of the Earth. While acting as the group’s official doctor, Rivet became intrigued by the linguistic and cultural diversity of the Amerindian peoples he encountered and began to study them. Anthropology at that time divided humanity into “races,” largely on the basis of measuring skeletons—particularly skulls. Like most of his peers, Rivet accepted that races existed and that they were biologically distinguishable from each other, but he strongly rejected the concept of a racial hierarchy, in which some races were regarded as superior to others. He believed that people of different races were also products of long adaptations to their unique physical, social and cultural environments. After he returned to Paris from Ecuador, Rivet and like-minded colleagues reoriented French anthropology along those lines, to consider races as different but equal. During the First World War, Rivet served as a medical officer at the First Battle of the Marne in 1914 and later in Serbia, and received medals for bravery, including the Croix de Guerre, for his unit’s role in setting up medical services behind the front lines. A decade later, he took over the old Museum of Ethnography on the Chaillot Hill, with its panoramic view of the Seine and the Eiffel Tower on the opposite bank, and set about modernizing it. As German anthropology embraced a notion of an Aryan super race, and anti-Semitic elements in French academic circles followed suit, Rivet co-founded an antiracist journal, Races et Racisme, in 1937. The building that housed the old Museum of Ethnography was razed, a new building went up on the same site, and Rivet moved his renamed and modernized Musée de l’Homme into it. Here, a visitor still encountered the skulls and skeletons of different races for comparison, but now she also strolled through galleries organized by region, in which each region’s indigenous population was presented with its tools, art and symbols. Overall, the emphasis was on the similarities between peoples, rather than the differences. As Hitler’s menace loomed over Europe, Rivet inaugurated the new Musée de l’Homme before an audience of France’s artistic, intellectual and political elite. By way of explaining the museum’s name, Rivet would later say: “Humanity is one indivisible whole throughout space and time.” * * * His message was as political as it was scientific. Unlike many of his contemporaries—indeed, unlike many scientists today—Rivet had always considered politics and science to be inseparable, having seen how “scientific” notions of supremacy could lead to violent injustice. He had been an outspoken critic of the anti-Semitism that led to the conviction of French artillery officer Alfred Dreyfus for treason in 1894, and was a prominent member of France’s antifascist movement in the 1930s. Visiting Berlin in 1933, he was shocked to find how deeply Nazi ideology had penetrated German society. In a letter to a close colleague, the German-American anthropologist Franz Boas, who had performed curatorial work for the Smithsonian Institution, he wrote that “a real regime of terror is reigning in Germany and this regime seems to cause no reaction at all” among many Germans. Rivet started recruiting exiled German Jews and Eastern Europeans to give them a haven in which to work. According to his biographer, Christine Laurière, he also wanted to increase the representation of Eastern European cultures at the museum, viewing them as a bulwark against fascism. Among his recruits were 26-year-old Boris Vildé, a Russian-born linguist and ethnologist who specialized in the Finno-Ugric peoples of northeastern Europe, and 30-year-old Anatole Lewitsky, a tall, aristocratic-born Russian who had studied shamanism in Siberia, and whom Rivet discovered driving a Parisian taxi. They haunted the museum’s basement, which housed the scientific research departments, addressing Rivet as cher Docteur. When war broke out, Vildé and Lewitsky—by now naturalized French citizens—were called up for military service. Injured and captured by the Germans, Vildé was interned in a prison camp in the Jura Mountains, between France and Switzerland, from which he managed to escape. Rivet later recalled his reappearance at the museum, on July 5, 1940: “We were sharing a frugal meal, right here, when our friend appeared, leaning on a cane, thin, exhausted. Without a word he sat down among us; he had returned to the bosom of his spiritual family.” Lewitsky also returned that summer, having escaped German captivity. Vildé, Lewitsky and Yvonne Oddon, the museum librarian and Lewitsky’s lover, now launched a campaign of organized disobedience against the invaders—and against the collaborationist French government. With Vildé leading the cell, they recruited friends and colleagues across Paris, and within a few months “the little group had transformed itself into a veritable spider’s web covering the whole of France,” writes Tatiana Benfoughal, Vildé’s biographer. Rivet, too well-known to take an active role, facilitated everything they did: He put them in touch with Parisian intellectuals who he believed would be sympathetic to their cause, he translated speeches by Churchill and Roosevelt for them, and above all he provided them with a base and logistical support in the form of the museum, which he vowed at the outbreak of war to keep open. Vildé, under cover of his work for the museum, traveled through the occupied and free zones of France, recruiting dissidents, gathering military intelligence and organizing escape routes for Allied prisoners of war held in Nazi-run camps—by boat from the fishing ports of Brittany, for example. At one point he claimed he could draw on a 12,000-strong force and an impressive stockpile of arms. It was undoubtedly an exaggeration, but he understood the power of words as well as Joseph Goebbels, Hitler’s propaganda minister. With Rivet’s blessing, the group used a mimeograph machine in the basement to produce a clandestine newspaper, Résistance. Oddon proposed the name, recalling that in the 18th century, Huguenot women imprisoned for their Protestantism carved RESISTER into the stones of their prison. An editorial from the first issue, dated December 15, 1940, proclaimed: “Resistance! That’s the cry that goes up from your hearts, in your distress at the disaster that has befallen our nation.” Copies were distributed around the city. It was dangerous work—if caught by the Gestapo, the résistants risked being tortured and executed—so they inhabited a secretive, nocturnal world of code names and passwords. When one of them wanted to speak to Oddon about resistance matters, they would appear in the library and announce: “I have come for my English lesson.” Rivet carried on the fight in public, giving hugely popular, standing-room-only lectures on the folly of scientific racism. In July 1940 he wrote the first of three open letters to France’s collaborationist leader, Marshal Philippe Pétain, in which he warned, “Marshal, the country is not with you.” In November 1940, he learned from the radio that the Vichy government had stripped him of his museum post; three months later, tipped off that the Gestapo was coming for him, he fled to Colombia. Just hours later, the Gestapo searched the museum in vain for plans of the German U-boat base at Saint-Nazaire on the Brittany coast—plans that Vildé’s people had stolen. The plans reached the British, and their forces bombed the base in 1942. In Bogotá, Rivet headed up the local committee of Gen. Charles de Gaulle’s government in exile, providing intelligence, contacts and logistical support to comrades in the Resistance back home. The Gestapo arrested Vildé on March 26, 1941, after he was betrayed by two other Russian émigrés working at the museum, whom Rivet had recruited, and by a French double agent. Either the Gestapo or the Abwehr—a German intelligence organization—rounded up his fellow résistants around the same time. After they’d spent almost a year in prison, a German military tribunal found them guilty. Despite petitions from influential French figures including the poet Paul Valéry and the exiled Rivet, a firing squad executed Vildé, Lewitsky and five others at Fort Mont-Valérien, a fortress outside Paris, in February 1942. The tribunal commuted Oddon’s sentence, and she spent time in various prisons before being deported to the Ravensbrück concentration camp in Germany in late 1944. Laurière, Rivet’s biographer, has unearthed just one letter that Rivet wrote to a friend at the time. It acknowledged the fate of Vildé and Lewitsky: “Those two corpses haunt me like an obsession.” Another young ethnologist, Germaine Tillion, took over Vildé’s role as the head of the museum’s resistance cell. She too was betrayed and deported to Ravensbrück. Both Tillion and Oddon survived the camp, and Tillion would publish a groundbreaking ethnographic study based on her captivity, Ravensbrück. Rivet returned from exile in October 1944 following the liberation of Paris; de Gaulle awarded him the Resistance medal in recognition of “the remarkable acts of faith and of courage that, in France, in the empire and abroad, have contributed to the resistance of the French people against the enemy and against its accomplices.” Oddon, Tillion, Vildé and Lewitsky were awarded the same medal—the last two posthumously. Rivet resumed his old post at the museum. Today, the lobby at the Musée de l’Homme hosts a small permanent exhibition dedicated to Rivet, Vildé, Tillion and their band. If you climb the stairs and turn right, you look out through a large window onto the Eiffel Tower, from which a swastika once flew. Turn left, and you arrive at the research library named for Yvonne Oddon, where the résistants came for their English lessons. This article is a selection from the June 2020 issue of Smithsonian magazine Laura Spinney is a journalist and novelist based in France. She is the author of Pale Rider: The Spanish Flu of 1918 and How It Changed the World.
698915d992ebfb79c679446b301ef9be
https://www.smithsonianmag.com/history/muslims-were-banned-americas-early-16th-century-180962059/
Muslims Were Banned From the Americas as Early as the 16th Century
Muslims Were Banned From the Americas as Early as the 16th Century On Christmas Day, 1522, 20 enslaved Muslim Africans used machetes to attack their Christian masters on the island of Hispaniola, then governed by the son of Christopher Columbus. The assailants, condemned to the grinding toil of a Caribbean sugar plantation, killed several Spanish and freed a dozen enslaved Native Americans in what was the first recorded slave revolt in the New World. The uprising was quickly suppressed, but it prompted the newly crowned Charles V of Spain to exclude from the Americas “slaves suspected of Islamic leanings.” He blamed the revolt on their radical ideology rather than the harsh realities of living a life of slavery. By the time of the Hispaniola revolt, Spanish authorities had already forbidden travel by any infidel, whether Muslim, Jewish, or Protestant, to its New World colonies, which at the time included the land that is now the United States. They subjected any potential emigrant with a suspicious background to intense vetting. A person had to prove not just that they were Christian, but that there was no Muslim or Jewish blood among their ancestors. Exceptions were granted solely by the king. Catholic Europe was locked in a fierce struggle with the Ottoman Empire, and Muslims were uniformly labeled as possible security risks. After the uprising, the ban applied even to those enslaved in the New World, writes historian Sylviane Diouf in a study of the African diaspora. “The decree had little effect,” adds historian Toby Green in Inquisition: The Reign of Fear. Bribes and forged papers could get Jews to the New World with its greater opportunities. Slave traders largely ignored the order because West Africa Muslims often were more literate and skilled in trades, and therefore more valuable, than their non-Muslim counterparts. Ottoman and North Africans captives from the Mediterranean region, usually called Turks and Moors, respectively, were needed to row Caribbean galleys or perform menial duties for their Spanish overlords in towns and on plantations. In the strategic port of Cartagena, in what is now Colombia, an estimated half of the city’s slave population were transported there illegally and many were Muslim. In 1586, the English privateer Sir Francis Drake besieged and captured the town, instructing his men to treat Frenchmen, Turks, and black Africans with respect. A Spanish source tells us “especially Moors deserted to the Englishman, as did the blacks of the city.” Presumably they were promised their freedom, although Drake was a notorious slave trader. A Spanish prisoner later related that 300 Indians—mostly women—as well as 200 Africans, Turks, and Moors who were servants or slaves boarded the English fleet. En route to the English colony on Roanoke Island, Drake and his fleet raided the small Spanish settlement of St. Augustine, on Florida’s Atlantic Coast, and stripped it of its doors, locks and other valuable hardware. With the pirated slaves and stolen goods aboard, Drake intended to bolster Roanoke, situated on the Outer Banks of North Carolina and the first English effort at settling the New World. “All the Negroes, male and female, the enemy had with him, and certain other equipment which had taken…were to be left at the fort and settlement which they say exists on the coast,” a Spanish report states. Drake sought to help his friend, Sir Walter Raleigh, who had settled Roanoke the year prior with more than 100 men and the goal of establishing a base for privateering and extracting the wealth that made Spain the richest and most powerful nation on Earth. Among them was a German metallurgist named Joachim Gans, the first Jewish-born person known to have set foot on American soil. Jews were forbidden to live or even visit England then—the ban lasted from 1290 to 1657—but Raleigh needed scientific expertise that could not be found among the Englishmen of his day. He won for Gans today’s equivalent of an H-1B visa so that the accomplished scientist could travel to Roanoke and report on any valuable metals found there. Gans built a workshop there and conducted extensive experiments. Shortly after Drake’s fleet arrived off the Carolina coast, a fierce hurricane pummeled the island and scattered the ships. The English colonists abruptly chose to abandon their battered fort and return home with the fleet. Had the weather been more fortunate, the fragile settlement on Roanoke might have emerged as a remarkably mixed community of Christian, Jewish and Muslim Europeans and Africans, as well as Indians from both South and North America. The Drake fleet returned safely to England, and Elizabeth I returned 100 Ottoman slaves to Istanbul in a bid to win favor with the anti-Spanish sultan. The fate of the Moors, Africans and the Indians, however, remains an enduring mystery. There is no record of them reaching England. “Drake thought he was going to find a flourishing colony on Roanoke, so he brought a labor supply,” says New York University historian Karen Kupperman. She and other historians believe that many of the men and women captured in Cartagena were put ashore after the storm. Drake was always eager to make a profit from human or material cargo, and not inclined to liberate a valuable commodity, but there was little market in England for enslaved persons. To make room for the Roanoke colonists, he may well have dumped the remaining men and women on the Carolina coast and sailed away. Some of the refugees may have drowned in the hurricane. Less than a year later, a second wave of English settlers sailed to Roanoke—the famous Lost Colonists--but they made no mention of meeting hundreds of refugees. The Cartagena captives might have scattered among the local Native American population to avoid detection by the slave raiders who prowled the North American coast in the 16th century. The new colonists were themselves abandoned in the New World and never heard from again—including Virginia Dare, the first English child born in America. The Jamestown settlement that followed adopted a policy similar to that of the Spanish with regards to Muslims. Christian baptism was a requirement for entering the country, even for enslaved Africans, who first arrived in Virginia in 1619. In 1682, the Virginia colony went a step further, ordering that all “Negroes, Moors, mulattoes or Indians who and whose parentage and native countries are not Christian” automatically be deemed slaves. Of course, suppressing “Islamic leanings” did little to halt slave insurrections in either Spanish or British America. Escaped slaves in Panama in the 16th century founded their own communities and fought a long guerilla war against Spain. The Haitian slave revolt at the turn of the 19th century was instigated by and for Christianized Africans, although whites depicted those seeking their freedom as irreligious savages. Nat Turner’s rebellion in Virginia in 1831 stemmed in part from his visions of Christ granting him authority to battle evil. The real threat to peace and security, of course, was the system of slavery itself and a Christianity that countenanced it. The problem wasn't the faith of the immigrants, but the injustice that they encountered on their arrival in a new land. Andrew Lawler is author of The Secret Token: Myth, Obsession, and the Search for the Lost Colony of Roanoke. He is also a contributing writer for Science magazine and has written for The New York Times, The Washington Post, Smithsonian, National Geographic, and other publications. Website: andrewlawler.com
30cf0e465e84e0c5812d1e3a26d31157
https://www.smithsonianmag.com/history/mystery-roanoke-endures-yet-another-cruel-twist-180962837/
The Mystery of Roanoke Endures Yet Another Cruel Twist
The Mystery of Roanoke Endures Yet Another Cruel Twist It seemed too good to be true. And it was. Nearly 20 years ago, excavators digging on North Carolina’s remote Hatteras Island uncovered a worn ring emblazoned with a prancing lion. A local jeweler declared it gold—but it came to be seen as more than mere buried treasure when a British heraldry expert linked it to the Kendall family involved in the 1580s Roanoke voyages organized by Sir Walter Raleigh during Elizabeth I’s reign. The 1998 discovery electrified archaeologists and historians. The artifact seemed a rare remnant of the first English attempt to settle the New World that might also shed light on what happened to 115 men, women, and children who settled the coast, only to vanish in what became known as the Lost Colony of Roanoke. Now it turns out that researchers had it wrong from the start. A team led by archaeologist Charles Ewen recently subjected the ring to a lab test at East Carolina University. The X-ray fluorescence device, shaped like a cross between a ray gun and a hair dryer, reveals an object’s precise elemental composition without destroying any part of it. Ewen was stunned when he saw the results. “It’s all brass,” he said. “There’s no gold at all.” North Carolina state conservator Erik Farrell, who conducted the analysis at an ECU facility, found high levels of copper in the ring, along with some zinc and traces of silver, lead, tin and nickel. The ratios, Farrell said, “are typical of brass” from early modern times. He found no evidence that the ring had gilding on its surface, throwing years of speculation and research into serious doubt. “Everyone wants it to be something that a Lost Colonist dropped in the sand,” added Ewen. He said it is more likely that the ring was a common mass-produced item traded to Native Americans long after the failed settlement attempt. Not all archaeologists agree, however, and the surprise results are sure to reignite the debate over the fate of the Lost Colony. The settlers arrived from England in the summer of 1587, led by John White. They rebuilt an outpost on Roanoke Island, 50 miles north of Hatteras, abandoned by a previous band of colonists. White’s group included his daughter Eleanor, who soon gave birth to Virginia Dare, the first child born of English parents in the New World. White quickly departed for England to gather supplies and additional colonists, but his return was delayed by the outbreak of war with Spain. When he finally managed to land on Roanoke Island three years later, the settlement was deserted. The only clue was the word “Croatoan” carved on a post, the name of a tribe allied with the English and the island now called Hatteras. ECU archaeologist David Phelps, now deceased, found the ring while excavating a Native American village there and took it to a jeweler named Frank Riddick in nearby Nags Head. Phelps reported that the jeweler tested the ring and determined it was 18-carat gold. Riddick, who now runs a fishing charter company called Fishy Bizness, recalled recently that he didn’t conduct an acid-scratch test typically used to verify the presence and quality of the precious metal. “Since this wasn’t about buying or selling, we didn’t do that,” he said. “I just told him that I thought it was gold.” Phelps apparently didn’t want to subject the object to potential damage. A senior member of London’s College of Arms subsequently noted that the seal on the signet ring was of a lion passant, and suggested that it might relate to the Kendall family of Devon and Cornwall. A Master Kendall was part of the first colonization attempt in 1585, while another Kendall visited Croatoan when a fleet led by Sir Francis Drake stopped by in 1586. Though this link was never confirmed, the object was nicknamed the Kendall ring. Since Phelps thought the ring was made of a precious material and likely belonged to the Elizabethan era, he argued it was an important clue. “That doesn’t mean the Lost Colony was here,” he told a reporter at the dig site after the ring’s discovery. “But this begins to authenticate that.” Some archaeologists, however, were skeptical of the artifact’s connection to Roanoke, given that it was found with other artifacts dating to between 1670 and 1720—about a century after the Elizabethan voyages. This was also an era in which brass rings showed up at Native American sites up and down the East Coast. But Mark Horton, an archaeologist at the University of Bristol in the United Kingdom, says that Ewen’s results don’t necessarily preclude that it belonged to a Roanoke colonist. “The fact that the ring is brass actually makes it more similar to other British examples,” he said, noting that the ring could have been made in the 1580s. “I would argue that it was kept as an heirloom, passed down, and then discarded.” Horton is currently digging at the Hatteras site where the ring was discovered. The excavations, sponsored by the Croatoan Archaeological Society, have so far uncovered several artifacts that may have been made during Elizabethan times, including the handle of a rapier and bits of metal from clothing. If the Lost Colonists left Roanoke for Croatoan in the late 1580s, argues Horton, they might have brought along their most precious objects. Over a couple of generations they may have assimilated with the Algonquian-speaking Croatoan people and their English heirlooms would have eventually worn out. “Oh, there’s granddad’s old sword in the corner rusting away,” said Horton. “Why are we keeping that?” His theory is also based on archaeological finds that show that Native Americans on Hatteras manufactured lead shot and used guns to hunt deer and birds by the 1650s. Prior to this, their diet was based heavily on fish and shellfish. The technological sophistication, Horton suggests, hints at the presence of Europeans before the second wave of English arrived in the area in the late 1600s. That, too, could point to the presence of assimilated colonists and their descendants. That theory is a stretch, says archaeologist Charles Heath, who worked with Phelps and was present when the ring was found. “Such items would have been used, modified, traded, re-traded, lost, discarded or curated by their native owners—and subsequent native owners—for many years,” he argued. In the end, he said, “a stray 16th-century artifact found here and there on the Outer Banks will not make for a Lost Colony found.” Horton acknowledges that rather than Roanoke colony possessions brought along by assimilating English, the Croatoan people could have acquired the goods from Jamestown, the later Virginia colony to the north, instead. Gunflints, coins, and glass beads found at the site almost certainly came from the newer English settlement. But he is confident that the current excavations will soon reveal additional evidence. Meanwhile, the hunt for the Lost Colony continues. Another group of archaeologists working about 50 miles west of Roanoke Island at the head of Albemarle Sound say that they have pottery and metal artifacts likely associated with the Lost Colony. The digs by the First Colony Foundation were sparked by the 2012 discovery of a patch concealing the image of a fort on a map painted by John White. But like the finds at Hatteras, the objects might be associated with the second wave of English settlement. Last fall, a dig by the National Park Service at Fort Raleigh on Roanoke Island—thought to be the site of the original settlement—yielded no trace of the colonists. But earlier in 2016, archaeologists did find a handful of fragments of an apothecary jar that almost certainly date from the 16th century. That the gold Kendall ring is likely a cheap brass trade item won’t derail the quest to find out what took place on the Outer Banks more than four centuries ago. As for Ewen, he hopes that the analysis of the ring will help put researchers back on track in their search for scarce clues to the Roanoke settlers. “Science actually does work,” he said—“if you give it time.” Andrew Lawler is author of The Secret Token: Myth, Obsession, and the Search for the Lost Colony of Roanoke. He is also a contributing writer for Science magazine and has written for The New York Times, The Washington Post, Smithsonian, National Geographic, and other publications. Website: andrewlawler.com
30b57713f888ae2754bfcd0eaf375377
https://www.smithsonianmag.com/history/nan-madol-the-city-built-on-coral-reefs-147288758/
Nan Madol: The City Built on Coral Reefs
Nan Madol: The City Built on Coral Reefs We zigzag slowly in our skiff around the shallow coral heads surrounding Pohnpei. The island, a little smaller than New York City, is part of the Federated States of Micronesia. It is nestled in a vast tapestry of coral reefs. Beyond the breakers, the Pacific stretches 5,578 miles to California. A stingray dashes in front of us, flying underwater like a butterfly alongside our bow. Our destination is Nan Madol, near the southern side of the island, the only ancient city ever built atop of a coral reef. Its imposing yet graceful ruins are made of stones and columns so heavy that no one has figured out how it was built. Besides the elegance of the walls and platforms, there is no carving, no art – nothing except legend to remember the people, called the Saudeleur, who ruled the island for more than a millennium. They were deeply religious and sometimes cruel, and modern Pohnpeians view the ruins as a sacred and scary place where spirits own the night. Abandoned centuries ago and now mostly covered with jungle, Nan Madol may soon be getting a makeover. Before I explore it, I stop to discuss its future with the man who holds sway over this part of Pohnpei. We nuzzle up to land and jump onto the remnants of a sea wall. I follow Rufino Mauricio, Pohnpei’s only archaeologist, along a path and up a hill to what appears to be a warehouse, painted white with a corrugated metal roof. It’s known here as the Tin Palace. There is a small house tacked on the end, with flowering bushes here and there. A gaggle of dogs welcome us noisily. This is the residence of the Nahnmwarki of Madolenihmw, the primus inter pares among the five traditional paramount chiefs who preside over a delightfully complex social structure that underpins Pohnpei's vibrant native culture. Aside from Easter Island, Nan Madol is the main archaeological site in Oceania that is made up of huge rocks. But while Easter Island gets 50,000 visitors a year, Nan Madol sees fewer than 1,000. Before I left on this trip, Jeff Morgan, director of the Global Heritage Fund of Palo Alto, California, had told me he wanted to fund a rehabilitation program. But before anything can be done, ownership issues that blocked previous rehabilitation efforts would have to be resolved—the state government and the Nahnmwarki both claim sovereignty over the ruins. A resolution would pave the way for Nan Madol to become a Unesco World Heritage site, increasing the flow of visitors and grants. “Nan Madol is one of the most significant sites not yet on the World Heritage List,” says Richard Engelhart, an archaeologist and former Unesco adviser for Asia and the Pacific. Mauricio and I are a bit nervous: an audience with the Nahnmwarki is best arranged through Pohnpei’s governor, John Ehsa. A day earlier, Ehsa had pledged to support the Global Heritage Fund’s idea and promised to arrange an audience with the Nahnmwarki so that I could interview him about the plan—but then Ehsa didn’t come through on his promise. Ehsa had noted that a previous attempt to clean up the ruins had foundered because the Japanese donors had not followed proper protocol with the Nahnmwarki. Sadly, neither do I. It’s unthinkable to arrive without a tribute, but the bottle of Tasmanian wine I brought for the occasion slipped out of my hand and shattered on the rocks as I got off the boat. Mauricio, who holds a lesser traditional title, is mortified: he didn’t know we were stopping to see the chief on our way to the ruins, so he is empty-handed too. Arriving empty-handed without an appointment is the height of rudeness, he grumbles. Mauricio, who, as I am, is dripping with sweat in Ponhpei’s steamy equatorial heat, informs the chief’s wife of our arrival. The Nahnmwarki agrees to see us and we walk back to the other end of the building so we can make our entry from the visitors’ side. Mauricio, who earned a PhD from the University of Oregon with a thesis on Nan Madol, kneels. He addresses the chief, a former teacher and school bus driver, who finishes buttoning up a russet aloha shirt and tan shorts and sits at the head of a small staircase. He has short, thick hair and, like most people in Pohnpei, his teeth are stained by betel nut, which he chews during out meeting, occasionally walking over to the door to spit. Through Mauricio, who translates, I inquire: Would the Nahnmwarki be interested in setting aside old grievances and cooperating with the state and other stakeholders in order to take advantage of this opportunity? “I would love to see Nan Madol rehabilitated, but it has to be under my supervision,” he replies, later adding, “All funding should go through the Madolenihmw municipal government, not the Pohnpei state government.” The municipal government is the heir to the Nahnmwarki’s rule. On the way back, Mauricio, who is director of the national archives, says thoughtfully, “It’s a reasonable request. Certainly, the national government [of the Federated States of Micronesia] would have no objection.” Back on the skiff, Augustine Kohler, the state historical preservation officer and himself the son of another of Pohnpei’s five Nahnmwarkis, says, “It could work.” We head for the ruins in the boat to take a look at what kind of rehabilitation would be appropriate. On the way, Mauricio explains that Nan Madol is composed of 92 artificial islands spread over 200 acres abutting Pohnpei’s mangrove-covered shore. Most of it was built from the 13th to the 17th centuries by the Saudeleurs, descendants of two brothers of unknown provenance who founded a religious community in the sixth century focused on the adoration of the sea. On their third attempt to build their political, religious and residential center, they settled on this patch of coral flats. They and their successors brought from the other side of the island columns of black lava rock up to 20 feet long that are naturally pentagonal or hexagonal and straight. They used them in a log cabin formation to build outer walls as well as foundations filled in with lumps of coral to create elevated platforms where traditional thatched structures were used as lodgings. Even with all the sunshine in the world washing over the thick green jungle and aquamarine water beyond, the unadorned black architecture is intimidating. The tyrannical last Saudeleur ruler was overthrown by an outsider named Isohkelekel who instituted the system of multiple chiefs that remains today. The Nahnmwarki of Madolenihmw is directly descended from him. Because of this bloodline, most Pohnpeians feel he is the legitimate supervisor of the ruins. As we approach the first building, Mauricio observes, “We don’t know how they brought the columns here and we don’t know how they lifted them up to build the walls. Most Pohnpeians are content to believe they used magic to fly them.” The easiest way to see Nan Madol is to take a cab from Kolonia, the little capital of Pohnpei, park on an unmarked spot and walk for nearly a mile through a primitive jungle path. When you arrive, only a channel separates you from the main building, the Nandawas. Representatives of the Nahnmwarki with a boat are on hand to collect $3 and take you across. The odds are good that you will have the place to yourself. Having your own boat at high tide allows you to go much farther. We glide though the channel, the outboard purring. The islands are covered with almost impenetrable jungle. A large component of the rehabilitation effort, if it happens, will be to clear brush to make the buildings accessible. The other component would be dredging the main channels so the ruins are accessible to boats at all times. Many of the outer walls, usually just a few feet high, are intact. Mauricio points out the little island of Idehd, where priests fed turtle innards to an eel, the sea deity, kept in a well, before sharing among themselves the rest of the turtle as a sacrament. To this day eels are considered holy and never eaten. Then we pass Peikapw, where Isohkelekel resided after he overthrew the last Saudeleur. He eventually committed suicide there after discovering how old he looked when he saw his reflection in a pool, according to the oral history. After he died, Nan Madol was largely abandoned, though religious ceremonies were occasionally held there until the late 19th century. As we continue, the channel gets narrower and shallower. We turn back to explore the city’s outer walls, still strong, and continue to the islet of Pahnwi, whose wall of huge, flat-sided stone rises 58 feet and encloses a tomb. Our final stop is Nandowas, by far the most elaborate building. It’s the royal mortuary, with two sets of 25-foot-high walls whose gracefully up-swept corners cover an area greater than a football field. One cornerstone is estimated to weigh 50 tons. I step down into the moss-encrusted tomb. Eight columns form the basis of a roof that lets in shards of sunlight. I’m glad I’m not alone. The bodies of kings were placed here and later buried elsewhere. On the way back, Mauricio remarks that, given Pohnpei’s population at the time was less than 30,000, the building of Nan Madol represented a much larger effort than the pyramids were for the Egyptians. The total weight of the black rocks moved is estimated at 750,000 metric tons, an average of 1,850 tons a year over four centuries. “Not bad for people who had no pulleys, no levers and no metal,” said Mauricio. Waving at the brush, he adds, “We need to clear all this out in at least some of the islands so we can appreciate the extraordinary effort that was put into this construction.”
d1ec55e60142e40d2bd2bdc6900f9f1a
https://www.smithsonianmag.com/history/nasa-landing-moon-many-african-americans-sought-economic-justice-instead-180972622/
While NASA Was Landing on the Moon, Many African Americans Sought Economic Justice Instead
While NASA Was Landing on the Moon, Many African Americans Sought Economic Justice Instead In anticipation of astronaut Neil Armstrong’s first step on the moon, an estimated 8,000 New Yorkers gathered in Central Park, eager to celebrate the moment. The New York Times ran a photograph of the crowd glued to the networks’ broadcasts on three giant screens and described the event as “a cross between a carnival and a vigil.” Celebrants came dressed in white, as encouraged by the city’s parks department. Waiting for the big show, they listened to the Musician’s Union orchestra play space-themed music and watched student artists dance in a “Moon Bubble,” illuminated by ultra-violet light. That same day, about 50 blocks north, another estimated 50,000 people, predominantly African American, assembled in Harlem for a soul-music showcase in Mount Morris Park headlined by Stevie Wonder, whose “My Cherie Amour” was climbing the Billboard charts. The parks department sponsored this event, too, but the audience was less interested in what was happening in the sky overhead. As the Times reported, “The single mention of the [lunar module] touching down brought boos from the audience.” The reception in Harlem reflects a broader truth about the Apollo 11 mission and how many black communities viewed it. NASA’s moonshot was costly; author Charles Fishman called it “the largest non-military effort in human history” in a recent interview with NPR. Black publications like the New York Amsterdam News and civil rights activists like Ralph Abernathy argued that such funds—$25.4 billion, in 1973 dollars— would be better spent alleviating the poverty facing millions of African Americans. Spoken word artist Gil Scott-Heron's memorable poem “Whitey on the Moon” catalogued a host of genuine hazards and deprivations earthbound African Americans endured while Armstrong and Buzz Aldrin hopped about on the moonscape. “No hot water, no toilets, no lights, while whitey's on the moon” he rapped, adding that “all that money I made last year” went to the race to beat the Soviets to the moon. In 1969, according to the United States census, the poverty rate for African Americans was 31.1 percent, compared to 9.5 percent for whites, and a full 62 percent of blacks on farms were living in poverty. The day before the Apollo launch, Abernathy, head of the Southern Christian Leadership Conference, led a march of 25 poor families to the Kennedy Space Center to protest what he called America’s “distorted sense of national priorities.” In perhaps the most vivid illustration of the gulf between America’s highest technological achievements and the abject poverty of millions of rural blacks, on the day of the launch, newspapers around the country described the scene: The protesters, with farm wagons drawn by four mules, marched across a field to meet the NASA administrator and other agency personnel, with Apollo 11’s 36-story Saturn V rocket on the launch pad in the background. Abernathy and the poor black families who marched with him (totaling as many as 150 people) told NASA administrator Thomas O. Paine the money spent on the impending launch could be better spent feeding people on Earth. According to the Orlando Sentinel, Paine responded by saying, “Poverty is such a great problem that it makes the Apollo program look like child’s play.” “If it were possible for us not to push that button and solve the problems you are talking about, we would not push that button,” Paine added. During the 20-minute encounter, Abernathy urged Paine to put NASA technologies in service to the poor. While Paine questioned what NASA could immediately do to combat hunger, he agreed the moon mission could inspire the country to band together to tackle its other problems. He told Abernathy, "I want you to hitch your wagon to our rocket and tell the people the NASA program is an example of what this country can do." While the protest highlighted African Americans’ displeasure with the government’s prioritization of the moon landing, the high cost of space exploration was actually a point of contention across American society. As Roger Launius, former chief historian for NASA and former senior official at the Smithsonian’s National Air and Space Museum, wrote in a 2003 report, “consistently throughout the 1960s, a majority of Americans did not believe Apollo was worth the cost.” Only when it was all-systems-go in July 1969 did one poll show the barest majority supporting the launch, he writes. But the black community was especially willing to point out the hypocrisy of spending on the future while neglecting the present. A July 27, 1969, New York Times headline announced: “Blacks and Apollo: Most Could Have Cared Less,” and historian David Nye notes that “most black newspapers carried editorials and cartoons attacking the space program.” The Times quoted Victoria Mares, the head of a poverty program in Saginaw, Michigan, who compared the government’s spending on Apollo to “a man who has a large family—they have no shoes, no clothing, no food, and the rent is overdue. But when he gets paid, he runs out and buys himself a set—another set—of electric trains.” Roy Wilkins, the executive director of the NAACP, the article states, “called the moon shot, ‘a cause for shame.’” The Times notes that the New York Amsterdam News, one of the nation’s leading black papers, the day after the moon landing, lamented, “Yesterday, the moon. Tomorrow, maybe us.” The Times article on “Blacks and Apollo” also quoted Sylvia Drew Ivie (then Sylvia Drew), an attorney for the NAACP Legal Defense and Educational Fund, who said, “If America fails to end discrimination, hunger, and malnutrition, then we must conclude that America is not committed to ending discrimination, hunger, and malnutrition. Walking on the moon proves that we do what we want to do as a nation.” Today, Ivie is the assistant to the president of the Charles R. Drew University of Medicine, which is named for her father, the pioneering African American surgeon. Reached by phone at her home in Los Angeles, Ivie says she is “less single-minded today than I was then, but the problems I was worried about then are still with us.” At that time, she said, ”My whole focus was solving problems on this planet…I wasn’t so interested in the wonder of scientific exploration.” Apollo did, though, inspire a generation of minorities and women to reach for the stars. Mae Jemison, the first African American woman in space, said in a recent video interview, “I was like every other kid. I loved space, stars, and dinosaurs.” But with Apollo, she said, “I was really, really irritated that there were no woman astronauts…There are a lot of people who felt left out. They didn’t see themselves so they didn’t see the connection back to them.” Jemison, in the same video, credits Nichelle Nichols, the African American actress who played Lieutenant Uhura on “Star Trek,” with “help[ing] me to say, yes, this is something reasonable to think about.” Nichols herself stated in a 2011 NPR interview that she had considered leaving the show after its first season for a role on Broadway, but that it was Martin Luther King who convinced her to stay for the symbol she represented to the country. Nichols later played a major role in NASA recruitment, stating in a 1977 recruitment film, “I’m speaking to the whole family of humankind, minorities and women alike. If you qualify and would like to be an astronaut, now is the time.” While some African Americans did indeed work on the Apollo mission, they were largely relegated to the shadows—in 1969, Jet criticized NASA for “the poorest minority hiring records [sic] among U.S. agencies.” Today, thanks largely to the 2016 Oscar-nominated film Hidden Figures, more Americans know about the role of Katherine Johnson and other African American women “computers” in the space race. NASA’s website calls Johnson’s calculations “critical to the success of the Apollo Moon landing.” Forty years after Abernathy confronted Administrator Paine at Kennedy Space Center, an African American president appointed an African American astronaut, General Charles Bolden, to head NASA. Likewise, one of today’s greatest public champions for space research and exploration is an African American man, astrophysicist Neil deGrasse Tyson, the director of New York’s Hayden Planetarium. Asked by a listener on his radio program, Star Talk, to state the most significant thing the Apollo program achieved (with the exception of landing on the moon), Tyson emphasized its role in inspiring the nation’s environmental movement: the founding of Earth Day, the creation of NOAA and the EPA, the passage of the comprehensive Clean Air and Water Acts, the banning of leaded gas and DDT, and the introduction of the catalytic converter. “Though we went to the moon to explore the moon,” he said, “upon getting there and looking back, in fact, we would discover Earth for the first time.” Ivie appreciates the greater diversity at NASA today. Her cousin, Frederick Drew Gregory, was among the first African American astronauts in space. But she believes the United States could have walked on the moon and pulled Americans out of poverty at the same time. “It wasn’t that we didn’t have enough money to do both [in 1969], we just didn’t have a desire to do both...And I think we are still lacking that will, though there is more interest in it today.” She pointed out, “In Watts, when we had the revolt in ‘65, we had one grocery store. This is 2019. We still have one grocery store in Watts.” As for the digital age, which Fishman says Apollo ushered in, and the environmental consciousness that Tyson attributes to the moon landing, Ivie is noncommittal. “I think it’s splendid to have someone African American be the teacher on public television about all these things. I think that’s really fantastic,” she says. “What it says is, the Earth and the stars are as mysterious and wonderful to us as they are to every other group, and we can learn about them and we can learn from them. We’re all members of the planet Earth together. That’s a huge message… But it doesn’t help us get a grocery store in Watts.” Bryan Greene lives in Washington, D.C. and has written about music, history, and race and society for Poverty & Race. He is a consulting producer on the 2021 music documentary, "Summer of Soul."
a50610bfbd439899609b4d39474b3c82
https://www.smithsonianmag.com/history/newly-discovered-diary-tells-harrowing-story-deadly-halifax-explosion-180964066/
A Newly Discovered Diary Tells the Harrowing Story of the Deadly Halifax Explosion
A Newly Discovered Diary Tells the Harrowing Story of the Deadly Halifax Explosion “We turn out of our hammocks at 6.30am and lash up and stow in the usual way,” a Royal Navy sailor named Frank Baker wrote in his diary on December 6, 1917. “We fall in on the upper deck at 7am and disperse to cleaning stations, busying ourselves scrubbing decks etc. until 8am when we ‘cease fire’ for breakfast.” Baker was pulling wartime duty as a ship inspector in the harbor of Halifax, Nova Scotia, on the lookout for spies, contraband and saboteurs. But there were no ships to be inspected that day, so after breakfast he and his crewmates aboard HMCS Acadia went back to their cleaning stations. “We...had just drawn soap and powder and the necessary utensils for cleaning paint work,” he wrote, “when the most awful explosion I ever heard or want to hear again occurred.” What Frank Baker heard was the biggest explosion of the pre-atomic age, a catastrophe of almost biblical proportions. The 918 words he wrote for December 6 make up the only eyewitness account known to be written on the day of what is now called the Halifax Explosion. After World War I, his diary sat unread for decades. Now, it has been included in an exhibit on the explosion’s centennial at the Dartmouth Heritage Museum, across the harbor from Halifax. It is published here for the first time. “The first thud shook the ship from stem to stern and the second one seemed to spin us all around, landing some [crew members] under the gun carriage and others flying in all directions all over the deck,” Baker wrote. Sailors 150 miles out to sea heard the blast. On land, people felt the jolt 300 miles away. The shock wave demolished almost everything within a half-mile. “Our first impression was that we were being attacked by submarines, and we all rushed for the upper deck, where we saw a veritable mountain of smoke of a yellowish hue and huge pieces of iron were flying all around us.” Unseen by Baker, two ships had collided in the Narrows, a strait linking a wide basin with the harbor proper, which opens into the Atlantic to the southeast. An outbound Belgian relief ship, the Imo, had strayed off course. An inbound French freighter, the Mont-Blanc, couldn’t get out of its way. The Imo speared the Mont-Blanc at an angle near its bow. The freighter carried 2,925 tons of high explosives, including 246 tons of benzol, a highly flammable motor fuel, in drums lashed to its deck. Some of the drums toppled and ruptured. Spilled benzol caught fire. The Mont-Blanc’s crew, unable to contain the flames, abandoned ship. The ghost vessel burned and drifted for about 15 minutes, coming to rest against a pier along the Halifax shore. Thousands of people on their way to work, already working at harborside jobs, or at home in Halifax and Dartmouth, stopped in their tracks to watch. Then the Mont-Blanc blew. “A shower of shrapnel passed over the Forecastle, shattering the glass in the engine room and chart room to smithereens, which came crashing down into the alleyways,” Baker wrote. “...The fires all burst out on to the floor of the stokehold [the engine room’s coal storage] and it was a marvel that the stokers were not burned to death, but all of them escaped injury as did all the other of the ship’s company. “A tug was alongside us at the time and part of her side was torn completely out and three of the crew were injured, one of them getting a piece of flesh weighing nearly 2 pounds torn off his leg. A hail of shrapnel descended about 20 yards from the ship, this came with such force that had it struck us we should certainly have all been lost.” The Mont-Blanc had disintegrated, showering iron fragments and black tar across Halifax; the shaft of its anchor, weighing 1,140 pounds, spiked into the earth more than two miles away. The explosion tore a hole in the harbor bottom, unleashing a tidal wave that tossed ships as if they were bathtub toys and washed away a Mi’kmaq fishing settlement that had been at the northwestern end of the basin for centuries. A volcanic plume of gray smoke, sparkling fragments and flame rose miles into the sky before billowing outward. “This was the last of the explosion, the whole of which had taken place inside of five minutes,...” Baker wrote. “Then came a lull of a few minutes and when the smoke had cleared sufficiently, we saw clearly what had happened....One ship had been hurled wholesale for a distance of about 400 yards, dashing it close to the shore, a total wreck with dead bodies battered and smashed lying all around in disorder. “Fires broke out on ships all around and hundreds of small crafts had been blown to hell and the sea presented an awful scene of debris and wreckage. Our doctor attended to the wounded men on the tug as quickly as possible and we laid them on stretchers in a motor boat and took them to hospital. The scene ashore was even worse. “The N.W. part of Halifax was in total ruins and fires were springing up all over the city. Part of the railway was completely demolished and everywhere were dead and dying among the ruins. When we arrived at the hospital, the windows were all blown out and the wards were two feet deep in water owing to all the pipes having burst. We had to return to our ship as quickly as possible, as we are Guard Ship and responsible for the safety of the other vessels in harbour.” Back on the Acadia, Baker beheld a desolate scene: “What a few hours before had been beautiful vessels, were now terrible wrecks, their crews all dead and bodies, arms, etc. were floating around in the water.” That afternoon the Acadia’s crew was called upon to quell a mutiny aboard the Eole, a French ship running relief for the Belgians. After doing so, they returned to their ship. “We quickly got hurried tea and proceeded ashore,” Baker wrote. “Here the scene was absolutely indescribable. “The town was literally ablaze, the dry dock and dockyard buildings completely demolished and everywhere wounded and dead. The theatres and suitable buildings were all turned into hospitals or shelters for the accommodation of the homeless. Naval and Military pickets were patrolling the streets endeavouring to keep order. Poor little kiddies homeless, their parents having perished, were crying piteously and anxious relatives were inquiring for their dear ones.” Virtually no family was untouched. By then, most of the nearly 2,000 known fatalities from the blast had occurred—though many bodies were unidentifiable. Some 9,000 were injured, many of them children—wounded in the face and eyes as they gazed out windows at the burning Mont-Blanc. Some 6,000 people were left homeless, and many thousands had to bed down in badly damaged houses. The coming morning would bring a blizzard and deep cold. Ashore, “we visited the part where the fires were at their worst, and it is beyond me to describe the absolute terror of the situation,” Baker wrote. “For miles around nothing but a flaming inferno, charred bodies being dragged from the debris and those poor devils who were left still lingering were piled into motor wagons and conveyed to one of the improvised hospitals. We returned to our ship at 11pm sick at heart with the appalling misery with which the city abounded. The glare from the fires lighting the harbour up like day, on the other side of the bay, the little town of Dartmouth was also in flames on sea and land nothing but misery, death and destruction....I cannot help but marvel that we escaped.” But Baker survived, and he served until March 1919. Then he settled in Kettering, about 80 miles north of London, with his diary, October 9, 1917, to January 14, 1918. In 1924, he married Jessie Liddington, from the nearby village of Pytchley; they had four sons. Eventually, he became head of a chain of butcher shops and meat-supply facilities. After retiring, in 1973, he moved to Australia, where two of his sons and many of his grandchildren were living. Two years later, he learned he had cancer. At that point, he passed the diary and some photographs from his time aboard the Acadia to his son “without any explanation,” the son, Rex, told me. After his father died, in 1977, “I put them away and forgot about them for over 30 years.” Only after Rex retired—he’s 72 now, and living in Busselton, a seaside town south of Perth—did he pull the diary from the bureau drawer where he’d stowed it. Once he read it, he suspected that it might have historical significance, so in January 2016 he contacted Bonnie Elliott, director of the Dartmouth Heritage Museum. When she read it, she says, “I fell off a log. I knew this diary was really important.” Rex Baker carried the diary himself to Canada. While there, he boarded the Acadia, which is now a floating museum in Halifax Harbor, for the first time. Elliott met him as he stepped off the ship. “There were tears in his eyes,” she recalls. Baker says his father “spoke to no one in the family about that experience at all.” After reading the diary, though, he says that as he walked about the Acadia, “I felt almost a presence. Like he was standing behind me.” Marc Wortman has written three history books, including 1941: Fighting the Shadow War: A Divided American in a World at War.
c3c30159ebd47d5bbb68d9505ae7aeff
https://www.smithsonianmag.com/history/nikita-khrushchev-goes-to-hollywood-30668979/
Nikita Khrushchev Goes to Hollywood
Nikita Khrushchev Goes to Hollywood Fifty summers ago President Dwight Eisenhower, hoping to resolve a mounting crisis over the fate of Berlin, invited Soviet Premier Nikita Khrushchev to a summit meeting at Camp David. Ike had no idea of what he was about to unleash on the land whose Constitution he had sworn to defend. [×] CLOSE Video: Nikita Khrushchev's Great American Tour It was the height of the cold war, a frightening age of fallout shelters and "duck-and-cover" drills. No Soviet premier had visited the United States before, and most Americans knew little about Khrushchev except that he had jousted with Vice President Richard Nixon in the famous "kitchen debate" in Moscow that July and had uttered, three years before, the ominous-sounding prediction, "We will bury you." Khrushchev accepted Ike's invitation—and added that he'd also like to travel around the country for a few weeks. Ike, suspicious of the wily dictator, reluctantly agreed. Reaction to the invitation was mixed, to say the least. Hundreds of Americans bombarded Congress with angry letters and telegrams of protest. But hundreds of other Americans bombarded the Soviet Embassy with friendly pleas that Khrushchev visit their home or their town or their county fair. "If you'd like to enter a float," the chairman of the Minnesota Apple Festival wrote to Khrushchev, "please let us know." A few days before the premier's scheduled arrival, the Soviets launched a missile that landed on the moon. It was the first successful moonshot, and it caused a massive outbreak of UFO sightings in Southern California. That was only a prelude to a two-week sojourn that historian John Lewis Gaddis would characterize as "a surreal extravaganza." After weeks of hype—"Khrushchev: Man or Monster?" (New York Daily News), "Capital Feverish on Eve of Arrival" (New York Times), "Official Nerves to Jangle in Salute to Khrushchev" (Washington Post), "Khrushchev to Get Free Dry Cleaning" (New York Herald Tribune)—Khrushchev landed at Andrews Air Force base on September 15, 1959. Bald as an egg, he stood only a few inches over five feet but weighed nearly 200 pounds, and he had a round face, bright blue eyes, a mole on his cheek, a gap in his teeth and a potbelly that made him look like a man shoplifting a watermelon. When he stepped off the plane and shook Ike's hand, a woman in the crowd exclaimed, "What a funny little man!" Things got funnier. As Ike read a welcoming speech, Khrushchev mugged shamelessly. He waved his hat. He winked at a little girl. He theatrically turned his head to watch a butterfly flutter by. He stole the spotlight, one reporter wrote, "with the studied nonchalance of an old vaudeville trouper." The traveling Khrushchev roadshow had begun. The next day, he toured a farm in Maryland, where he petted a pig and complained that it was too fat, then grabbed a turkey and griped that it was too small. He also visited the Senate Foreign Relations Committee and advised its members to get used to communism, drawing an analogy with one of his facial features: "The wart is there, and I can't do anything about it." Early the next morning, the premier took his show to New York City, accompanied by his official tour guide, Henry Cabot Lodge Jr., the United States ambassador to the United Nations. In Manhattan, Khrushchev argued with capitalists, yelled at hecklers, shadowboxed with Gov. Nelson Rockefeller, got stuck in an elevator in the Waldorf-Astoria Hotel and toured the Empire State Building, which failed to impress him. "If you've seen one skyscraper," he said, "you've seen them all." And on the fifth day, the cantankerous communist flew to Hollywood. There, things only got weirder. Twentieth Century Fox had invited Khrushchev to watch the filming of Can-Can, a risqué Broadway musical set among the dance hall girls of fin de siècle Paris, and he had accepted. It was an astounding feat: a Hollywood studio had persuaded the communist dictator of the world's largest nation to appear in a shameless publicity stunt for a second-rate musical. The studio sweetened the deal by arranging for a luncheon at its elegant commissary, the Café de Paris, where the great dictator could break bread with the biggest stars in Hollywood. But there was a problem: only 400 people could fit into the room, and nearly everybody in Hollywood wanted to be there. "One of the angriest social free-for-alls in the uninhibited and colorful history of Hollywood is in the making about who is to be at the luncheon," Murray Schumach wrote in the New York Times. The lust for invitations to the Khrushchev lunch was so strong that it overpowered the fear of communism that had reigned in Hollywood since 1947, when the House Committee on Un-American Activities began investigating the movie industry, inspiring a blacklist of supposed communists that was still enforced in 1959. Producers who were scared to death of being seen snacking with a communist screenwriter were desperate to be seen dining with the communist dictator. A handful of stars—Bing Crosby, Ward Bond, Adolphe Menjou and Ronald Reagan—turned down their invitations as a protest against Khrushchev, but not nearly enough to make room for the hordes who demanded one. Hoping to ease the pressure, 20th Century Fox announced that it would not invite agents or the stars' spouses. The ban on agents crumbled within days, but the ban on spouses held. The only husband-and-wife teams invited were those in which both members were stars—Tony Curtis and Janet Leigh; Dick Powell and June Allyson; Elizabeth Taylor and Eddie Fisher. Marilyn Monroe's husband, the playwright Arthur Miller, might have qualified as a star, but he was urged to stay home because he was a leftist who'd been investigated by the House committee and therefore was considered too radical to dine with a communist dictator. However, the studio was determined that Miller's wife attend. "At first, Marilyn, who never read the papers or listened to the news, had to be told who Khrushchev was," Lena Pepitone, Monroe's maid, recalled in her memoirs. "However, the studio kept insisting. They told Marilyn that in Russia, America meant two things, Coca-Cola and Marilyn Monroe. She loved hearing that and agreed to go....She told me that the studio wanted her to wear the tightest, sexiest dress she had for the premier." "I guess there's not much sex in Russia," Marilyn told Pepitone. Monroe arrived in Los Angeles a day ahead of Khrushchev, flying from New York, near where she and Miller were then living. When she landed, a reporter asked if she'd come to town just to see Khrushchev. "Yes," she said. "I think it's a wonderful thing, and I'm happy to be here." That provoked the inevitable follow-up question: "Do you think Khrushchev wants to see you?" "I hope he does," she replied. The next morning, she arose early in her bungalow at the Beverly Hills Hotel and began the complex process of becoming Marilyn Monroe. First, her masseur, Ralph Roberts, gave her a rubdown. Then hairdresser Sydney Guilaroff did her hair. Then makeup artist Whitey Snyder painted her face. Finally, as instructed, she donned a tight, low-cut black patterned dress. In the middle of this elaborate project, Spyros Skouras, the president of 20th Century Fox, dropped by to make sure that Monroe, who was notorious for being late, would arrive at this affair on time. "She has to be there," he said. And she was. Her chauffeur, Rudi Kautzsky, delivered her to the studio. When they found the parking lot nearly empty, she was scared. "We must be late!" she said. "It must be over." It wasn't. For perhaps the first time in her career, Marilyn Monroe had arrived early. Waiting for Khrushchev to arrive, Edward G. Robinson sat at table 18 with Judy Garland and Shelley Winters. Robinson puffed on his cigar and gazed out at the kings and queens of Hollywood—the men wearing dark suits, the women in designer dresses and shimmering jewels. Gary Cooper was there. So was Kim Novak. And Dean Martin, Ginger Rogers, Kirk Douglas, Jack Benny, Tony Curtis and Zsa Zsa Gabor. "This is the nearest thing to a major Hollywood funeral that I've attended in years," said Mark Robson, the director of Peyton Place, as he eyeballed the scene. Marilyn Monroe sat at a table with producer David Brown, director Joshua Logan and actor Henry Fonda, whose ear was stuffed with a plastic plug that was attached to a transistor radio tuned to a baseball game between the Los Angeles Dodgers and the San Francisco Giants, who were fighting for the National League pennant. Debbie Reynolds sat at table 21, which was located—by design—across the room from table 15, which was occupied by her ex-husband Eddie Fisher and his new wife, Elizabeth Taylor, who had been Reynolds' close friend until Fisher left her for Taylor. The studio swarmed with plainclothes police, both American and Soviet. They inspected the shrubbery outside, the flowers on each table and both the men's and women's rooms. In the kitchen, an LAPD forensic chemist named Ray Pinker ran a Geiger counter over the food. "We're just taking precautions against the secretion of any radioactive poison that might be designed to harm Khrushchev," Pinker said before heading off to check the soundstage where the premier would watch the filming of Can-Can. As Khrushchev's motorcade pulled up to the studio, the stars watched live coverage of his arrival on televisions that had been set up around the room, their knobs removed so nobody could change the channel to the Dodgers-Giants game. They saw Khrushchev emerge from a limo and shake hands with Spyros Skouras. A few moments later, Skouras led Khrushchev into the room and the stars stood to applaud. The applause, according to the exacting calibrations of the Los Angeles Times, was "friendly but not vociferous." Khrushchev took a seat at the head table. At an adjacent table, his wife, Nina, sat between Bob Hope and Frank Sinatra. Elizabeth Taylor climbed on top of table 15 so she could get a better look at the dictator. As the waiters delivered lunch—squab, wild rice, Parisian potatoes and peas with pearl onions—Charlton Heston, who'd once played Moses, attempted to make small talk with Mikhail Sholokhov, the Soviet novelist who would win the Nobel Prize in Literature in 1965. "I have read excerpts from your works," Heston said. "Thank you," Sholokhov replied. "When we get some of your films, I shall not fail to watch some excerpts from them." Nearby, Nina Khrushchev showed Frank Sinatra and David Niven pictures of her grandchildren and bantered with cowboy star Gary Cooper, one of the few American actors she'd actually seen on-screen. She told Bob Hope that she wanted to see Disneyland. As Henry Cabot Lodge ate his squab, Los Angeles Police Chief William Parker suddenly appeared behind him, looking nervous. Earlier, when Khrushchev and his entourage had expressed interest in going to Disneyland, Parker had assured Lodge that he could provide adequate security. But during the drive from the airport to the studio, somebody threw a big, ripe tomato at Khrushchev's limo. It missed, splattering the chief's car instead. Now, Parker leaned over and whispered into Lodge's ear. "I want you, as a representative of the president, to know that I will not be responsible for Chairman Khrushchev's safety if we go to Disneyland." That got Lodge's attention. "Very well, Chief," he said. "If you will not be responsible for his safety, we do not go, and we will do something else." Someone in Khrushchev's party overheard the conversation and immediately got up to tell the Soviet leader that Lodge had canceled the Disneyland trip. The premier sent a note back to the ambassador: "I understand you have canceled the trip to Disneyland. I am most displeased." When the waiters had cleared away the dishes, Skouras stood up to speak. Short, stocky and bald, Skouras, 66, looked a lot like Khrushchev. With a gravelly voice and a thick accent, he also sounded a lot like Khrushchev. "He had this terrible Greek accent—like a Saturday Night Live put-on," recalled Chalmers Roberts, who covered Khrushchev's U.S. tour for the Washington Post. "Everybody was laughing." Khrushchev listened to Skouras for a while, then turned to his interpreter and whispered, "Why interpret for me? He needs it more." Skouras may have sounded funny, but he was a serious businessman with a classic American success story. Son of a Greek shepherd, he had immigrated to America at 17, settling in St. Louis, where he sold newspapers, bused tables and saved his money. With two brothers, he invested in a movie theater, then another, and another. By 1932, he was managing a chain of 500 theaters. A decade later, he was running 20th Century Fox. "In all modesty, I beg you to look at me," he said to Khrushchev from the dais. "I am an example of one of those immigrants who, with my two brothers, came to this country. Because of the American system of equal opportunities, I am now fortunate enough to be president of 20th Century Fox." Like so many other after-dinner orators on Khrushchev's trip, Skouras wanted to teach him about capitalism: "The capitalist system, or the price system, should not be criticized, but should be carefully analyzed—otherwise America would never have been in existence." Skouras said he'd recently toured the Soviet Union and found that "warm-hearted people were sorrowful for the millions of unemployed people in America." He turned to Khrushchev. "Please tell your good people there is no unemployment in America to worry about." Hearing that, Khrushchev could not resist heckling. "Let your State Department not give us these statistics about unemployment in your country," he said, raising his palms in a theatrical gesture of befuddlement. "I'm not to blame. They're your statistics. I'm only the reader, not the writer." That got a laugh from the audience. "Don't believe everything you read," Skouras shot back. That got a laugh, too. When Skouras sat down, Lodge stood up to introduce Khrushchev. While the ambassador droned on about America's alleged affection for Russian culture, Khrushchev heckled him, plugging a new Soviet movie. "Have you seen They Fought for Their Homeland?" the premier called out. "It is based on a novel by Mikhail Sholokhov." "No," Lodge said, a bit taken aback. "Well, buy it," said Khrushchev. "You should see it." Smiling, the dictator stepped to the dais and invited the stars to visit the Soviet Union: "Please come," he said. "We will give you our traditional Russian pies." He turned to Skouras—"my dear brother Greek"—and said he was impressed by his capitalist rags-to-riches story. But then he topped it with a communist rags-to-riches story. "I started working as soon as I learned how to walk," he said. "I herded cows for the capitalists. That was before I was 15. After that, I worked in a factory for a German. Then I worked in a French-owned mine." He paused and smiled. "Today, I am the premier of the great Soviet state." Now it was Skouras' turn to heckle. "How many premiers do you have?" "I will answer that," Khrushchev replied. He was premier of the whole country, he said, and then each of the 15 republics had its own premier. "Do you have that many?" "We have two million American presidents of American corporations," Skouras replied. Score one for Skouras! Of course, Khrushchev was not willing to concede anything. "Mr. Tikhonov, please rise," the premier ordered. At a table in the audience, Nikolai Tikhonov stood up. "Who is he?" Khrushchev asked. "He is a worker. He became a metallurgical engineer....He is in charge of huge chemical factories. A third of the ore mined in the Soviet Union comes from his region. Well, Comrade Greek, is that not enough for you?" "No," Skouras shot back. "That's a monopoly." "It is a people's monopoly," Khrushchev replied. "He does not possess anything but the pants he wears. It all belongs to the people!" Earlier, Skouras had reminded the audience that American aid helped fight a famine in the Soviet Union in 1922. Now, Khrushchev reminded Skouras that before the Americans sent aid, they sent an army to crush the Bolshevik revolution. "And not only the Americans," he added. "All the capitalist countries of Europe and of America marched upon our country to strangle the new revolution. Never have any of our soldiers been on American soil, but your soldiers were on Russian soil. These are the facts." Still, Khrushchev said, he bore no ill will. "Even under those circumstances," he said, "we are still grateful for the help you rendered." Khrushchev then recounted his experiences fighting in the Red Army during the Russian civil war. "I was in the Kuban region when we routed the White Guard and threw them into the Black Sea," he said. "I lived in the house of a very interesting bourgeois intellectual family." Here he was, Khrushchev went on, an uneducated miner with coal dust still on his hands, and he and other Bolshevik soldiers, many of them illiterate, were sharing the house with professors and musicians. "I remember the landlady asking me: ‘Tell me, what do you know about ballet? You're a simple miner, aren't you?' To tell the truth, I didn't know anything about ballet. Not only had I never seen a ballet, I had never seen a ballerina." The audience laughed. "I did not know what sort of dish it was or what you ate it with." That brought more laughter. "And I said, ‘Wait, it will all come. We will have everything—and ballet, too.'" Even the tireless Red-bashers of the Hearst press conceded that "it was almost a tender moment." But of course Khrushchev could not stop there. "Now I have a question for you," he said. "Which country has the best ballet? Yours? You do not even have a permanent opera and ballet theater. Your theaters thrive on what is given to them by rich people. In our country, it is the state that gives the money. And the best ballet is in the Soviet Union. It is our pride." He rambled on, then apologized for rambling. After 45 minutes of speaking, he seemed to be approaching an amiable closing. Then he remembered Disneyland. "Just now, I was told that I could not go to Disneyland," he announced. "I asked, ‘Why not? What is it? Do you have rocket-launching pads there?' " The audience laughed. "Just listen," he said. "Just listen to what I was told: ‘We—which means the American authorities—cannot guarantee your security there.' " He raised his hands in a vaudevillian shrug. That got another laugh. "What is it? Is there an epidemic of cholera there? Have gangsters taken hold of the place? Your policemen are so tough they can lift a bull by the horns. Surely they can restore order if there are any gangsters around. I say, ‘I would very much like to see Disneyland.' They say, ‘We cannot guarantee your security.' Then what must I do, commit suicide?" Khrushchev was starting to look more angry than amused. His fist punched the air above his red face. "That's the situation I find myself in," he said. "For me, such a situation is inconceivable. I cannot find words to explain this to my people." The audience was baffled. Were they really watching the 65-year-old dictator of the world's largest country throw a temper tantrum because he couldn't go to Disneyland? Sitting in the audience, Nina Khrushchev told David Niven that she really was disappointed that she couldn't see Disneyland. Hearing that, Sinatra, who was sitting next to Mrs. Khrushchev, leaned over and whispered in Niven's ear. "Screw the cops!" Sinatra said. "Tell the old broad that you and I will take 'em down there this afternoon." Before long, Khrushchev's tantrum—if that's what it was—faded away. He grumbled a bit about how he'd been stuffed into a sweltering limousine at the airport instead of a nice, cool convertible. Then he apologized, sort of: "You will say, perhaps, ‘What a difficult guest he is.' But I adhere to the Russian rule: ‘Eat the bread and salt but always speak your mind.' Please forgive me if I was somewhat hot-headed. But the temperature here contributes to this. Also"—he turned to Skouras—"my Greek friend warmed me up." Relieved at the change of mood, the audience applauded. Skouras shook Khrushchev's hand and slapped him on the back and the two old, fat, bald men grinned while the stars, who recognized a good show when they saw one, rewarded them with a standing ovation. The lunch over, Skouras led his new friend toward the soundstage where Can-Can was being filmed, stopping to greet various celebrities along the way. When Skouras spotted Marilyn Monroe in the crowd, he hastened to introduce her to the premier, who'd seen a huge close-up of her face—a clip from Some Like It Hot—in a film about American life at an American exhibition in Moscow. Now, Khrushchev shook her hand and looked her over. "You're a very lovely young lady," he said, smiling. Later, she would reveal what it was like to be eyeballed by the dictator: "He looked at me the way a man looks on a woman." At the time, she reacted to his stare by casually informing him that she was married. "My husband, Arthur Miller, sends you his greeting," she replied. "There should be more of this kind of thing. It would help both our countries understand each other." Skouras led Khrushchev and his family across the street to Sound Stage 8 and up a rickety wooden staircase to a box above the stage. Sinatra appeared onstage wearing a turn-of-the-century French suit—his costume. He played a French lawyer who falls in love with a dancer, played by Shirley MacLaine, who was arrested for performing a banned dance called the cancan. "This is a movie about a lot of pretty girls—and the fellows who like pretty girls," Sinatra announced. Hearing a translation, Khrushchev grinned and applauded. "Later in this picture, we go to a saloon," Sinatra continued. "A saloon is a place where you go to drink." Khrushchev laughed at that, too. He seemed to be having a good time. Shooting commenced; lines were delivered, and after a dance number that left no doubts why the cancan had once been banned, many spectators—American and Russian—wondered: Why did they choose this for Khrushchev? "It was the worst choice imaginable," Wiley T. Buchanan, the State Department's chief of protocol, later recalled. "When the male dancer dived under [MacLaine's] skirt and emerged holding what seemed to be her red panties, the Americans in the audience gave an audible gasp of dismay, while the Russians sat in stolid, disapproving silence." Later, Khrushchev would denounce the dance as pornographic exploitation, though at the time he seemed happy enough. "I was watching him," said Richard Townsend Davies of the State Department, "and he seemed to be enjoying it." Sergei Khrushchev, the premier's son, wasn't so sure. "Maybe father was interested, but then he started to think, What does this mean?" he recalled. "Because Skouras was very friendly, Father did not think it was some political provocation. But there was no explanation. It was just American life." Sergei shrugged, then added: "Maybe Khrushchev liked it, but I will say for sure: My mother did not like it." A few moments later, Khrushchev slid into a long black limousine with huge tailfins. Lodge slipped in after him. The limo inched forward, slowly picking up speed. Having put the kibosh on Disneyland, Khrushchev's guides were compelled to come up with a new plan. They took the premier on a tour of tract housing developments instead. Khrushchev never did get to Disneyland. Peter Carlson spent 22 years at the Washington Post as a feature writer and columnist. He lives in Rockville, Maryland. Adapted from K Blows Top, by Peter Carlson, published by PublicAffairs, a member of the Perseus Book Group. All rights reserved.
7a5902ad2d68340309d08ad87ea84c96
https://www.smithsonianmag.com/history/no-costume-grab-sheet-and-rock-toga-180953199/
No Costume? Grab A Sheet And Rock a Toga
No Costume? Grab A Sheet And Rock a Toga If you've made it to October 31 without a game plan for a costume, it's time for a reality check: any costume store within driving distance of civilization will be more frightening than The Ring. Amazon's drones won't be able to express ship you that banana suit in time, which means that your costume is going to be homemade. Luckily, that's not a big deal, because you can create a perfectly classic costume with something that almost everyone (even the worst planners) have on hand: a simple bed sheet. The toga might have, in recent years, gained a bad reputation as the chosen garb of drunken coeds, but in reality, it's an easy—and historically interesting—way to pull together a last-minute costume. Traditionally, togas were worn like a modern-day tuxedo, a ceremonial garment designed to denote status among Roman male citizens. Noncitizens, slaves and women weren't permitted to wear togas, though prostitutes could. Togas were worn from the beginning of the Roman empire through to its end and originated from an Etruscan garment known as the "tebenna." Unless they were participating in an athletic event, Romans would wear their toga over a tunic, so wear a casual shirt and shorts (or pants) under your toga to ensure you won't be arrested for public indecency. Smithsonian.com got the scoop on how to perfect the toga-wrap from Mariah Hale, a costume designer whose work can be seen starting November 3 in the Folger Theater's production of Julius Caesar in Washington, D.C. (though this production forgoes togas in favor of more "timeless" costumes.) To assemble your toga, you'll need three things: a bed sheet (crucial), some safety pins (helpful, but not crucial) and a decorative pin of some kind (fun, but also not crucial). You can use any size sheet, though Hale recommends using something bigger than a twin, which can be too small. You can also use any color, though most traditional Roman togas were white. If you're feeling fancy, try purple (Roman senators often had purple strips in their togas, denoting status). Black togas were worn occasionally for mourning purposes, so unless you're feeling particularly dower, avoid dark colored sheets. Begin by folding the sheet in half lengthwise. If you want the toga to drape longer across the body, fold the sheet not-quite halfway. Drape one end of the toga across the left shoulder, adjusting the sheet so that the bottom hits above the left ankle. Then, using the left arm and the body to hold the toga in place, begin wrapping the sheet around the back, stopping when the sheet reaches across the back to the right side of the body. Gather the remaining width of the sheet in your hands, creating ripples/folds/an accordion-like texture. Continue wrapping the sheet the rest of the way around the body (under the right arm, across the front of the body), draping the remaining portion of the sheet over the left shoulder. For extra security, use a safety pin to secure the sheet over the left shoulder. If you're using a decorative pin of some kind, you can pin it on the shoulder or chest. Ta-da! Go forth and impress the world with your costume ingenuity and knowledge of toga history. (Animated gifs by Casey McAdams of the Smithsonian Digital Studio) Natasha Geiling is an online reporter for Smithsonian magazine.
2285baaa496f4f97c7f458c835a9a9e2
https://www.smithsonianmag.com/history/november-anniversaries-4-64999169/
November Anniversaries
November Anniversaries 70 Years Ago Incline and Fall Washington’s four-month-old Tacoma Narrows Bridge collapses on November 7, 1940. Known as “Galloping Gertie” for its vertical ripple in a breeze, the span begins twisting laterally in the day’s 42 mph winds. “I saw a side girder bulge out,” says a witness. “Suddenly the bridge dropped from under me.” A dog is the lone fatality. Study of the collapse—caught on film—improves bridge design. 120 Years Ago Service Revelry A crowd of 500 gathers at the U.S. Military Academy at West Point as cadets face off against midshipmen from the U.S. Naval Academy, November 29, 1890, in the first Army-Navy football game. The contest is spearheaded by cadet Dennis Michie, a former high-school player who persuades West Point to field a team and challenge Navy. Navy triumphs, 24-0; Army wins a rematch in 1891. Though the teams will be eclipsed by other college rosters, the annual battle—set in 2010 for December 11—remains a fall classic. 175 Years Ago Mark My Words Samuel Langhorne Clemens is born in Florida, Missouri, November 30, 1835. His father dies when he is 12, and Clemens finds work as a typesetter and apprentice riverboat pilot before beginning the writing career that makes him famous as Mark Twain, a pen name he adopts in 1863. As a journalist, and in more than 10 novels, including Tom Sawyer (1876) and Huckleberry Finn (1884), Twain infuses his work with an acerbic wit and his memories of life along the Mississippi. The uncensored version of his autobiography, embargoed for 100 years after his death in 1910, is published in 2010. 390 Years Ago Safe Harbor Two months after leaving England, the 102 pilgrims of the Mayflower, foiled by bad weather in their attempt to reach their intended Hudson River destination, anchor at today’s Cape Cod, November 11, 1620. “The whole country, full of woods and thickets, represented a wild and savage hue,” colony leader William Bradford will later recall. To prevent dissenters from striking out on their own, the pilgrim men sign a compact creating a government in their new location. After a winter spent aboard ship, the 57 survivors head ashore to found a new English colony at Plymouth. 490 Years Ago Sea to Shining Sea Portuguese explorer Ferdinand Magellan and his men become the first Europeans to sail from the Atlantic Ocean to the Pacific, November 28, 1520. Navigating a winding 350-mile-long strait between mainland South America and Tierra del Fuego takes Magellan 38 days; on emerging, he orders artillery fire and rejoicing. Magellan is killed before his crew finishes the first sail around the world, but his strait will remain a vital trade route until the Panama Canal opens in 1914.
9b68e3c2e3d66067089c5a37aba30fde
https://www.smithsonianmag.com/history/november-anniversaries-6-106885816/
November Anniversaries
November Anniversaries 40 Years Ago Leader of the Pack Six months after its launch, Mariner 9 reaches Mars on November 14, 1971, narrowly beating two Soviet missions to be the first spacecraft to orbit another planet. For 349 days the unmanned orbiter relays data about the composition and temperature of Mars and its atmosphere, sending back more than 7,000 images, including detailed looks at volcanoes, polar caps and moons Phobos (above: an artist’s rendering) and Deimos. A new Mars rover designed to search for conditions favorable to microbial life is slated to launch this month or next. 40 Years Ago One Jump Ahead A man calling himself Dan Cooper hijacks Northwest Orient Flight 305 on November 24, 1971, demanding and receiving $200,000 and four parachutes. When he jumps out of the 727 over Washington State and disappears into a dark and stormy night, the legend of D. B. Cooper is born. A child discovers some of the money by the Columbia River in 1980, but though the FBI checks out some 1,000 people—including an Oregon man whose niece claims in 2011 that he did it—Cooper has yet to be positively identified. 150 Years Ago Glory, Glory, Hallelujah Inspired by a foray to review Union troops near Washington, D.C. in November 1861, author and abolitionist Julia Ward Howe writes the poem that becomes known as “The Battle Hymn of the Republic.” Awakened in the night by an “attack of versification,” Howe scrawls words to fit the tune of the popular “John Brown’s Body”—itself a reworking of a song attributed to William Steffe. After the poem is published by the Atlantic Monthly in 1862, earning Howe four dollars, it becomes an unofficial Union anthem and later a patriotic standard. Howe dies in 1910. 160 Years Ago Hast Seen the White Whale? Herman Melville, 32, publishes Moby-Dick in the United States on November 14, 1851. He grounds his tale of Captain Ahab, whose obsessed pursuit of the “accursed” white whale dooms the Pequod, in his own experiences aboard a whaling vessel and earlier accounts of disastrous cetacean encounters. Reviews range from “thrilling” to “ridiculous.” Melville’s reputation sinks before Moby-Dick is hailed in the 1920s as a great American novel. 200 Years Ago Tall in the Saddle In the early hours of November 7, 1811, members of a Native American confederation in Indiana Territory, united by Shawnee brothers Tecumseh and Tenskwatawa to defend against increasing white settlement, attack an army led by territory governor William Henry Harrison (right). Confronting the Indians near their village on the Tippecanoe River, Harrison’s troops cripple the confederation. Tecumseh, away during the fight, sides with the British in the War of 1812; he dies in a battle against Harrison in 1813. In 1840 Harrison rides his fame as “Old Tippecanoe” into the White House, only to die in 1841, just weeks into his presidency.
3b2f1079bf3fa0528d31c258da22a9c4
https://www.smithsonianmag.com/history/one-and-only-time-major-party-embraced-third-party-candidate-180959923/
The Only Time a Major Party Embraced a Third-Party Candidate for President
The Only Time a Major Party Embraced a Third-Party Candidate for President By the time Michele Obama and Bernie Sanders were finished speaking in Philadelphia last night, this year’s Democratic National Convention had already lasted longer than the shortest Democratic National Convention in history. That lightning confab was held in Baltimore in July 1872. It lasted just six hours, split over two days. Once the general election was decided that fall, party elders might have wished they’d taken more time. That election was held at an acutely volatile time, just seven years after the Civil War. The rights and roles of African-American citizens were still fiercely contested, in the North as well as in the South. The extent to which the federal government could or would enforce Reconstruction was in question. And when it came to rebuilding the war-battered economy, free traders were at loggerheads with tariff-wielding protectionists. These wedge issues were splintering both the Republicans and the Democrats, but the GOP had a strong enough center to re-nominate the incumbent president: Ulysses S. Grant. One of those splinter groups organized as the Liberal Republican Party. It railed against corruption in the Grant administration and contended that U.S. troops should be pulled out of the South because African-Americans now had political and civil rights. Convening in Cincinnati in May 1872, the Liberal Republicans nominated New York Tribune editor Horace Greeley for president. Two months later, the Democrats—smelling opportunity in the Republican’s disarray—adopted Greeley, too, even though he had regularly blistered them in his newspaper over a variety of issues. No major party had embraced a third-party candidate before. No major party has done so since. Greeley was already famous for his newspaper’s anti-slavery crusading, and he was becoming famous for some career advice he dispensed to a young correspondent in 1871: “I say to all who are in want of work, Go West!” In 2006, biographer Robert C. Williams wrote that “Greeley’s personality and fame as a fearless editor and reformer, more than his political philosophy, made him a serious candidate. He symbolized virtue over corruption, reform over reaction, reconciliation over revenge, generosity over greed.” And yet: Greeley had a well-earned reputation as an erratic advocate, and among his contemporaries, he came off as an incorrigible scold. During the Civil War, he and President Abraham Lincoln sparred over the pace, timing and extent of emancipation. The abolitionist William Lloyd Garrison wrote that Greeley was “a first-class political demagogue, unless it may be charitably suspected that he is smitten with imbecility.” One of Greeley’s supporters thought he was “a sort of inspired idiot, neither a scholar, statesman or gentleman.” Grant believed he was “a disappointed man at not being estimated by others at the same value he places upon himself.” Even so, Greeley entered the 1872 campaign as the nominee of two parties to Grant’s one. It didn’t matter. Grant remained popular. Thomas Nast sharpened his caricaturist’s pen on Greeley’s foibles. “I have been assailed so bitterly,” said the Democrat/Liberal Republican, “that I hardly knew whether I was running for the presidency or the penitentiary." On Election Day, Grant took 56 percent of the popular vote, besting Greeley by 12 percentage points. And then, that November 29, Greeley died, at the age of 61—the only time a candidate died between the popular vote and the balloting in the Electoral College. The 66 electoral votes that had pledged to him were divvied out to five other candidates. But Grant had amassed 286, and so went on to his second term. Tom Frail is a senior editor for Smithsonian magazine. He previously worked as a senior editor for the Washington Post and for Philadelphia Newspapers Inc.
de99b22438e9450cacac367ff8469075
https://www.smithsonianmag.com/history/one-few-surviving-heroes-d-day-shares-his-story-180972323/
One of the Few Surviving Heroes of D-Day Shares His Story
One of the Few Surviving Heroes of D-Day Shares His Story As world leaders and assorted dignitaries join the throngs of grateful citizens and remembrance tourists in Normandy this year to commemorate the 75th anniversary of D-Day, one group in particular will command a special reverence: veterans of the actual battle. Their numbers are rapidly dwindling. The U.S. Department of Veterans Affairs estimates that fewer than 3 percent of the 16 million Americans who served in World War II are still living. For those who saw the fiercest combat, the numbers are even more sobering. One telling measure: As of mid-May, just three of the war’s 472 Medal of Honor winners were still alive. The youngest D-Day vets are now in their mid-90s, and it is generally understood, if not necessarily said aloud, that this year’s major anniversary salutes may be the final ones for those few surviving warriors. One of the returning American vets is 98-year-old Arnold Raymond “Ray” Lambert, who served as a medic in the 16th Infantry Regiment of the army’s storied First Division, the “Big Red One.” Lambert, then 23, was but one soldier in the largest combined amphibious and airborne invasion in history, a mighty armada of some 160,000 men, 5000 vessels and 11,000 aircraft—the vanguard of the Allied liberation of Western Europe from what Churchill had called “a monstrous tyranny never surpassed in the dark, lamentable catalogue of human crime.” When D-Day finally arrived, after years of planning and mobilization, the Big Red One was at the point of the spear. In the early dawn of June 6, 1944, Lambert’s medical unit landed with the first assault wave at Omaha Beach, where Wehrmacht troops were especially well-armed, well-fortified and well-prepared. Drenched, weary and seasick from the nighttime Channel crossing in rough seas, the GIs faced daunting odds. Pre-dawn aerial bombardments had landed uselessly far from their targets; naval gunfire support had ended; amphibious tanks were sinking before they reached land. Many of the landing craft were swamped by high waves, drowning most of their men. Soldiers charged forward in chest-deep waters, weighed down by as much as 90 pounds of ammunition and equipment. As they came ashore, they faced withering machine gun, artillery and mortar fire. In the opening minutes of battle, by one estimate, 90 percent of the frontline GIs in some companies were killed or wounded. Within hours, casualties mounted into the thousands. Lambert was wounded twice that morning but was able to save well more than a dozen lives thanks to his bravery, skill and presence of mind. Impelled by instinct, training and a profound sense of responsibility for his men, he rescued many from drowning, bandaged many others, shielded wounded men behind the nearest steel barrier or lifeless body, and administered morphine shots—including one for himself to mask the pain of his own wounds. Lambert’s heroics only ended when a landing craft ramp weighing hundreds of pounds crashed down on him as he attempted to help a wounded soldier emerge from the surf. Unconscious, his back broken, Lambert was tended to by medics and soon found himself on a vessel heading back to England. But his ordeal was far from over. “When I came out of the army I weighed 130 pounds,” Lambert says. “I'd been in hospital for almost a year after D-Day, in England, then back in the States, before I was able to walk and really get around too good.” The now-annual D-Day commemorations initially dispensed with pomp and circumstance. On June 6, 1945, just a month after V-E Day, Supreme Allied Commander Dwight D. Eisenhower simply granted troops a holiday, declaring that “formal ceremonies would be avoided.” In 1964, Ike revisited Omaha Beach with Walter Cronkite in a memorable CBS News special. Twenty years later, President Ronald Reagan delivered a soaring address at Pointe du Hoc, overlooking the beach. He praised the heroism of the victorious allied forces, spoke of reconciliation with Germany and the Axis powers, which had also suffered greatly, and reminded the world: “The United States did its part, creating the Marshall Plan to help rebuild our allies and our former enemies. The Marshall Plan led to the Atlantic alliance—a great alliance that serves to this day as our shield for freedom, for prosperity, and for peace.” Ray Lambert has visited Normandy many times and is returning for the 75th anniversary to take part in solemn ceremonies, visit the war museums, and pay his respects to the 9,380 men buried in the American military cemetery in Colleville-sur-Mer, on the high bluff overlooking the hallowed beach. Lambert knew many of those men from D-Day and earlier amphibious assaults and pitched battles in North Africa and Sicily, where he earned a Silver Star, Bronze Star and two Purple Hearts. After D-Day, he was awarded another Bronze Star and Purple Heart. There is evidence that he earned two more Silver Stars—one each in Normandy and Sicily—but official paperwork has been lost or destroyed, and Lambert is not the sort of man to claim honors that might not be absolutely clear. The tranquil seaside scene of today’s Normandy coastline is very different from the one etched in Lambert’s soul. “Where tourists and vacationers see pleasant waves, I see the faces of drowning men,” Lambert writes in Every Man a Hero: A Memoir of D-Day, the First Wave at Omaha Beach and a World at War, co-authored with writer Jim DeFelice and published on May 28. “Amid the sounds of children playing, I hear the cries of men pierced by Nazi bullets.” He especially remembers the sound of combat, a furious cacophony unlike anything in civilian life. “The noise of war does more than deafen you,” he writes. “It’s worse than shock, more physical than something thumping against your chest. It pounds your bones, rumbling through your organs, counter-beating your heart. Your skull vibrates. You feel the noise as if it’s inside you, a demonic parasite pushing at every inch of skin to get out.” Lambert brought home those memories, which still rear up some nights. Yet he somehow survived the slaughter and came home to raise a family, thrive as a businessman and inventor, and contribute to the life of his community. Ray lives with his wife Barbara in a quiet lakeside home near Southern Pines, North Carolina, where they recently celebrated their 36th anniversary. His first wife, Estelle, died of cancer in 1981; they were married for 40 years. He enjoys meeting friends for 6 a.m. coffee at the village McDonalds’s and says he keeps in touch with 1st Infantry Division folks at Fort Riley, Kansas. In 1995, he was named a Distinguished Member of the 16th Infantry Regiment Association. In that role, he tells his story to schoolchildren, Lions Clubs, and other organizations. Is Lambert the last man standing? Maybe not, but he’s certainly close. “I have been trying for months and months track down guys who had been in the first wave,” says DeFelice, whose books include the bestselling American Sniper, a biography of General Omar Bradley, and a history of the Pony Express. He has spoken with Charles Shay, 94, a medic who served under Ray that morning who will also participate in this week's Normandy ceremonies, and has learned of just one other veteran of the initial landing at Omaha Beach, a man in Florida who is not in good health. "Ray is definitely one of the last survivors of the first wave," DeFelice says. Longevity is in Lambert’s genes. “My father lived to be 101, my mother lived to be 98,” he says. “I have two children, four grandchildren and I think I’ve got nine great-grandchildren now,” he says. “For breakfast I like some good hot biscuits with honey and butter, or I like some fried country ham and a biscuit. The kids say, ‘Oh, Poppy, that’s not good for you.’ And I tell ’em, well I've been eating that all my life, and I’m 98 years old!” Lambert says he learned to look out for himself growing up in rural Alabama during the Great Depression, an experience he believes toughened him for later challenges. “We were always looking for work to help out the family, because there was no money to speak of,” he says. As a schoolboy, he cut logs for a dollar a day with a two-man, crosscut saw, right beside the grown men. He helped out on his uncle’s farm, tending to horses and cows, fetching firewood for the stove, learning to patch up balky farm machinery. “In those days,” he says, “we didn’t have running water or electricity. We had outhouses and we used oil lamps. I had to take my turn at milking the cows, churning the milk for butter and drawing well water with a rope and bucket. Sometimes we would have to carry that water for 100 to 150 yards back to the house. That was our drinking water and water for washing up.” At 16, he found work with the county veterinarian, inoculating dogs for rabies as required by law. He wore a badge and carried a gun. “I’d drive out to a farm—I didn’t have a license, but no one seemed too concerned in those days—and some of these farmers didn’t like the idea of you coming out and bothering them,” he says. “Many times I'd drive up and ask if they had any dogs. They would say no. Then all of a sudden the dog would come running out from under the house barking.” In 1941, months before Pearl Harbor, Lambert decided to enlist in the army. He told the recruiter he wanted to join a fighting unit and was placed in the 1st Division and assigned to the infantry’s medical corps, a nod to his veterinary skills. “Which I thought that was kind of funny,” he says. “If I could take care of dogs, I could take care of dogfaces—that’s what they called ’em.” DeFelice says it took months to persuade Lambert to do the book. Like many combat veterans, he is reluctant to call attention to himself or seek glory when so many others paid a heavier price. Some things are hard to relive, hard to return from. “We’re taught in our life, ‘Thou shalt not kill,’” Lambert says. “When you go into the military that all changes.” For him, the shift occurred during the North Africa campaign, when at first the Americans were being pushed around by hardened German troops led by Field Marshal Erwin Rommel. The U.S. commander, General Terry Allen, told his troops they had to learn how to kill. “And it wasn't but a few days until you saw your buddies getting killed and mangled and blown away before you realize you either kill or be killed,” Lambert says. “And then when you get back home, you’re faced with another change, a change back to the way you were, to be kind and all this kind of stuff. Many men can’t handle that very well.” Ultimately, he agreed to collaborate with DeFelice and write Every Man a Hero because of the army buddies he left behind, comrades who live on in memory and spirit. “I got to thinking very seriously about the fact that many of my men were killed,” he says. “Sometimes I was standing right by one of my guys, and a bullet would get him, and he’d drop dead against me. So I’m thinking about all my buddies that couldn’t tell their stories, that would never know if they had children, would never know those children or grow up to have a home and a loving family.” The responsibility he felt for those men on Omaha Beach 75 years ago has never left Ray Lambert, and it never will. Editor's note, June 4, 2019: This story has been updated with a clarifying quote from Jim DeFelice about his knowledge of other surviving first-wave veterans of D-Day. Jamie Katz is a longtime Smithsonian contributor and has held senior editorial positions at People, Vibe, Latina and the award-winning alumni magazine Columbia College Today, which he edited for many years. He was a contributing writer to LIFE: World War II: History’s Greatest Conflict in Pictures, edited by Richard B. Stolley (Bulfinch Press, 2001).
2ec5390bc224d97561846478475d80d0
https://www.smithsonianmag.com/history/one-hundred-years-ago-four-day-race-riot-engulfed-washington-dc-180972666/
One Hundred Years Ago, a Four-Day Race Riot Engulfed Washington, D.C.
One Hundred Years Ago, a Four-Day Race Riot Engulfed Washington, D.C. By all accounts, the 1919 Fourth of July celebration in Washington, D.C., was one for the ages. Coming right on the heels of the end of the Great War, and with President Woodrow Wilson’s League of Nations peace plan still very much alive, Independence Day was a symbolic coming out party for the United States of America on the global stage. The local hacks sure played it up that way. Under the headline “Gorgeous Display As Jubilee Finale,” the Washington Evening Star described the Independence Day festivities as if the newspaper was owned by a sparklers and cherry bombs conglomerate: A ‘blaze of glory’ that easily surpassed any pyrotechnic display ever seen in Washington marked the close of the city’s most elaborate Fourth of July celebration last night, both the quantity and magnificence of the fireworks overshadowing anything of the kind seen in former years. It was one of a number of stories in the newspaper extolling American virtues, including an article detailing a stirring speech given by President Wilson on the deck of a presidential steamer, the George Washington, in between tug-of-war bouts between Army and Navy teams. President Wilson’s remarks declared it “the most tremendous Fourth of July ever imagined, for we have opened its franchise to the whole world.” Two weeks later, a brutal race riot would sweep across the city. ********** The riot broke out as so many others have broken out: following a white woman’s claim that black men had wronged her. As the Washington Post recently outlined, attacks in the weeks prior led to sensational headlines, massive showings of police force, scores of unfounded arrests, and an escalation of tensions throughout the city. In the July 18 incident that put the match to the tinder, 19-year-old Elsie Stephnick was walking to her home on 9th St. SW from her job at the Bureau of Engraving just a few blocks away when two African-American men allegedly collided with her and tried to steal her umbrella. The Evening Star reported her description of the “colored assailants” as “a short dark man” and a “taller man with a ‘bumpy’ face.” Stephnick claimed she staved them off until a carload of white men came to her aid. (Other than her word, no evidence or report suggests anything more than an attempted pilfering, if it even occurred in the first place.) Stephnick was married to a Naval Aviation Corps employee, and the story made the rounds among white soldiers and sailors in Washington on weekend holiday. The D.C. police quickly arrested Charles Ralls, a black man, for the alleged attack, but the tale quickly grew taller with each telling, a game of racist telephone that turned what was at worst a minor skirmish into marauding gangs of African-American rapists who’d been terrorizing the city for months. Four daily newspapers, in a heated fight for readers, fueled the fire with headlines like the Washington Post’s “Negroes Attack Girl. White Men Vainly Pursue” and the Washington Times’ “Negro Thugs.” The stories would get picked up on the newswires and made their way into papers across the nation. Police questioned Ralls, upon which Stephnick’s husband, John, became convinced he was one of the men who had attacked his wife. A group of servicemen met up on Saturday night to get revenge, and as historian David F. Krugler describes the scene in 1919: The Year of Racial Violence, it didn’t take much time for an angry assemblage to form: “The result was a mob in uniform.” More than 100 servicemen, after hours of heavy drinking, gathered outside the illicit taverns, brothels and pool halls of the seedy neighborhood known as “Murder Bay,” today home to the federal buildings hugging Pennsylvania Ave NW. (Though not instituted nationwide yet, the District had already fallen under the lightly enforced spell of Prohibition.) “Brandishing pipes, clubs, sticks, and pistols,” the mob of veterans marched south across the Mall to a poor, black neighborhood then known as Bloodfield. George Montgomery, a 55-year-old man out buying produce, was the first to take a beating. The men soon spotted Ralls and his wife and began assaulting them until they broke free and ran home. For four days, Washington, D.C. became a battlefield with no real defense against the rampaging around the White House, the War Department, and the Capitol, and in predominantly black neighborhoods like LeDroit Park around Howard University, the U Street district, the Seventh St. commercial corridor, and even on random streets where unfortunate souls found themselves. That night, a black man named Lawrence Johnson was thrashed about the head by Marines wielding handkerchiefs filled with rocks, until that got tiring and a pipe was used to bash him bloody on the sidewalk, just outside the White House. “There have been race riots throughout the breadth of American history, in every decade since the founding of the country, and the worst of it was in 1919,” says Cameron McWhirter, a Wall Street Journal reporter and author of Red Summer: The Summer of 1919 and the Awakening of Black America. “Every single one was instigated by white mobs and Washington was the pinnacle if for no other reason than the symbolism. When the sailors and soldiers gathered to raise hell over race, it was at the Peace Monument in front of the Capitol, which was erected to say we’re one nation following the Civil War.” ********** The term “Red Summer,” coined by the NAACP’s first black executive field secretary James Weldon Johnson (who also wrote “Lift Ev’ry Voice and Sing,” now commonly known as “The Black National Anthem), referred to the bloodshed being spilled in race riots across the country. From April to November, hundreds of Americans, mostly black, would die, and thousands more were injured. Lynchings and indiscriminate killings sparked 25 conflicts in small towns like Millen, Georgia, and in major cities such as Charleston, Chicago and Cleveland. Elaine, Arkansas, saw the most horrifying of all when 237 black sharecroppers were murdered over two days for trying to form a union. It was a year that would see 78 lynchings and 11 black men burned alive at the stake. Cultural, economic and military factors combined in 1919 to create conditions ripe for strife. D.W. Griffith’s 1915 Birth of a Nation—screened at the White House and enthusiastically received by President Wilson—glorified the Ku Klux Klan’s white-hooded terrorists as heroes, portraying the organization as saviors of southern white women during Reconstruction. The movie was a blockbuster and helped bring about a rebirth of the Klan, which grew from a few thousand members pre-release to estimates of 4-8 million by the mid 1920s. On July 6, 1919, local newspapers reported the Klan rode into Montgomery County—just outside Washington, D.C.—for the first time in 50 years. Meanwhile, the Great Migration saw tens of thousands of blacks move from the cotton fields of the South to the factories of the North. Soldiers returning from World War I sought jobs, too. Organized labor grew, as did labor unrest, and the Communist Party of the United States arose as an offshoot of the Socialist Party. As McWhirter writes, "The Red Summer arrived in tandem with the Red Scare.” A fear of radicalism spread, especially toward blacks who no longer acquiesced to the pre-World War I social order. The Red Summer was a moment when black citizens showed they had had enough, and fought back. Roughly 375,000 African-Americans served in World War I, and upon returning home, felt newly emboldened to fight for their rights. The righteous indignation was captured in a July poem, first published in The Liberator by seminal Harlem Renaissance writer Claude McKay. “If We Must Die" was the Red Summer anthem, a rousing 14-line verse ending with a literal call to arms: What though before us lies the open grave? Like men we’ll face the murderous, cowardly pack, Pressed to the wall, dying, but fighting back! The emerging resistance also saw itself reflected in the NAACP’s adoption of a more activist platform, flexing its strength in support of H.R. 11279, the anti-lynching bill first introduced in Congress by Congressman Leonidas Dyer of Missouri in 1918. The growth of the NAACP in 1919 was astounding, more than doubling its membership from 44,000 to 91,000. ********** In 1919, some 110,000 African-Americans (roughly a quarter of the city's population) called Washington, D.C. home, more than any other American city. McWhirter describes it as “black America’s leading cultural and financial center,” with more well-off African-Americans than anywhere else and numerous steady decent-paying middle-class jobs working for politicians, bureaucrats, and the federal government, especially during the war effort. Black prosperity, though, was an affront to many white veterans who felt they had come back to a different country than the one they’d left, even though a number of black soldiers in the 372nd Infantry, comprised of National Guard units from Connecticut, Maryland, Massachusetts, Ohio, Tennessee and the District of Columbia, had been awarded the Croix de Guerre, France's highest military honor. “There were two major problems for soldiers returning after World War I,” says John M. Cooper Jr., professor emeritus in the history department at the University of Wisconsin-Madison and the author of Woodrow Wilson: A Biography. “You have all these Doughboys coming back flooding the labor market, so there’s unemployment. You also have the lifting of the wartime price controls, so there’s rampant inflation, which was called ‘High Cost of Living.’ In early August, Wilson gave his last speech before his stroke about the HCL and basically said everyone should be restrained in their spending because sorry, the government can do very little about it.’” The same could have been said, at least initially, for the spread of violence in D.C. that summer as the white mob’s collective anger came down on whatever unfortunate black person came across their path. White servicemen yanked blacks off of streetcars, pummeling them on the sidewalks until police showed up, when they would disperse and re-form, an amorphous mob that expanded on the night of Sunday, July 20, when a hundred more men stomped from the Navy Yard to terrorize local black residents. Gangs of rioters piled into “terror cars,” the street name for Model-Ts used in indiscriminate drive-by shootings. Carter Goodwin Woodson, a noted black historian who was dean of Howard University at the time, later recalled the horrors he witnessed after hiding in the shadows for his safety: The mob “caught a Negro and deliberately helped him up as one would a beef for slaughter,” he wrote, “and when they had conveniently adjusted him for lynching they shot him.” Over the course of the weekend, newspapers continued to stoke the fires, reporting that 500 revolvers had been sold at pawn shops as battle lines were being drawn. A notorious Washington Post front page story on Monday was headlined “Mobilization for Tonight” and urged every able-bodied serviceman to join a “‘clean-up’ that will cause the events of the last two evenings to pale into insignificance,” a barely coded call to inflict more pain upon the black community. The white throngs continued to unleash violence through mid-morning on Monday, when a group of black men drove a terror car of their own past the Navy Hospital and fired on patients milling about outside. To combat the “reign of hysteria and terror,” the city's black newspaper, the Washington Bee, urged blacks to arm themselves, and a blistering market of firearms and ammunition purchased in Baltimore were smuggled into Washington. Rumors hit the streets that Howard University ROTC officers were handing out guns and ammo. Barricades were set up around Howard and the U Street area with rooftops patrolled by black men with rifles, including veterans of World War I. Meanwhile, some 400 white men heeded the Washington Post’s call at 9 p.m. and united at the Knights of Columbus recreation center on Pennsylvania Avenue at 7th St. NW. Victims of the violence filled the segregated hospitals and morgues, as dozens were injured and at least four were killed. According to the Washington Post, the first person killed was Randall Neale, a 22-year-old black man fatally shot in the chest by Marines said to be passing in a car. The Washington Bee reported Neale was just back from the war, describing his death as "one of the more cowardly murders that was ever perpetrated upon a young man who had been to France to fight for world democracy.” Sgt. Randall Neale would be buried in Arlington National Cemetery. Neval Thomas, a history teacher at Washington's Dunbar High School and an activist who would be appointed to the NAACP board of directors in 1919 wrote that no longer would white people wreak havoc with impunity, that blacks would "die for their race, and defy the white mob.” One incident in particular stands out amidst the news reports. Near Union Station, a 17-year-old black girl named Carrie Johnson was hiding under her bed on the second floor as 1,000 rioters stormed the area. Responding to reports of someone firing from the building’s roof, police broke down her bedroom door. Johnson shot and killed 29-year-old Metropolitan Police Detective Harry Wilson and claimed self-defense. She became a folk hero in the black press. A poem published in the Afro-American in August 1919 baldly stated: “You read about Carrie Johnson, who was only seventeen, She killed a detective wasn’t she brave and keen.” Johnson was charged with first-degree murder. In 1921, she was convicted of manslaughter, but a separate judge overturned the verdict after accepting that she feared for her life and acted in self-defense. Within two years, Johnson was a free woman. The worst hours of the racial war petered out early Tuesday morning as the rioters exhausted themselves. ********** The claims of a violent attack on Elsie Stephnick were sketchy at best, but given the hostility felt by many white residents of the city and the fact that the “white woman ravaged by black men” story spread so quickly, there is probably little chance the early rioting could have been prevented. However, nobody attempted to prevent escalation. Long before Congress granted D.C. home rule in 1973, the city was run by three presidentially appointed district commissioners. Former Tennessee newspaperman Louis “Brownie” Brownlow, given the job in 1915 based on his friendship with Woodrow Wilson, dithered while Washington exploded, sticking to his misguided plan to have the city’s 700-person police force, home auxiliary guards, and loaned troops keep things calm. It was a suspect decision given that D.C. falls under federal jurisdiction and Brownlow could have easily called up disciplined World War I troops from any of the nearby military installations. Later, Brownlow laid the blame at the foot of outside communist agitators. He was still fuming about it when his autobiography, A Passion for Politics, was published in 1955. Only on Tuesday, July 22, did President Wilson give authorization to mobilize 2,000 soldiers. Crowds were dispersed from street corners, theaters and bars were closed, auto traffic was restricted, and tanks equipped with machine guns were brought in from Fort Meade, 25 miles away in Maryland. Limited violence arose that night, but what really brought calm to the capital was a relentless hot summer night rainstorm. Still, the damage was done, and not just to the nation’s capitol. The black press in America called out Wilson’s unwillingness to intercede and bring peace, while newspapers in Germany and Japan criticized him for promoting the League of Nations while black citizens were enduring a summer of hell across the country—and in his own backyard. The Atlanta Independent declared, “Our president seems to be in utter ignorance of the conditions obtaining at his door.” A full accounting of the Washington D.C. riot wasn’t on anyone’s mind, at least not anyone in power. No official death toll was ever given; at the time the “official" number was seven, while it’s now believed around 40 were slain. Similar accountings, of injury and property damage, were also never made by the government. By the time the rain let up and the last soldier left Washington D.C. on Sunday, July 27, the violence and tragedy of Red Summer had moved west. On the very same day, Chicago erupted in its own, even bloodier, 1919 race war that began when an African-American teenager was hit in the head by a rock thrown by a white man and drown in Lake Michigan for the crime of swimming where he wasn’t supposed to be. The violence in D.C., though, marked a flashpoint in American racial dynamics. The 20th-century fight against the white power structure was at hand even if the riot itself was swept under the rug. Following the Washington race war, a ”Southern black woman," as she identified herself, wrote a letter to the NAACP magazine, The Crisis, praising blacks for fighting back: The Washington riot gave me a thrill that comes once in a l lifetime ...at last our men had stood up like men...I stood up alone in my room...and exclaimed aloud, 'Oh I thank God, thank God.' The pent up horror, grief and humiliation of a life time -- half a century -- was being stripped from me. Originally from Montana, Patrick Sauer is a freelance writer based in Brooklyn. His work appears in Vice Sports, Biographile, Smithsonian, and The Classical, among others. He is the author of The Complete Idiot’s Guide to the American Presidents and once wrote a one-act play about Zachary Taylor.
f88df414f4ab0332a78f728f00db4641
https://www.smithsonianmag.com/history/one-last-time-read-our-timeless-deep-dive-what-beloved-tv-show-got-right-and-wrong-180971094/
One Last Time, Read Our ‘Timeless’ Deep Dive Into What the Beloved TV Show Got Right and Wrong
One Last Time, Read Our ‘Timeless’ Deep Dive Into What the Beloved TV Show Got Right and Wrong ​Fans of the NBC show “Timeless” just couldn’t let the series end. They turned out the vote, choosing the time-travel procedural as the number-one show that should get renewed in USA Today’s Save Our Shows poll. They raised $20,000 to rent a helicopter to fly a #SaveTimeless banner over San Diego Comic Con. The lesson: Don’t mess with Team Clockblocker, basically. NBC in the end came to a compromise, of sorts, un-canceling the show a second time to allow the writers and producers one final wrap-up show, a two-hour finale to tie up the many loose ends left at the end of the second season this spring. Are you just joining us? You can catch up here, but here’s the 60-second summary: A shadowy secret organization known as Rittenhouse is trying to use a time machine to Make America Great Again by altering history to entrench white male power. They’re basically the Illuminati, but with time travel. Attempting to stop them are a ragtag team of Lucy Preston, a historian, Wyatt Logan, a soldier, and Rufus Carlin, an engineer, who together travel through history to fix or prevent the potential damage done by Rittenhouse. At the end of Season 2, though, things look real bad for the #timeteam. Rufus has died in San Francisco in 1888, the rest of the team is bruised and battered, and while Rittenhouse is down a few members, mostly thanks to infighting, the evil organization seems more evil than ever. Yet not all hope is lost. At the end of season 2, an older, more steampunk, bad-ass versions of Lucy and Wyatt appear in a souped-up time machine. Older Lucy, sporting a distinct Lara Croft vibe, gives Present Lucy a gift—her own journal. “Figure it out together,” Older Lucy says before she and Older Wyatt disappear into the time machine. Tonight’s finale picks up there, but before the team can figure out the message in the journal, they get an alert that Rittenhouse has jumped to California in January 1848, on the dawn of the Gold Rush. Ever eager to stop their adversaries, Lucy, Wyatt, new pilot Jiya (also Rufus’s girlfriend) and baddie-turned-antihero Garcia Flynn chase after them. Once in Coloma, California, near the famous Sutters Mill where gold would be found, the heroes find themselves again in cowboy get-ups and wanted by the law. By happenstance (per usual), they team up with Joaquin Murrieta, a fellow fugitive and Mexican outlaw with plans to avenge the murder of his brother and assault of his wife at the hands of Americans. As in the show, Murrieta is considered the inspiration for Johnston McCulley’s pulp hero Zorro. The writers had a lot to cram into this two-hour episode, so the next few bits are a blur, but in essence, Wyatt decides that the only way to rescue Rufus is to eliminate Jessica from the timeline. I’m still confused about why this is the conclusion they came to—as my editor has pointed out, why not just time-travel to a time before Connor Mason invented a time machine and off him?—but inspired by this conversation, Flynn sneaks out at night, takes the time machine to the night Jessica was killed, and, in the best time-paradox moment of the episode, kills Jessica and the Rittenhouse agent protecting her. Turns out Jessica's mysterious killer was Flynn all along. (Time is not a straight line, but more of a Jeremy Bearimy.) Deciding he’d rather die a hero than live as a tormented ex-terrorist, Flynn sends the time machine back to 1848, while stranding himself in 2012, doomed to suffer and eventually die from the side effects of existing in two places at the same time. Still at breakneck speed, Rufus appears in 1848, rescuing Wyatt, Lucy and Jiya from bounty hunters, as if nothing had happened. (To him, nothing has happened—he doesn’t remember going to rescue Jiya in 1888 because in his timeline, Jessica never betrayed Wyatt, captured Jiya or brought her to 1888. No abduction, no rescue mission, no dead Rufus. Surely this isn’t the plan Future Wyatt and Future Lucy had envisioned.) Back in 2018, Emma, realizing that Jessica has been erased from the timeline, utters what is either the best or worst line of the episode. “Get the mothership ready,” she orders an underling. “What for?” “Hell.” Turns out “Hell” is North Korea a year into the Korean War—so, pretty accurate. Emma, now obsessed with eliminating Lucy, has set a trap: Lure the Time Team to North Korea in 1950. Bribe a U.S. Marine to kidnap them and drop them in enemy territory. If that plan doesn’t work, the Chinese soldiers, the bombing, or the sub-zero temperatures will. Our team quickly realizes they’re in a trap and dispatches the Marine off-camera. But now, they are miles away from their time machine, and it’s real cold. While Wyatt and Rufus hotwire an Army ambulance, Jiya and Lucy warm up in a church, where they meet a very pregnant woman named Eung-Hee. She says her dissident journalist husband and their young son have evacuated, and she’s planning to wait for them to return in a few days. But as troops pour into the church, Lucy convinces her to escape with them. The Hungnam Evacuation is a lesser known chapter of the Korean War. As Lucy and Wyatt explain, after the Battle of the Chosin Reservoir, facing heavy losses, the United Nations decided to evacuate its troops. Thousands of Korean refugees poured into the port of Hungnam hoping to escape. One ship, the SS Meredith Victory, designed to carry 60 people, ended up boarding 14,000 refugees. (That’s not a typo.) Miraculously, nobody died—and five babies were born on board. Lucy insists that they can get Eung-Hee to safety and then make it back to the Lifeboat to save themselves. While they do manage to get Eung-Hee—and the baby she delivered on the way there—to the port and reunited with her family, the team only makes it back as far as the church. They’re essentially waiting to die, when who appears but Agent Christopher in the Mothership! Back in the bunker in 2018, Agent Christopher and Mason had discovered photos of their colleagues killed by the Chinese Army on Christmas Day, 1950 in the Massacre of Usang-Ri. (This isn’t a thing.) In another tying up of loose ends, they bribed Lucy’s father into leading them to Emma’s safehouse, where they shackled her and forced her to take Christopher to 1950 for a rescue mission. After a brief confrontation, Emma is conveniently shot by Communists and the team escapes back to the present, where Mason destroys the Mothership, Christopher gives hand-knitted scarves to the whole team, and--most importantly for many—Lucy and Wyatt finally agree to give their relationship a chance. The episode—and for now, the series—ends with an epilogue. In 2023, Lucy and Wyatt have married and have twins named, naturally, Flynn and Amy. Lucy’s back to teaching history, and just made tenure, which is...surprisingly fast? Rufus and Jiya founded a startup called Riya Industries that spends some (but not enough, as the episode makes weirdly, snarkily, clear) of its profits funding youth science fairs. And the team has one last mission: to go back to 2014 and give Flynn the journal that started the entire (mis?)adventures. With that out of the way, they could theoretically smash the last time machine, but as Mason points out, once the technology has been invented once, there’s nothing to stop someone else from building one, so they may as well keep their spare, just in case. (This will surely be treated by some Clockblockers as a sign that a full Season Three isn’t fully out of the question.) The final final scene shows a young girl, the same one who showed off her Leyden jar to Rufus at the science fair, drawing up plans for a new time machine. Cue dramatic music ... and the history notes! There’s no magic time machine upgrade that allowed Lucy and Wyatt to travel to their own timeline. It turns out that it’s just, disappointingly, a case of bad side effects; Connor Mason says they start with headaches and end in insanity or death. Mostly they seem to take the effect of characters having migraines just as they’re about to spill an important plot point. As far as Murrieta goes, the writers are eliding history here in sake of a larger truth. Historical records about Murrieta are scarce and many accounts of his life draw on an 1854 pulp novel as truth. Some say he wasn’t even a real person and was actually just an amalgam of many Mexican-American outlaws. But if he was real, he didn’t arrive in California until 1849, the height of the gold rush. When Murrieta says that he was kicked off his gold claim by “filthy Americanos,” he’s telling the story of the tens of thousands of Mexicans who had become second-class citizens in 1848. When gold was discovered at Sutter’s Mill, California was still, technically, part of Mexico, and Mexico and the United States were at war. The Treaty of Guadalupe Hidalgo, and Mexico’s forced surrender of massive amounts of land including what would become California, would be signed eight days later. The Treaty gave Mexicans living in the newly ceded territories the opportunities to become American citizens, and on paper protected existing property rights, but as Hsuan L. Hsu writes in The Paris Review, the government failed to intervene when whites just took what they wanted. Later, Gen. Persifor Smith, the military governor of California, encouraged a rumor that it was illegal for noncitizens to mine gold (it wasn’t) and California in 1850 instituted a “foreign miner tax” that was “that was chiefly (and often violently) enforced against Mexican, South American, and eventually Chinese miners.” Even if Murrieta hadn’t yet experienced violence at the hands of white Americans, many other new Mexican-Americans had. Murrieta, after a few years of stealing horses and robbing miners, was chased down by the newly formed California State Rangers and purportedly beheaded in 1853. When Jiya says that she knows this to be true because she saw Murrieta’s pickled head in 1888, that’s distinctly possible—after collecting a $5000 bounty for killing Murrieta, the rangers toured the state exhibiting a decapitated head preserved in alcohol, charging people $1 to see it. There are rumors that the head didn’t actually belong to Murrieta and that the bandit lived to a ripe old age, but we may never know the truth. Incidentally, what is thought to be the first piece of gold found at Sutter’s Mill is in the collection at the Smithsonian National Museum of American History. It seems eminently plausible that McCulley was inspired by Murrieta when creating Zorro. As Hsu points out, though, McCulley changed the setting for his masked vigilante to Mexican, not American, California, making Zorro’s antagonists Mexican rulers instead of white. Rufus: “You think you’ll get back together, or what, because I’m still totally shipping #TeamLyatt.” Lucy: “Huh?” The Hungnam Evacuation as described in the show sounds impossible, but it’s true. First, a little context: U.S. and U.N. troops had been winning the Korean War until Chinese forces surprised them at the Chosin Reservoir. This was a brutal battle over 17 days in severely cold weather—recorded at -40 degrees F at some points. Troops froze into their boots; many lost toes later. Medical supplies froze and weapons malfunctioned. The “Frozen Chosin” is considered one of the defining moments of the Marine Corps, even if it ended in a retreat. Facing heavy losses, troops retreated to Hungnam to evacuate to Busan, South Korea. A hundred Navy and merchant marine ships made nearly 200 trips to evacuate not just the troops but most of their equipment as well. Thousands of civilians got wind of what was happening and went to Hungnam too, hoping to escape North Korea. A military history says that the North Korean military was encouraging rumors that the Americans would evacuate any civilian who wished to leave, to create a mass movement of people that would hide spies and saboteurs. But while Gen. Edward M. Almond had planned to evacuate officials and the families of those who had assisted the Americans, he hadn’t planned to take anyone else. According to English-language newspaper Korea JoongAng Daily, an on-site interpreter by the name of Hyun Bong Hak, “desperately pleaded or the transfer of as many civilians as possible, arguing that they would be massacred if they remained in the North.” Top brass ultimately made the decision to remove cargo to make room for refugees. (Dr. Hyun also makes a brief cameo in the episode as the man who offered to help deliver Eung-Hee’s baby.) The SS Meredith Victory was the most striking example. On a ship designed to carry 12 passengers and 47 crew, Captain Leonard LaRue fit 14,000 North Koreans. In total, 100,000 civilians—about half of those who came seeking help—escaped. Among the civilians evacuated were the parents of current South Korean president Moon Jae-In. Eung-Hee, it turns out, is not important to history (but as Lucy says, everyone’s important to someone). We may be meant to infer that Paulina, the young inventor of the new time machine, is Eung-Hee’s granddaughter, but that’s not clear. We do know that Eung-Hee lived a long, peaceful life, and her daughter grew up to be a teacher. Happy endings for all! Gotta love the nose-thumbing to the haters at the end. As we see Lucy talking with her history students on campus, one doofy guy asks her: “This was supposed to be a regular American history class. How come we’re only studying women?” “I meant to get to the men,” Lucy replies, “but I just didn’t have time.” One thing “Timeless” consistently did well throughout its run was tell less-known stories, especially those of women and people of color. Yes, the team saw Abraham Lincoln get shot and saved JFK from an untimely death, but they also met Benjamin Franklin’s mother, an early African-American NASCAR driver, and Katherine Johnson (before the movie Hidden Figures was released). Bravo to the writers for sticking to their guns on this one. This may be the true end for our heroes—but everyone gets a happy ending. Luckily, it’s available to stream on Hulu , so we can watch it again from the beginning. It’s the next best thing to having a time machine.
13ae99c615d56944689efc02361d7769
https://www.smithsonianmag.com/history/operation-desert-storm-was-not-won-smart-weaponry-alone-180957879/
Operation Desert Storm Was Not Won By Smart Weaponry Alone
Operation Desert Storm Was Not Won By Smart Weaponry Alone Technology has long been a deciding factor on the battlefield, from powerful artillery to new weaponry to innovations in the seas and the skies. Twenty-five years ago was no different, as the United States and its allies, proved overwhelmingly successful in the Persian Gulf War. A coalition of U.S. Army Apache attack helicopters, cruise missiles from naval vessels, and Lockheed F-117 Nighthawk “stealth fighters” soundly broke through Saddam Hussein’s army defenses in Kuwait during Operation Desert Storm, which became known as the “100-hours war.” The military response was a reaction to Hussein’s invasion of Kuwait earlier that year. The United Nations Security Council had demanded that Hussein withdraw his troops by a mid-January 1991 deadline, or it would launch a counter-offensive. When troops remained on the ground past the cutoff date, Operation Desert Storm came to fruition. The swift and dominant victory made it seem like the future was now when it came to science fiction-like military weaponry that helped win the day. The U.S., entrenched in the Cold War, had been heavily investing in its military technology for years leading up to the Gulf War. In the 1980s, President Ronald Reagan’s proposed missile defense system against the USSR, Strategic Defense Initiative (SDI), signaled a commitment to the highest technology not just in space, but in different realms, says former defense analyst Robert English. English advised the military on national security in the 1980s, when much of the technology used in Operation Desert Storm was first put on the drawing board. At the time, English recalls, it was at first an uphill battle to get the Pentagon to approve spending money on high-tech projects. As a general rule, military brass were reluctant to introduce new technology, as they would rather stick to with a larger quantity of battle-proven weaponry. But the “Star Wars” defense program, as SDI was dubbed, helped serve as an impetus for new investments in technology across the board. This led to the debut of the Patriot air missiles, which targeted and intercepted Iraqi Scud ballistic missiles and the Lockheed F-117, a “stealth fighter,” first deployed when the United States invaded Panama in 1989. The fighter was described by Daniel Plesch and Michael Wardell for the Los Angeles Times in 1991. They wrote, “…It is intended to close in on its target unnoticed, virtually eliminating the enemy's capacity to react. Its radar signature is supposed to be no bigger than that of a duck.” Though the fighter proved effective against Iraqi forces, stealth technology was still in its infancy at the time of Desert Storm, as Plesch and Wardell point out in their piece. For instance, British allies on Royal Navy destroyers in the Gulf were able to pick up the F-117 up to 40 miles from its targets, using technology more than a decade old. Despite its glitches, the Nighthawks’ surgical strike capability was what “convinced the U.S. Air Force to make significant changes after the war,” writes Don Hollway for HistoryNet, moving the U.S. toward new technology and tactics. The F-117 would have a long shelf life. The 1,313th and final F-117 was delivered to the U.S. Air Force just this month. During Desert Shield, soldiers, sailors and air crews also used $25,000 Holographic One-Tube light amplifying goggles to capture and reflect visible light too dim for the naked human eye, electronically, “somewhat like the viewfinder on a home video camera, with magnification,” wrote Martha P. Hernandez for the Associated Press at the time. It was these glasses, she predicted in a piece published right after Operation Desert Storm began, that would give the U.S. and its allies a “major edge” over Iraqi forces in night battles. Perhaps one of the most effective technologies employed during the Gulf War was using satellite surveillance systems. The war might have been prolonged has troops not been given GPS receivers, the United Kingdom’s Science Museum positions. Though the U.S. Department of Defense had been investing in GPS technology since the 1960s, it was unprepared to supply troops in the Gulf with multiple GPS receivers. The museum writes: Manufacturers had to scramble to make new receivers and send them out to the troops. Often there were as few as two instruments for 100 vehicles. Some soldiers relied on members of their families to buy civilian GPS systems and ship them out, even though they were less accurate. Even the military equipment was not well designed for use in a theatre of war – tank crews and helicopter pilots stuck the devices to their vehicles with gaffer tape, for example. Yet despite supply problems, GPS receivers were what allowed troops to find Iraqi ground forces, as well as assess bombing damage. The Joint Surveillance Target Attack Radar Systems (JSTARS), U-2 reconnaissance planes, and reconnaissance satellites all relied on the surveillance equipment. However the surveillance technology was not perfect, cautions Robert H. Gregory, Jr. in his book, Clean Bombs and Dirty Wars: Air Power in Kosovo and Libya. The technology was “susceptible to being fooled by Iraq's use of decoys, camouflage, and digging in of forces.” As Gregory points out, Iraq had in fact purchased “thousands of dummy tanks and artillery from an Italian company before the Gulf War,” which UN observers after the war called virtually “impossible to distinguish from actual equipment.” But for all the posssibilites that this “Computer War,” offered, such as laser guidance systems on precision-guided munitions (PGMs), like cruise missiles—18-foot-computer-guided flying bombs launched from warships, Operation Desert Storm was not won by smart weaponry, alone. Rather, as English estimates, 90 percent of the ammunitions employed in Desert Storm were actually “dumb weapons.” The bombs, which weren't guided by lasers or satellites, were lucky to get within half a kilometer of their targets after they were dumped from planes. While dumb bombs might not have been exciting enough to garner the headlines during the attack, they were cheaper to produce and could be counted on to work. PGMs might have been the “invention that shaped the Gulf War,” as Malcolm W. Browne wrote for the New York Times in 1991, as they enhanced the effectiveness of attacks by an extreme measure, yet it was the dumb bombs that was the most commonly used weapon during the attack. But frequency of use doesn’t change why history will remember Desert Storm for its smart weapons, rather than its dumb ones. As Philadelphia Inquirer staff reporters Matthew Purdy, Karl Stark and Tim Weiner reported, “Almost all the new technology, built and paid for in the trillion-dollar military buildup of the 1980s and intended for a full-tilt war with the Soviet Union, had never before been tested in battle,” which meant that their success rates in Dessert Storm had reason to be “not as dazzling as initially believed.” By introducing the high-tech weaponry during the operation, however it would set a precedent for how the U.S. would engage in the Balkans and a dozen years later, back in Iraq. Jacqueline Mansky is a freelance writer and editor living in Los Angeles. She was previously the assistant web editor, humanities, for Smithsonian magazine.
995aa63ad7ddf17dc26b5a8558e5fd4f
https://www.smithsonianmag.com/history/othmar-ammanns-glory-173362465/
Othmar Ammann’s Glory
Othmar Ammann’s Glory It was called the most beautiful bridge in the world. At the time of its 1931 opening, it certainly was the longest single span. To honor the engineering feat it represented, a stamp with its picture was issued, and the bridge became the subject of music, even a children's book. Yet, a section of suspension cable for the George Washington Bridge in the collections of the National Museum of American History can only hint at such glories. Three feet in diameter and ten feet long, the massive cylinder weighs an ungainly 34,000 pounds. From its ends protrude 26,474 individual steel wires, compacted under 400 tons of pressure. Before computers, this experimental section helped engineers model the effects of compression on the finished bridge's cables. Today, it represents an engineering marvel, whose creation spanned half a century of depressions, politics and the passions of two of America's greatest bridge designers. No matter when it was built, the first bridge to span the Hudson River from New Jersey to New York City was destined for fame. After the Civil War, a single span was determined most suitable for the wide, heavily trafficked river just west of the fast-growing metropolis. But materials and engineering skill lagged far behind the dream. Until 1888. Just five years after the completion of John Roebling's Brooklyn Bridge, then the world's longest suspension bridge, 38-year-old Austrian-born engineer Gustav Lindenthal put forth a plan for a suspension bridge across the Hudson. It was a grand concoction: six railroad tracks, more than a mile in total length. Its center span was to be nearly twice as long as that of Roebling's widely admired masterpiece. Great feats of engineering require greater feats of imagination. For both, Lindenthal was well qualified. With little formal education and a physique to match the size of his dreams, he had taught himself English and the rudiments of engineering. Immigrating to America in 1874, he quickly prospered in his adopted land, whose engineers had more use for quick thinking and practical energy than college degrees. By the turn of the century, Lindenthal was renowned among his peers. His Seventh Street and Smithfield Street bridges in Pittsburgh were some of the most significant of their time. In 1902, Lindenthal became commissioner of bridges for New York City, a political appointment that gave him considerable power and prestige as an engineer and designer. But his dream bridge still had not been built. Despite endorsement of Lindenthal's Hudson River bridge plan by the War Department, a rival bridge concern had sued to stop the project. By the time the case was settled, the depression of the early 1890s had dried up most of the funds. Replaced as commissioner after the 1903 city elections, Lindenthal found himself in the odd position of peddling new Hudson River bridge designs to myriad interested groups — with no agreement on location, cost or funding. In the meantime, the city grew. By 1912, Lindenthal was busy completing plans for a railroad bridge — the world's longest steel arch bridge, in fact — across the dangerous channel between Manhattan and Queens called Hell Gate. To help with the task, the august designer took on a 33-year-old assistant not long arrived from Switzerland. Slight in stature, with a quiet demeanor that hid a steely core, Othmar Ammann seemed the opposite of the large, bluff, practically educated Lindenthal. Ammann's degree, unlike any that Lindenthal might occasionally claim, was from a Swiss institute of technology considered one of the most prestigious in the world. Ammann was impressed by his mentor, one of the world's preeminent bridge builders — and the favor was returned. "I estimate an engineer one-third by his character, one-third by his ability, and one-third by his experience," Ammann recalled Lindenthal saying before promoting him for outstanding work on the Hell Gate Bridge project. Through all of this, Lindenthal's dream for a span over the Hudson continued. But what was grand in 1888 had, through decades of deferment, become fantastical. By 1923, Lindenthal's plan called for a bridge more than 200 feet wide, with two decks, one for 12 railroad tracks, the other for 20 vehicle lanes, including two for trolleys. Its massive concrete towers, at 825 feet high, would rise above even the ten-year-old Woolworth Building, then the world's tallest skyscraper. The price: at least a cool $200 million (nearly two billion in today's dollars). Ammann deferentially warned Lindenthal that such a costly project would never be realized. But the old master sharply rebuked his assistant for his "timidity and shortsightedness in not looking far enough ahead," as Ammann noted in his diary. "He stated that he was looking ahead for 1,000 years." A thousand years or no, his professional relationship with Lindenthal quickly deteriorated. "In vain," wrote a frustrated Ammann to his mother later that year, "I as well as others have been fighting against the unlimited ambition of a genius that is obsessed with illusions of grandeur. He has the power in his hands and refuses to bring moderation into his gigantic plans. Instead, his illusions lead him to enlarge his plans more and more." Working on his own, Ammann had developed another scheme. Quietly, he wrote to the governor of New Jersey with suggestions for a smaller, cheaper suspension bridge to be built across the Hudson at 179th Street. The newly formed Port of New York Authority, which enjoyed both states' cooperation and had a short time before rejected Lindenthal's expensive monstrosity, was immediately interested — to Lindenthal's understandable dismay. "Now it appears that A. has used his position of trust, the knowledge acquired in my service...to compete with me in plans for a bridge over the Hudson and to discredit my work on which I had employed him," Lindenthal wrote despairingly to the governor. "He does not seem to see that his action is unethical and dishonorable." But new forces were at work. With construction under way for what would be known as the Holland Tunnel, it was assumed that connecting the metropolis to its burgeoning New Jersey suburbs by underwater routes would be cheaper than a bridge (a notion proved wrong well before the tunnel's 1927 completion). By that time, too, necessarily heavy (and expensive) railroad spans across the Hudson were steadily being eclipsed by less costly ones dedicated to a newly popular conveyance: the car. Already, in Philadelphia and Detroit, huge suspension bridges had been built for cars. The future was clear. By 1925, Ammann was bridge engineer for the Port Authority, charged with designing not only the 179th Street bridge (then known as the Hudson River Bridge) but also a bridge between Staten Island and New Jersey—both mainly for cars. Construction of the Hudson bridge began in the fall of 1927, with more than 100,000 miles of cable wire strung across the river by John Roebling's company. By any standard, the bridge was monumental. With a 3,500-foot main span — nearly twice that of the next largest suspension bridge, built just two years before — its slender deck was to arch gracefully more than 200 feet above the Hudson. Its twin 604-foot towers would stand nearly 50 feet taller than the Washington Monument. And each of its four cables could support more than 90,000 tons — ten times more than each Brooklyn Bridge cable. For his design, Ammann owed as much to material advances since that 1883 wonder as he did to his own ingenuity. Improved steel ensured that when drawn to only 0.196 inch in diameter, each of the 26,474 wires that made one cable had a strength of at least 240,000 pounds per square inch—more than one and a half times that of the cable wires in the Brooklyn Bridge. And better machinery allowed the wires to be hung from the towers (a process called spinning) 16 times faster than in 1883. Engineers followed what they had learned from the behavior of their model, that ten-foot section of cable today in the Smithsonian's collections, to compress the wires together into their final, three-foot-diameter, cylindrical form. In the relentless Great Depression, the bridge became a sort of savior in steel. Completed six months ahead of schedule, it cost less than the $60 million originally allocated. "Fulfilling a dream of three-quarters of a century," ran the ecstatic headline in the New York Times. On October 24, 1931, in front of thousands of spectators, New York governor (and soon to be President) Franklin Roosevelt and New Jersey governor Morgan Larson opened the bridge, newly named in honor of George Washington. In tribute to his mentor, Ammann drove with Gustav Lindenthal onto the bridge that the older man had spent his lifetime fruitlessly dreaming of. Even more revolutionary than its length was the bridge's lack of a common design feature. Until the George Washington, modern suspension bridges were stiffened with steel trusses and beams to limit motion in traffic and wind (an important consideration when a bridge's length is large relative to its width and depth, like the George Washington's). But such stiffening often gave bridges less attractive, thicker decks — and added cost. Ammann reasoned that the sheer weight of his span, and its necessarily heavy cables, would by themselves provide sufficient stiffness. The George Washington's resulting slender profile — both from the side as well as from above — fueled engineers' aesthetic sensibilities. Just six years later, the Golden Gate Bridge astounded the world with a narrower and yet even longer span. If such gracefully thin and relatively light bridges were sometimes disconcertingly flexible in a breeze (as drivers and engineers noted), they also were lovely to look at. In 1940, however, the extremes of Ammann's innovation were dramatically demonstrated in the wind-driven collapse of the aptly nicknamed "Galloping Gertie," otherwise known as the Tacoma Narrows Bridge. After his investigation of that famous failure, which had been captured on film for the nation to see, Ammann wrote, "Its smaller weight and extreme narrowness has drastically revealed that this practice has gone too far." By the early 1960s, when the George Washington's lower deck was added (as specified in the original plans), Ammann had all but eclipsed his mentor. Ammann's other 1931 creation, the Bayonne Bridge connecting Staten Island and New Jersey, was until 1977 the world's largest steel arch bridge — more than 600 feet longer than the previous record holder, Lindenthal's Hell Gate Bridge. Months before his death in 1965, Ammann gazed through a telescope from his 32nd-floor Manhattan apartment. In his viewfinder was a brand-new sight some 12 miles away: his Verrazano-Narrows suspension bridge. As if in tribute to the engineering prowess that made Ammann's George Washington Bridge great, this equally slender, graceful span would not be surpassed in length for another 17 years.
9b0728417c44958ecd4aa71dae29b3d5
https://www.smithsonianmag.com/history/our-top-ten-stories-2019-180973857/
Our Top Ten Stories of 2019
Our Top Ten Stories of 2019 The last year of the decade was filled with a dizzying array of headlines, from the January government shutdown to the devastating fire at Notre-Dame Cathedral, the discovery of a new human ancestor species, the first-ever image of a black hole, the U.S. Women’s Soccer Team World Cup victory, the Amazon rainforest’s unprecedented fire season and the end of a new Star Wars trilogy. Archaeologists unearthed fascinating finds like a Pompeiian sorceress’ kit, “helmets” made from the skulls of children and 1,700-year-old Roman eggs. In the cultural sphere, Maurizio Cattelan’s $120,000 banana epitomized the public’s confusion and exasperation over conceptual art, while a Renaissance nun’s Last Supper painting made its public debut after 450 years in hiding. We lost luminaries including author Toni Morrison, celebrity cat Lil Bub, opera singer Jessye Norman, adventurer Barbara Hillary and Tuskegee airman Robert Friend, and welcomed new arrivals like royal baby Archie Mountbatten-Windsor. From the hidden story of the Apollo 11 mission to a new method for calculating dogs’ age, the pythons overtaking the Florida Everglades and a 16-million-year-old sequoia tree, these were Smithsonian magazine’s top ten stories of 2019. Smithsonian magazine’s most-read story of the year centered on Ancient Earth, an interactive map that allows users to visualize how different parts of the world have evolved over the past 750 million years. Plug a specific address or more generalized region, such as a country or province, into the tool, then choose the desired date from a dropdown menu of 26 options spanning the Cryogenian Period to the present. To truly appreciate the project’s scale, start at the beginning of the map’s timeline and watch as the world shifts from unrecognizable masses to the supercontinent of Pangea and, finally, the seven continents seen today. Ahead of the release of Netflix’s The King, we profiled the movie’s eponymous monarch, England’s Henry V. Portrayed by Timothée Chalamet in the admittedly historically faulty film, the real Lancastrian king is best remembered as a warrior who led his country to victory against overwhelming odds at the Battle of Agincourt in 1415. This feature teased out the complexities behind the much-mythologized ruler, who spent much of his nine-year reign fighting (or negotiating with) the French. To date, a team led by archaeologist Robert Muckle has recovered more than 1,000 artifacts—among others, the list includes rice bowls, buttons, ceramics, teapots, pocket watches and sake bottles—from a long-forgotten 20th-century Japanese settlement in the forests of British Columbia’s North Shore mountains. Populated by immigrants and their Canadian-born children, the community likely acted as a refuge from the rampant racism of pre-World War II Vancouver but was abandoned around 1942, when, as Muckle told Smithsonian magazine’s Brigit Katz earlier this year, its residents were “incarcerated or sent to road camps.” The June issue of Smithsonian magazine marked the 50th anniversary of the moon landing with a deep dive on the Apollo 11 mission by Charles Fishman. As the author of One Giant Leap: The Impossible Mission That Flew Us to the Moon asked readers, “The leap to the Moon in the 1960s was an astonishing accomplishment. But why? What made it astonishing? … What exactly was the hard part?” Fishman outlines the answer to these questions in an immersive, behind-the-scenes exploration of the race to the moon, documenting everything from President John F. Kennedy’s personal lack of interest in space to the Soviets’ secretive launch of an unmanned craft called Luna 15 just days before Apollo 11’s entry into orbit. Tourists stopping by the long-awaited “Hall of Fossils—Deep Time” exhibition at the Smithsonian’s National Museum of Natural History are welcomed by an enormous sequoia fossil dated to some 16 million years ago. Containing about 260 tree rings, the slab represents the culmination of curators’ efforts to place dinosaurs, megafauna and other relics of the past in perspective, offering visitors the opportunity to fully absorb just how much time has passed since the sequoia tree first sprang out of the earth in what is now central Oregon. “Time is so vast,” Smithsonian paleobotanist Scott Wing told contributing writer Riley Black in June, “that this giant slab of a tree is just scratching the surface.” Contrary to popular belief, one dog year isn’t actually equivalent to seven human years. To come up with a more accurate aging formula, geneticists led by Tina Wang of the University of California, San Diego, compared humans’ “epigenetic clocks,” or estimated age as indicated by a phenomenon called DNA methylation rates, to that of canines. The team found that young puppies and human infants have similar methylation rates, but these figures diverge over time, with dogs’ epigenetic clocks speeding up during the first year of life before slowing to align more closely with humans in later stages of life. Overall, the researchers reported, a 2-year-old dog is roughly equivalent to a 42-year-old human, while a 10-year-old dog is the equivalent of a 67.8-year-old person. Calculate (*Only enter numbers greater than zero) Your dog's age in human years is: Smithsonian magazine’s January/February 2019 issue focused on America at war, exploring the nation’s involvement in conflicts through infographics, features, polls and photo stories. This map, compiled by Stephanie Savell and her colleagues at Brown University’s Costs of War Project, tracked the United States’ military involvement across the globe, revealing that the country’s armed forces operate in 40 percent of the world’s nations. Florida has a python problem—to say the least. As many as hundreds of thousands of Burmese pythons are scattered across the Everglades, wreaking havoc on the region’s native wildlife populations and largely evading those tasked with curbing their reach. Smithsonian contributor Ian Frazier joined local bounty hunters and biologists fighting the snakes’ invasion, recording these individuals’ forays into Florida’s swampland in vivid detail for the July 2019 issue of the magazine. In this online feature, historian Kevin M. Levin explored the lives of the 6,000 to 10,000 enslaved individuals who travelled with General Robert E. Lee’s Army of Northern Virginia in the summer of 1863 into the enemy territory north of the Mason-Dixon line. Drawing on diaries written by Confederate soldiers, Levin, author of Searching for Black Confederates: The Civil War’s Most Persistent Myth, keys in on the role slavery played in the war. Some of the enslaved escaped once on the friendlier battleground of Pennsylvania, but others, perhaps out of fear, remained close to their owners. Levin shares the story of Moses, who buried his owner Captain William McLeod of the 38th Georgia following his death at the Battle of Gettysburg, and in the end surmises that “camp slaves and other enslaved workers—the entire institution of slavery, really—were crucial to the … Confederate insurgency as a whole.” For this “Deep Time” story, Hans-Dieter Sues, curator of fossil vertebrates in the National Museum of Natural History’s paleobiology department, recounted scientists’ attempts to unravel the mystery of the Devil’s Corkscrews, an unusual type of fossil named for its bedeviling spiral appearance. As it turns out, the fossils are actually corkscrew-shaped burrows built by the extinct beaver species Palaeocastor. Meilan Solly is Smithsonian magazine's assistant digital editor, humanities. Website: meilansolly.com.
e0858051d565326b037172e0d7a1240c
https://www.smithsonianmag.com/history/pandemic-isnt-first-time-hajj-has-been-disrupted-muslims-180974735/
This Pandemic Isn’t the First Time the Hajj Has Been Disrupted for Muslims
This Pandemic Isn’t the First Time the Hajj Has Been Disrupted for Muslims Saudi Arabia has urged Muslims to delay their plans for the hajj, amid speculation that the obligatory pilgrimage may be canceled this year due to the coronavirus. Earlier this year, Saudi authorities halted travel to holy sites as part of the umrah, the “lesser pilgrimage” that takes place throughout the year. Canceling the hajj, however, would mean a massive economic hit for the country and many businesses globally, such as the hajj travel industry. Millions of Muslims visit the Saudi kingdom each year, and the pilgrimage has not been canceled since the founding of the Saudi Kingdom in 1932. But as a scholar of global Islam, I have encountered many instances in the more than 1,400-year history of the pilgrimage when its planning had to be altered due to armed conflicts, disease or just plain politics. Here are just a few. One of the earliest significant interruptions of the hajj took place in A.D. 930, when a sect of Ismailis, a minority Shiite community, known as the Qarmatians raided Mecca because they believed the hajj to be a pagan ritual. The Qarmatians were said to have killed scores of pilgrims and absconded with the black stone of the Kaaba—which Muslims believed was sent down from heaven. They took the stone to their stronghold in modern-day Bahrain. Hajj was suspended until the Abbasids, a dynasty that ruled over a vast empire stretching across North Africa, the Middle East to modern-day India from A.D. 750-1258, paid a ransom for its return over 20 years later. Political disagreements and conflict have often meant that pilgrims from certain places were kept from performing hajj because of lack of protection along overland routes into the Hijaz, the region in the west of Saudi Arabia where both Mecca and Medina are located. In A.D. 983, the rulers of Baghdad and Egypt were at war. The Fatimid rulers of Egypt claimed to be the true leaders of Islam and opposed the rule of the Abbasid dynasty in Iraq and Syria. Their political tug-of-war kept various pilgrims from Mecca and Medina for eight years, until A.D. 991. Then, during the fall of the Fatimids in A.D. 1168, Egyptians could not enter the Hijaz. It is also said that no one from Baghdad performed hajj for years after the city fell to Mongol invasion in A.D. 1258. Many years later, Napolean’s military incursions aimed at checking British colonial influence in the region prevented many pilgrims from hajj between A.D. 1798 and 1801. Much like the present, diseases and other natural calamities have also come in the way of the pilgrimage. There are reports that the first time an epidemic of any kind caused hajj to be canceled was an outbreak of plague in A.D. 967. And drought and famine caused the Fatimid ruler to cancel overland Hajj routes in A.D. 1048. Cholera outbreaks in multiple years throughout the 19th century claimed thousands of pilgrims’ lives during the hajj. One cholera outbreak in the holy cities of Mecca and Medina in 1858 forced thousands of Egyptians to flee to Egypt’s Red Sea border, where they were quarantined before being allowed back in. Indeed, for much of the 19th century and the beginning of the 20th century, cholera remained a “perennial threat” and caused frequent disruption to the annual hajj. So did the plague. An outbreak of bubonic plague in India in 1831 claimed thousands of pilgrims’ lives on their way to perform hajj. In fact, with so many outbreaks in such quick succession, the hajj was frequently interrupted throughout the mid-19th century. In more recent years, too, the pilgrimage has been disrupted for many similar reasons. In 2012 and 2013 Saudi authorities encouraged the ill and the elderly not to undertake the pilgrimage amid concerns over Middle East Respiratory Syndrome, or MERS. Contemporary geopolitics and human rights issues have also played a role in who was able to perform the pilgrimage. In 2017, the 1.8 million Muslim citizens of Qatar were not able to perform the hajj following the decision by Saudi Arabia and three other Arab nations to sever diplomatic ties with the country over differences of opinion on various geopolitical issues. The same year, some Shiite governments such as Iran leveled charges alleging that Shiites were not allowed to perform the pilgrimage by Sunni Saudi authorities. In other cases, faithful Muslims have called for boycotts, citing Saudi Arabia’s human rights record. While a decision to cancel the hajj will surely disappoint Muslims looking to perform the pilgrimage, many among them have been sharing online a relevant hadith—a tradition reporting the sayings and practice of the prophet Muhammad—that provides guidance about traveling during a time of an epidemic: “If you hear of an outbreak of plague in a land, do not enter it; but if the plague breaks out in a place while you are in it, do not leave that place.” Ken Chitwood is a lecturer and journalist-fellow at the USC Center for Religion and Civic Culture at Concordia College New York.
6fe33fa87d5d1b14f43e7fd577cab593
https://www.smithsonianmag.com/history/path-of-the-monuments-men-through-europe-180949681/
The Path of the Monuments Men Through Europe
The Path of the Monuments Men Through Europe The map above shows how widespread the challenges were that faced the Monuments Men, or as they are more formally known, the Monuments, Fine Arts and Archives Program. The personal papers of some of the Monuments Men, including conservator George Leslie Stout (a version of whom is played by George Clooney in the new movie), are held in the Smithsonian Institution's Archives of American Art. The Archives also has a fantastic series of oral history recordings with the soldiers, which are well worth your listen. For more on the true story of the Monuments Men, read our coverage of their travails in Italy and Austria. Esri is a GIS-mapping company based in Redlands, California
c35fcd0c641b6da6e55e5b23873c0785
https://www.smithsonianmag.com/history/petra-jordan-drone-3d-scan-digital-modeling-180970310/
Once you’ve been to Petra, it stays with you. Long after you’ve left you will find grit from Petra’s red sandstone in the tread of your shoes; your fingernails will have a faint rosy tinge; a fine pinkish dust will cling to your clothing. For some time you will close your eyes and still be able to relive the startling moment you first saw this ancient stone city rising out of the desert floor; you will savor the memory of this place, its grandeur and strangeness, even after you manage to wash away the traces of its red rocks. Driving southwest across the dull plateau from Amman for a few hours, you suddenly tip into the dry basin of Jordan’s Arabah Valley and tumble down through mountain passes. The landscape is cracked and sandy, seared and unpromising. It is hardly the setting in which you expect to find a city of any sort, let alone one this rich and extravagant and refined. There seems to be no water, no possibility of agriculture, no means of livelihood or sustenance. The fact that the Nabatean people, the nomadic Arabs who crisscrossed the region until they grew wealthy from trade, made Petra the capital of their empire by the fourth century B.C. is baffling. Yet here, at the valley’s center, are the remains of this once-lavish city, watered by hidden aqueducts that run for miles from an underground spring. It looks like no other place I’ve ever seen. The “buildings” are punched into the rock cliffs—in other words, they are elaborate caves, recessed in the sandstone and fronted with miraculously carved ornate facades. It is probably one of the world’s only cities that was made by subtraction rather than addition, a city you literally enter into, penetrate, rather than approach. Petra will draw you in, but at the same time, it is always threatening to disappear. The sandstone is fragile. The wind through the mountains, the pounding of feet, the universe’s bent toward disintegration—all conspire to grind it away. My trip here was to see the place and take a measure of its evanescent beauty, and to watch Virtual Wonders, a company devoted to sharing and documenting the world’s natural and cultural wonders, use all manner of modern technology to create a virtual model of the site so precise that it will, in effect, freeze Petra in time. * * * I arrived in Petra just as the summer sun cranked up from roast to broil; the sky was a bowl of blue and the midday air was piping hot. The paths inside the Petra Archaeological Park were clogged. Horse-drawn buggies clattered by at a bone-joggling speed. Packs of visitors inched along, brandishing maps and sunscreen. In a spot of shade, guides dressed as Nabateans kneeled to conduct their midday prayers. At its peak, 2,000 years ago, Petra was home to as many as 30,000 people, full of temples, theaters, gardens, tombs, villas, Roman baths, and the camel caravans and marketplace bustle befitting the center of an ancient crossroads between east and west. After the Roman Empire annexed the city in the early second century A.D., it continued to thrive until an earthquake rattled it hard in A.D. 363. Then trade routes shifted, and by the middle of the seventh century what remained of Petra was largely deserted. No one lived in it anymore except for a small tribe of Bedouins, who took up residence in some of the caves and, in more recent centuries, whiled away their spare time shooting bullets into the buildings in hopes of cracking open the vaults of gold rumored to be inside. In its period of abandonment, the city could easily have been lost forever to all but the tribes who lived nearby. But in 1812, a Swiss explorer named Johann Ludwig Burckhardt, intrigued by stories he’d heard about a lost city, dressed as an Arab sheikh to beguile his Bedouin guide into leading him to it. His reports of Petra’s remarkable sites and its fanciful caves began drawing oglers and adventurers, and they have continued coming ever since. Two hundred years later, I mounted a donkey named Shakira and rode the dusty paths of the city to ogle some of those sites myself. This happened to be the middle of the week in the middle of Ramadan. My guide, Ahmed, explained to me that he had gotten permission to take his blood pressure medication despite the Ramadan fast, and he gobbled a handful of pills as our donkeys scrambled up rock-hewn steps. Ahmed is a broad man with green eyes, a grizzled beard, a smoker’s cough, and an air of bemused weariness. He told me that he was Bedouin, and his family had been in Petra “since time began.” He was born in one of Petra’s caves, where his family had been living for generations. They would still be living there, he said, except that in 1985, Petra was listed as a Unesco World Heritage site, a designation that discourages ongoing habitation. Nearly all the Bedouin families living in Petra were resettled—sometimes against their wishes—in housing built outside the boundaries of the new Petra Archaeological Park. I asked Ahmed if he preferred his family’s cave or his house in the new village. His house has electricity and running water and Wi-Fi. “I liked the cave,” he said. He fumbled for his phone, which was chirping. We rode on, the donkeys’ hard hooves tapping a rhythmic beat on the stone trail. Petra sprawls and snakes through the mountains, with most of its significant features collected in a flat valley. Royal tombs line one side of the valley; religious sites line the other. A wide, paved, colonnaded street was once Petra’s main thoroughfare; nearby are the ruins of a grand public fountain or “nymphaeum,” and those of several temples, the largest of which was probably dedicated to the Nabatean sun god Dushara. Another, the once free-standing Great Temple—which probably served as a financial and civic center in addition to a religious one—includes a 600-seat auditorium and a complex system of subterranean aqueducts. On a small rise overlooking the Great Temple sits a Byzantine church with beautiful intact mosaic floors decorated with prancing, pastel animals including birds, lions, fish and bears. The grander buildings—that is, the grander caves—are as high and spacious as ballrooms, and the hills are pocked with smaller caves as well, their ceilings blackened by the soot left from decades of Bedouin campfires. Some of the caves are truly imposing, like the Urn Tomb, with its classical facade carved into the cliff on top of a base of stone-built arches, and an eroding statue of a man (perhaps the king) wearing a toga. Others are easy to miss, such as the cave known as the Triclinium, which has no facade at all but possesses the only intricately carved interior at Petra, with stone benches and walls lined with fluted half-columns. Standing inside the valley it is easy to see why Petra thrived. The mountains contain it, looming like sentries in every direction, but the valley itself is wide and bright. This article is a selection from the October issue of Smithsonian magazine So much of Petra feels like a sly surprise that I became convinced the Nabateans must have had a sense of humor to have built the city the way they did. They were gifted people in many ways. They had a knack for business, and cornered the market in frankincense and myrrh. They had real estate savvy, establishing their city at the meeting point of several routes on which caravans shipped spices, ivory, precious metals, silk and other goods from China, India and the Persian Gulf to the ports of the Mediterranean. They had a talent for melding the dust and dirt around them into a hard, russet clay from which they made perfume bottles and tiles and bowls. They were expert artisans. And while it isn’t recorded in historical texts, they clearly appreciated the hallmarks of architectural showmanship—a good sense of timing, a flair for theatrical siting. The most convincing evidence of this begins with the Siq, the main entrance to the city, a natural ravine that splits the towering rocks for almost a mile. It’s a compressed, confined space; its rock walls lean this way and that. Once you inch your way through it, you are spilled out onto a sandy apron and confronted with the most dramatic structure in Petra—Al Khazneh, or the Treasury, a cave more than a hundred feet high, its facade a fantastical mash-up of a Greco-Roman doorway, an Egyptian “broken” pediment and two levels of columns and statues etched into the sheer face of the mountain. The Treasury wasn’t actually a treasury at all—it gets its name from the riches said to have been stored in the great urn atop the circular building at the facade’s center. The statues adorning the colonnaded niches suggest it may have been a temple, but most scholars think it was a tomb housing the remains of an important early king. (A favorite candidate is the first century B.C. Aretas III, who used the word Philhellenos on his coins—“friend of the Greeks”—which might explain the building’s Hellenistic flair.) Inside the cave there are just three bare chambers, today empty of whatever remains once rested there. Perhaps the Nabateans placed this grand building here because the Siq served as a buffer to marauders, much like a wall or a moat. But I can’t help but think that they knew that forcing visitors to approach the Treasury via a long, slow walk through the Siq would make a perfect lead-up to a great reveal, designed to delight and astonish. The gradual approach also leaves the world with a timeless pun, because coming upon the Treasury this way makes you feel as if you’ve found a treasure at the end of a secret grotto. Petra was a nexus of commerce and cultural exchange When the Nabateans established their capital at Petra they ensured that it was well connected to booming trade routes: the Silk Road to the north, Mediterranean ports to the west, Egypt and southern Arabia to the south. With trading partners across the ancient world, the seat of Nabatean power was “the very definition of a cosmopolitan trade center,” writes the classicist Wojciech Machowski. * * * As Ahmed and I rode along, I could just make out in the distance the team from Virtual Wonders, who had spent the day flying a drone over the Great Temple, shooting high-resolution images of it from above. The company was formed in 2018 by three friends with complementary talents. Mark Bauman, a longtime journalist and former executive at Smithsonian Enterprises and National Geographic, knew the people in charge of historical locations like Petra and how to work with local authorities. Corey Jaskolski, a one-time high school dropout/computer whisperer (he eventually earned a graduate degree from MIT in electrical engineering), who has patented systems for impossible-seeming robotic cameras and 3-D scanning for use underwater, on land and from the air, would manage the technological challenges of image capture and digital modeling. Kenny Broad, an environmental anthropologist at the University of Miami, is a world-class cave diver and explorer for whom scrambling around a place like Petra was a piece of cake; he would serve as chief exploration officer. The three of them shared a passion for nature and archaeology and a concern with how to preserve important sites. While outfits such as the Getty Research Institute and the nonprofit CyArk have been capturing 3-D images of historical sites for some time, Virtual Wonders proposed a new approach. They would create infinitesimally detailed 3-D models. For Petra, for instance, they would capture the equivalent of 250,000 ultra-high-resolution images, which will be computer-rendered into a virtual model of the city and its breathtaking structures that can be viewed—even walked through and interacted with—using a virtual-reality headset, gaming console or other high-tech “projected environments.” Virtual Wonders will share these renderings with authorities and other scholarly and educational partners (in this case, the Petra National Trust). Detailed modeling of this kind is at the leading edge of archaeological best practices, and according to Jordan’s Princess Dana Firas, the head of the Petra National Trust, the data will help identify and measure the site’s deterioration and assist in developing plans for preservation and managing visitors. “It’s a long-term investment,” Firas told me. By the time I arrived in Petra, the Virtual Wonders team had scanned and imaged more than half of Petra and its significant buildings using an assortment of high-tech methods. A DJI Inspire drone—for which a military escort is required, because drones are illegal in Jordan—uses a high-resolution camera to collect aerial views, shot in overlapping “stripes” so every inch is recorded. Exact measurements are done by photogrammetry, with powerful lenses on 35-millimeter cameras, and Lidar, which stands for Light Detection and Ranging, a revolving laser mechanism that records minute calculations at the rate of a million measurements per second. When combined and rendered by computers those measurements form a detailed “texture map” of an object’s surface. All of this data will be poured into computers, which will need about eight months to render a virtual model. None of this is cheap. In Petra, the Virtual Wonders team hiked around with about a half-million dollars’ worth of gear. According to Bauman, the company’s hope is that the cost of the projects will be recouped, and exceeded, by licensing the data to film companies, game developers and the like, with a portion of the revenue going back to whoever oversees the site, in this case the Petra National Trust. This isn’t an idle hope. Petra is so spectacular that it has been used as a location in films, most famously Indiana Jones and the Last Crusade; countless music videos; and as a setting in at least ten video games including Spy Hunter, OutRun 2 and Lego Indiana Jones. If its approach succeeded, Virtual Wonders hoped to move on to similar projects around the world, and since I left Jordan the company has begun work at Chichen Itza, the Mayan city in the Yucatán. It has also scored a clear success with an immersive virtual reality exhibit titled “Tomb of Christ: the Church of the Holy Sepulchre Experience,” at the National Geographic Museum in Washington, D.C. I left my donkey and crossed through the ruins of the flat valley to join the team on a ridge overlooking the Great Temple. “We’re shooting stripes,” Jaskolski called out as the bug­like drone rose and jetted across the open sky toward the temple. Jaskolski’s wife, Ann, was monitoring the drone on an iPad. She reached out and adjusted the drone’s landing pad, a gray rubber mat, which was weighed down with a rock to keep the gusty breeze from toying with it. The drone made a burbling sizzle as it darted over the temple. Somewhere in the distance a donkey brayed. A generator coughed and then commenced its low grumbling. “We’re killing it!” Jaskolski called to Bauman, sounding a little like a teenager playing Fortnite. “I’m really crushing the overlap!” Bauman and I hiked along the ridge to another building known as the Blue Chapel. A few crooked fingers of rebar stuck out of some of the rock—evidence that some clumsy restoration had been attempted. But otherwise, the structure was untouched, another remnant of the city that Petra once had been, a bustling capital, where lives were lived and lost; an empire etched in time, where the city’s carapace is all that remains. * * * On the far side of the valley from the Treasury, across the plain, Petra’s architects kept another great trick up their sleeve: Ad Deir, or the Monastery. This ancient temple is thought to have been dedicated to a deified Nabatean king named Obodas I, and possesses Petra’s largest carved facade. But the path there gives you no glimpse of it at all. For 40 minutes Ahmed and I clung on as our donkeys climbed up the steep path. I kept my eyes glued to the back of Ahmed’s head so I wouldn’t have to see the sheer drop-off along the edge of the trail. As we made yet another turn with no building in sight, I began to wonder if I had misunderstood our destination. Even when Ahmed stopped and announced that we had arrived, there was nothing to see. The heat was getting to me and I was impatient. I grumbled that I didn’t see anything. “Over there,” Ahmed said, gesturing around a ragged rock wall. When I turned the corner, I was met with the full-frontal view of an enormous facade with an array of columns and doorway-shaped niches, almost 160 feet wide and nearly as tall, carved into a rocky outcropping. It was so startling and beautiful that I gasped out loud. Like so many of the monuments here, the Monastery’s interior is deceptively simple: a single rectangular room with a niche carved into the back wall, which probably once held a stone Nabatean icon. The walls of the niche itself are carved with crosses, suggesting the temple became a church during the Byzantine era—hence the name. The Monastery is said to be the best example of traditional Nabatean architecture—simplified geometric forms, the urn atop a rounded building at the center. It is believed that the Monastery’s architect took inspiration from the Treasury but pointedly stripped away most of its Greco-Roman flourishes. There are no statues in the spaces cut between the columns, and overall it’s rougher, simpler. But out here, all alone, in front of a wide stone courtyard where Nabateans and travelers from across the ancient world came to worship or feast, the sight of the Monastery is profound. I stared at Ad Deir for what felt like an eternity, marveling not only at the building but the way it had provided the exquisite pleasure of delayed gratification. When I returned to Ahmed, he was on the phone with his 2-year-old daughter, who was begging to get a new teddy bear on their upcoming trip to town. Ahmed has five other children. His oldest son, Khaleel, also works as a guide in the park. Khaleel had taken me earlier in the day to a ledge above the Treasury, a view even more vertiginous than the trail to Ad Deir. I needed several minutes before I could inch to the edge and appreciate the view. When I steadied my nerves and was able to peek out through squeezed eyes, I could grasp the monumentality of the Treasury—how it loomed, emerging out of the mountainside like an apparition, a building that wasn’t a building, a place that was there but not there. What will it mean to create a perfect model of a place like Petra—one that you might be able to visit sitting in your living room? Will it seem less urgent to see Petra in person if you can stick on a pair of virtual reality goggles and make your way through the Siq, gawk at the Treasury, hike up to the Monastery, and inspect ruins that are thousands of years old? Or will having access to an almost-real version of Petra make it easier for more people to learn about it, and that, in turn, will make more people care about it, even if they never walk over its red rocks or slide their way through the Siq? The preservation aspect of projects like Virtual Wonders’ is undeniably valuable; it saves, for posterity, precise images of the world’s great sites, and will allow people who won’t ever have the opportunity to travel this far to see the place and experience it almost as it is. But visiting a place—breathing in its ancient dust, confronting it in real time, meeting its residents, elbowing its tourists, sweating as you clamber up its hills, even seeing how time has punished it—will always be different, more magical, more challenging. Technology makes it easier to see the world almost as it is, but sometimes the harder parts are what make travel memorable. The long climb to Ad Deir, with its scary path and surprising reveal, is what I will remember, long after the specific details of the building’s appearance have faded from my memory. The way Petra is laid out means you work for every gorgeous vision, which is exactly what I imagine the Nabateans had in mind. * * * As soon as I left Petra, I found myself staring at the pictures I had taken and finding it hard to believe I had been there; the images, out of context, were so fantastical that they seemed surreal, a dream of a red stone city dug into the mountainside, so perfectly camouflaged that as soon as you drive the steep road out of the park, it seems to disappear, as if it were never there. In Amman, where signs advertised this fall’s Dead Sea Fashion Week (“Bloggers and Influencers Welcome!”), my driver pulled up to the front door of my hotel and I stepped out, passing a sign directing Fashion Week attendees to the ballroom. The hotel had just opened for business—it was a glossy, glassy building that advertised itself as being in the heart of the new, modern Amman. But ancient Jordan was here as well. The entry was puzzlingly dark and small, with a narrow opening that led to a long hallway with walls that were akimbo, leaning in at some points and flaring out in others, with sharp angles jutting out. I inched along, dragging my suitcase and banging a corner here and there. Finally, the dark hall opened wide onto a big, bright lobby, so unexpected that I stopped cold, blinking until my eyes adjusted to the light. The young man at the reception desk nodded at me and asked if I liked the entrance. “It’s something special,” he said. “We call it the Siq.”
b964e2e866638d08a04be35cdd0697b6
https://www.smithsonianmag.com/history/philip-macedonia-even-greater-alexander-the-great-180974878/
Colossal Ambition
Colossal Ambition I drive on a dirt road in Northern Greece through the ruins and spectral presence of a once-great city. Behind it, cloud shadows move across steep, forested mountains. Small birds dart from bushes. Wind combs the grass. Chunks of limestone, quarried more than 23 centuries ago, protrude from the earth. In the passenger seat, talking and gesticulating, is an archaeologist named Angeliki Kottaridi, a slight, forceful woman in her early 60s with bright coppery-dyed hair. She is the director of operations here at Aigai, the ancient royal capital of Macedonia, now protected by Unesco as one of the most important archaeological sites in Europe. This is where Philip II of Macedon, having conquered nearly all of classical Greece, built his monumental palace in the fourth century B.C. For too long, Philip has been regarded as a minor figure in ancient history, remembered primarily as the father of Alexander the Great. But Philip was a colossus in his own right, a brilliant military leader and politician who transformed Macedonia and built its first empire. At Aigai, it is Philip who looms largest among the ruins, even though the place was vitally important for Alexander too. Excavations have revealed that Philip transformed the ancient city, revolutionized its political culture, and turned it into a symbol of power and ambition. We pass the worn-down remains of the outdoor theater that Philip built near his palace. This is where he entertained dignitaries from across Greece and the Balkans, and where he ultimately met his death in a shocking public assassination. Kottaridi hopes to start excavating and restoring the theater soon, but this is an extremely busy year at Aigai. She and her team are preparing the exhibits for a massive new museum, scheduled to open to the public in January 2021. It will showcase artifacts found at the site—a selection of more than 6,000 items, spanning 13 centuries. Meanwhile, digging continues in the vast burial grounds and other parts of the city, and a staff of 75 is working to complete a $22 million partial restoration of Philip II’s palace—the largest building in classical Greece, three times the size of the Parthenon in Athens. For Kottaridi, decades of work are coming to fruition, and for anyone interested in Philip and Alexander, Aigai is now a must-see destination. This article is a selection from the June 2020 issue of Smithsonian magazine And yet there is so much more to learn. “We have excavated only a tiny portion of the site, less than 1 percent, and this has taken decades,” says Kottaridi. “We are constantly making new discoveries, so many that it’s a problem, because we must also preserve what we have, restore the most important structures, write everything up and present our discoveries to the public. There is enough work for three or four lifetimes.” Kottaridi grew up in the northern Greek city of Thessaloniki and studied at the Aristotle University there. Now she lives near Aigai in a house that she shares with a rescue dog and a retinue of 30 cats. Kottaridi doesn’t drive, won’t fly, refuses to use a smartphone, ignores most of her email and has planted more than 1,600 trees at Aigai, mainly for the birds. She has published six books and 150 academic papers, and in 2008 she was awarded the prestigious Golden Cross of the Order of the Phoenix by President Karolos Papoulias of Greece for her contributions to knowledge of the ancient world. “People ask why I have no children,” she says. “It’s really because I adopted Alexander the Great. I fell in love with him when I was young—not the mythical figure but the man. He was so much more than a military genius. He opened up the Silk Road. He built these amazing Hellenistic cities in Tajikistan, Afghanistan, Pakistan, Egypt, with freedom of religion, tolerance for different cultures, equal opportunity. And it all began right here in Aigai.” This is where Alexander launched his famous invasion of the Persian Empire. Without denying Alexander’s greatness, it’s important to remember that he was using his father’s army, and that the expedition was Philip’s idea. * * * Kottaridi and her colleagues have found graves and ornamental burial goods dating back perhaps 3,000 years, but Aigai didn’t become a city until the seventh century B.C. That’s when the Temenids, a Macedonian royal dynasty that claimed direct descent from Zeus and Hercules, established their capital here. According to legend, the first Temenid king, Perdiccas, was told by the oracle at Delphi that a herd of white goats would lead him to the site of his kingdom’s capital. Perdiccas followed the goats to the foothills of the Pierian Mountains, overlooking the Haliacmon River as it crosses the wide green Macedonian plain. “The word aigai means ‘goats’ in ancient Greek,” says Kottaridi, as we admire the same view. The culture of the ancient Macedonian people, who originated as herding and hunting tribes north of Mount Olympus, became more Greek under Temenid rule. They spoke a dialect of the Greek language and worshiped Greek gods. “One of the important discoveries at Aigai was the tombstone carvings,” says Kottaridi. “They taught us that everyone here had Greek names. They thought of themselves as Macedonians and Greeks.” In the eyes of sophisticated Athenians, however, they were northern barbarians who mangled the language, practiced polygamy, guzzled their wine without diluting it, and were more likely to brawl at the symposium than to discuss the finer points of art and philosophy. The Athenian politician Demosthenes once described Philip II as “a miserable Macedonian, from a land from which previously you could not even buy a decent slave.” When Philip was growing up at the Macedonian court—based at the administrative capital of Pella, with Aigai reserved for royal weddings, funerals and other ceremonial occasions—he learned to hunt, ride and fight in combat. He also studied Greek philosophy, drama and poetry, and absorbed the necessity for ruthlessness in politics. The palace was a viper’s nest of treachery and ambition, and royal children were frequently murdered by rivals to the throne. Macedonia was a violent, unstable, hypermasculine society surrounded by enemies. In 359 B.C., Philip, 23, saw his older brother King Perdiccas III and 4,000 men get slaughtered by the Illyrians, a rebellious warlike people in Upper Macedonia. His other brother had been murdered in a palace conspiracy, and since Perdiccas III’s heir was a small child, the Macedonian Assembly appointed Philip as regent to the throne, and then as king. “He inherited a very old-fashioned tribal kingdom, with an economy based on livestock,” says Kottaridi. “Philip had lived in Thebes for a few years, and he brought new ideas from Greece. He introduced coinage. He turned this city into a politically functioning space, and he completely revolutionized the military.” Macedonia had no full-time professional soldiers, just conscripts and volunteers. Philip instituted regular pay, better training and weapons, a promotion pathway, and a system of cash bonuses and land grants in conquered territories. He invented a highly effective new weapon, the sarissa, a 14- to 18-foot pike with an iron spearhead, and he trained his infantry to fight in a new phalanx formation. Like a traditional Macedonian warrior-king, Philip always led from the front in battle, charging toward the enemy on horseback. In addition to minor wounds, he lost an eye to an arrow, shattered a collarbone, maimed a hand and suffered a near-fatal leg wound, which left him limping for the rest of his life. The Roman historian Plutarch tells us that “he did not cover over or hide his scars, but displayed them openly as symbolic representations, cut into his body, of virtue and courage.” Philip inherited 10,000 part-time infantrymen and 600 cavalry, and built this up to 24,000 infantry and 3,000 cavalry. None of the city-states in Greece had such large standing armies. Nor did they foresee that Philip would use his military, along with cunning diplomacy and seven strategic marriages, to bring nearly all of Greece, a large swath of the Balkans and part of what is now Turkey under ancient Macedonian rule. “This is an incredible achievement for someone they dismissed as a barbarian, and very important for Alexander,” says Kottaridi. * * * Nineteen miles from Aigai, just outside the village of Naoussa, lies a tranquil clearing with caves, springs and ancient carved limestone benches. This is Mieza, or Sanctuary of the Nymphs. When Plutarch came here in the second century A.D., locals told him that this was where Aristotle had tutored the young Alexander. Guidebooks and travel websites impart the same information to modern tourists, and road signs point the way to “Aristotle’s School.” It is immeasurably intriguing that Alexander, the ancient world’s greatest conqueror, was taught by Aristotle, the great philosopher. How did the experience shape Alexander’s intellect, decision-making, interests and outlook? Would history have run a different course if the young prince had been tutored by someone more ordinary? It was Philip’s idea. Alexander, the son of his fourth wife, Olympias, was a bold, headstrong boy of unusual intelligence. When Alexander reached age 13, Philip summoned Aristotle to the Macedonian court. There was a connection between the two families: Aristotle’s father had been a friend and court physician to Philip’s father, Amyntas III. There was also bad blood: Philip had razed Aristotle’s hometown of Stagira six years previously and sold most of its inhabitants into slavery. Nonetheless, the two men came to an agreement. Aristotle would instruct Alexander, and in return Philip would rebuild Stagira and resettle its citizens there. For the next three years, Aristotle, a curmudgeonly figure who had small eyes, wore many rings and spoke with a lisp, tutored Alexander in biology, ethics, literature, mathematics, medicine, philosophy, politics, rhetoric and zoology. Plutarch describes the two of them sitting on the stone benches and discussing philosophy, and strolling through nearby orchards and vineyards. Modern guidebooks and history books repeat this romantic description, much to Kottaridi’s annoyance. “It is idiotic!” she says. “From 13 to 16, Alexander and his peers learned how to fight. They would have done this in a gymnasium, a combination of school and military academy, with different areas to sleep, eat, study and fight. There is no evidence of facilities like this at the Mieza sanctuary. There is no room for them!” In fact, Kottaridi’s colleagues have partially excavated the remains of a gymnasium seven miles away, near an ancient theater, and they have dated it to the time of Philip II. To the displeasure of the villagers in Naoussa, for whom “Aristotle’s School” has constituted a tourist attraction since the second century, local archaeologists now believe that Aristotle taught Alexander and probably 150 other students at this gymnasium. Philip likely built it in order to supercharge his elite warrior class, in preparation for his planned invasion of the Persian Empire. I visit the place with Ioannes Graekos, an affable archaeologist who used to work at Aigai and now oversees a museum in the nearby town of Veria. There isn’t much to see at the gymnasium site—a few old digs on a large area of overgrown land—because the excavation stalled for lack of funding. Nonetheless, Graekos is able to conjure up what once stood here: a massive two-story building with dining rooms, wrestling and fighting areas, and classrooms. “Alexander and Aristotle probably visited the Mieza sanctuary, because it was so close, and so pleasant, but the real schooling took place here,” he says. Aristotle’s fascination with nature, and his belief in the scientific method, exerted a strong influence on Alexander, who took naturalists with him as he marched his army across Asia. Alexander apparently sent their reports back to Aristotle, accompanied by flora and fauna samples. He also included scientists, engineers and philosophers in his retinue, and opened up intellectual contacts between East and West. When their student-teacher relationship ended in 340 B.C., Aristotle gave his own, annotated copy of the Iliad to Alexander, who carried the book to Asia and famously placed it under his pillow, next to his dagger, while he slept. In one important regard, Alexander and Aristotle disagreed. The philosopher thought that all non-Greek people were barbarians and potential slaves. When Alexander started hiring foreigners in his army and administration, the relationship cooled. “Alexander wanted to expand the world and prove what a mixture of people can do and be,” says Graekos. “He wanted citizenship to mean the same thing for his subjects in Afghanistan and Persia as in Macedonia. This was anathema to Aristotle, who advised Alexander to treat people from other nations as you treat plants and animals.” Anthony Everitt, the British author of the recent biography Alexander the Great, agrees that Aristotle was a hard-core nationalist. Talking by phone, he jokingly compares the philosopher to a “Brexiteer.” But he disagrees with Graekos’ and Kottaridi’s portrayal of Alexander as a pan-ethnic idealist who wanted to bring races and creeds together. “Alexander was driven by the excitement of fighting, which he loved, and the Homeric idea that war brought glory,” he says. “Once he had defeated the Persian Empire, he needed a practical way of governing a vast territory with many different languages. His solution was to hire locals. Gradually this led to the blending of cultures.” * * * Angeliki Kottaridi was a 20-year-old archaeology student in 1977 when her professor, Manolis Andronikos, invited her on a dig at Aigai. He had been excavating the tumuli, or burial mounds, near the modern village of Vergina. An English historian, Nicholas Hammond, had suggested that the tumuli and ruined palace belonged to the lost city of Aigai, and Andronikos agreed with him. After the breakup of the Macedonian kingdom by the Romans in the second century B.C., Aigai fell into decline and obscurity. Then, in the first century A.D., a massive landslide buried the city and consigned it to oblivion, although a large burial mound remained clearly visible at the edge of the plain. Andronikos called it the Great Tumulus, and that’s where he and Kottaridi were digging. “I was thrilled that he chose me to help, but it was a very ugly excavation,” she says. “Just earth, earth, earth. Nothing but earth for 40 days. Then the miracle.” Excavating 16 feet down with a small hoe, Andronikos uncovered two royal tombs and dated them to the fourth century B.C. Other royal tombs discovered nearby had been looted in antiquity. But these newly unearthed ones were sealed and intact. That night, with guards posted at the dig, the two researchers barely slept. The following day, they pried open the marble door to the first tomb. They stepped into a large, vaulted, double chamber strewn with smashed pottery, silver vases, bronze vessels, armor and weapons, including a golden breastplate and a beautiful gilded arrow quiver. Painted on one wall was a breathtaking frieze depicting Philip II and a young Alexander, both on horseback, hunting lions and other animals. Opening a marble sarcophagus with trembling hands, Andronikos found a small golden coffin, or larnax, with a relief star on the lid. Lifting it, he saw burned bones and a golden wreath. A shiver ran down his spine. He was unable to breathe. If the dating was correct, he was almost certainly holding the bones of Philip II. “It was far too terrifying an idea for my brain to assimilate,” he later wrote. The discovery, widely reported in the news media, was hailed as the archaeological find of the century. (Some archaeologists have disputed that Philip II’s bones were in the golden larnax, but the latest research, and the weight of professional opinion, now indicates that Andronikos was correct.) The following year, with Kottaridi as his assistant, Andronikos unsealed the unlooted tomb of Alexander IV, the son of Alexander the Great. “I was the first to catalog the items coming out of these tombs, to describe, measure and draw them,” Kottaridi says. “An unbelievable honor.” After finishing her dissertation in 1981, she worked as Andronikos’ assistant until he retired in 1989. Kottaridi took charge of Aigai in 1991 and has been overseeing it ever since. “When Manolis was here, we found the theater, the acropolis on the mountain, and four royal tombs,” she says. “Since I’ve been in charge, we have excavated more than a thousand tombs and found sanctuaries, new city districts, farmhouses, streets, fortifications. We have a much clearer idea of the history and the form of the city. It was spread out with different districts serving different functions.” Kottaridi’s plan for Aigai is based on the same principle. She has been creating a “Polycentric Museum,” with separate and distinct units scattered over a wide area and integrated with the ongoing archaeology. The Museum of the Royal Tombs, completed in 1993, is a dark, atmospheric, underground space inside the Great Tumulus. Here one can see the tombs, frescoes and spectacular golden grave goods of Philip II, Alexander IV and other kings. The site of the palace is nearly a mile away, on a broad terrace of land in the foothills. On a quiet Sunday afternoon, with Kottaridi in the passenger seat, I drive up there. Here Philip’s immense structure, under restoration by Kottaridi, is rising for the second time. The peristyle, or main courtyard, is 130,000 square feet—room for 8,000 people to gather. “This was a political building, not a home, and it was open to the public,” she says. “It was a place for feasts, political meetings, philosophical discussions, with banqueting rooms on the second floor and a library. The peristyle was flanked by stone colonnades, which we are restoring to a height of six meters. We are redoing all the mosaics on the floor. It is very difficult to find stonemasons and mosaic-makers who can do this work by hand.” The great palace, “utterly revolutionary and avant-garde for its time,” Kottaridi says, was two stories high and visible from the entire Macedonian basin. It was a symbol of Philip’s power and sophistication, a reflection of his ambition, and a retort to the Athenians who had derided him and were now his subjects. Philip’s vast royal complex, covering an area of nearly four acres, larger than any monument in Athens, must have reminded his Greek neighbors that his kingdom had defeated them. By 336 B.C., after little more than two decades on the throne, Philip had transformed Macedonia from a struggling backwater into an imperial superpower. Now he was planning to invade the Persian Empire in Asia Minor. He had already sent an advance contingent of 10,000 troops. The rest of the army would join them after the marriage of his daughter Cleopatra (no connection with the Egyptian queen) in October. He turned the wedding into a huge gala for dignitaries and ambassadors from all over Greece and the Balkans. “They crowned Philip with golden wreaths,” says Kottaridi. “The wedding took place right here in the palace and there was a huge feast. The next morning they all gathered at the theater for the final celebration.” It began with a sunrise procession. Twelve men came through the theater holding up statues of the 12 Olympian gods. They were followed by a statue of Philip, suggesting that he had crossed the permeable line between men and gods and was now divine. Then came one-eyed Philip himself, scarred and limping, but radiating power and authority. He wore a white cloak and a golden crown, and most dramatically, he was unarmed. Macedonian men typically wore their weapons, but Philip wanted to convey his invincibility. When he reached the center of the theater, he stopped and faced the cheering crowd. Suddenly one of his bodyguards stabbed him in the chest with a dagger, “driving the blow right through the ribs,” according to the historian Diodorus. Philip fell dead and his white cloak turned red. The assassin sprinted to the city gates, where horses were waiting for him. Three bodyguards who were friends of Alexander gave chase, caught him and killed him on the spot. The assassin was Pausanias of Orestes in Upper Macedonia, and Philip had recently jilted him for a new male lover. Pausanias was then gang-raped by a man named Attalus and his cronies, and turned over to the stable hands for more sexual abuse. When Pausanias reported this outrage to Philip, the king did nothing. Did Pausanias murder Philip for not punishing Attalus, as some scholars believe? Or was Pausanias the paid instrument of more powerful individuals who wanted Philip dead, as other scholars believe? We know that Olympias loathed her husband and yearned for Alexander to take the throne. King Darius II of Persia is another suspect with an obvious motive: Philip was preparing to invade his empire. Prominent Athenians are under suspicion, because they resented Macedonian rule. The finger has also been pointed at Alexander, who had quarreled with his father and would gain the throne with his death. That last theory is a foolish slander against Alexander, Kottaridi says. She suspects a plot by a rival faction of nobles. Palace intrigues had long been a blood sport in Macedonia. The kings at Aigai—Philip was 46—almost never died of old age. * * * The semicircular theater is a short distance from the palace and was built as part of the same complex. For Kottaridi, it is a place of the greatest historical significance, and she yearns to restore it. Standing in the wind, gazing at the grassed-over ruins, she describes the aftermath of Philip’s murder, the chaos and panic, 19-year-old Alexander and his supporters marching up from the theater into the palace, where Alexander swiftly gained the support of the generals and was declared king. She sighs and dabs tears from her eyes. “This is the very place where, in one moment, the history of the world changed for all eternity.” Alexander threw the biggest funeral in Macedonian history for his father. After burning the body on a pyre, attendants retrieved the bones, washed them in wine, wrapped them in purple cloth and laid them in a golden larnax. The larnax was then placed in a sarcophagus and the tomb was sealed. Alexander, facing a revolt in Greece, marched out to crush it, and when he returned to Aigai a year later he threw a party. He invited many of the same dignitaries who had attended Cleopatra’s wedding, and he presented a nine-day drama at the theater where they had witnessed his father’s murder. After the celebrations, he launched his invasion of the Persian Empire, carrying out his father’s plan with his father’s army, siege machinery and many of the same generals. Although Alexander was a brilliant commander, and his campaign in Asia would far exceed anything that Philip had imagined, it was his inheritance that made it possible. Without Philip’s war machine, there would have been no Alexander the Great.
81e61256100d79d15e9d6a19376e95f0
https://www.smithsonianmag.com/history/photographic-requiem-americas-civil-war-battlefields-180955746/?no-ist
A Photographic Requiem for America’s Civil War Battlefields
A Photographic Requiem for America’s Civil War Battlefields In “Poem of Wonder at the Resurrection of the Wheat,” Walt Whitman describes a landscape that is oblivious to human suffering, with “innocent and disdainful” summer crops rising out of the same ground where generations lie buried. He published the lyric in 1856, not long before the Civil War transformed peach orchards and wheat fields into vistas of mortal anguish. The Civil War: A Visual History The “Broken Land” photography series, by Eliot Dudik, seems to challenge Whitman’s vision of an indifferent earth: In these battlefield panoramas, the new life of 150 summers can’t seem to displace death. Seasonal change is just another ghostly note in these images. Fresh snow, high cotton—it hardly matters. Moss advances in Shenandoah River bottoms and clouds storm Lookout Mountain, but nature never conquers memory here. The soil still looks red. Dudik, who spent his childhood in Pennsylvania, moved to South Carolina in 2004. “Conversations there always seemed to turn toward the Civil War,” he says, and that made him “realize the importance of remembering and considering.” He embarked on “Broken Land” three years ago, and so far has photographed about a hundred battlegrounds in 24 states. He’s now founding a photography program at the College of William & Mary in Williamsburg, Virginia; this summer, while he’s on break, he hopes to add battlegrounds in three more states. Using an antique view camera that weighs 50 pounds, he typically takes only a single, painstaking picture of each battlefield he visits. He prefers to shoot in winter, and “in rain, and on really overcast and nasty days. Blue sky is kind of my nemesis.” The subdued light makes landscapes look perfectly even. “I avoid the grandiose, the spectacular, the beautiful. It helps the viewer consider what’s being photographed.” In Dudik’s pictures, trees are everywhere. “If I could take pictures of trees for the rest of my life, I would,” he says. He likes how their vertical forms balance long horizons, but they are spiritual presences, too. They go gray or blue, depending on the light. They hold the line, beckon, surrender: A frequent contributor to Smithsonian, Abigail Tucker is the author of The Lion in the Living Room: How House Cats Tamed Us and Took Over the World. More information is available at her website: abigailtucker.com
38813182055d2b8c6fefe584b66dc84e
https://www.smithsonianmag.com/history/ping-pong-diplomacy-60307544/
Ping-Pong Diplomacy
Ping-Pong Diplomacy Thirty years ago: April 1972. The Cold War is entering its 26th year with no end in sight. In Vietnam, war still rages. On April 12, a Pan Am 707 lands in Detroit, Michigan, carrying the People's Republic of China's world champion table tennis team for a series of matches and tours in ten cities around the United States. The era of Ping-Pong diplomacy had begun 12 months earlier when the American team—in Nagoya, Japan, for the World Table Tennis Championship—got a surprise invitation from their Chinese colleagues to visit the People's Republic. Time magazine called it "The ping heard round the world." And with good reason: no group of Americans had been invited to China since the Communist takeover in 1949. Why had they been invited? The Chinese felt that by opening a door to the United States, they could put their mostly hostile neighbors on notice about a possible shift in alliances. The United States welcomed the opportunity; President Richard M. Nixon had written: "We simply cannot afford to leave China outside the family of nations." Soon after the U.S. team's trip, Nixon, not wanting to lose momentum, secretly sent Secretary of State Henry Kissinger to Peking to arrange a Presidential visit to China. Nixon's journey seven months later, in February 1972, would become one of the most important events in U.S. postwar history. "Never before in history has a sport been used so effectively as a tool of international diplomacy," said Chinese Premier Chou En-lai. For Nixon, it was "the week that changed the world." In February 2002, President George W. Bush, in his second trip to China, recalled the meeting that came out of Ping-Pong diplomacy, telling President Jiang Zemin: "Thirty years ago this week, President Richard Nixon showed the world that two vastly different governments could meet on the grounds of common interest and in a spirit of mutual respect."
bed990b6ded81726c8658be1e6b84183
https://www.smithsonianmag.com/history/ponce-de-leon-never-searched-for-the-fountain-of-youth-72629888/
Ponce De Leon Never Searched for the Fountain of Youth
Ponce De Leon Never Searched for the Fountain of Youth Half a millennium ago, in 1513, the Spanish explorer Juan Ponce de León departed Puerto Rico for the verdant island of “Bimini”—an uncharted land in what is now the Bahamas. He eventually landed instead in Florida, where he staked a claim for the Spanish Crown and ensured himself a spot in the annals of history. As legend has it , and as scholars have maintained for centuries, Ponce was in search of the Fountain of Youth, a fabled wellspring thought to give everlasting life to whoever bathed in or drank from it. But new scholarship contradicts the old fable and suggests that Ponce was interested not in longevity but political gain. The real story goes something like this: In 1511, messy political squabbling forced Ponce to surrender the governorship of Puerto Rico, an appointment he had held since 1509. As a consolation prize, King Ferdinand offered him Bimini, assuming the stalwart conquistador could finance an expedition and actually find it. J. Michael Francis, a historian at the University of South Florida, St. Petersburg who has spent decades studying the Spanish colonies in the Americas , says no mention of a Fountain of Youth occurs in any known documents from Ponce’s lifetime, including contracts and other official correspondence with the Crown. In fact, Ponce’s name did not become connected with the Fountain of Youth until many years after his death, and then only thanks to a Spanish court chronicler out to discredit him. Gonzalo Fernández de Oviedo y Valdés disliked Ponce, contending that he was gullible, egocentric and dull-witted. The animosity probably had something to do with court politics: Oviedo aligned himself with Diego Columbus, who was the son of Christopher and the man who helped push Ponce out of Puerto Rico. In Historia general y natural de las Indias, Oviedo’s account of the Spanish settling of the Americas, he relates a tale in which Ponce, deceived by Indians, goes tromping off on a futile hunt for the Fountain of Youth. It’s all a literary device intended to make Ponce appear foolish. Although visits to spas and mineral baths were common in the 16th century, actually believing water could reverse aging was apparently considered pretty silly. Oviedo’s satiric version of Ponce’s travels stuck. “You’ve got this incredible story that started out as an invention,” Francis says, “and by the 17th century, it has become history.” (For what it’s worth, Ponce died at age 47 after being wounded by an arrow in a fight with an Indian tribe in Florida.) Of course, not all tall tales are codified by the passing years into something approaching fact. Sherry Johnson, a historian at Florida International University, says the myth of Ponce de León and his magical fountain remain because of the romance. “Instinctively, we latch on to it—this idea that we might never get old,” she says. It also fits the self-made mythos of America, a young country where, we’re taught, anything is possible. Florida continues to capitalize on what could be its greatest legend, with hundreds of tourists drinking each day from the stone well at St. Augustine’s Fountain of Youth Archaeological Park. Despite debunking efforts by Francis and others, the story of Ponce’s fountain just won’t die.
68e8d471087becd3ae42c18fdbca616f
https://www.smithsonianmag.com/history/powers-that-be-74850594/
Powers That Be
Powers That Be Presidential historian Robert Dallek is probably best known as the author of An Unfinished Life: John F. Kennedy, 1917-1963. It is to Kennedy, who was sworn into office 50 years ago this month, that Dallek traces a significant expansion of presidential power in this country (“Power and the Presidency” ). The turning point, he says, “was the Cuban missile crisis, when we came closer than any other time during the cold war to a hot war and to a nuclear exchange.” During those fateful 13 days, “Kennedy set up what was known as the ExComm, the executive committee. He did consult with some people in Congress, but they didn’t make policy. It was essentially done by the president himself, with the advice of these advisers.” Since that time, says Dallek, the power to make war and peace has been “very much in the hands of the executive.” Dallek sees his article as “a cautionary tale. We’re not going to want to eliminate executive power or even inhibit the president to too great an extent. But by the same token, we need to pay attention.” John Morthland first wrote about wild hogs for Texas Monthly more than a decade ago after hearing stories about cowboys who roped the wily creatures for fun. “But at that time,” he says, “the invasion was fairly recent. Now, it’s happening on a huge scale everywhere. They are now in 39 states and four Canadian provinces.” Morthland’s article about the animals (“A Plague of Pigs”) raises the question of how to deal with an animal that does a surprising amount of harm. He emphasizes “the need to keep this population contained. People who live in the suburbs think, ‘Well, this is somebody else’s problem. This is never going to affect me.’ But it does. The pigs have moved into the suburbs. They have moved into city parks. So it’s really everybody’s problem.” What in heaven’s name! Our latest Smithsonian Collector’s Edition, Mysteries of the Universe, is now on sale at selected newsstands and bookstores. Or you may order it at 1-800-250-1531 or by going to Smithsonian.com/universe. This is an amazing compilation about—among other mind-bending phenomena and events—the recent discoveries of new planets, the expanding universe, black holes and dark energy. It’s everything you ever wanted to know about our solar system and beyond. Far, far beyond. Carey Winfrey was Smithsonian magazine's editor in chief for ten years, from 2001 to 2011.
ee2af6caefff484bd46b5a5ac553e69e
https://www.smithsonianmag.com/history/priest-abu-grahib-180971013/
Joshua Casteel was 24 years old when he learned he would be sent to Iraq as an interrogator with the 202nd Military Intelligence Battalion. This was his first deployment. It was June 2004, and the war in Iraq had been going on for a little more than a year. Casteel packed a copy of the Book of Common Prayer and didn’t stop reading until he saw the lights of Baghdad in the desert below. From Ali Al Salem Air Base, outside Kuwait City, he took a military bus overnight to Baghdad International Airport. Out his window he saw oil fires, roadside weddings, sand that went on forever. The next day, he suited up in body armor, strapped on his M-16, and took a heavily armored three-vehicle convoy 20 miles outside Baghdad to Abu Ghraib prison. On the way, he was thinking about Pope John Paul II, who wrote about suffering, human dignity and the nature of personhood and its relationship to the divine. Then the commander asked about newcomers: “Who has never done this before?” Casteel raised his hand. The commander explained that they didn’t fire warning shots. “If you move your selector level from ‘safe’ to ‘semi’ automatic, you shoot to kill,” he said. Casteel stood 6-foot-1 and weighed 240 pounds. He was a blond, blue-eyed evangelical Christian from Cedar Rapids, Iowa. The deployment came six weeks after the revelation of prisoner torture and abuse at Abu Ghraib shocked the world. An Army intelligence officer and a patriot who’d long dreamed of serving his country in uniform, Casteel also had doubts about the morality of the so-called war on terror. Two weeks before he got his assignment letter from the Army, he was accepted to seminary school. He chose Iraq. His mother, Kristi Casteel, could never picture her son as an interrogator. “He just wasn’t cruel to anyone,” she told me. She worried the job would change him. Casteel tried to rationalize. “Better that they have someone like me in the interrogation room,” he told her, “than someone who doesn’t care about the Geneva Conventions, or just wants to drop bombs.” This article is a selection from the January/February issue of Smithsonian magazine Abu Ghraib was already a prison before the Americans arrived, where Saddam Hussein incarcerated, tortured and executed Iraqi dissidents. When Saddam’s regime collapsed, the Americans took the place over and replaced Saddam’s portrait with a banner that read “America is the friend of all Iraqi people.” There was hardly any vegetation, just expanses of dirt and mud between buildings. “At the prison’s edge is a teetering skyline—minaret, palm trees, the mosaic dome of a mosque, rooftops,” Casteel wrote home to his parents. “At sunset I can hear the calls to prayer from the south and from the east. At times it may even appear as if in a round, like choirs of a cathedral, one folded atop the other. But always a few hours after the sun has fallen there is the intermittent echo of small-arms fire, the howling of dogs.” The complex, which now also housed a U.S. military base, had a chapel, a couple of cafeterias, an entertainment shed. When Casteel got to his sleeping quarters, everything was covered in ash. Outside, he saw a plume of smoke from a giant trash pile. The pit burned 24 hours a day, seven days a week. Sometimes the smoke blew right through Casteel’s sleeping quarters. Casteel was told that the military’s top priority, above even the search for Osama bin Laden, was to hunt down Abu Musab al-Zarqawi, the leader of Al Qaeda in Iraq, and nicknamed the “Sheik of the Slaughterers.” Casteel’s job would be to interrogate prisoners to learn more about Zarqawi’s chief lieutenant, a man named Omar Hussein Hadid, whose army of insurgents had killed 95 Americans with rocket-propelled grenades and crude bombs during the Battle of Fallujah. For the first week Casteel sat in on interrogations. There were six booths on each side of a long hallway; down the center was a two-way mirror that didn’t always work well, and when it didn’t, the prisoners watched you watch them. The rooms held little beyond plastic chairs, cheap tables, maybe zip ties on the chair legs. Sometimes a steel hook was attached to the floor. Every now and then prisoners were led to a more comfortable room, to confuse them, make them relax. The goal was to make them slip up. Sometimes Casteel saw men kept naked. Sometimes they were handcuffed to chairs. During lessons, Casteel’s supervisors explained how to use fabricated stories and charges of homosexuality to shame the prisoners and manipulate them. The commanders were clear about who they were dealing with, Casteel remembered. “These men,” they said, “are the agents of Satan, gentlemen.” * * * I met Casteel in 2009, when we were both graduate students in the writing program at the University of Iowa. We took a class together on the art of memoir, and on the side, Casteel told me, he took courses in philosophy and theology. I was surprised when I learned he had been an interrogator at Abu Ghraib prison. He wasn’t like any soldier I had ever met. He loved to sing solos from Les Misérables and gave frequent sermons at local churches. I often saw him in a corduroy blazer, books piled under one arm. A few years later, I contacted Casteel’s mother, Kristi, because I wished I had gotten to know him better. She invited me to her home in Cedar Rapids and gave me access to a Dropbox account containing Joshua’s many writings and files. The folders had titles like “Heidegger and the Mystery of Pain,” “Flesh and Finitude,” “Heidegger and Sartre on God and Bodies,” “Technologies of Humanness” and “The Rhetoric of Pain.” Kristi said, “Joshua had a complexity about his life.” There were folders for academic papers, diary entries, plays—Casteel got a dual master’s degree in playwriting and nonfiction writing—and many jotted-off musings. A small publisher, Essay Press, had put out a short book by Casteel in 2008 titled Letters from Abu Ghraib, composed of selected emails he wrote to friends and family during his six-month deployment. And there were a lot of unfinished projects, including a memoir called No Graven Images. Peeking into Casteel’s files felt a little like having a conversation with him, even if it was one-sided. But there was so much I still wished to know. Casteel often made difficult and even contradictory choices, which to many people who knew him seemed incomprehensible. He was constantly trying to make sense of how his Christianity fit with the war and his time in Iraq. For him, questioning this paradox at the heart of his life was analogous to figuring out the mystery of Christ. “If Jesus is anything,” Casteel wrote in the introduction to his unfinished memoir, “he is incomprehensible. This is my story of wrestling with that incomprehensibility.” * * * Casteel was born into a family of evangelists and raised in Cedar Rapids. His father was an ordained minister with River of Life Ministries, and both of his parents worked as Christian marriage therapists. Joshua was the youngest child of three, and the only boy. For years Casteel soaked up the ecstasy of Pentecostalism, spoke in tongues, attended miracles. On Sundays, he listened to sermons, Scriptures, hymns, and learned about the fight between good and evil. He was a kid driven by questions of meaning and significance. He lived with what people now like to call “intentionality.” He told his mother he wanted to give himself up to a higher cause—either his country, or God, or both. He even told his mother that his calling might include the ultimate sacrifice. He covered his bedroom walls with cutouts from Army brochures and Marine recruiters, the American flag and the U.S. Constitution, and a large wooden cross. He attended his first presidential caucus events at age 7, and in high school became president of the local chapter of the Young Republicans. In his parents’ garage he would hold press conferences in a White House built from cardboard, wearing a suit and clip-on tie, his hair parted like Ronald Reagan’s. He got his first gun at 11, during the Gulf War—a 22-caliber rifle with a long-range scope. Rush Limbaugh was a constant presence. So was Billy Graham and Ralph Reed, then head of the Christian Coalition. “On the one hand,” Casteel wrote in his memoir, “the political banter of our ‘fundamentalist’ Christian household hovered around familiar conservative themes: family values, small government, private enterprise (Dad was an entrepreneur). But also always present was what Thomas Friedman refers to as the invisible fist behind the invisible hand in the economy: strong national defense.” Casteel was consumed by feelings of loyalty to America and believed in America as a “Shining City on a Hill.” His father had been a captain in the Army, and his grandfather had fought in World War II, Korea and Vietnam. At his grandfather’s funeral, Joshua placed an old West Point badge in his casket. One summer, at Bible camp, when Casteel was 14 years old, a man named Steve, a self-declared prophet, had a revelation that Casteel was destined to be a powerful and historically significant man. When Steve was kicked out of the ministry for false prophecy, Casteel asked the camp pastor whether the prophecy was still worth anything. “It doesn’t mean it wasn’t true,” the pastor said. “God can speak through a false prophet.” * * * Kristi Casteel describes her son as a happy and affectionate child, obedient as they come. The two forged a close and trusting relationship right from the beginning. One day when Casteel was 3 years old she found him sobbing uncontrollably. He brought her outside. “It’s really bad,” he said. “A little worm is dead.” The worm had dried out in the sun. Casteel dug a tiny grave and buried it. “Jesus loves the little wormies,” he told his mother. “All the little wormies of the world.” As a teenager he made small but symbolic acts in the name of God. He torched his collection of unholy CDs. He anointed the high school doorways and baseball dugouts with oil from the Christian bookstore. He blew a shofar from centerfield. His mother said he could sometimes get lonely, staying home on weekends rather than partying or socializing with other teenagers. He didn’t drink or do drugs. Some of his friends took to calling him “Mama’s Boy.” Other classmates thought he was gay because many of his friends were girls, because he acted in school plays and musicals, because he had a hormone imbalance called gynecomastia that gave him breasts. For years, until he had surgery, he was teased in the locker room, and refused to take off his shirt to swim or change backstage during school plays. He and his mother talked about everything—faith, friendships, girls, dreams, disappointments, fears, philosophy, theology, art, literature, music. “We were very much alike in many ways, and just naturally connected on a deep level,” Kristi told me. Joshua was never as close to his father, Everett, who didn’t share his son’s temperament or interests. (In 2010, Everett Casteel died from complications related to a brain tumor.) With his mother, Joshua was always sweet. He gave her a tiny crystal swan, a ragged cotton bunny (she collected bunnies), a pink chiffon blouse, a large print of an angel that he thought looked like her, and a framed poem he wrote about her and the meaning of her name. Casteel was always praying to Mary, the mother of God. For Kristi, it made sense. “We identified with Mary and Jesus—it just seemed to naturally evolve,” she says. “People mentioned his likeness to Christ again and again.” Kristi had always worried that God would take her son. She had gone into his bedroom at night when he was a few weeks old and heard God talking: Give him back to me. You need to let him go. She tried to make sense of it. She later thought of the story of Isaac, when Abraham raised a knife above his son’s head to prove his faith in God. “Whenever that fear entered my mind,” she told me, “I reminded myself that all of our children are on loan to us, and I shouldn’t live in fear of something I couldn’t know would happen.” * * * Casteel never forgot Steve’s prophecy, and a month after he turned 17 he enlisted as an Army reservist in Iowa City under the delayed entry program, in part to help his chances of getting accepted to West Point. That summer, between junior and senior year of high school, Casteel joined hundreds of other recruits for haircuts, immunizations and barrack assignments at basic combat training at Fort Leonard Wood, Missouri. At bayonet practice, he learned to impale a body with a footlong knife affixed to an automatic assault rifle. The soldiers thrust bayonets into dummies and repeated a chant: “Kill, kill, kill without mercy, Sergeant! Blood, blood, bright red blood, Sergeant! Blood makes the green grass grow!” Casteel stumbled and found that he couldn’t repeat the words. During his senior year he applied to West Point, Wheaton College and Notre Dame. Casteel prayed to God for a sign, because he was having doubts. He found he was more drawn to a life in academia, maybe even the priesthood, than one in the military. But the rejection letter from Notre Dame arrived the day after his prayer, and this was followed by an acceptance letter from West Point. He matriculated in June 1998. In his civilian duffle he packed a copy of Joseph Conrad’s Heart of Darkness. “Is this where you feel God has led you?” his father asked. Casteel said it was. But he quickly found that he was miserable at the military academy. The academic environment was highly restrictive. The cadets lacked creativity and pandered to expectations. He didn’t feel the camaraderie. Sometimes he’d drink a couple of canteens of water at night so he’d have an excuse to read Edgar Allan Poe in the bathroom. Poe had attended West Point. He lasted seven months. Casteel lasted three. The night before he quit he was so nervous he polished a hole clean through his garrison cap. He left on a lonely Saturday in September, not knowing whether he was walking away from God’s plan for his life, or walking into it. Casteel returned home, where he enrolled at the University of Iowa, and he spent the second semester of his senior year abroad, at Oxford University, in a program devoted to Medieval and Renaissance studies. This is where he met Timothy Roth, Jacob Florer and Joseph Clair, who would become lifelong friends. Roth remembers that the first time he saw him, Casteel was listening to Jay-Z, drinking SoCo and reading Heidegger. He wore a black turtleneck and a beret. They busked on the balcony with their guitars and took requests from passers-by. Roth started playing “I Believe I Can Fly” by R. Kelly and sang the first few words, but Casteel took over and he sang the whole song. “He wasn’t a macho dude,” Roth told me. “He was sincere, open, honest, vulnerable. No need to put on a front.” But the attacks of September 11, 2001, four months before he left for Oxford, hung over Casteel’s time abroad, and as the U.S. government began to talk about war Casteel’s demeanor changed. He started lifting weights, getting huge. He talked to friends about quitting school and joining a war that was at first officially named Operation Infinite Justice. Casteel graduated from the University of Iowa in May 2002, and that September he entered the basic interrogator course at Fort Huachuca, in Arizona, home to the Army’s military intelligence center. Soon after, he enrolled in the immersive language course at the Defense Language Institute Foreign Language Center, in Monterey, California, to study Arabic. His zeal for serving his country was matched, even fueled, by his faith. While in Monterey, he befriended an Episcopal priest at a nearby church named Father William Martin, and again thought about becoming a priest, maybe even an Army chaplain. Then Casteel traveled to Seattle to attend a lecture by a pacifist theologian from Duke University named Stanley Hauerwas, who believed that Christians ought to remain nonviolent even in times of war. The lecture haunted him. After the talk, Casteel walked up to Hauerwas and asked what a Christian already enlisted in the military should do. Hauerwas replied that he should leave the military right away—get as far away from it as possible. Casteel applied to divinity school, and meanwhile he obsessively read the literature on “just war” theory, pacifism, ethics and international relations, the politics of nonviolence, the writings of Pope John Paul II, of Martin Luther King Jr., Gandhi and the pacifist monk Thomas Merton, as well as the history of the Mennonite tradition and of pacifism in the Roman Catholic and Anglican traditions. Then, in May 2004, he was accepted to seminary at the Graduate Theological Union in Berkeley, California. Two weeks later, his deployment letter to Iraq turned up in his mailbox. In the end he understood what he had to do. He would bring moral order to the interrogation room. He wanted to “be on the first plane,” he wrote in an email, to “ensure that nothing of the sort happens under my watch.” * * * At Abu Ghraib, Casteel woke at 5, exercised, prayed, slipped on 70 pounds of body armor and walked through 120-degree heat to work. Mornings were for preparation, afternoons for interrogations. Some mornings he researched insurgent groups, their cell structures and fighting tactics, or scanned satellite images of American and mujahedin traffic-control points. Almost every day he read inventories and personality biographies of terrorists. For some interrogations, he received a pile of background information on the man he was scheduled to interrogate. Other times he walked into the interrogation room knowing nothing at all. He prepared to encounter a range of prisoners, from bloodthirsty adolescents and Al Qaeda fanatics to good people in the wrong place at the wrong time. Mostly he interrogated good, normal people—Iraqi schoolboys, taxi drivers and imams. Between interrogations he read commentaries on the New Testament and Western philosophy and brushed up on his biblical Greek and Aramaic to reread key scriptural passages about the ethics of soldiering and violence. People learned pretty quickly that he’d been accepted to seminary school before his deployment to Iraq. Everybody called him “Priest.” Some fellow soldiers took to confessing their sins to him in the Abu Ghraib bathroom stalls. Casteel soon realized the interrogation room wasn’t all that different from a church confessional, and he imagined himself a priest and the prisoner a confessor. I won’t coerce, he decided, but will guide the Iraqi toward self-disclosure. “To that end, empathy and understanding go a long way,” he wrote to his parents. Everyone wants to be understood, no one wants to carry around their shame and their secrets all alone. Being empathetic, he went on, “forces a person to question the legitimacy of their training and indoctrination. In many ways, I have no other recourse but to identify with these people.” It turned out he couldn’t help but feel bad for the prisoners. It didn’t matter if the prisoner was a wrongly accused farmer or a jihadist bent on Casteel’s destruction. His orders commanded that he approach prisoners as assets to manipulate, but when Casteel walked into the interrogation room and saw the prisoner, he thought, This is a man in need of redemption. “From my very first interrogation,” he wrote later, “I have simply lacked the ability to look at the person I interrogate in a way that does not demand I also think about what is best for him.” Soon Casteel was attending confession with an Army chaplain after each interrogation, because “of an overwhelming burden to atone for what I considered the sin of reducing individuals to strategic ‘objects of exploitation.’” Once, he told a prisoner “You are not a criminal, you are not a terrorist,” and the prisoner wept, because no American had ever called him anything but evil. At the same time, Casteel was extracting more information from the prisoners than other interrogators. During interrogations, Casteel smiled a lot and tapped his foot or smoked a cigarette to give the prisoner time to think, or sometimes because he didn’t quite know what to do next. He tried to show respect. He listened more than he spoke. He paid attention to a prisoner’s words, tone of voice, body language. “Some good news came in today,” he wrote to his parents after a month in Iraq. “I was just notified that the results of my past three interrogations received special recognition from ‘higher up.’ I guess my cigarettes and smiles with the ruthless man I spoke briefly of earlier did something profitable for the commanders in the field. That was a big boost of confidence, being as the best thing I did was simply respect him.” But as time went on, he got disillusioned. The Americans, he learned, raided villages and arrested all the males over the age of 14, who were then sent en masse to Abu Ghraib. And after each interrogation, Casteel typed up a report, and sometimes those reports led to a person’s imprisonment or death. They might reveal information about a particular village or a family, and soon enough the Americans would raid or drop bombs on the target, and kill innocents in the process. “The weight of the job sometimes is more painfully present to me than other times,” he wrote home. “While I understand quite clearly the role of judgment and wielding authority for the punishment/prevention of crime in society, this is a duty I assume with no joy. I do so because it is what has been asked of me....But how I would much rather speak of Grace with those across my table, and tell them of the alternative to their chosen path.” In August 2004, Casteel was promoted to the rank of sergeant—a goal he’d set from the very beginning. “I wanted the leader’s post of a working man,” he wrote home, “to earn my stripes and my respect from a job well done, earned by exertion, not by my college success.” It no longer felt like something to celebrate. The past weeks had seen him get angrier and angrier—at the execution of the war, the failure of the Americans to find weapons of mass destruction or establish a link between Iraq and the attacks of September 11, at the helplessness and misery of the vast majority of Iraqis, pawns in a larger struggle between violent extremists on one side and an invading foreign army that, even granting their good intentions, routinely killed Iraqi civilians. And he was incensed by the way American politicians used Christian language to justify the war. At lunch Casteel ate with local Iraqi workers instead of Americans, and grew tired of the way the Americans mocked the country. He ate enormous amounts of food to relieve stress, or sometimes he didn’t eat at all. He lost and gained weight, smoked cigarettes, drank excessive amounts of coffee. He stayed up late reading because he didn’t want the next day to come. He read so much that even during mortar attacks he shopped for new books on Amazon. Another book by Stanley Hauerwas, Wittgenstein, Derrida, Aquinas, Kant. Another treatise on Christian pacifism and the kingdom of God. He was looking for answers, or an excuse to get out of the war, or an excuse to stay in it. “Every day I wonder if I have reached my limit,” he wrote to his family. “Every day I wonder if I have come to my breaking point.” * * * On October 18, 2004, Casteel was scheduled to interrogate a prisoner from Saudi Arabia who was unusual because he was a self-professed jihadist. By now Casteel had been in Abu Ghraib five months, and had conducted more than 100 interrogations. This one would change the course of his life. What we know of the interrogation is drawn from Casteel’s emails, speeches, journals and other writings. He described the experience with the prisoner as almost mystical, an apotheosis. “I was dumbstruck,” he wrote in a letter home to his mother. “I left praising Christ, and thanking God for this enemy.” Casteel began the interrogation with customary questions: Where are the access points into Iraq from Syria and Saudi Arabia? When did you arrive here? What are your intentions here in Iraq? Then Casteel asked the jihadist why he’d come to Iraq to kill. The jihadist looked at Casteel but didn’t answer. Instead, he asked the same of him. “Why did you come to Iraq to kill?” It was a fair question. It was the question that had haunted him ever since he had arrived, and though he didn’t admit it, Casteel was far more afraid of killing than of being killed. Casteel’s left hand was in his pocket, gripping the crucifix on his rosary. The prisoner counted prayer beads in his right hand. “If there were a knife on the table,” Casteel asked, “would you pick it up and kill me?” “I will have to think about it,” the prisoner said. “What is there to think about?” “Maybe I will pick it up and I will kill you. Maybe I will wait.” Against protocol, Casteel began to speak to the prisoner about the teachings of Christ. He told him that the violence condoned by Islam was not the only path in life: Christ had taught another way. The jihadist responded that vengeance was his right, since the Americans were an invading army on Arab lands. “You claim to be a Christian,” he said, “and yet you don’t follow Christ to pray for those who persecute you, or pray for your enemies. Your Lord, our prophet Isa, tells you to turn the other cheek, to love those who hate you. Why do you not do this?” Casteel had spent his whole life preaching the word of Christ, and now his words were being returned to him by a jihadist in a gentle, evangelical tone. Every challenge to the prisoner came back as a challenge to himself. It was clear the jihadist had peace because of his belief in Islam. He told Casteel that if he was put in prison and never released, he would be okay with that, because he was acting justly, in accordance with his faith. Did Casteel have that same kind of peace? “What is terrifying is that this jihadist genuinely wanted my conversion,” Casteel wrote afterward. “And I felt him actually care for me, desire for my good.”Casteel’s job as an interrogator faded to the periphery. Finally, he succumbed. “You’re right,” he said to the jihadist’s gentle admonishments. To his parents afterward, Casteel wrote, “I confessed to him my sins, and asked him to look at his own. I’m certain that this interrogation was not ‘doctrinal’ by Army standards. Pardon my bluntness, but to hell with the Army and their ‘doctrines.’ Today was a moment when life mattered!” After the interrogation, he wrote, “I left and I prayed I would be given the chance to see him one day in the future when I could say, ‘I left that world behind me, so can you.’” * * * Casteel walked out of the interrogation room and told his superiors that if they wanted to continue interrogating this man, it would have to be done by someone else. He went on scheduled leave to Qatar, where he shot off emails to family and friends. “So, I just experienced why it is I am here in Iraq,” he began one email. “Other than all the struggles I’ve been wrestling with...I just ‘met’ my reason—a young foreign jihadist who said he might kill me if he had the chance (that is, as long as I am a U.S. soldier in Muslim lands). The Gospel came out of his mouth unwittingly, while trying to convert me to Islam. It was beautiful.” It was soon clear to Casteel what he had to do. He would apply for an honorable discharge as a conscientious objector, which meant he’d have to prove not that he simply opposed the war in Iraq, but that he had a “firm, fixed, and sincere objection to participation in war in any form or the bearing of arms, by reason of religious training and belief.” Not everybody was supportive. He had already been exchanging long, anguished emails full of competing interpretations of Scripture with his father, who as a devoted Christian conservative couldn’t understand his son’s righteous turn against the war. Casteel’s friend Miguel Bowser, a Navy petty officer he’d met at the language institute in Monterey, told me that he thought Casteel’s “CO nonsense,” as he put it, “was a cop-out at first.” He wrote to Casteel, “If you’re going to be a CO take off your uniform, run off into the desert, into Baghdad, and join the church there and live your life.” Father Martin responded even more harshly. “Joshua, please refrain from sending me any more examples of the Narcissistic Personality. The degree of your present self-absorption is indicative of a spiritual sickness that issues forth into the betrayal of many. And besides, I am presently trying to help soldiers and sailors whose obedience and humility reveal much more of Jesus to me than your psychic journey through cloud-cookooland.” Christ had been misunderstood, too. When Casteel returned to Abu Ghraib, he told his commander he could no longer serve in the Army. He believed his job as a soldier and interrogator contradicted his obligations as a Christian. It had been two weeks since his encounter with the jihadist. To Casteel’s shock, his commander was encouraging. “He totally supports me,” Casteel wrote to his friend Jacob Florer in manic excitement. “He told me he thought I was one of the best interrogators under his command, that he very much appreciated the way I have handled my duties in light of my ethical dilemmas, that he has been nothing but impressed with my nobility, leadership and maturity (good thing he doesn’t read my tempestuous emails!), and that I am the interrogator with the greatest knowledge of Arabs, the greatest sensitivity and creativity, outright. Needless to say, I was pretty dumbstruck.” During a hearing connected to his application, Casteel said, “To take another’s life is the quintessential statement of divine judgment, and faithlessness toward the possibility of reconciliation and redemption....I wish to end the delusion that good is gained by evil means, or that even maintaining my own economic and physical security is something to be defended by means of violence. I believe that idea to be a lie.” While Casteel waited for approval, the Army gave him something to do outside of the interrogation room. He was given trash duty, which meant he helped to pour jet fuel onto the waste in a different pit from the one burning day and night near his sleeping quarters. The military burned paint and plastic and batteries and ordnance. It burned Styrofoam, petroleum, soda cans, medical waste, human limbs. For weeks, Casteel worked right on the edge of the pit, making sure the fire didn’t spread. * * * He returned to Iowa in January 2005. In May he was honorably discharged as a conscientious objector, one of the few service members to apply for and receive approval in the early years of the Iraq War. It was perhaps the only decision he made with absolute clarity in Iraq. Casteel described coming home as “like stepping into the twilight zone.” His second day back, his sister Rebekah brought him to a mall in Cedar Rapids to buy him a leather jacket, a late Christmas present. He slipped the jacket over his arms and looked in the mirror. “Two days earlier I’d been in Kuwait, toting an M-16; three weeks before that, in Iraq, strapped in body armor and aiming that same M-16 at little boys,” he wrote to a friend. “I don’t know how to go from flak vests and assault rifles to fashion cycle jackets.” He suffered from depression and post-traumatic stress disorder. Strangely, he missed the clarity of war. Everything over there came down to life and death. Many nights he drank. “He had a lot of turmoil going on at the time,” says Roth. “He needed redemption, to turn over a new leaf.” That winter, Casteel and Florer, who had traveled to Iraq with a Christian peace organization known as the International Centre for Reconciliation, met up with Roth and Clair to tour the Pacific Northwest and give speeches about their experiences in Iraq. On the cold Oregon coast, Casteel stripped naked to baptize himself in the ocean. “He was just that kind of passionate, unpredictable person,” Roth says. “He felt much better after that catharsis, and it seemed to sharpen his resolve.” Casteel became involved with the nonprofit organizations Iraq Veterans Against the War and the Catholic Peace Fellowship, which supports active military members and veterans who are struggling, as the group puts it, with “the contradiction between their personal participation in war and their consciences.” In 2006, Casteel enrolled in the writing program at the University of Iowa, where he wrote several plays about the dilemmas of soldiers at Abu Ghraib. In Returns, the protagonist is an interrogator named James who, Casteel wrote in the play’s introduction, “suffers from post-traumatic stress disorder and is haunted by the faces of the men he interrogated. A man of faith, James is nicknamed Priest by his Army buddies, which he resents for the holiness it implies, and wishes he still possessed.” The play premiered in 2007 in Iowa City, with Casteel in the lead role. Roth attended the opening, and he says he didn’t know just how tormented Casteel was until he saw the play. “It featured so much screaming that if you were in the front you’d be wearing his spit,” he told me. “But it all came down to the last phrase, where it was just him under a spotlight, turning the typical post-tour-of-duty question back on the audience: ‘What was it like?’ That sent chills down my spine. It was like asking the audience, What was this play full of screaming and crying like for you? It was confrontational. Of course, it was also a critique of the absurdity of asking a war vet such a casual question. It was then that I knew I didn’t ever want to ask him that question because it could never be answered. That well was just too deep and dark.” * * * In Iraq, Casteel had started to cough up a thick black mucus. Lots of other soldiers were doing it too. “It’s no big deal,” they said, just “Iraqi crud.” Besides, Casteel had suffered from allergies since he was a child, and Abu Ghraib was full of dust and sand and smoke. Now back in the States, Casteel was often congested and coughing. Doctors at the Veterans Affairs hospital told him he had asthma. He walked out with a prescription for Albuterol, a common asthma medication. When he returned, they prescribed more inhalers. Over the next few years, Casteel wrote and traveled widely, lecturing about his experiences across the United States and in England, Ireland, Sweden and South Korea. He was invited to the Vatican to speak to officials about the church’s role in advocating nonviolence, and a photograph of Casteel shaking hands with Pope Benedict XVI now hangs in the offices at the Catholic Peace Fellowship in South Bend, Indiana. During a 2009 lecture at Elizabethtown College, in Pennsylvania, Casteel reminded the audience that Christians make up the vast majority of uniformed American troops. “I have a simple message today as we approach the sixth anniversary of the invasion of Iraq,” he said. “If Christians wish to lessen the spread of evil throughout the world, they need only to refrain from doing it.” The next year, after enrolling in divinity school in Chicago, and getting a job teaching a graduate writing workshop, Casteel began exeriencing terrible back pain. He visited the chiropractor but nothing helped. At home in Iowa, his mother drove him to the VA. The doctors diagnosed bronchitis. They sent him home. On Halloween, he coughed so hard something snapped in his back. At the emergency room, the doctors took X-rays for the first time and discovered a mess of tumors in his lungs. The next morning, Casteel called Kristi and told her she needed to come to the hospital. He didn’t tell her that the doctors said it was probably cancer, because he didn’t want her worrying during the drive. She was frantic. In the waiting room, she held a pamphlet she’d tossed into her purse on her way out, and read the same line over and over, something about protecting us from evil, from the onslaught of the enemy. Then the doctors walked in and announced that her 31-year-old son had Stage 4 adenocarcinoma, cancer of the lungs. They said this was rarely found in otherwise healthy young men—mostly it affects geriatric patients and Asian women. Joshua had tumors in his lungs, liver and spine. Later it would spread to his arms, legs and finally to his brain. Kristi fell forward onto the floor and started sobbing. She cried so hard the maintenance man dropped his tools and ran over to comfort her. Joshua came out. “Do you know?” she asked him. He said he knew. “But I don’t think I’m done,” he said. “I have too much to do.” He wanted to finish his PhD. He wanted to start a magazine and a film company. He had a memoir to finish. He thought he might want to have a family, or enter the priesthood—he hadn’t given up that dream. I’m not going to die because I haven’t been made a priest yet, he told himself. Casteel visited a lot of specialists. The pain was so bad he slept an hour at a time. He had spinal surgery, radiation therapy, chemo. His hair fell out. Fall turned to winter, then spring, then summer, and not 20 minutes passed without pain. He used a cane to walk. People visited him and prayed. Casteel started to wonder if he was being punished or tested by God for what the Americans had done in Iraq. Hundreds of thousands of innocents killed. Iraq torn apart, close to a decade after the invasion. The wider region was destabilized and only getting worse. Then a friend stopped by to visit, a woman whose father had suffered most of his adult life from exposure to Agent Orange. Do you think maybe the cancer has anything to do with Iraq? she asked. His friend told him respiratory illnesses were being diagnosed in many soldiers returning from the war, and some doctors were connecting them back to the burn pits. The condition was called “Iraq/Afghanistan War-Lung Injury.” Some families of veterans had started a registry to document those affected, and were talking about lawsuits. The burn pits were used on U.S. military bases throughout Iraq and Afghanistan to get rid of waste even though incinerators, safer mechanisms for disposing the waste, often sat unused. Casteel had slept 100 yards away from an enormous burn pit and in his final weeks he had manned a second one. The connection had never occurred to him. He had never used a mask when he worked. His concerns had been metaphysical. In November 2011, his muscles spasmed from his waist up. Casteel was in agony, lips locked between clenched teeth. At the hospital, Kristi lifted her son and walked him around like a marionette, because the nurses wouldn’t help. She had to scream at a nurse in the hallway to persuade her to give Casteel a shot for the pain. Kristi took Casteel home to Cedar Rapids in December. Two weeks later he had a seizure so strong the vertebrae in his spine collapsed. He went back to the hospital, but Casteel believed he had been blessed with a mysterious grace from the very beginning. “I’ll suffer,” he said, “and then I’ll get better.” He was determined to continue his work. He kept teaching and giving speeches as long as he could. He appeared in a PBS documentary called “Soldiers of Conscience.” “I feel a sense of relief that I get to share in the suffering of the Iraqi people,” he told his friends and family. “Because the Americans burned toxins in their fields and on their earth. In a certain sense it’s an opportunity to climb up on the cross with them.” The war was a path for him to become more like Christ. A year or two earlier, Casteel had written in his journal about the time he asked his mother, “When you look down the road of my life, what do you see me doing?” “I think you’re going to follow in the footsteps of Christ,” she’d replied. “Me too,” he’d told her. * * * In early 2012, Casteel traveled with his mother to New York, where he would take part in a clinical trial of an experimental cancer drug. They lived at a housing center for cancer patients and their families in Midtown Manhattan called Hope Lodge. For a while Casteel was able to manage the stairs to the subway. But the cancer mutated and spread. When a nurse told him it was time for palliative care, he said he didn’t like the word “palliative,” which he knew came from the Latin for “to cover or cloak,” disguising the inevitable. Soon Casteel’s spine crumbled. He became emaciated. Movement was so excruciating that his eyes rolled back in his head. His mother and sisters took turns holding his arms above his head for hours, hoping it would relieve the pain. One Saturday, on the way to the hospital, Kristi passed a church with a statue that made her pause. It was the limp body of Christ, just after he had been pulled from the cross. That statue looks just like Joshua, she thought. Kristi walked to the hospital. She told her son how much she loved him. How he had been the joy of her life, that she could never have asked for a better son. “I think at that moment I knew what was coming,” Kristi said. In the days that followed he weakened to the point of no recovery. One afternoon Casteel started having a hard time breathing. Kristi fled to the bathroom. She couldn’t bear it. Then she began thinking about Mary, how it must have been for her when Jesus was going to die and return. Kristi had a necklace Joshua had given her with a blue heart, which to her symbolized Mary. In the bathroom, God told her, Don’t desert him now. This is his walk up the hill. He was climbing up on the cross. “When Joshua was near the end, that was the image that God brought to me,” Kristi said. “We were walking up the hill together.” It happened at 3:30 p.m. on a Saturday. * * * “I know that to follow Christ is to suffer,” Kristi said shortly after Joshua died. “And when I’m in my sound mind, and not in desperate pain and loneliness that comes much more often than I want it to, I can genuinely say this suffering is a small price to pay to have had my son for 32 years.” After her son died, Kristi established the Joshua Casteel Foundation, which raises awareness about Iraqis and veterans in need of help, in particular those affected by the burn pits. More than 10,000 veterans of the wars in Iraq and Afghanistan have filed medical claims connected to the pits, and the Department of Veterans Affairs has acknowledged that burn pit exposure is related to at least 2,200 of the cases. Kristi’s foundation also works with a Catholic community center in Alliance, Ohio, named the Joshua Casteel House. Near the end of his life, Casteel told his friends and family that he’d found peace. He found it in spite of his illness, in spite of the war, in spite of the world’s imperfection and because he hoped others would keep fighting for peace once he was gone. “I have been given back a hope I remember from childhood,” he wrote to them, “but which has been chastened by suffering and baptized by a voluntary love. And I am a happier man.” David Burnett first got his start in photojournalism as a college student working for Time in 1967. In 50-plus years since, he’s documented some of the standout moments of the 20th century, including the Vietnam War and the Iranian Revolution. In journalism, Burnett told Smithsonian, “you never stop learning and never stop trying to understand the world and never stop discovering things that you didn’t know yesterday.” Burnett, who has photographed veterans and sites of military conflict, worked with Smithsonian for the America at War issue. His images accompany Jennifer Percy’s article “The Priest of Abu Ghraib,” which tells the story of the late Joshua Casteel, an Army interrogator turned conscientious objector. “He was a very sensitive and spiritual young man and died far too young,” Burnett said. Photographing Casteel’s mother was especially powerful: “her dedication to trying to share with that spirituality is something that’s really impressive.” In these two videos, Burnett reminisces about his time jumping out of an airplane with D-Day veterans and his experience as a war photographer in Vietnam.
9b17471fec8e5d7b01eeb842e94e9bff
https://www.smithsonianmag.com/history/private-tour-cias-incredible-museum-180952738/
A Private Tour of the CIA’s Incredible Museum
A Private Tour of the CIA’s Incredible Museum A chill wind whipped off the Warnow as a retired railroad worker shuffled through the streets of the port city of Rostock one winter night in 1956. He wore the drab clothes typical of East German residents. But when a second man appeared from the shadows, the elderly German revealed that he was wearing a pair of distinctive gold cuff links embossed with the helmet of the Greek goddess Athena and a small sword. The second man wore an identical pair. Wordlessly, he handed the German a package of documents and retreated back into the shadows. The German caught a train for East Berlin, where he handed the package and the cuff links to a CIA courier. The courier smuggled them to the agency’s base in West Berlin—to George Kisevalter, who was on his way to becoming a legendary CIA case officer. The man who retreated back into the shadows was Lt. Col. Pyotr Semyonovich Popov, an officer of the GRU, the Soviet military intelligence agency. Three years earlier, Popov had dropped a note into an American diplomat’s car in Vienna saying, “I am a Soviet officer. I wish to meet with an American officer with the object of offering certain services.” He was the CIA’s first Soviet mole, and Kisevalter was his handler. Popov became one of the CIA’s most important sources through the 1950s, turning over a trove of Soviet military secrets that included biographical details on 258 of his fellow GRU officers. It was Kisevalter who had decided on the cuff links as a recognition signal. He gave them to Popov before Moscow recalled the GRU officer in 1955, along with instructions: If Popov ever made it out of the USSR again and renewed contact with the CIA, whoever the agency sent to meet him would wear a matching set to establish his bona fides. Popov renewed contact after he was assigned to Schwerin, East Germany, and the cuff links worked as intended. He fed Kisevalter information through the retired railroad worker for another two years. But after Popov was recalled to Moscow in 1958, he was arrested by the KGB. There are various theories on why he fell under suspicion. However, in a series of interviews two decades ago, Kisevalter told me it was the result of a botched signal: He said George Payne Winters Jr., a State Department officer working for the CIA in Moscow, “got the instruction backward” and mistakenly mailed a letter addressed to Popov at his home. The KGB spotted him in the act and fished the letter out of the mailbox. Popov was doomed. The Soviets expelled Winters from Moscow in 1960, the same year they executed Popov—by firing squad, Kisevalter believed. He told biographer Clarence Ashley he doubted a rumor that Popov had been thrown alive into a furnace as a lesson to other GRU officers, who were required to watch. Today, the cuff links rest in one of the most compelling and least visited museums in the United States. The museum has an extraordinary collection of spy gadgets, weapons and espionage memorabilia from before World War II to the present—more than 28,000 items, of which 18,000 have been cataloged—and hundreds are on display. But the museum is run by the CIA and housed at its headquarters in Langley, Virginia, eight miles outside Washington, D.C. The agency’s entire campus is off-limits to the public, and the museum is open only to CIA employees, their families and visitors on agency business. By special arrangement, Smithsonian magazine was allowed to tour the museum, take notes and photograph select exhibits. Our guide through the looking glass was Toni Hiley, the museum’s director. “Every day, CIA officers help to shape the course of world events,” Hiley said. “The CIA has a rich history, and our museum is where we touch that history.” SILENT THREAT The Hi-Standard .22-caliber pistol is described in the exhibit as “ideal for use in close spaces or for eliminating sentries.” Developed by Stanley P. Lovell, the chief of gadgets and weapons for the Office of Strategic Services, the CIA’s World War II predecessor, the long-barreled weapon was flashless and silencer-equipped—designed to kill without making a sound. How quiet was it? According to Lovell’s account, Maj. Gen. William J. “Wild Bill” Donovan, the chief of the OSS, was so eager to show off his agency’s latest lethal gadget that he took a Hi-Standard and a sandbag to the Oval Office. While President Franklin D. Roosevelt was busy dictating to his secretary, Lovell wrote in his book Of Spies and Stratagems, Donovan fired ten rounds into the sandbag. FDR gave no notice and never stopped talking, so Donovan wrapped his handkerchief around the still-hot barrel and presented the weapon to the president, telling him what he had just done. Roosevelt is said to have responded, “Bill, you’re the only wild-eyed Republican I’d ever let in here with a weapon.” Donovan gave FDR one of the guns, Hiley told me: “It was displayed in Hyde Park. But the OSS came one day and said they’d have to take it back because it was classified.” THE PURLOINED LETTER As the Nazi regime collapsed in 1945, a young OSS officer sat down to write a letter to his son in the United States. “Dear Dennis,” he wrote, The man who might have written on this card once controlled Europe—three short years ago when you were born. Today he is dead, his memory despised, his country in ruins. He had a thirst for power, a low opinion of man as an individual, and a fear of intellectual honesty. He was a force for evil in the world. His passing, his defeat—a boon to mankind. But thousands died that it might be so. The price for ridding society of bad is always high. Love, Daddy The card on which Richard Helms was writing was a piece of Adolf Hitler’s personal stationery. It bore a gold-embossed eagle holding a swastika above the Nazi leader’s name. To the right was printed the word “Obersalzberg,” referring to Hitler’s retreat high in the Bavarian Alps above Berchtesgaden. “I found the letter when I was in high school, in a bunch of scrapbooks my mother kept, but I had no idea of its significance,” Dennis Helms, now 72 and a lawyer in New Jersey, told me. “It just sat there in a suitcase I kept under my bed, tucked away in a scrapbook with the Christmas pictures.” He donated it to the agency in 2011. He says the letter gave him insight into the secretive and private nature of his father, who served as CIA director from 1966 to 1973, when he was dismissed by President Richard M. Nixon. Richard Helms died in 2002. “The letter was a very emotional expression for my father,” he said. “He was not known for emotions. He was all about the facts. He was the most understated guy on the planet. “I knew early on he was in the CIA. When friends asked, I would say he worked for the State Department. They would ask what he did and I said, ‘I don’t know.’ They said, ‘You must be pretty stupid.’ ” When Dennis asked his father how he had managed to snare a piece of Hitler’s stationery, he received a vague answer. Although the letter was dated V-E Day—May 8, 1945—Richard Helms wasn’t even in Germany that day, although he was later stationed in Berlin. Dennis says he wasn’t surprised that his father’s life remained surrounded by mysteries: “I found things in the museum that he had never mentioned.” LISTEN HERE In spy fiction, an electronic bug is usually small enough to fit inside a cellphone or to be sewn into the lining of a jacket an unwitting victim takes to the cleaners. In spy life, an electronic bug can be ten feet long. The bug in this instance is an insulated metal reinforcing bar, one of dozens the KGB embedded in the walls of the U.S. embassy in Moscow, and thus a relic of one of the most awkward episodes in the U.S.-Soviet détente. In a purportedly helpful move, the Soviet Union offered to sell the United States precast concrete modules for the building, supposedly to ensure that it would be up to code, and the United States accepted. But mid-construction inspections beginning in 1982, including X-rays, revealed that the Soviets were turning the building into a huge antenna, with some bugs so sophisticated they could transmit each keystroke from the embassy’s IBM Selectric typewriters. After that, the top floors of the embassy were torn down and replaced by a secure “top hat” of four floors. The project took more than four years—and was done by American contractors. PROCEED WITH THE ASSAULT Just two weeks after the terrorist attacks of September 11, 2001, the CIA began inserting personnel into Afghanistan to prepare for the U.S. response to Osama bin Laden and his compatriots in Al Qaeda, and the agency is still active there. The museum’s Afghan Gallery has objects ranging from the patriotic—such as the “Don’t Mess With the U.S.” T-shirt an agency logistics officer bought after she found out she would be deployed in 2003—to the bemusing, such as the photograph of a CIA K-9 explosives-detection team in which the security measures extend to obscuring not only the faces of the three men in the frame, but the dog’s face as well. Among the most sobering are those related to the hunt for bin Laden. The search took ten years, from bin Laden’s disappearance into the Afghan mountains soon after 9/11 to the CIA’s picking up the trail of a courier that led to a compound in Abbottabad, in northeastern Pakistan, in 2011. Surveillance photographs showed a tall man occasionally pacing in the compound’s courtyard. Could it be bin Laden? The agency developed evidence that it was, but analysts could not be sure. After an extensive debate, the Obama administration made a decision: Any assault would be made by a team of Navy SEALs working under the aegis of the CIA. Technicians at the National Geospatial-Intelligence Agency, mapmakers for the intelligence community, built three scale models of the compound, Hiley said. The original was used to brief the assault team and President Obama; of the two created for the historical record, one is in the CIA museum. The SEALs also trained on a full-scale mock-up at an undisclosed CIA site. “We don’t say where the training on the mock-up took place, but it was one of the CIA’s covert sites,” Hiley said. The training was widely reported to have taken place in North Carolina. The assault team destroyed parts of the mock-up every day, Hiley said, but it was rebuilt. At the CIA, then-director Leon Panetta awaited word from the White House. If anything went wrong, President Obama would take the blame, but so would he. At 10:35 a.m. on April 29, 2011, Panetta got a call from the president’s national security adviser. He reached for a sheet of stationery bearing the words, “The Director, Central Intelligence Agency, Washington, D.C. 20505” and began writing a memo for the record, which is preserved under glass at the museum: “Received phone call from Tom Donilon who stated that the President made a decision with regard to AC1 [Abbottabad Compound 1]. The decision is to proceed with the assault....The direction is to go in and get Bin Ladin and if he is not there, to get out. Those instructions were conveyed to Admiral McCraven at approximately 10:45 AM.” In the moment he added an extra “c” to the name of then-Vice Adm. William H. McRaven, commander of the U.S. Special Operations Command. The raid proceeded shortly after 1 a.m. on May 2 in Pakistan. After it succeeded, some of the SEALs told agency debriefers the mock-up had been so accurate they felt as if they’d been to the compound before. The museum has two artifacts from Abbottabad: a brick from bin Laden’s compound and an assault rifle, a Russian-made AKMS modeled on the Kalashnikov AK-47 but, for reasons unknown, with counterfeit Chinese markings. “The rifle was found next to bin Laden when he was killed,” Hiley said. “So we assume it was his rifle.” The Liberator, or FP-45, never had the cachet of the silent Hi-Standard .22—it fired just one .45-caliber bullet, and that bullet had a tendency to wobble off course beyond a range of 25 feet. But the weapon was designed to be air-dropped to resistance forces behind enemy lines, as much for its psychological value as its dubious firepower. “The idea was, you would use the gun to liberate a better weapon from an enemy,” Hiley explained. In the summer of 1942, “GM made a million of these in three months, and thousands were shipped to China.” The staff of Gen. Dwight D. Eisenhower had little enthusiasm for the weapon, and authorized the dropping of only 25,000, for the French resistance.
fde05cd91c9f1007d82e7d00219548b7
https://www.smithsonianmag.com/history/proliferation-happiness-180968468/
The Proliferation of Happiness
The Proliferation of Happiness It took only ten minutes for Harvey Ball to create the Smiley face. In 1963, the State Mutual Life Assurance Company in Worcester, Massachusetts, hired him to come up with a design that would help raise the morale of its employees. Ball was an artist formally educated at the Worcester Art Museum School and a trained sign painter. After he presented the Smiley face, the company paid him $45 for his work. Neither Ball nor the insurance company took out a trademark. Before too long, tens of millions of buttons with the iconic image (two black marks for eyes and a black grin on a bright yellow background) were in circulation. In the early 1970s, the brothers Murray and Bernard Spain secured a trademark for a combination of the face with the phrase “Have a Happy Day,” later changed to “Have a Nice Day.” The rest is history—images and sayings that we are all familiar with. Finally, in 1999, Ball created the World Smile Corporation to license one version of the image. He used the proceeds to help improve the lives of children, and his son Charles said that his father was not sorry that he made so little money off what he wrought. "He was not a money-driven guy, he used to say, 'Hey, I can only eat one steak at a time, drive one car at a time.'" Ball died in 2001 at age 79, too soon to witness the full flowering on positive psychology and happiness studies, scholarly fields that combine Eastern religions, neuroscience, evolutionary biology, and behavioral economics—but above all represent a shift of focus among some psychologists from mental illness to mental health, from depression and anxiety to subjective well-being. When a cultural movement that began to take shape in the mid-twentieth century erupted into mainstream American culture in the late 1990s, it brought to the fore the idea that it is as important to improve one's own sense of pleasure as it is to manage depression and anxiety. His own commitments underscore two key findings of positive psychology, insights based on science. Although some of these insights were available before he died, it is unlikely he knew about them—and yet, he lived them. If there was a moment when positive psychology emerged on the American scene with organizational heft, it was in 1998, when University of Pennsylvania psychology professor Martin Seligman delivered the presidential address at the American Psychological Association, in which he defined positive psychology as “a reoriented science that emphasizes the understanding and building of the most positive qualities of an individual: optimism, courage, work ethic, future-mindedness, interpersonal skill, the capacity for pleasure and insight, and social responsibility.” Harvey Ball didn’t need psychologists to tell him of their discovery of the Helper’s High, the pleasure that a person gets from giving, the basis of the link between altruism and a sense of well-being. Nor did he need to read the research that demonstrated that above a certain level of income ($70,000 is the one most commonly mentioned), additional income provides only marginally meaningful increments of happiness. As with almost any finding in a new and burgeoning scientific field, claims about the impact of greater income are contested. However, they led to important consequences. The caution that more income above a certain level did not necessarily enhance positivity caused some political activists to call for a more egalitarian distribution of income; studies of the relationship between a nation’s Gross Domestic Product and its citizens’ well-being seem to reinforce that push. The World Happiness Report—an annual survey conducted since 2012—determined that citizens of Finland, the Netherlands, and Denmark report more life satisfaction than do those living in the United States, which has a higher GDP per capita. Ball would no doubt have evidenced a grin on his own face when in 2015, Dan Price, the head of Gravity Payments, a Seattle credit-card-processing firm, having learned that incomes over $70,000 do not make people appreciably happier, decided to reduce his own salary from $1 million to $70,000 and increase those of his employees to at least $70,000. The move is still paying dividends. Of course, just as international comparisons are controversial, so too was Price’s decision. His brother, who had co-founded Gravity Payments, unsuccessfully sued him. Rarely have academic findings so quickly influenced a culture. Some of this is coincidence, representative of how experimental findings and cultural change occur simultaneously but independently. In the mid-1990s, Oprah Winfrey reconfigured her show to shift from a focus on personal problems to opportunities for personal growth. Positive psychology might have gained significant traction under different conditions, but television evangelism, TED talks, and the proliferation of apps and websites devoted to aspects of positive psychology and self-improvement, along with Oprah’s enterprises, greatly accelerated and amplified the field’s reach. Support from private foundations and government agencies also helped launch, build, and define their presence, inside and more notably outside university walls. So, too, did opportunities to spread happiness via positive coaching and positive institution building. While some assertions of positive psychology can be questioned—there are those who say its practitioners has moved too quickly from experimental findings to bold assertions, as well as those, relying on the works of Marx and Foucault, questioning its politics—certain insights are indeed significant. Investigations underscore the connection between physical health and mental well-being, the importance of social relationships, what we can (our perspectives) and cannot (our genetic composition) control, and the benefits of character strengths such as grit and compassion. Ball’s influence, too, has been pervasive. In January 2005, Time Magazine placed multiple Smiley faces on its cover and announced that inside readers could learn of “The Science of Happiness”—and answers to why optimists live longer, whether God wants us to be happy, and if joy is in our genes. In January 2009, Psychology Today put a Smiley face on its cover, and announced that with the number of books on happiness growing from 50 published in 2000 to 4,000 published 8 years later, a “happiness frenzy” had arrived. “Herein,” the cover story promised, “we report the surest ways to find well-being.” Then in July 2016 Time offered a special edition, on “The Science of Happiness” with no less than 15 smiley faces—one with a halo, one with two hearts, and one with a blinking eye. Inside were “NEW DISCOVERIES FOR A MORE JOYFUL LIFE,” including an emphasis on relationships, meditation, and exercise. Had Harvey Ball lived to see these covers, he likely would have smiled.
a5c1652f5a519d7ebe76b1d3b0a81ed3
https://www.smithsonianmag.com/history/proposal-change-vocabulary-we-use-when-talking-about-civil-war-180956547/
A Proposal to Change the Words We Use When Talking About the Civil War
A Proposal to Change the Words We Use When Talking About the Civil War A new generation of scholarship – not to mention high-profile films like 12 Years a Slave and Lincoln­ -- has changed the way that the public understands American history, particularly slavery, capitalism, and the Civil War. Our language should change as well. The old labels and terms handed down to us from the conservative scholars of the early to mid-20th century no longer reflect the best evidence and arguments. Terms like “compromise” or “plantation” served either to reassure worried Americans in a Cold War world, or uphold a white supremacist, sexist interpretation of the past. But the Cold War is over, and historians, and in turn the general public, must consider rejecting faulty frameworks and phrases. We no longer call the Civil War “The War Between the States,” nor do we refer to women’s rights activists as “suffragettes,” nor do we call African-Americans “Negroes.” Language has changed before, and I propose that it should change again. Legal historian Paul Finkelman has made a compelling case against the label “compromise” to describe the legislative packages that avoided disunion in the antebellum era.1 In particular, Finkelman has dissected and analyzed the deals struck in 1850. Instead of the “Compromise of 1850,” which implies that both North and South gave and received equally in the bargains over slavery, the legislation should be called the “Appeasement of 1850.” Appeasement more accurately describes the uneven nature of the agreement. In 1849 and 1850, white Southerners in Congress made demands and issued threats concerning the spread and protection of slavery, and, as in 1820 and 1833, Northerners acquiesced: the slave states obtained almost everything they demanded, including an obnoxious Fugitive Slave Law, enlarged Texas border, payment of Texas debts, potential spread of slavery into new western territories, the protection of the slave trade in Washington, D.C., and the renunciation of congressional authority over slavery. The free states, in turn, received almost nothing (California was permitted to enter as a free state, but residents had already voted against slavery). Hardly a compromise! Likewise, scholar Edward Baptist has provided new terms with which to speak about slavery. In his 2014 book The Half Has Never Been Told: Slavery and the Making of American Capitalism (Basic Books), he rejects “plantations” (a term pregnant with false memory and romantic myths) in favor of “labor camps”; instead of “slave-owners” (which seems to legitimate and rationalize the ownership of human beings), he uses “enslavers.” Small changes with big implications. These far more accurate and appropriate terms serve his argument well, as he re-examines the role of unfree labor in the rise of the United States as an economic powerhouse and its place in the global economy. In order to tear down old myths, he eschews the old language. Similar changes and constructions should be made surrounding the language we use for the Civil War.  I suggest that we drop the word “Union” when describing the United States side of the conflagration, as in “Union troops” versus “Confederate troops.” Instead of “Union,” we should say “United States.” The employment of “Union” instead of “United States,” implicitly supports the Confederate view of secession wherein the nation of the United States collapsed, having been built on a “sandy foundation,” as Alexander Stephens, the Vice President of the Confederacy, put it in his “Cornerstone Speech.” In reality, however, the United States never ceased to exist. The Constitution continued to operate normally; elections were held; Congress, the presidency, and the courts functioned; diplomacy was conducted; taxes were collected; crimes were punished. Yes, there was a massive, murderous rebellion in at least a dozen states, but that did not mean that the United States disappeared. The dichotomy of “Union v. Confederacy” lends credibility to the Confederate experiment and undermines the legitimacy of the United States as a political entity. The United States of America fought a brutal war against a highly organized and fiercely determined rebellion – it did not stop functioning or morph into something different. We can continue to debate the nature and existence of Confederate “nationalism,” but that discussion should not affect how we label the United States during the war. Compromise, plantation, slave-owners, Union v. Confederacy, etc.: these phrases and many others obscure rather than illuminate; they serve the interests of traditionalists; they do not accurately reflect our current understanding of phenomena, thus they should be abandoned and replaced. Let us be careful and deliberate with our wording; though we study the past, let us not be chained to it. This article was first published on the History News Network. Michael Todd Landis, an assistant professor of history at Tarleton State University, is the author of Northern Men with Southern Loyalties: The Democratic Party and the Sectional Crisis Michael Landis teaches history at the State University of New York, Ulster, and is the author of Northern Men with Southern Loyalties: The Democratic Party and the Sectional Crisis (Cornell, 2014). https://DrMichaelLandis.com
79d36c870978219ba0acb3ca28f53d7d
https://www.smithsonianmag.com/history/ptsd-civil-wars-hidden-legacy-180953652/
Did Civil War Soldiers Have PTSD?
Did Civil War Soldiers Have PTSD? In the summer of 1862, John Hildt lost a limb. Then he lost his mind. Living Hell: The Dark Side of the Civil War The 25-year-old corporal from Michigan saw combat for the first time at the Seven Days Battle in Virginia, where he was shot in the right arm. Doctors amputated his shattered limb close to the shoulder, causing a severe hemorrhage. Hildt survived his physical wound but was transferred to the Government Hospital for the Insane in Washington D.C., suffering from “acute mania.” Hildt, a laborer who’d risen quickly in the ranks, had no prior history of mental illness, and his siblings wrote to the asylum expressing surprise that “his mind could not be restored to its original state.” But months and then years passed, without improvement. Hildt remained withdrawn, apathetic, and at times so “excited and disturbed” that he hit other patients at the asylum. He finally died there in 1911—casualty of a war he’d volunteered to fight a half-century before. The Civil War killed and injured over a million Americans, roughly a third of all those who served. This grim tally, however, doesn’t include the conflict’s psychic wounds. Military and medical officials in the 1860s had little grasp of how war can scar minds as well as bodies. Mental ills were also a source of shame, especially for soldiers bred on Victorian notions of manliness and courage. For the most part, the stories of veterans like Hildt have languished in archives and asylum files for over a century, neglected by both historians and descendants. This veil is now lifting, in dramatic fashion, amid growing awareness of conditions like post-traumatic stress disorder. A year ago, the National Museum of Civil War Medicine mounted its first exhibit on mental health, including displays on PTSD and suicide in the 1860s. Historians and clinicians are sifting through diaries, letters, hospital and pension files and putting Billy Yank and Johnny Reb on the couch as never before. Genealogists have joined in, rediscovering forgotten ancestors and visiting their graves in asylum cemeteries. “We’ve tended to see soldiers in the 1860s as stoic and heroic—monuments to duty, honor and sacrifice,” says Lesley Gordon, editor of Civil War History, a leading academic journal that recently devoted a special issue to wartime trauma. “It’s taken a long time to recognize all the soldiers who came home broken by war, just as men and women do today.” Counting these casualties and diagnosing their afflictions, however, present considerable challenges. The Civil War occurred in an era when modern psychiatric terms and understanding didn’t yet exist. Men who exhibited what today would be termed war-related anxieties were thought to have character flaws or underlying physical problems. For instance, constricted breath and palpitations—a condition called “soldier’s heart” or “irritable heart”—was blamed on exertion or knapsack straps drawn too tightly across soldiers’ chests. In asylum records, one frequently listed “cause” of mental breakdown is “masturbation.” Also, while all wars are scarring, the circumstances of each can wound psyches in different ways. The relentless trench warfare and artillery bombardments of World War I gave rise to “shell shock” as well as “gas hysteria,” a panic prompted by fear of poison gas attacks. Long campaigns in later conflicts brought recognition that all soldiers have a breaking point, causing “combat fatigue” and “old sergeant’s syndrome.” In Vietnam, the line between civilians and combatants blurred, drug abuse was rampant and veterans returned home to an often-hostile public. In Iraq and Afghanistan, improvised explosive devices put soldiers and support personnel at constant risk of death, dismemberment and traumatic brain injury away from the front. Civil War combat, by comparison, was concentrated and personal, featuring large-scale battles in which bullets rather than bombs or missiles caused over 90 percent of the carnage. Most troops fought on foot, marching in tight formation and firing at relatively close range, as they had in Napoleonic times. But by the 1860s, they wielded newly accurate and deadly rifles, as well as improved cannons. As a result, units were often cut down en masse, showering survivors with the blood, brains and body parts of their comrades. Many soldiers regarded the aftermath of battle as even more horrific, describing landscapes so body-strewn that one could cross them without touching the ground. When over 5,000 Confederates fell in a failed assault at Malvern Hill in Virginia, a Union colonel wrote: “A third of them were dead or dying, but enough were alive to give the field a singularly crawling effect.” Wounded men who survived combat were subject to pre-modern medicine, including tens of thousands of amputations with unsterilized instruments. Contrary to stereotype, soldiers didn’t often bite on bullets as doctors sawed off arms and legs. Opiates were widely available and generously dispensed for pain and other ills, causing another problem: drug addiction. Nor were bullets and shells the only or greatest threat to Civil War soldiers. Disease killed twice as many men as combat. During long stretches in crowded and unsanitary camps, men were haunted by the prospect of agonizing and inglorious death away from the battlefield; diarrhea was among the most common killers. Though geographically less distant from home than soldiers in foreign wars, most Civil War servicemen were farm boys, in their teens or early 20s, who had rarely if ever traveled far from family and familiar surrounds. Enlistments typically lasted three years and in contrast to today, soldiers couldn’t phone or Skype with loved ones. These conditions contributed to what Civil War doctors called “nostalgia,” a centuries-old term for despair and homesickness so severe that soldiers became listless and emaciated and sometimes died. Military and medical officials recognized nostalgia as a serious “camp disease,” but generally blamed it on “feeble will,” “moral turpitude” and inactivity in camp. Few sufferers were discharged or granted furloughs, and the recommended treatment was drilling and shaming of “nostalgic” soldiers—or, better yet, “the excitement of an active campaign,” meaning combat. At war’s end, the emotional toll on returning soldiers was often compounded by physical wounds and lingering ailments such as rheumatism, malaria and chronic diarrhea. While it’s impossible to put a number on this suffering, historian Lesley Gordon followed the men of a single unit, the 16th Connecticut regiment, from home to war and back again and found “the war had a very long and devastating reach.” The men of the 16th had only just been mustered in 1862, and barely trained, when they were ordered into battle at Antietam, the bloodiest day of combat in U.S. history. The raw recruits rushed straight into a Confederate crossfire and then broke and ran, suffering 25 percent casualties within minutes. “We were murdered,” one soldier wrote. In a later battle, almost all the men of the 16th were captured and sent to the notorious Confederate prison at Andersonville, where a third of them died from disease, exposure and starvation. Upon returning home, many of the survivors became invalids, emotionally numb, or abusive of family. Alfred Avery, traumatized at Antietam, was described as “more or less irrational as long as he lived.” William Hancock, who had gone off to war “a strong young man,” his sister wrote, returned so “broken in body and mind” that he didn’t know his own name. Wallace Woodford flailed in his sleep, dreaming that he was still searching for food at Andersonville. He perished at age 22, and was buried beneath a headstone that reads: “8 months a sufferer in Rebel prison; He came home to die.” Others carried on for years before killing themselves or being committed to insane asylums. Gordon was also struck by how often the veterans of the 16th returned in their diaries and letters to the twin horrors of Antietam and Andersonville. “They’re haunted by what happened until the end of their lives,” she says. Gordon’s new book on the 16th, A Broken Regiment, is but one of many recent studies that underscore the war’s toll on soldiers. In another, Living Hell: The Dark Side of the Civil War, historian Michael Adams states on the first page that his book describes “the vicious nature of combat, the terrible infliction of physical and mental wounds, the misery of soldiers living amid corpses, filth, and flies.” Not all scholars applaud this trend, which includes new scholarship on subjects such as rape, torture and guerrilla atrocities. “All these dark elements describe the margins not the mainstream of Civil War experience,” says Gary Gallagher, a historian at the University of Virginia who has authored and edited over 30 books on the war. While he welcomes the fresh research, he worries that readers may come away with a distorted perception of the overall conflict. The vast majority of soldiers, he adds, weren’t traumatized and went on to have productive postwar lives. Tony Horwitz was a Pulitzer Prize-winning journalist who worked as a foreign correspondent for the Wall Street Journal and wrote for the New Yorker. He is the author of Baghdad without a Map, Midnight Rising and the digital best seller BOOM. His most recent work, Spying on the South, was released in May 2019. Tony Horwitz died in May 2019 at the age of 60.
0641c752872c25df6f6d86f54176bda3
https://www.smithsonianmag.com/history/puzzle-given-ellis-island-immigrants-test-intelligence-180962779/
This Jigsaw Puzzle Was Given to Ellis Island Immigrants to Test Their Intelligence
This Jigsaw Puzzle Was Given to Ellis Island Immigrants to Test Their Intelligence The face puzzle, a box of wooden jigsaw pieces, looks like a child’s game, a primitive, flattened-out version of Mr. Potato Head. Start with the biggest piece, a half-inch-thick hunk of wood shaped like a head. Place the others where they belong: the eye-shaped piece, the nose, the mouth and several more that together form an ear. Finish it and you have a profile of a bald man with sharp features smiling a tight little smile. Imbeciles: The Supreme Court, American Eugenics, and the Sterilization of Carrie Buck Howard Andrew Knox: Pioneer of Intelligence Testing at Ellis Island The wooden puzzle might look like fun, but it was anything but that to the men and women who were once required to solve it. The Feature Profile Test, in the collections of the Smithsonian National Museum of American History, was administered to immigrants at Ellis Island in the early 20th century. Those who failed to assemble it correctly could be labeled “feebleminded” and sent back home. The Feature Profile Test encapsulates the complex feelings America had toward the immigrants of its time. It was a tool for ushering suitable foreigners into citizenship—and for turning others away. It constituted an idealistic effort to be fair—while at the same time being cruelly unjust. Yet it represents an almost benign era in American immigration history—because what followed would be far worse. New York’s Ellis Island was, from 1892 to 1954, the nation’s main immigration gateway, which some 12 million people passed through. For these new arrivals, who in many cases came from simple rural villages, Ellis Island could be a frightening place—a bedlam of unruly crowds and indecipherable tongues, presided over by grim-faced immigration officers. Immigrants in the early 1900s were examined for physical and mental illness, questioned about their ability to support themselves financially, and challenged on whether they held radical views. As part of the inquisition, the U.S. Public Health Service administered primitive intelligence tests. “The purpose of our mental measuring scale at Ellis Island,” Howard A. Knox explained in 1915, “is the sorting out of those immigrants who may, because of their mental make-up, become a burden to the State or who may produce offspring that will require care in prisons, asylums, or other institutions.” It was Knox, a physician, who developed the Feature Profile Test, which he administered from 1912 to 1916. (Knox resigned his post at Ellis Island that year, eventually establishing a practice as a country doctor in New Jersey.) The puzzle represented a progressive reform of sorts. Before it, the public health service measured intelligence with traditional I.Q. tests, whose questions required cultural and linguistic knowledge that many immigrants did not have, causing perfectly intelligent people to test as “imbeciles.” The Feature Profile Test relied on more universal knowledge—around the world, noses and ears are in the same places. And it could be “administered with minimal use of language, ideally by use of pantomime alone on the part of both examiner and examinee,” notes John T.E. Richardson, author of Howard Andrew Knox: Pioneer of Intelligence Testing at Ellis Island. For all of the democratic impulses behind it, the Feature Profile Test nevertheless could be viewed as an outgrowth of a deplorable ideology. American immigration policy of the time was grounded in eugenics, the pseudoscience of trying to uplift humanity by preventing the “unfit” from having children or, if they lived outside of the country, keep them out. When Knox administered the Feature Profile Test, the stakes were high, and the conditions far from ideal. Typically, the test-takers had just arrived after a long voyage aboard ship, often in horrific conditions, and they were in a foreign land. They might be sleep-deprived, depressed or ill. And they might never have taken a test before. If they did not complete the puzzle in five minutes, that failure—along with other factors the doctors weighed—could lead to a mother being ripped from her family and shipped back to the Old World. Immigrants were turned back often enough, for a variety of reasons, that Ellis Island earned the nickname “The Island of Tears.” Over a fiscal year ending June 30, 1914, nearly one immigrant per 1,000 of the more than one million examined—957 individuals—were deported as mentally defective. As crude as the puzzle test may seem today, it reflected the belief that healthy immigrants should be admitted. Within a decade, though, anti-immigrant, eugenic and racist forces would persuade Congress to pass the Immigration Act of 1924, which dramatically cut back immigration of Italians, Eastern European Jews and other groups considered undesirable. The immigrants who were shut out of America—including many Jews who would, only a short time later, try to flee the Holocaust—would have gladly taken their chances with Dr. Knox’s wooden puzzle. This article is a selection from the May issue of Smithsonian magazine
0f6285c7484a863fc335f4153fff2e78
https://www.smithsonianmag.com/history/recapping-the-jetsons-episode-07-the-flying-suit-109680696/
Recapping ‘The Jetsons’: Episode 07 – The Flying Suit
Recapping ‘The Jetsons’: Episode 07 – The Flying Suit This is the seventh in a 24-part series looking at every episode of “The Jetsons” TV show from the original 1962-63 season. The seventh episode of “The Jetsons” premiered on American television November 4, 1962, and was titled “The Flying Suit.” In this episode we’re introduced to Mr. Cogswell (we don’t learn until the 1980s that his first name is Spencer) whose company Cogswell’s Cosmic Cogs is Mr. Spacely’s direct competitor. We discover that the cigar-chomping Cogswell is trying to merge with Spacely Sprockets in a sort of 21st century semi-hostile takeover. Cogswell’s company has developed the X-1500 flying suit which will likely force Spacely Sprockets to sell out to Cogswell, but thanks to a comedic mix-up at the 30-second dry cleaners, George winds up with the flying suit, depriving Cogswell of his invention. Both companies are confused about the source of their respective powers (and lack thereof) after the mistake at the cleaners and George is convinced that his son Elroy has developed a pill that allows people to fly. But after both sides return to the cleaners and the mix-up is rectified (unbeknownst to both parties) the status quo is restored, with George returning to his regular job and the two companies returning to their bitter rivalry manufacturing cogs and sprockets. Cogswell’s Cosmic Cogs, introduced in the Jetsons episode “The Flying Suit” Life on the Ground As I mentioned last week, the sixth episode of the series, titled “The Good Little Scouts” shows what might be our first glance at the ground. The Jetsons’ world is largely made up of many buildings on platforms in the sky — but often we get a look at something ambiguous; something that may be resting on the earth. In “The Flying Suit” we get our first look at something more clearly on the ground. Strangely enough, that something is a bird. A bird on the ground in the seventh episode of ‘The Jetsons’ in 1962 “What’s happening on the ground?” is one of the most common questions people have when they work from vague memories of The Jetsons, having watched the show as children. Last week someone vandalized the Wikipedia page for Jetsons, inserting a story about why the people of the future live up in the sky: apparently zombies had attacked and forced people to build homes where they couldn’t be preyed upon by the undead. This, of course, isn’t true (though someone has no doubt written up this fanfic already). What is true, is that we do get a few glimpses of life on the ground in the year 2062. Aside from the bird who has been forced to live on the ground thanks to so many humans zipping around in the sky, we learn that hobos and layabouts live on lower levels. Perhaps the more jarring revelation about meeting a character in poverty is that people can still be in such a situation a hundred years hence. It’s obviously not given a lot of screen time (and only serves to assist a joke) but the idea that poor people still exist in the year 2062 is counter to many of the post-scarcity narratives so prominent in 20th century futurism. Americans were told, even in the depths of the Depression, that the people of the 21st century would be capable of providing for everyone; that a new form of economics would evolve wherein no one would do without the most basic of goods. In fact, people would thrive and the evolution of humanity and the American economy itself would mean that no one could go hungry. But just as the Jetsons sought to project the model American family into the future without challenging any social norms, the world of 1962 American poverty (albeit a cartoonish version of it) is projected into 2062. A hobo living on a lower level in the seventh episode of ‘The Jetsons’ from 1962 Jetpack Lite: The Flying Suit of the Future Bell Aerospace’s rocket belt in Hopi Buttes, Arizona (source: USGS 1966) As we’ve seen time and again while exploring the world of “The Jetsons,” the show takes many plausible, futuristic ideas of the 1950s and early ’60s and adds a heightened cartoon twist. In this episode the idea of personal flight machines — jetpacks of the early 1960s which were becoming more plausible with each passing day — were done away with to provide a comedic storyline of futuristic travel. Since the dawn of humanity it seems we’ve been fascinated with flight. Powered flight being a relatively recent invention, and it strikes me as something special to live in a time when we can know such common-sense-defying thrills as human flight. Yet for many retro-futurists of today, we’re still waiting on that jetpack. Wendell F. Moore applied for a patent in 1960 and on February 13, 1962 was granted patent number 3,021,095 for his rocket belt. I use the term “jetpack” because it’s more commonly understood as the personal aircraft device that people of the retro-future would zip around on. But as Mac Montandon explains quite well in his 2008 book Jetpack Dreams, the devices researched and developed successfully at Bell Aerospace in the early 1960s are more appropriately named rocket belts. The patent explicitly explains the desire for the rocket belt to be used by military personnel, but much like other innovations of the American military, the public expected that they would one day get a jetpack of their very own. From the 1960 propulsion unit patent of Wendell Moore and Bell Aerospace in New York: For a number of years, there has been a need for increasing the mobility of military personnel, for example, infantrymen, by way of providing some means to directly lift and transport an individual soldier. It is of primary concern in connection with the present invention to provide such means in the form of a safe, reliable and easily controllable rocket propulsion system having sufficient total impulse to lift and propel an individual for distances up to approximately two or three miles. It is a further object of this invention to provide a device in accordance with the above which is capable of being utilized by the average soldier with an absolute minimum of training. That desire to achieve “two or three miles” was the largest hurdle that the jetpack would face, as it’s not efficient to propel a person in such a manner — you simply can’t store and burn enough fuel in such a compact device to make it a practical means of transportation. Thus, the jetpack has been relegated to concerts and Super Bowls as an entertaining spectacle. George Jetson wearing the flying suit We may not have a jetpack, and we may not be living on platforms in the clouds, but take solace my fellow retro-futurists: the world still has 50 years to deliver on the techno-utopianism that was the promise of the Jetsons’ future. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
58e8d3387d46fb7f879bb94d2f842b80
https://www.smithsonianmag.com/history/recapping-the-jetsons-episode-09-elroys-tv-show-137116925/
Recapping ‘The Jetsons’: Episode 09 – Elroy’s TV Show
Recapping ‘The Jetsons’: Episode 09 – Elroy’s TV Show This is the ninth in a 24-part series looking at every episode of “The Jetsons” TV show from the original 1962-63 season. When I was a kid I didn’t quite understand how TV and movies were made. Around the age of four or five, I had a basic understanding of how live TV was recorded with cameras and beamed to homes all around the country. And I understood that every time I put my Captain EO VHS tape (I think we recorded it off TV, since it was never issued officially) into the VCR, I would get to watch Michael Jackson singing and dancing. But I conflated the two and believed that every time I put in that VHS tape I was somehow telling people in some distant production studio to stage a live performance of Captain EO. As a kid, there’s something magical about learning how the things you like are made, even if you’re a bit fuzzy on the details. Whether it’s crayons or robots or movies, I and many others have fond memories from childhood where we felt like we were being let in on a wonderful production secret. There’s no story that writers, actors and producers of media like telling more than their own and these self-reflexive tales serve an important guide in our long-term understanding of the media itself. Even if it’s done for laughs, we’re meant to absorb something akin to a mission statement when producers poke at the artifice of their own creations. A TV cameraman shoots Elroy as “Space Boy” on Jupiter (1962) The ninth episode of “The Jetsons” aired on November 18, 1962 and featured pneumatic tubes, flying cars, videophones, and even another look at the ground in 2062! But the most important aspect of this episode, titled “Elroy’s TV Show,” was that it gave kids a peak behind the curtain, letting them in on the secret of how television was made. People who grew up prior to the YouTube generation most often learned about media production from watching the media itself. And “The Jetsons” delivered, poking fun at TV writers as lazy, directors as control freaks and actors (and their overbearing parents, in this case) as impossibly difficult prima donnas. George, Elroy and Astro travel to Jupiter to shoot Elroy’s TV show and kids of the 1960s were let in on the secret of how television is made, albeit in a heightened cartoonish form. The episode highlights the perennial debate over the role of TV programming in the American home. The latter half of the 20th century saw numerous fights over the regulation of TV programming and the battles were especially vicious when this episode premiered in 1962. The public airwaves were (and still are) regulated by the government and networks were obligated to devote some time each day to educational and public service broadcasting (such as news shows and the like). Of course, many of these FCC regulations are still on the books, but the 1980s declawing of the FCC meant that media deregulation advocates largely won that battle, arguing that TV networks should answer only to the market rather than what regulators deem to be the public interest. In fact, that’s what this episode argues, as Jane Jetson says that she doesn’t watch TV anymore since it’s “over her head.” Instead she wants more “doctor and cowboy shows.” When a TV producer named Mr. Transistor visits Jane to pitch a show based on the adventures of her son Elroy and her dog Astro, she says that she doesn’t want any more education on TV. Mr. Transistor replies, “I don’t blame you.” The Asteroid TV production building in the ninth episode of “The Jetsons” The Jetsons was rather infamously billed by broadcasters in the 1990s as an example of “educational TV” because it taught kids about the future. Which, while that is in some ways true, it’s certainly a stretch. Many early experimenters saw television as a promising tool for educating people — especially in rural farming communities where distance prohibited some from traveling to a major university for their education. But today we take it for granted that television is an entertainment medium first and foremost, often forgetting the many battles of previous decades. What are we meant to take from this episode? That despite the battles being waged over TV regulations, in the future Americans will get the action-packed (read: low-brow) programming they want. Entertainment finds a way, if you will. And while the episode is obviously not malicious in its intent to call educational programming uncool, such a message rings loudly throughout. George, Elroy and Astro on Jupiter shooting Elroy’s TV show (1962) Elroy Jetson was voiced by Daws Butler who also did classic cartoon characters like Yogi Bear, Snagglepuss and Huckleberry Hound. But it was Lucille Bliss who was originally offered the job of Elroy. Bliss was a voice actress best know for her work as Smurfette on the 1980s TV show “The Smurfs,” and she died earlier this month. Bliss is reported to have lost the job of voicing Elroy Jetson in 1962 when she refused to be credited under a pseudonym. Apparently it was somewhat scandalous for an adult woman to be voicing a cartoon boy, though it’s obviously quite common and not at all controversial today. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
0aa57883fb95c22c3fd969884310cd79
https://www.smithsonianmag.com/history/reckless-breeding-of-the-unfit-earnest-hooton-eugenics-and-the-human-body-of-the-year-2000-15933294/
Reckless Breeding of the Unfit: Earnest Hooton, Eugenics and the Human Body of the Year 2000
Reckless Breeding of the Unfit: Earnest Hooton, Eugenics and the Human Body of the Year 2000 Illustration of the human bodies of the future by Abner Dean in the January 1950 issue of Redbook magazine In the early 1950s, many people speculated that the average American’s body would look dramatically different by the early 21st century. Some thought that the average woman of the year 2000 might be over six feet tall, incredibly athletic and just as strong as the average man. Others believed that modern conveniences like the automobile would have disastrous effects on the human body of the 21st century, creating a society of fat weaklings and scrawny depressives. You can place Earnest A. Hooton in the latter school of thought. The January 1950 issue of Redbook magazine included the predictions of Hooton, a pioneering and often controversial anthropologist who advocated eugenics as a solution to many of America’s ills. As Hooton saw it, the progressive trends of the first half of the 20th century had only served to produce humans less fit for survival: The human animal has undergone astonishing bodily changes during the last half century. The physical features of our population in 2000 A.D. can be predicted with grim assurance unless present trends are corrected by a science of man. Changes in the physiques of Americans through more than fifty years are recorded in the gymnasium records of universities and colleges, in successive surveys of soldiers during two wars, of immigrants, delinquents and other elements of the population. Among the best data are those on Harvard sons and fathers and corresponding information from four Eastern women’s colleges. Harvard sons are bigger than their fathers in twenty-seven of thirty measurements. Notably, they are more than one and one-third inches taller, more than ten pounds heavier, longer in the legs relative to trunk length, larger in breadths and girths of the torso and longer in the forearms and lower legs. Girls differ from their mothers similarly, but have much narrower hips. These bigger dimensions sound well until studies are made of individual body types from photographs as well as measurements. Then it appears that short, broad, muscular builds are decreasing, along with the stubby, strong but fat types. On the contrary, long, taper-legged, obese types of inferior structure are on the increase, and, above all, the tall, weak “stringbeans.” With increased stature, heads are getting narrower, faces longer and narrower, palates more pinched, teeth less regular, noses more razor-backed. January 1950 cover of Redbook magazine Hooton believed that criminals were biologically different than non-criminals, coming down firmly on the side of nature in the “nature versus nurture” debate. He also believed that things like body type were closely tied to one’s temperament. In this vein, artist Abner Dean produced an illustration (above) for the piece which showed off the humans of the future — the happy rotund man, the depressed skinny man, and the tall, slender and largely content woman of tomorrow. Different body types are associated with distinct kinds of temperaments and well-defined physical and social aptitudes and disabilities. Broad, muscular men (usually short) tend to be aggressive, domineering, insensitive, practical and unimaginative, military and political but not intellectual and artistic leaders. Fat types are generally easy-going, kindly, “fond of the good things of life,” sociable, admirable in family relations, etc. The tall and skinny are commonly shy, nervous, repressed, emotionally unstable, intellectual and idealistic, but difficult in social relations. The auto has made walking obsolete (witness the poorly muscled modern legs). Work requiring strenuous muscular exertion is no longer usual for growing youth and for most adults. Sports and physical education hardly compensate for the sedentary habits that have sapped the stamina of the masses in our nation. Infant and juvenile mortality has decreased astoundingly through improved medical care and sanitation. The upsurge of the tall and skinny among adults is probably due in part to the preservation of elongate, fragile babies who now live to reproduce their kind. The proportion of the aged, too, has increased enormously, partly because of better medical care, but also because of easy living. So we have more of those too weak for work because of youth or age. As Nicole Rafter notes in her 2004 paper on the biological tradition in American criminology, Hooton believed that financial aid to the poor was hindering the progress of the human race: “The welfare programs of the New Deal seemed to Hooton to coddle an already weak segment of the population that might better be allowed to die off; unwittingly, government policies were encouraging regressive trends in human evolution. Deeply disturbed by the apparent downward rush of civilization, Hooton predicted social, political and genetic doom.” This description of Hooton is in line with his distaste for the “reckless breeding of the unfit” (terminology that largely fell out of fashion in academia after WWII). There can be little doubt of the increase during the past fifty years of mental defectives, psychopaths, criminals, economic incompetents and the chronically diseased. We owe this to the intervention of charity, “welfare” and medical science, and to the reckless breeding of the unfit. In 2000, apart from the horde of proliferating morons, the commonest type of normal male will be taller and more gangling than ever, with big feet, horse-faces and deformed dental arches. The typical women will be similar—probably less busty and buttocky than those of our generation. These spindly giants will be intelligent, not combative, full of humanitarianism, allergies and inhibitions—stewing in their own introspections. Probably they will be long-lived; the elongated shrivel and buckle, but hang on. There will also be a strong minority of towering heavyweights—melon-shaped, with knock-kneed shanks, small hands and feet and sociable dispositions. Ultimately this type may lead, because it is philoprogenitive, if not overly prolific. The lean and hungry Cassii and Cassandras propagate briefly and parsimoniously, then separate and sulk in celibacy. The stubby, bone-and-muscle Mr. Americas of today seem doomed to disappear or to be reduced to the ranks of institutionalized malefactors (judging from studies of present types of juvenile delinquents), instead of becoming dictators, they will be outlaws, since with attenuation of body-build the temperaments of the masses will probably change, so that idealism and intelligence will not be enslaved by brutishness. Sex illusions will persist. Men will still think women beautiful; women will still regard men as brainy and virile; reproduction will go on. But a science of man could intervene to effect a real improvement of the human animal within the next half-century. Hooton passed away just four years after publication of this article at the age of 66. He remained an advocate of eugenics until his death. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
2390a245ef2d756ad74afcc706195e73
https://www.smithsonianmag.com/history/reconsiderations-63769826/
Reconsiderations
Reconsiderations Caitlin O’Connell-Rodwell, a Stanford ecologist and author of The Elephant’s Secret Sense, has been studying elephant behavior since 1992, primarily in Namibia’s Etosha National Park. In “Male Bonding,” she describes a hair-raising confrontation between two bull elephants that overturned all her preconceptions—and the conventional wisdom—about challenges to pachyderm hierarchy. “I thought for sure I was going to see the dominant bull get displaced by the bull in musth,” she says. “When I didn’t, I realized that social structures are not all black and white; once you start focusing, you see there is a lot of gray. There are all these fascinating soap operas to try to figure out.” The biggest challenge with studying elephants is time—they can live up to 70 years in the wild. “Every year there are different dynamics within a social group, but if you have enough years to string together, you can see how relationships are shaped and how they are maintained—or not.” Thomas Powers is best known for his probing dissections of the CIA and other intelligence agencies. His latest book, The Killing of Crazy Horse, from which “How Little Bighorn Was Won” was adapted, would seem a departure. It’s a moment-by-moment account of the battle that cost George Armstrong Custer and his men their lives—told from the Indian point of view. But as with his earlier books, Powers had to piece the story together like a puzzle. “The most challenging thing,” he says, “was to stop trying to figure out what Custer did and start listening to the Indians. When you do that, everything becomes much simpler. It stops being a mystery; after all, there were thousands of Indians but only some 200 soldiers under Custer.” Powers was stunned by the soldiers’ callousness—and ineptitude. “They raced across the country to attack these Indians, who were not at war with them and were not threatening them in any way. They were attacked out of the blue, basically, by soldiers who didn’t know who they were attacking, didn’t know the lay of the land and didn’t know how to reach the Indians. That was startling to realize, that this thing was such a complete botch. I promise there will be 10,000 guys out there that will rise up and want to lynch me because they don’t like to think of Custer that way.” A reminder: Smithsonian’s 8th annual photo contest closes December 1, 2010, at 2 p.m. E.S.T. Visit Smithsonian.com/photocontest to enter your photographs and to view past winners and finalists. Carey Winfrey was Smithsonian magazine's editor in chief for ten years, from 2001 to 2011.
e9d5ad9ca7cb8f5f13534a687b56c3c5
https://www.smithsonianmag.com/history/rehabilitating-cleopatra-70613486/
Rehabilitating Cleopatra
Rehabilitating Cleopatra Cleopatra VII ruled Egypt for 21 years a generation before the birth of Christ. She lost her kingdom once; regained it; nearly lost it again; amassed an empire; lost it all. A goddess as a child, a queen at 18, at the height of her power she controlled virtually the entire eastern Mediterranean coast, the last great kingdom of any Egyptian ruler. For a fleeting moment she held the fate of the Western world in her hands. She had a child with a married man, three more with another. She died at 39. Catastrophe reliably cements a reputation, and Cleopatra's end was sudden and sensational. In one of the busiest afterlives in history, she has become an asteroid, a video game, a cigarette, a slot machine, a strip club, a synonym for Elizabeth Taylor. Shakespeare attested to Cleopatra's infinite variety. He had no idea. If the name is indelible, the image is blurry. She may be one of the most recognizable figures in history, but we have little idea what Cleopatra actually looked like. Only her coin portraits—issued in her lifetime, and which she likely approved—can be accepted as authentic. We remember her, too, for the wrong reasons. A capable, clear-eyed sovereign, she knew how to build a fleet, suppress an insurrection, control a currency. One of Mark Antony's most trusted generals vouched for her political acumen. Even at a time when female rulers were no rarity, Cleopatra stood out, the sole woman of her world to rule alone. She was incomparably richer than anyone else in the Mediterranean. And she enjoyed greater prestige than every other woman of her time, as an excitable rival king was reminded when he called for her assassination during her stay at his court. (The king's advisers demurred. In light of her stature, they reminded Herod, it could not be done.) Cleopatra descended from a long line of murderers and upheld the family tradition, but was for her time and place remarkably well behaved. She nonetheless survives as a wanton temptress, not the first time a genuinely powerful woman has been transmuted into a shamelessly seductive one. She elicited scorn and envy in equal and equally distorting measure; her story is constructed as much of male fear as of fantasy. Her power was immediately misrepresented because—for one man's historical purposes—she needed to have reduced another to abject slavery. Ultimately everyone from Michelangelo to Brecht got a crack at her. The Renaissance was obsessed with her, the Romantics even more so. Like all lives that lend themselves to poetry, Cleopatra's was one of dislocations and disappointments. She grew up amid unsurpassed luxury and inherited a kingdom in decline. For ten generations her family, the Ptolemies, had styled themselves pharaohs. They were in fact Macedonian Greek, which makes Cleopatra about as Egyptian as Elizabeth Taylor. She and her 10-year-old brother assumed control of a country with a weighty past and a wobbly future. The pyramids, to which Cleopatra almost certainly introduced Julius Caesar, already sported graffiti. The Sphinx had undergone a major restoration—more than 1,000 years earlier. And the glory of the once-great Ptolemaic empire had dimmed. Over the course of Cleopatra's childhood Rome extended its rule nearly to Egypt's borders. The implications for the last great kingdom in that sphere of influence were clear. Its ruler had no choice but to court the most powerful Roman of the day—a bewildering assignment in the late Republic, wracked as it was by civil wars. Cleopatra's father had thrown in his lot with Pompey the Great. Good fortune seemed eternally to shine on that brilliant Roman general, at least until Julius Caesar dealt him a crushing defeat in central Greece. Pompey fled to Egypt, where in 48 B.C. he was stabbed and decapitated. Twenty-one-year-old Cleopatra was at the time a fugitive in the Sinai—on the losing side of a civil war against her brother and at the mercy of his troops and advisers. Quickly she managed to ingratiate herself with the new master of the Roman world. Julius Caesar arrived in Alexandria days after Pompey's murder. He barricaded himself in the Ptolemies' palace, the home from which Cleopatra had been exiled. From the desert she engineered a clandestine return, skirting enemy lines and Roman barricades, arriving after dark inside a sturdy sack. Over the succeeding months she stood at Caesar's side—pregnant with his child—while he battled her brother's troops. With their defeat, Caesar restored her to the throne. For the next 18 years Cleopatra governed the most fertile country in the Mediterranean, guiding it through plague and famine. Her tenure alone speaks to her guile. She knew she could be removed at any time by Rome, deposed by her subjects, undermined by her advisers—or stabbed, poisoned and dismembered by her own family. In possession of a first-rate education, she played to two constituencies: the Greek elite, who initially viewed her with disfavor, and the native Egyptians, to whom she was a divinity and a pharaoh. She had her hands full. Not only did she command an army and navy, negotiate with foreign powers and preside over temples, she also dispensed justice and regulated an economy. Like Isis, one of the most popular deities of the day, Cleopatra was seen as the beneficent guardian of her subjects. Her reign is notable for the absence of revolts in the Egyptian countryside, quieter than it had been for a century and a half. Meanwhile the Roman civil wars raged on, as tempers flared between Mark Antony, Caesar's protégé, and Octavian, Caesar's adopted son. Repeatedly the two men divided the Roman world between them. Cleopatra ultimately allied herself with Antony, with whom she had three children; together the two appeared to lay out plans for an eastern Roman empire. Antony and Octavian's fragile peace came to an end in 31 B.C., when Octavian declared war—on Cleopatra. He knew Antony would not abandon the Egyptian queen. He knew too that a foreign menace would rouse a Roman public that had long lost its taste for civil war. The two sides ultimately faced off at Actium, a battle less impressive as a military engagement than for its political ramifications. Octavian prevailed. Cleopatra and Antony retreated to Alexandria. After prolonged negotiation, Antony's troops defected to Octavian. A year later Octavian marched an army to Egypt to extend his rule, claim his spoils and transport the villain of the piece back to Rome, as a prisoner. Soundly defeated, Cleopatra could negotiate only the form of her surrender. She barricaded herself in a vast seaside mausoleum. The career that had begun with a brazen act of defiance ended with another; for the second time she slipped through a set of enemy fingers. Rather than deliver herself to Octavian, she committed suicide. Very likely she enlisted a gentle poison rather than an asp. Octavian was at once disappointed and in awe of his enemy's "lofty spirit." Cleopatra's was an honorable death, a dignified death, an exemplary death. She had presided over it herself, proud and unbroken to the end. By the Roman definition she had at last done something right; finally it was to Cleopatra's credit that she had defied the expectations of her sex. With her death the Roman civil wars came to an end. So too did the Ptolemaic dynasty. In 30 B.C. Egypt became a province of Rome. It would not recover its autonomy until the 20th century A.D. Can anything good be said of a woman who slept with the two most powerful men of her time? Possibly, but not in an age when Rome controlled the narrative. Cleopatra stood at one of the most dangerous intersections in history: that of women and power. Clever women, Euripides had warned 400 years earlier, were dangerous. We do not know whether Cleopatra loved either Antony or Caesar, but we do know that she got them to do her bidding. From the Roman point of view, she "enslaved" them both. Already it was a zero-sum game: a woman's authority spelled a man's deception. To a Roman, Cleopatra was thrice suspect, once for hailing from a culture known—as Cicero had it—for its "fribbling, fawning ways," again for her Alexandrian address, lastly for her staggering wealth. A Roman could not pry apart the exotic and the erotic; Cleopatra was a stand-in for the occult, alchemical East, for her sinuous, sensuous land, as perverse and original as its astonishment of a river. Men who came in contact with her seem to have lost their heads, or at least to have rethought their agendas. The siren call of the East long predated her, but no matter: she hailed from the intoxicating land of sex and excess. It is not difficult to understand why Caesar became history, Cleopatra a legend. Her story differs from most women's stories in that the men who shaped it enlarged rather than erased her role, for their own reasons. Her relationship with Antony was the longest of her life—the two were together for the better part of 11 years—but her relationship with Octavian proved the most enduring. He made much of his defeat of Antony and Cleopatra, delivering to Rome the tabloid version of an Egyptian queen, insatiable, treacherous, bloodthirsty, power-crazed. Octavian magnified Cleopatra to hyperbolic proportions to do the same with his victory—and to smuggle Mark Antony, his real enemy and former brother-in-law, out of the picture. As Antony was erased from the record, Actium was wondrously transformed into a major engagement, a resounding victory, a historical turning point. Octavian had rescued Rome from great peril. He had resolved the civil war; he had restored peace after 100 years of unrest. Time began anew. To read the official historians, it is as if with his return the Italian peninsula burst—after a crippling, ashen century of violence—into Technicolor, the crops sitting suddenly upright, crisp and plump, in the fields. "Validity was restored to the laws, authority to the courts, and dignity to the senate," proclaims the historian Velleius. The years after Actium were a time of extravagant praise and lavish mythmaking. Cleopatra was particularly ill-served; the turncoats wrote the history. Her career coincided as well with a flowering of Latin literature. It was Cleopatra's curse to inspire its great poets, happy to expound on her shame, in a language inhospitable to her. Horace celebrated her defeat before it had occurred. She helpfully illuminated one of the poet Propertius's favorite points: a man in love is a helpless man, painfully subservient to his mistress. It was as if Octavian had delivered Rome from that ill as well. He restored the natural order of things. Men ruled women, and Rome ruled the world. On both counts Cleopatra was crucial to the story. She stands among the few losers whom history remembers, if for the wrong reasons. For the next century, the Oriental influence and the emancipation of women would keep the satirists in business. Propertius set the tone, dubbing Cleopatra "the whore queen." She would later become "a woman of insatiable sexuality and insatiable avarice" (Dio), "the whore of the eastern kings" (Boccaccio). She was a carnal sinner for Dante, for Dryden a poster child for unlawful love. A first-century A.D. Roman would falsely assert that "ancient writers repeatedly speak of Cleopatra's insatiable libido." Florence Nightingale referred to her as "that disgusting Cleopatra." Offering Claudette Colbert the title role in the 1934 movie, Cecile B. DeMille is said to have asked, "How would you like to be the wickedest woman in history?" Inevitably affairs of state have fallen away, leaving us with affairs of the heart. We will remember that Cleopatra slept with Julius Caesar and Mark Antony long after we remember what she accomplished in doing so: that she sustained a vast, rich, densely populated empire in its troubled twilight. A commanding woman versed in politics, diplomacy and governance, fluent in nine languages, silver-tongued and charismatic, she has dissolved into a joint creation of the Roman propagandists and the Hollywood directors. She endures for having seduced two of the greatest men of her time, while her crime was in fact to have entered into the same partnerships that every man in power enjoyed. That she did so in reverse and in her own name made her deviant, socially disruptive, an unnatural woman. She is left to put a vintage label on something we have always known existed: potent female sexuality. It has forever been preferable to attribute a woman's success to her beauty rather than to her brains, to reduce her to the sum of her sex life. Against a powerful enchantress there is no contest. Against a woman who ensnares a man in the coils of her serpentine intelligence—in her ropes of pearls—there should, at least, be some kind of antidote. Cleopatra would unsettle more as sage than as seductress; it is less threatening to believe her fatally attractive than fatally intelligent. As one of Caesar's murderers noted, "How much more attention people pay to their fears than to their memories!" A center of intellectual jousting and philosophical marathons, Alexandria remained a vital center of the Mediterranean for a few centuries after Cleopatra's death. Then it began to dematerialize. With it went Egypt's unusual legal autonomy for women; the days of suing your father-in-law for the return of your dowry when your husband ran off with another woman were over. After a fifth-century A.D. earthquake, Cleopatra's palace slid into the Mediterranean. Alexandria's magnificent lighthouse, library and museum are all gone. The city has sunk some 20 feet. Ptolemaic culture evaporated as well; much of what Cleopatra knew would be neglected for 1,500 years. Even the Nile has changed course. A very different kind of woman, the Virgin Mary, would subsume Isis as entirely as Elizabeth Taylor has subsumed Cleopatra. Our fascination with the last queen of Egypt has only increased as a result; she is all the more mythic for her disappearance. The holes in the story keep us coming back for more. Adapted from Cleopatra: A Biography, by Stacy Schiff. Copyright © 2010. With permission of Little, Brown and Company. All rights reserved. Stacy Schiff won the Pulitzer Prize for her 1999 biography, Véra (Mrs. Vladimir Nabokov): Portrait of a Marriage.
ed5aa601536d32f0757eaeb7d09701fe
https://www.smithsonianmag.com/history/reinvention-great-american-circus-180963682/
Step Right Up! See the Reinvention of the Great American Circus!
Step Right Up! See the Reinvention of the Great American Circus! Printed in red capital letters on the back of the instructor’s black T-shirt is what seems to me a loaded question: WHY WALK WHEN YOU CAN FLY? The Greatest Shows on Earth: A History of the Circus Looking down from nearly 20 feet up in the air, perched atop a 5-foot-wide platform, I can tell you why. I’m afraid of heights. I have a bad shoulder. There’s no such thing as “the friendly skies.” Furthermore, if jumping from this platform and dangling from a steel pole is safe, why did I have to sign a liability waiver? “You can do it!” shouts our instructor, Ailsa “Al” Firstenberg, from below, flashing two thumbs up. My six classmates at trapeze school, all younger than I, look less certain, but are visibly riveted by my evident panic and the potential for disaster. Standing beside me, another instructor, Patrick Howlett, an Australian doppelgänger for actor Chris Hemsworth, extends a Thor-like arm and catches the bar a co-worker on the far opposite platform sends sailing our way. Patrick smiles. “Come on, Hols,” he purrs, instantly nicknaming me. “Time to fly.” It is so not time to fly. Just scaling the ladder without supplemental oxygen induced colon cramps. Going down? I think. No way. Mind you, I’m no wimp. I’ve survived dangerous assignments: swimming with sharks in the Caribbean; riding a water buffalo in Brazilian rainforest; standing in line at a Nicholas Sparks book signing in Greenville, South Carolina. Surely flying at España-Streb Trapeze Academy in Williamsburg, Brooklyn is not going to kill me. Right? Learning the flying trapeze is, after all, the most popular offspring of the traditional traveling circus, whose demise has unearthed a flourishing ecosystem of boutique circuses and participatory upstarts across the country. Though the Ringling Bros. retired in May, dry your eyes and pop your clown nose on; there are plenty more circuses you can visit wide-eyed or run away to and join. No joke: Circus scholar Janet Davis counts some 85 circus schools and training centers scattered across the country, where everyone from bona fide big-top and art-house pros to curious civilians and energetic youngsters learn the ropes, high wires and German wheels of circus yore. More grounded types can master juggling and clowning arts, while fitness fanatics elevate into yoga aerialists and trampoline acrobats. And roving troupes and single-ring spectacles abound. Ninety percent of us live within an hour’s drive of a performing circus, according to the World Circus Federation, each with its own special flair for wow. Like Circus Amok, whose clowns in drag perform free outdoor shows, spotlighting social issues from AIDS to immigration to gentrification. Or Absinthe, a naughty Las Vegas cabaret-circus hybrid the New York Times cheers as “Cirque du Soleil as channeled through the Rocky Horror Picture Show.” Cirque des Voix, based in Sarasota, Florida, sets aerial routines to choral music performed by more than a hundred singers and a 40-piece orchestra, and Atlanta-based UniverSoul, the only African-American-owned circus, is an extravaganza of black culture from around the world. From Montreal, there’s Les 7 Doigts de la Main (The Seven Fingers of the Hand), which recently toured the United States with its show “Cuisine & Confessions,” in which a juggling, dancing, storytelling, acrobatic troupe also cooks and feeds the audience. In simpler times, the big top was a thrilling escape from monotony. In today’s topsy-turvy world, these shows and scores of others offer an interactive and intimate respite from our tech-age overload—our emails, smartphones, Twitter feeds, queued-up Netflix TV shows, all demanding our attention, stealing our time, depriving us of memories. Hence, my heart-pounding predicament at the España-Streb Trapeze Academy, which was founded by renowned acrobatic choreographer Elizabeth Streb and fifth-generation circus legends Noe and Ivan España, where almost everybody can learn to fly, as long as they’re between the ages of 5 and 85. I grasp the trapeze bar in one hand while Patrick moves behind me to grip my safety belt, so I can lean forward beyond the platform to grab the distant other end with my free hand. “The bar is heavy, so you’re going to feel like you want to bend forward,” Patrick says. “But keep your shoulders back and push your hips forward, nice and tall. Do not look down.” Stretched out over the abyss, white-knuckling the bar, I wait for a spotter named Viktor, manning the safety ropes on my belt from below, to call out the commands. “Ready” means bend the knees. “Hep” means go. (Circus people tend not to say “go,” as it could be mistaken for “no.”) “Ready! Hep!” I jump, stunned by the cement weight of my body, which threatens to rip away from my shoulders and leave my limbs behind on the bar. My hands burn. I’m about to give up, let go, cry Uncle!, when the tonnage of flesh and bones and blood lightens on the upswing, and the magical sensation of flying kicks in. At the highest point, I feel featherweight and roller-coaster giddy as the air holds me in its breath before releasing me to swing back again. It’s physics, Viktor explains later. “When you’re vertical, you experience three times your body weight in your grip. At apex—when your body peaks horizontal to the floor—you’re weightless.” (This is the moment when acrobats do tricks.) Four trips up the ladder later and I’m launching myself, swinging upside down by my knees, and dismounting with a back flip into the gigantic airbag below, a superhero with a newfound power and an ego to match. ********** Do you speak Circus? Yes, you do! Ever ordered jumbo fries? Those are named for the plus-sized zoo elephant bought and made famous by P.T. Barnum in 1882. Called someone a geek? That’s a sideshow freak. Gotten the show on the road or jumped on the bandwagon? Or—my personal favorite—been ditched? If so, the circus didn’t bother to formally fire you—it just left you standing beside the tracks after the train deviously pulled out of the station early. For the citizens and 54 railway cars of the Ringling Bros. and Barnum & Bailey Circus Xtreme, Providence, Rhode Island, is the last stop on the line. Kenneth Feld, whose family owns the circus, appears and thanks the sold-out crowd of 14,000 for 146 years of “making the impossible possible. And now, for the Greatest Show on Earth—one last time!” The long-ballyhooed goodbye begins! There are fire jugglers, camel-riding contortionists, glow-in-the-dark bungee-jumping acrobats, snake charmers wrapped in bright yellow pythons, a Mongolian strongman who lifts a 551-pound mass of Mongol gals and kettlebells with his “jaws of steel.” Clowns pop up and out all over the place, and I’m gleefully overstimulated. Then a 20-foot cannon, wheeled into the ring, grabs my attention. A fuse is lit. The audience counts down from five and bang! “Nitro” Nicole Sanders flies more than a hundred feet at 66 miles per hour into the pillowy embrace of a giant airbag, just as pioneer cannonballer Rosa “Zazel” Richter did 140 years earlier. And who rigged the first human cannon, you ask? That was funambulist (tightrope walker) William Leonard Hunt, a.k.a. the Great Farini, which raises the question, why wasn’t he the first human cannonball? (“Zazel, you go first.”) After the blast, “Nitro” Nicole takes a bow, and intermission is announced with a reminder of how much the world has changed: “In the event of firearms, stay calm and look for the nearest exit.” The highlight of the second half includes 12 tigers strutting inside a massive cage, encircling their buff, bald-headed trainer, Tabayara “Taba” Maluenda, a sixth-generation Chilean circus performer dressed in a bedazzled green sleeveless velvet jumpsuit, matching armbands and knee-high leather boots. With a flick of Taba’s whip, the regal beasts sit, jump from stool to stool, lie down side by side, roll over one after the other. Taba sweats bullets throughout, mopping his mug. But when he faces us and takes a bow, it’s clear those are tears streaming down his face. The trainer turns and kisses one of the man-eaters on the nose. Sobbing, he addresses them. “For 30 years you put food on my table,” he says. “Catana, I have had you for 13 years, since you were 6 months old.” He calls Catana to him and buries his head in her fur. Then he dismisses the cats one by one, thanking each by name. With the last one gone, Taba kisses the empty floor. To close the evening, and an era, Kristen Michelle Wilson, Ringling’s first (and last) female ringmaster, calls some 300 cast and crew into the ring, to sing “Auld Lang Syne.” From backstage, husbands, wives and children come join them. None of the babies are crying, but all of the grown-ups are. “We circus people always say, ‘We’ll see you down the road,’” Wilson says, her voice rising with emotion. “So, ladies and gentlemen, children of all ages: We’ll see you down the road!” ********** After nearly 150 years of Ringling Bros. and Barnum & Bailey hogging the circus spotlights, you might suppose they were the big bang of it all, but not so. Step right up and I’ll tell you a story of freaks and fantasy and flight and fortunes and a great American capitalistic dream come true. Excuse me, sir, please turn off your iPhone. The first American circus debuted in Philadelphia, then the nation’s capital, on April 3, 1793. The founder and star was John Bill Ricketts, a dashing Scottish horseman, who would ride a stallion round a ring standing in the saddle, with a 9-year-old boy—also standing—on his shoulders. One of the show’s attractions was a Revolutionary War hero—a horse named Jack once ridden by Gen. George Washington (or so the tale goes), a confirmed circus fan who entrusted the steed to Ricketts for his show. Soon ragtag troupes were driving wagons through small towns staging “mud shows” in canvas tents, inspired by the productions of their European forebears. Because this was the U.S.A., you had to have a gimmick; and what the American impresarios added was exotic fauna: lions and tigers and bears and other talented wildlife netted along the way. The golden age of the American circus coincided with the Gilded Age, and one Phineas Taylor Barnum (P.T. for short) was a living emblem of both: a New York City swindler who called himself the “Prince of Humbug” and began his career selling tickets to see a mummified “mermaid” made of a monkey’s head sewn to a fish. P.T. Barnum’s Grand Traveling Museum, Menagerie, Caravan & Hippodrome filled not one but three tents—and sometimes as many as seven—dividing the audience’s attention among outlandish, phantasmagoric displays. To the lion tamers, clowns and trick riders he added freak shows: human zoos of bearded women and “armless wonders.” When Barnum merged with his competitor, J. A. Bailey, in 1881, they crowned their union the "Greatest Show on Earth.” By the turn of the century, village schools, mills and shops shuttered for “Circus Day,” and hardscrabble farmers and their children boarded discounted trains to the nearest town center where the tent was raised. For kids seeing camels march down Main Street, “running away with the circus” became a dream—and an option. The latter was true for five of the Ringling brothers, raised by a harness maker first in Iowa and later Wisconsin. After visiting the circus in 1870, they hand-stitched a rag tent in their backyard, charged a penny admission and earned enough to upgrade to muslin. By the time Barnum & Bailey returned from a six-year European tour in 1902, the Ringling circus was a potential usurper. The brothers had harnessed the same global gymnastics trend that revived the Olympics in 1896. Freaks and geeks were très passé; the Ringlings’ focus was action-oriented fare. When the rivals coupled up in 1918, the combined show was called the “Big One.” They weren’t bragging: In the 1920s, the Big One had 1,600 performers traveling on four, 100-car trains. It was all fun and fantastical until the Great Depression. Soon afterward, the talkies seduced audiences. There were attempts to modernize: whole shows based on a single theme or orchestrated like complex ballets, including 1942’s Ballet of the Elephants, choreographed by George Balanchine with an original score by Igor Stravinsky. In the 1970s, the nouveau cirque, groovy one-ring productions influenced by arty affairs from Europe that eschewed sideshows and animal acts, cast the seeds of the renewal blossoming today: smaller operations like the San Francisco-based Pickle Family Circus, with its cooperative structure and ensemble juggling, and the clown-focused Big Apple Circus (which, after shutting down in 2016, announced earlier this year that it would return with new ownership this fall). In 1984, a band of 20 Québécois street performers led by the fire-breathing, stilt-walking accordionist and high-stakes poker player Guy Laliberté became Cirque du Soleil. Like everything in the ’80s—hair, shoulder pads, attitude—it went big and broad, reinventing spectacle on a grand international scale, with giant tents, lavish costumes and elaborate theatrics combined with awesome acrobatic skill. While Cirque grew into a billion-dollar industry, Ringling dwindled under the pressure of animal rights activists and shrinking ticket sales. “It was a business model they just couldn’t continue,” says Linda Simon, author of The Greatest Shows on Earth: A History of the Circus. “They kept their ticket prices down, but to mount that kind of extravaganza, how are they going to support their railroad cars and their thousands of employees? And there you have it.” ********** Inside the lobby of Madison Square Garden, I watch two male hand-balancers in red-and-white- striped leotards and wonder if they know their skintight onesies were first worn by the 19th-century French aerialist Jules Léotard, who created his namesake get-up to fly through the air with the greatest of ease, and without a wedgie. The duo shifts from one circus-sutra position to the next in a show of statuesque strength, as looky-loos and their little-loos drink cocktails and sodas and gobble popcorn and candy. Chimes beckon all to their seats for the grand spectacular, Circus 1903: The Golden Age of Circus, a new traveling homage to the kind of old-timey show you would have seen more than a century ago, after Barnum & Bailey’s circus returned from its tour of Europe with the crème de la crème of foreign talents in tow. A mustachioed, top-hatted ringmaster named William Winterbottom Whipsnade (a.k.a. David Williamson, a magician) scans the crowd. “I need a kid with personality!” he booms. Lucky Lucas, 7, gets plucked up. Whipsnade sits on a short stool and asks, “You ragamuffin, you want a good look at the elephants?” You bet! Whipsnade pulls a balloon from his pocket, blows it up, and twisting it into an elephant says, “I like you, Lucas. You’re weird like me. You’ve got sawdust in those veins!” This is a big tease. The magical appeal of Circus 1903 is a new breed of pachyderm: hyper-realistic, life-size puppets, by the creators of the Broadway smash War Horse. As Lucas runs off with his prize, Whipsnade scoffs at the light applause: “You’re not at the theater! You’re at the circus!” Not to be a killjoy, but technically speaking, we’re not at a circus, as circus is Latin for circle. Any Roman will tell you that, and then try to take credit for starting the whole thing in a ring. And while they did innovate the ring, “the real origins of the circus,” says Simon, were “street entertainers in Europe, responding to things in their culture, showing off their talents.” Which brings us full circle-ish back to here and now and Circus 1903, whose good-natured, kid-friendly shenanigans are presented facing the audience, from a stage. Among the world-class stars: the Sensational Sozonov, balancing on a teeterboard atop sky-high cylinders. The Cycling Cyclone, a “wizard of the wheel” on a bicycle—spinning, rearing, balancing—and doing on a bike what Philip Astley, father of the modern circus, did on a horse at the London opening of Astley’s Amphitheatre, in January 1768. “Now for the weird and wonderful side of the human species,” Whipsnade bellows. “The sideshow!” He unveils the (faux) Bearded Lady, the (somewhat) Strong Man, and the Man-Eating Chicken: a man...eating chicken. “Now, for the beguilingly bizarre!” The Elastic Dislocationist, an apparently spineless woman from Ethiopia, bends in two, with her buttocks on her head. She stares hypnotically between her legs and proceeds to walk them 180 degrees spiderishly around herself. “Make her stop!” cries a tot next to me. More bizarre than beguiling, I want to look away, but toward what? Then it hits me. This charming little circus is missing something: an audience on the other side of a ring, their expressions of joy, fear and awe amplifying my own, exciting and uniting all of us. (Gotta give it to the Romans.) I replay the moment for Simon, the historian, who gets it: “That communal experience of everybody being amazed at something, and knowing that everyone else is amazed—that’s lost.” My grievance is cut short with the grand entrance of the elephant Queenie and her calf Peanut, who elicit a collective gasp and cheers from the crowd. The molded foam-and-fabric-mesh puppets, with their realistic glass eyes, completely capture the lumbering walk, weight and emotion of their wild mates, thanks to the four puppeteers on stilts half-hidden inside Queenie and the one beneath Peanut, precisely manipulating hinged trunks and limbs. Mother teaches child to do circus tricks—stand on a stool, turn in a circle, bow down, each to great, guilt-free applause. PETA would be proud. But, for me, the real breath-takers are fifth-generation Mexican tightropers Los Lopez, who don’t just walk the wire but jump rope, ride unicycles and bike on it, too—with a bar balanced on their shoulders, while a woman in the middle slides into splits. This lady knows how to put the fun in funambulist. Hey, when it comes to the circus, it takes all kinds. “Life is on the wire,” mused Karl Wallenda, founder of the celebrated circus troupe. “The rest is just waiting.” For most of us, waiting is just fine, as long as we get to watch something worth waiting for. And that, in a circus peanut nutshell, is why the show will go on. “The future of the circus,” says Simon, “is a combination of different genres—so there’s dance, acrobatics, trapeze, satire, critique, juggling, all of that in a different kind of intimate experience.” ********** Even so, I would like to file a complaint. More often than not, these newfangled newbies seem to have ditched the very symbol of the circus and its beating emotional heart: the clown. Which brings me to, of all places, Yale. On an overcast day this spring, students sporting red rubber noses wander around a classroom exhibiting raw bursts of emotion. If you suffer from coulrophobia, you’d be freaking out right now. Then again, if you, like me, have always wanted to say, “I went to Yale,” this class is more fun than skipping school. Christopher Bayes, Yale School of Drama’s head of physical acting, gives the students vocal cues. “Anxiety!” There’s nail-biting, furrowed brows, hunched shoulders cowering in corners. At “Anger!” the twenty-somethings look like me on the phone with Time Warner Cable. “Despair!” They keen, wail, implore heaven; some even really cry. “I try to get these guys to go primary, expressing without filter,” says Bayes, boyishly handsome in jeans, a gray T-shirt and wire-rimmed glasses. He starts with negative emotions. “Then we can find our way to play—have a Yay! party.” He adds, “It’s not therapy, but it can be therapeutic.” Which is fitting, as clowns embody the spirit of the circus much as aerialists and acrobats represent its raw physicality. Each imbues the other with meaning, creating a balance. “After watching people fly through the air and do all kinds of death-defying stunts, clowns are something just really human, to make us laugh in a really simple way,” Bayes says. “They draw people further and further into the show in a way that’s much more naive and grounding.” While the red nose was reportedly inspired by the rubicund honkers of buffoonish boozehounds, a nose is not required. Ancient cultures from Egypt and China to Greece and the American Indians had a version of the clown. Our modern exemplars include Charlie Chaplin, the Marx Brothers, Carol Burnett, Steve Martin and numerous “Saturday Night Live” icons. Not for nothing, President Nixon, clown lover, signed Proclamation 4071 on August 2, 1971, declaring the first week of August “National Clown Week.” But it wasn’t long after that that the clown’s rep took a hit, thanks in part to John Wayne Gacy Jr., the killer clown in Stephen King’s novel It, and more recently the reports of real-life violent clowns lurking around certain American neighborhoods. “I think the U.S. is the only place where we have that kind of culture surrounding clowns,” Bayes says. “They don’t have it in Europe. They don’t have Bozo, Krusty, these clowns who laugh for no good reason, who are grotesque, the creepy ones who put on a clown outfit but are not clowns.” Which means the American clown’s future seems fairly uncertain. Bayes’ students won’t go to the circus, he guesses. “They’re going to be comic actors, some of them; some will make a lot of money, some will struggle. I’m trying to be a kind of infection: to send these beautiful students out into the world to start their own kind of revolution.” He is training them “to ungrow up,” he says, “and welcome back a kind of playfulness as something that has value.” ********** The morning after my trapeze class, I’m back inside Elizabeth Streb’s SLAM warehouse (a.k.a. Streb Lab for Action Mechanics), where in addition to her trapeze academy she rents warehouse space to practicing professional daring-doers. There’s a girl spinning in aerial silks; guys flipping between trapeze bars; and the Streb Extreme Action company—a troupe of six men and three women equal in size and strength—rehearsing for the company’s show SEA (Singular Extreme Actions). They launch from a trampoline, flying like synchronized missiles, full- body-planting into a floor mat one after another in hair-raising succession, side by side by side. Like cartoon characters, they incredibly survive impact, spring up, and go again and again: thud, thud, thud, thud. At first, the sound of raining bodies hitting the ground is slightly sickening, but soon it grows into an organic drumbeat, rhythmic and cool. “Get some air, get some air!” shouts Streb, 67, seated in a metal folding chair a few feet away from the landing pad. “Yes! That’s it! Watch out!” Streb combs a hand through her thick black punk-rock hair, adjusts her thick black-framed glasses. Dressed in a black suit with gold piping, the pants stuffed into knee-high motorcycle boots, she looks equal parts Goth ringmaster, avant-garde artist and intellectual godmother to the cirque new wave. All of which she is, as well as a 1997 MacArthur Foundation “Genius” Fellow, awarded for her “original approach to choreography that is action oriented and gravity defying.” “I always tell them, ‘Harder, faster, sooner, higher!’ That’s the mantra,” Streb says. (Moments later, she yells: “Fall slower!”) Streb has choreographed spectacles of all sizes, including a series of performances during the 2012 Olympic festival, when her troupe made jaw-dropping use of London landmarks: bungee-jumping acrobatics from the Millennium Bridge, “walking” down the side of the City Hall building, and dancing, while tethered, atop the spokes of the massive, revolving London Eye. Her wild ideas were born in a tent in Rochester, New York, where Streb grew up going to the Shrine Circus every year. “It was my obsession,” she says. “I loved odd things: the smells, the sawdust, the dirtiness, the fact that it was in a tent. It was a magical world. I wanted to be a troubadour like that. I wanted that lifestyle right away. I knew it.” After studying dance in college (though she’d never taken a dance class), Streb struck out for San Francisco before moving to New York, where her one-woman shows grew into the ensemble of acrobats she calls “action heroes,” who perform net-less, near-death, whimsically freakish physical feats that might incorporate ropes, cinder blocks and an iron beam, or trusses and giant custom-made machines like spinning ladders and wheels. Ask how her troupe has evolved from the circus, and Streb points to the synchronized fliers, crashing flat-bodied against the floor. “The thing that we do that the other circuses won’t do—and now they’re going to steal my idea—is we land,” she says. “Why does the circus pretend that gravity doesn’t exist? And why do they think that’s beautiful? You’re lying about physicality!” “In the traditional circus, you do the trick, you pose, you smile, they applaud,” says aerial specialist Bobby Hedglin-Taylor, a Streb instructor and actor who also trains Broadway stars. “Those days are gone. One thing that attracted me to Streb and her work is she doesn’t compete with the circus. She’s made it her own.” A week later, Streb, dressed in a black suit with a Pac-Man print, looks anxious and excited as she paces before an audience of all ages and every race. An M.C. whips up the crowd: “We encourage you to make noise! Take pictures! Film the show! Post it to social media! Get the word out! And thank you for coming!” Streb’s action heroes, in their shiny red footless unitards, fly and flip and fall. But an act called “Steel” steals the show. An eight-foot-long, 200-pound I-beam is lowered from the ceiling by a thick chain, and stops a foot from the ground. A performer on each end sends it spinning, the sound of their hands pounding against the metal ringing like a gong, the air from the whirling beam fanning the audience. One by one, the troupe dodges and rolls under the whirling death girder, sitting up and lying down over and over as the beam misses their heads by mere inches, risking a major dental bill at best and brain scrambling at worst. It’s stomach churning. Half the crowd is watching through splayed fingers. Afterward, when the show ends, Streb comes over, greets me with a hug and asks if I’ve gone flying lately. No, actually, I say: I threw out my back after dropping my keys and bending over to pick them up. She shakes her head and smiles. “Life is a dangerous game.” ********** On the subway back to Manhattan, three teens congregate in the middle of the car. One wearing a black baseball cap announces, “Ladies, gentlemen! May we have your attention please! We are not homeless. We do not do drugs. The cops don’t like us because their daughters do.” At this, heads locked on smartphone screens look up, and there’s a chorus of laughter. A boombox starts playing dance music, and a kid in a New England Patriots T-shirt grabs the parallel poles that run along the subway car’s ceiling and starts flipping and performing perfectly executed tricks and maneuvers. His friends cheer him on and in turn perform spinning stunts on the center passenger pole. Riders slide away to give the flying limbs room. Soon everybody’s encouraging them with “Woo-hoo’s!” and applause. As the train pulls into the station, it occurs to me that you can always find a circus, and sometimes the circus will find you. Editor's Note: In “Divas and Daredevils,” we said Leona Dare’s mother was killed by a stray bullet at the Alamo. In fact, her grandmother was killed there. This article is a selection from the July/August issue of Smithsonian magazine Holly Millea is a New York-based writer and the “beauty adventure” columnist for Elle magazine. Millea has written extensively about the movie industry for New York, Premiere and Talk magazine.
6b4f6715b0b1acbf787ea83fcd9a9cef
https://www.smithsonianmag.com/history/remedy-spread-fake-news-history-teachers-180961310/
The Remedy for the Spread of Fake News? History Teachers
The Remedy for the Spread of Fake News? History Teachers Few people would approach a complete stranger on the street for information about the pressing issues of the day, and yet that is just how many behave on the internet. In the wake of the 2016 election, reporting from Buzzfeed and other outlets has made it increasingly clear that the American voter is woefully lacking in the skills needed to judge the veracity of a news website. Among the many headlines from fake news websites were reports that Pope Francis endorsed President-elect Trump, that Hillary Clinton used a body double throughout the campaign and sold weapons to ISIS. The founders and authors of these fake news promulgators craft their stories for the sole purpose of maximizing visitor hits to in turn generate massive revenue. Their deceptions play to readers’ worst fears regardless of whether the writers themselves subscribe to the political leanings of the article's content. "It is not intended to pose an alternative truth," writes author Neal Gabler, "as if there could be such a thing, but to destroy truth altogether, to set us adrift in a world of belief without facts, a world where there is no defense against lies." In comparison with news outlets (and other sites) that offer ideologically biased takes on the most pressing issues of the day, fake news operations occupy a unique place on the web and constitute an obvious and menacing threat to unsuspecting visitors. The inability of so many readers to distinguish between the two, and knowing when to steer clear of a website altogether, is undoubtedly concerning. For those of us on the frontlines of education, especially for history teachers, this problem is nothing new, given the ways in which the rise of the internet has transformed the teaching of the subject over the past 15 years. Students and teachers now have access to a vast amount of information about the past, but few know how to discern what is reliable and what is not. The problem surfaced for me in 2001 when a student handed in a research paper on the early history of the Ku Klux Klan that minimized the level of racial violence during Reconstruction and characterized their relationship with black Southerners as overall positive. The sources were drawn almost entirely from websites published by individual Klan chapters. The student had not thought about the obvious bias of the website or whether it constituted a legitimate historical source. The experience served as an important learning experience for the students, but even more so for me. Even as late as 2001, my students still relied primarily on printed materials compared to Internet sources. Librarians maintained control over new additions to the stacks, allowing for a certain level of quality control, but with each passing year the availability of faster personal computers, handheld devices and increased access to the web provided students with easier access to information about an ever-expanding number of historical subjects. Students and teachers benefited immensely from this increased access. Teachers could now introduce their students to a deep well of primary sources and historical figures that never made it into textbooks. Opportunities for students to conduct their own research through primary and secondary sources was soon limitless, defined only by the time they are willing to spend researching. On the other hand, the technology quickly outpaced educators’ ability to police or even guide students as to how best to search and assess online information. An unsubstantiated narrative, perpetuated by the media, that children are digital natives, naturally hardwired to understand how to use computers, helped to exacerbate the problem even further. Students were left to figure it out on their own as schools gradually cut back on the purchase of additional printed sources or purged their collections entirely. Where once librarians taught students how to research, few schools appreciated the important role they could play in educating students how to search and assess information on the Web. A recent study of Internet literacy among students by the Stanford History Education Group shows that they are incapable of "distinguishing advertisements from news articles or identifying where information came from." There is no denying that access to primary sources from the Library of Congress and other research institutions, along with secondary sources from the scholarly community, has enriched the teaching of history, but their availability means little if they cannot be accessed or distinguished from the vast amount of misinformation that awaits the uneducated user online. In 2008, George Mason University professor T. Mills Kelly created a course called "Lying About the Past" in which students were encouraged to create fake websites about a historical subject. Students worked on creating a fake Wikipedia page, blog, and videos about Edward Owens, a fictitious Virginia oyster fisherman who took up piracy in the Chesapeake Bay in the 1870s. This fake historical narrative was complemented by fake primary sources, including Owens's “legal will.” Although the project was met with some skepticism and even more serious charges by Wikipedia founder Jimmy Wales, Kelly hoped his students "would become much more skeptical consumers of online information." It's difficult to imagine a more effective method of driving home such an important lesson. In the years since Mills first taught the class, opportunities to publish and share information online has expanded even further through Facebook, YouTube and Twitter and blogging platforms such as WordPress and Medium. Opportunities to publish can be an empowering experience. History teachers who embrace these digital tools can shift from assignments that would never see the outside of their classroom's walls to projects that have the potential to reach a wide public audience. Educators can engage students about the ethical responsibilities related to how information should be published on the web. But if the public is left unprepared and without the skills needed to determine what is real and what is suspect, there can be real consequences. Consider for instance the publication of Our Virginia: Past and Present a fourth-grade textbook written by Joy Masoff. First discovered by William and Mary historian Carol Sheriff, whose child was then in the fourth grade, the chapter on the Civil War included a statement that "thousands of Southerner blacks fought in Confederate ranks, including two battalions under the command of Stonewall Jackson." The myth of the Confederate black soldier is an insidious one, traced back to the late 1970s and a small group of Confederate heritage advocates who hoped to distance the history of the Confederacy from slavery. If black men fought as soldiers in the army, they argued, than it would be difficult to maintain that the Confederacy fought to protect and expand the institution of slavery. Not a single academic historian came forward in support of the textbook's claim. Later it was learned that Masoff had discovered the information on a website published by the Sons of Confederate Veterans. There are thousands of websites published by individuals and organizations who believe black Confederate soldiers existed. Websites such as the Petersburg Express, for example, includes photographs and even primary sources that to the uneducated may appear legitimate. The purveyors of these stories often insist that they are providing a public service by uncovering accounts that academic historians have intentionally ignored. Regardless of the motivation for publishing the material in question, these websites present visitors with some of the same challenges as fake news sites. The history classroom is an ideal place in which to teach students how to search and evaluate online information given the emphasis that is already placed on the careful reading and analysis of historical documents. Even the most basic guidelines can steer students away from misinformation. Consider the following questions next time you are researching online: History classrooms that emphasize the critical evaluation of bias and perspective in primary sources, along with the questions above, will also provide students of all ages with the necessary skills to evaluate the links that regularly appear in their Twitter and Facebook feeds. Healthy and well-deserved skepticism can go a long way. The ease with which we can access and contribute to the web makes it possible for everyone to be his or her own historian, which is both a blessing and a curse. The internet is both a goldmine of information as well as a minefield of misinformation and distortion. Teaching our students how to discern the difference will not only help them steer clear of fake history and fake news, but reinforce the importance of a responsible and informed citizenry. In doing so, we strengthen the very pillars of democracy. Kevin M. Levin is a historian and educator based in Boston. He is the author of Remembering the Battle of the Crater: War as Murder (2012) and is currently at work on Searching For Black Confederate Soldiers: The Civil War’s Most Persistent Myth for the University of North Carolina Press. You can find him online at Civil War Memory and Twitter @kevinlevin. Kevin M. Levin is a historian and educator based in Boston. He is the author of numerous articles and books about the Civil War, including Searching for Black Confederates: The Civil War’s Most Persistent Myth.
98e820b5ecbd7923d7c43208b7eb687d
https://www.smithsonianmag.com/history/remote-cold-war-radar-system-has-new-use-warming-world-180952777/
A Remote Cold War Radar System Has New Use in a Warming World
A Remote Cold War Radar System Has New Use in a Warming World How cold was the cold war? The workers who built the DEW (Distant Early Warning) Line in the mid-1950s liked to toss a glass of water into the air just so they could hear the firecracker-like report as the droplets instantaneously froze. They were working in some of the most remote places on earth, on a new line of defense commissioned by the U.S. and Canadian governments: a series of 63 radar and communications stations, most of them manned, running some 3,000 miles from Alaska to Baffin Island and eventually to Iceland, to sound the alarm if attacking Soviet bombers came over the polar horizon. The DEW Line searched the skies until the 1980s, when it was replaced by the North Warning System, a string of 51 unmanned radar stations, such as LAB-1 (right) in Newfoundland and Labrador, the subject of Donovan Wylie’s new book of photographs, North Warning System. Now that the cold war is over and the planet is warming, more foreign ships—particularly Russian and Chinese ships—are exploring newly accessible Arctic waters, and military officials are considering whether the system should be updated to detect marine threats as well. Tom Frail is a senior editor for Smithsonian magazine. He previously worked as a senior editor for the Washington Post and for Philadelphia Newspapers Inc.
b206c5e2e4c596da4c5406fc8d87f71a
https://www.smithsonianmag.com/history/resurrecting-the-dead-with-computer-graphics-77706610/
Resurrecting the Dead With Computer Graphics
Resurrecting the Dead With Computer Graphics A couple of weeks ago audiences at the Coachella music festival got to see Tupac perform live (NSFW language), despite the fact that he’s been dead for fifteen years. Countless websites have already dissected why the technology used to create this “Tupac hologram” isn’t actually a hologram, but rather a Pepper’s Ghost effect that dates back to the mid-19th century, so I won’t get into that. But the other fascinating element to this story is the fact that we can now RESURRECT OUR FAVORITE ENTERTAINERS FROM THE DEAD. Bringing back popular entertainers was the promise of the future in the 1980s and ’90s. As computer graphics improved in the 1980s (with movies like Tron) and then in the 1990s (with movies like Terminator 2: Judgement Day and Jurassic Park) people imagined that actors like Clark Gable, Marilyn Monroe and even a Laurence Olivier/Abraham Lincoln mash-up would be able to star in the computer-enhanced movies of tomorrow. Arthur C. Clarke’s 1986 book July 20, 2019: Life in the 21st Century includes a fictional movie listing for the year 2019: Still Gone with the Wind. The sequel picks up several years after where the 80-year-old original left off, with Rhett and Scarlett reuniting in their middle age, in 1880. Features the original cast (Clark Gable, Olivia de Havilland, and Vivien Leigh) and studio sets resurrected by computer graphic synthesis. Still Gone sets out to prove that they do make ‘em like they used to (Selznick Theater, 2:00 and 8:00 P.M.) The June, 1987 issue of Omni magazine featured an article by Marion Long, who spoke with six directors to get their ideas for the kinds of movies that they would want to direct in the year 2001. One of the directors that Long spoke to was Susan Seidelman, who in 1987 directed a movie called Making Mr. Right starring John Malkovich. Seidelman’s hypothetical movie of the year 2001 was called Yankee Doodle Sweetheart, and was imagined as starring Marilyn Monroe, Robert De Niro, Debra Winger and Jimmy Stewart. Marilyn Monroe had been dead for 25 years by the time this article came out, and though Jimmy Stewart didn’t die until 1997, he was still pictured as playing a much more youthful (and completely computer-generated) version of himself. The synopsis of the film is below: Seidelman electronically recreates Marilyn Monroe. The sex goddess of the Fifties plays a showgirl off to the front lines of a war on a Bob Hope USO tour. In sharp contrast to Monroe’s innocence and naivete stands Debra Winger, a military nurse acutely aware of the horrors of war. But this is Monroe’s story—her coming-of-awareness. Robert De Niro, a Marine sergeant deadened to human emotion, wants one thing: the showgirl. So does his friend, a young recruit, played a computer-simulated Jimmy Stewart. Monroe falls in love with—you’ll have to see the film. The 1982 book The Omni Future Almanac also imagined even more radical computer creations, being able to include the acting skills of one actor with the appearance of another historical figure: It is possible that dramatic performances, even actors’ lines, will be altered, via computer synthesis, yielding a perfect first “take” every time. Some actors, specifically character types, might be totally synthesized. One actor’s performance might easily be combined with another person’s distinctive physical look or voice. By using computer synthesis, a director would be able to marry the acting skills of Laurence Olivier to photographic images of Abraham Lincoln. Marilyn Monroe as a computer simulation (March, 1994 Popular Mechanics) Marilyn Monroe popped up a number of times in predictions about future movies, which may have had something to do with the fact that she died so young—she was just 36 years old. A 1993 article in the San Francisco Examiner predicted that one day, “dead actors such as Humphrey Bogart and Marilyn Monroe could be ‘resurrected’ by using computers to generate their visages and act out scenes they never did,” while the following year, Popular Mechanics ran a story that also featured Marilyn Monroe. The March, 1994 issue had an article called “Beyond Jurassic Park,” which predicted a world of resurrected movie stars now that Jurassic Park had shown just how far computer graphics had come. Marilyn Monroe moves smoothly under a red kimono, and the audience gasps with delight. The scene cuts to Marilyn seated in a swinging trapeze far above the ground. Her face is animated and happy, platinum hair flying in the breeze and her short skirt flipping up over her sleek, attractive thighs. As in her previous life, nobody really knows this Marilyn. This Marilyn is a computer construct—a proof-of-concept synthetic human actor used to advance the science and art of realistic 3D digital animation. The 1990s saw TV advertisements wherein Fred Astaire danced with a vacuum cleaner and John Wayne drank beer, long after both had passed away, but it seems the “Tupac hologram” has for those of the 2010s revived interest in the idea that we might see our favorite celebrities perform for us once again. There’s speculation that Michael Jackson may be next to take the stage from beyond the grave. Or that maybe a digital Lisa “Left Eye” Lopes will allow TLC to reunite. But allow me to be the first to request a “hologram Sheb Wooley.” Because why not, that’s why. And, what about you? If you were making a computer-enhanced film, who would be in your dream cast of living and dead actors? Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
ce191071d4c129f0f3c61f1964347289
https://www.smithsonianmag.com/history/rise-fall-sleeping-car-king-180971240/
The Rise and Fall of the Sleeping Car King
The Rise and Fall of the Sleeping Car King George M. Pullman literally raised Chicago from the mud. He introduced luxury to the nation’s rail lines. He even created a model company town for his workers—a feat that prompted some to proclaim him the “Messiah of a new age.” Then, in the greatest labor uprising of the nineteenth century, he found himself cast as the villain and his reputation turned to dust. Pullman began his career lifting buildings. Taking over a business started by his father, he moved warehouses and barns to allow a widening of the Erie Canal. During the 1850s, officials in Chicago decided to raise their whole city ten feet to allow for drainage of its mud-clogged streets. Pullman jumped at the opportunity. Directing hundreds of men armed with screw jacks and cribbing, he lifted houses and hotels, even an entire city block, without breaking a single pane of glass. More than anything, Pullman wanted to raise himself. The word “businessman” had recently been coined—a man who was neither merchant nor manufacturer but a mobilizer of capital, an entrepreneur. Pullman was a businessman by instinct—shrewd, gifted at calculating value, and always open to the new. Lifting and moving buildings was an exacting operation—hesitation or a lapse of control could mean disaster. It required careful planning, a commanding presence, and steady nerves. These were the qualities on which George Pullman built his success. Railroads had begun to dominate the landscape before the Civil War, and those who could look beyond that terrible conflict could see opportunity approaching. Pullman hired a substitute to take his place in the Union army and set to work fashioning a high-quality sleeping car. It was ready before the war was over. When the first transcontinental rail line opened in 1869, his business took off. George Pullman did not invent the sleeping car—most of the credit went to Theodore T. Woodruff, an upstate New York wagon maker whose car debuted in 1857. But Pullman contributed his share of innovations. He based his success on two ideas: luxury and revenue. Employing both traditional craftsmen and an early version of the assembly line, he created cars that appealed to the Victorian taste for ornamentation—lush carpeting, brocade upholstery, and chandeliers. He installed double-glazed windows and an improved suspension for a quieter, more comfortable ride. Rather than sell the cars, he retained ownership and contracted with the various railroads to add them to passenger trains as an enticement to customers. Pullman then pocketed the extra fare each passenger paid for an upgrade to Pullman luxury. This arrangement gave him a steady stream of revenue. It also meant that he kept complete control over the operation and maintenance of the cars. And those cars proved irresistible. Business travelers could sleep while they rode to the next day’s meeting. Middle-class customers could bask in tony amenities and attentive service. Hungry passengers could feast on gourmet fare in an ornate dining car, another Pullman innovation. For the very wealthy, he offered absurdly opulent private cars. Through buyouts and mergers, Pullman’s company gained a monopoly in the business. The name Pullman came to stand for quality and class. A staunch Republican, George Pullman followed the spirit of Lincoln when he offered jobs to freed slaves. The men served as porters on the cars. They catered to passenger needs and performed the intricate task of transforming a coach car into a rolling dormitory for the night. The Pullman Company soon became the largest employer of African Americans in the country. Concerned about the tenements and squalor that had accompanied industrialization and about the trouble that unrest might bring to capitalists, Pullman constructed a model town adjacent to his huge factory on Chicago’s outskirts. Pullman, Illinois featured the Midwest’s first indoor shopping mall and an elegant library, along with parks, playing fields, and neat brick homes for the workers. A local clergyman said it was “how cities should be built.” Of George Pullman, the Chicago Times predicted that “future generations will bless his memory.” But in the conflict between George Pullman’s idealism and his instinct for making money, money usually won. He hired African American porters in need of work, but he paid them starvation wages—they had to rely on tips and endure the scorn of racist passengers. He created a town replete with flowers and greenery, but he charged exorbitant rents, posted demeaning rules, and allowed no town government. The company ran the show and Pullman’s spies invaded employees’ privacy. The patriotic Pullman was stung when economist Richard Ely criticized his model town as “well-wishing feudalism” that was ultimately “un-American.” The human aspect of affairs did not come naturally to Pullman. One of his office workers noted that “I never knew a man so reserved.” His boss, he felt, would have liked to have treated people as friends, “but he couldn’t. He just didn’t know how.” Still, his company prospered and Pullman reveled in his position as one of the grandees of Chicago society. His sumptuous mansion on Prairie Avenue, “the sunny street that held the sifted few,” was the scene of gala parties. Pullman and his wife spent a week with President Grant at the White House, and the sleeping car magnate hired Lincoln’s son Robert as his personal lawyer. Then came trouble. In 1893 a financial panic plunged the nation into the worst depression that American citizens had yet seen. Pullman laid off workers and cut wages, but he didn’t lower rents in the model town. Men and women worked in his factory for two weeks and received only a few dollars pay after deducting rent. Fed up, his employees walked off the job on May 12, 1894. The Pullman strike might have attracted little notice—desperate workers struck against hundreds of companies during the depression. But the Pullman employees were members of the American Railway Union, the massive labor organization founded just a year earlier by labor leader Eugene V. Debs. At their June convention, delegates of the ARU, a union open to all white railroad employees, voted to boycott Pullman cars until the strike was settled. At the convention, Debs advised members to include in their ranks the porters who were essential to the Pullman operation. But it was a time of intense racial animosity, and the white workers refused to “brother” the African Americans who manned on the trains. It was a serious mistake. The boycott shut down many of the nation’s rail lines, particularly in the West. The workers’ remarkable show of solidarity brought on a national crisis. Passengers were stranded; rioting broke out in rail yards. Across the country, the price of food, ice, and coal soared. Mines and lumber mills had to close for lack of transportation. Power plants and factories ran out of fuel and resources. George Pullman refused to accede to his employees’ demand, which was to assign a neutral arbitrator to decide the merits of their complaints. The company, he proclaimed, had “nothing to arbitrate.” It was a phrase that he would repeat endlessly, and one that would haunt him to his grave. The dramatic story of the explosive 1894 clash of industry, labor, and government that shook the nation and marked a turning point for America. Railroad corporations cheered him on and fired employees who refused to handle Pullman cars. The railroad managers, determined to break the ARU, had a secret weapon in the fight. U.S. Attorney General Richard Olney, a practicing railroad lawyer even while in office, declared that the country had reached “the ragged edge of anarchy.” He asked courts for injunctions making the strike illegal, and he convinced President Grover Cleveland to send federal troops to Chicago and other hot spots to face down strikers. Although state governors had not requested federal intervention, U.S. cavalry troops and soldiers with bayonets were soon confronting rioters. Several dozen citizens were shot dead. Debs and other union leaders were arrested. Nonunion workers began to operate trains. The strike was soon over. That summer, the Pullman workers returned to their jobs on George Pullman’s terms. But their 63-year old boss had little to celebrate. Many thought the nation’s distress could have been avoided if Pullman had shown more humanity. He was scorned even by some of his fellow tycoons—one thought a man who wouldn’t meet his employees halfway was a “God-damned fool.” Eugene Debs, although he had lost the strike, was lionized. One hundred thousand cheering supporters welcomed him when he emerged from a six-month jail term for defying the injunction. Frustrated by government intervention on the railroads’ side, Debs turned to socialism as the only way to rectify the nation’s industrial ills. He led the Socialist Party for almost a quarter century, running for president five times under its banner George Pullman’s public image never recovered. The federal commission that investigated the strike judged that his company’s paternalism was “behind the age.” A court soon ordered the company to sell off the model town. When Pullman died three years after the strike, he left instructions that his body be encased in reinforced concrete out of fear it would be desecrated. A clergyman exclaimed at Pullman’s funeral, “What plans he had!” But most remembered only how his plans had gone awry. Eugene Debs offered the simplest eulogy for his pompous antagonist: “He is on equality with toilers now.” Jack Kelly is a historian and novelist. His latest books is The Edge of Anarchy: The Railroad Barons, the Gilded Age, and the Greatest Labor Uprising in America.
86dc771eaec03b511ae70c3ebd97fa26
https://www.smithsonianmag.com/history/rise-zombie-mall-180973086/
The Rise of the Zombie Mall
The Rise of the Zombie Mall “Who wants to sit in that desolate-looking spot?” Frank Lloyd Wright carped of the atrium inside the first enclosed shopping mall, the Southdale Center in Edina, Minnesota. But 75,000 people rushed there the day it opened in October 1956 and marveled at the 72 stores on two floors, the 800,000 square feet of retail, the 5,200-space parking lot, the 70-degree controlled climate. The Austrian-born architect Victor Gruen, already acclaimed for building the nation’s largest open-air shopping center, had birthed a new phase of American culture. Over the next 40 years, another 1,500 enclosed malls would dot the landscape, from suburb to shining suburb, insinuating themselves into everyday life so profoundly that just “going to the mall” became a pastime. Hundreds of malls, meanwhile, have closed and been demolished or converted, overtaken by a renewed emphasis on walkable neighborhoods and challenged by that overwhelming force of 21st-century living: online shopping. But rumors of the shopping mall’s death may be premature, if the mega-mall opening this October is any indication. The $5 billion, three-million-square-foot American Dream complex in northern New Jersey houses a theme park, a water park, a ski and snowboard park, an ice rink, an aquarium, a movie theater and a Ferris wheel. Oh, and stores. Hundreds of luxury and designer stores. The original developer, Mills Corporation, conceived of the American Dream when Amazon Prime didn’t even exist. The project has faced 16 years of trouble, including a Securities and Exchange Commission investigation of Mills Corp. The company reportedly paid $165 million plus interest to settle the case, and sold the project. A second developer stopped construction when a major lender broke a financing deal. The Triple Five Group—which built the Mall of America in Minnesota in 1992—rescued the project in 2011, but continued to battle environmentalists, neighbors and advocates of vigorous downtowns. Economists voiced skepticism. “I don’t know which is worse—if it fails or if it succeeds,” Jeff Tittel, director of the New Jersey Sierra Club, told New York Magazine in 2011. “If it fails, New Jersey is going to be out of $350 million in taxpayer subsidies. And if it succeeds, it will be the worst traffic, and it will destroy shopping areas in cities and malls all over the state.” The future of enclosed malls is uncertain enough, and they’ve been around long enough, that symptoms of nostalgia are cropping up more and more in the mainstream. The latest season of the hit show “Stranger Things” features a neon-lit 1980s mall, enabling a new generation to see how teens at the height of the craze hung out—under skylights, on elevators, around fountains full of pennies. “Don’t romanticize it,” warns Lizabeth Cohen, a Harvard professor of American studies who has written about the rise of shopping malls. Developers built them in white suburbs, far removed from cities and public transportation routes, fashioning castles of commerce for the white middle class. The mallification of America continued through the ’70s, ’80s and ’90s (19 malls opened in 1990 alone). But by the turn of the millennium the Congress for the New Urbanism was worrying aloud about “greyfields”—shuttered indoor malls that fell to an oversaturated market. In 2000, DeadMalls.com began memorializing the fallen. The Great Recession of 2008 didn’t touch A-grade luxury centers, but it pulverized other tiers of malls. Green Street Advisors, a California-based real estate research firm, says the country’s 37 top-performing malls account for nearly 30 percent of mall value nationwide. Yet Americans still go to the mall, spending some $2.5 trillion in 2014, according to the International Council of Shopping Centers. A 2018 study from the group—which is, admittedly, paid to promote brick-and-mortar retail—found that three-quarters of teens still prefer physical stores to shopping online. Certainly malls are changing, as the nation does. Paco Underhill, a market researcher and founder of the consulting company Envirosell, points to La Gran Plaza in Fort Worth, Texas, which slumped to 10 percent occupancy before reinventing itself as a Hispanic-themed mall, in a region where 23 percent of the population speaks Spanish. Underhill once called the early years of this century the “postmall world,” but he now refers not to malls but to “alls,” extravagant facilities that offer almost everything. Life in 2019 moves at the speed of a tap, immeasurably faster than our traffic-beleaguered roads. Why travel among home, job and fun when you can move to a mall and never leave? The idea is not so different from Victor Gruen’s original vision of all-in-one shopping, which was inspired partly by cozy European town squares. He might like the variety of experiences available to visitors at the massive American Dream, but it’s safe to say he would hate the parking lots, and the impact on downtowns. Gruen had wanted malls to blend in with their surrounding communities; instead, oceans of asphalt isolated them. “I would like to take this opportunity to disclaim paternity once and for all,” the so-called father of the mall said in 1978, two years before his death. “I refuse to pay alimony to those bastard developments.” This article is a selection from the October 2019 issue of Smithsonian magazine Stephie Grob Plante is a features writer and essayist whose work has appeared in The Atlantic, Vox and The Verge.
bf466fef9c64c23dabcebff438c0345a
https://www.smithsonianmag.com/history/robert-reid-mayor-of-nearby-middletown-remembers-three-mile-island-nuclear-accident-180971636/
For Those Living Nearby, the Memory of the Three Mile Island Accident Has a Long Half-Life
For Those Living Nearby, the Memory of the Three Mile Island Accident Has a Long Half-Life In mid-March 1979, Americans headed to the theaters to see The China Syndrome. Starring Jane Fonda, Michael Douglas and Jack Lemmon, the disaster thriller follows a broadcast journalist who discovers safety coverups at a nuclear power plant and the plant supervisor who tries to avert a nuclear disaster. Variety called it “moderately compelling” while the New York Times was a bit more generous, deeming it a “smashingly effective, very stylish suspense melodrama.” Whatever the critics said, The China Syndrome immediately spurred debate about the dangers of relying on nuclear power and the real-world plausibility of such a disaster. One nuclear power executive said the film was “an overall character assassination of an entire industry.” He reassured readers of the New York Times, “The systems are designed and built in such a way that a reactor will operate safely even if there is a significant equipment failure or human error.” But just 12 days after the film’s release, proponents of nuclear power were having to answer for a drastic real-life situation. On March 28, 1979, at the Three Mile Island Nuclear Generating Station in Dauphin County, Pennsylvania, a combination of technical malfunction and human error caused one of the reactors (Unit 2) to partially melt down and release a small amount of radiation into the atmosphere. The site took 14 years and $1 billion to clean up and, to date, Three Mile Island remains the worst nuclear accident in U.S. history. In the aftermath, a presidential commission investigated the accident and the Nuclear Regulatory Commission intensified its oversight of nuclear reactors, implementing new industry-wide safety standards. Many local residents became dedicated anti-nuclear advocates, while others continued working at the non-damaged reactor (Unit 1), which resumed operation in 1985. Now, 40 years after the accident, Three Mile Island might be closing down for good. Unless Pennsylvania state legislators vote to save the power plant, it will shut in September. Elected in 1978, Robert Reid was the mayor of Middletown Borough, which is just three miles from the plant. On the occasion of the anniversary of the accident, Smithsonian spoke with Reid, who finished his last term as mayor in 2013, about what it was like on the ground when the reactor partially melted down. How did you learn about the partial meltdown in one of the reactors? I was teaching at the local high school, and I was on hall duty when the emergency preparedness coordinator called. He said, “Something’s going on down at the island.” They told us there was a problem, but no release of radiation. But we kept hearing different stories. Then they told me there was a small release of radiation. I thought they had been lying to us, but now I think that this was a new type of energy and things were developing so fast they didn’t know how to react. That was Wednesday. Then everything seemed to go back to normal. But there was still a problem. On Friday there was a hydrogen bubble that they thought was going to explode [and release radioactive material]. The governor called for an evacuation of pregnant women and preschool-age children. But most people left on their own. We figured three-quarters of the people left the borough. What was the reaction like among the townspeople? There was a run on the banks. Teenagers were going around town announcing that everyone had to evacuate. It was a mess. I recall standing on a street corner. People were hollering out of their car windows, “Mayor, watch the town!” I knew I couldn’t go. I kept thinking, I was born and raised here. If we had a heavy release of radiation, I'd have to leave this area and start a new life somewhere else. A lot of people thought about this. “What's going to happen to us? Where are we going to go?” When did the people who evacuated come back? There was no explosion, but most people didn’t return for a week or two. It took quite a while for things to get back to normal. In fact, some people never did return. What was it like to be the mayor during this crisis? Oh, it was tough. I was concerned, but I couldn't display it. I couldn't let people see that I was almost frightened, too. Someone had to be in charge that the people could look up to and say, “Well, we have someone. We have a leader here that knows what he’s doing, so we'll follow what he’s doing.” Just my displaying calm was a calming effect as far as the people were concerned. This is what people tell me now. My wife wouldn’t leave. I said, “Look, I can't be worrying about you and worrying about the town.” I said, “You’re going to have to take the kids and get out of town.” They left and went to Connecticut and stayed at my brother’s house. But I knew I couldn't go because I had a responsibility to be here. Did you see the town’s public opinion turn against nuclear power? Shortly after the accident, there was a referendum. It was a vote taken whether to keep the plant closed. It was put on the ballot [in] Dauphin County. Two to one to keep it shut. It wasn't a binding vote. [The plant re-opened in 1985.] As the years went on and on and on, people became a little more educated as far as nuclear energy was concerned. They're not concerned as much now. Today, if you took that same vote, it would be much different. What changes did you see at the plant after the accident? When it was first built, the so-called experts [were] looking down their noses at the people living in the area. The people weren't involved. Today, the plant owners involve the local citizens in just about everything they do. They have committees that meet with the owners of the plant and the engineers. They meet and they discuss things. We're part of the nuclear system that's in the area. Not like it was 40 years ago. [Before], they never reported anything to the local government. But after the accident, a fish couldn't jump out of the water unless they called me. “Mayor, a fish jumped out of the water so we're calling you to let you know what's going on.” It’s a little different today than it was years ago. They're better neighbors. Let's put it that way. How do locals feel about Three Mile Island today? Every once in a while, if the sirens blow down at the island, people ask what’s going on. But we’ve learned more about nuclear energy. Personally, I think we have the safest nuclear plant in the world because everybody is keeping an eye on it. Still, I kept a Geiger counter [an instrument that detects radiation] in my office. I looked at it every day. It reminded me to be prepared. This article is a selection from the April issue of Smithsonian magazine Anna Diamond is the former assistant editor for Smithsonian magazine.
2428f2fdd7e1553cd64fa079d2d8db4a
https://www.smithsonianmag.com/history/rudolf-hess-tale-poison-paranoia-and-tragedy-180952783/
Rudolf Hess’ Tale of Poison, Paranoia and Tragedy
Rudolf Hess’ Tale of Poison, Paranoia and Tragedy In August 1945, an Army major named Douglas Kelley was handed one of the most sought-after assignments in his profession: examining the most prominent Nazis who’d been taken prisoner of war. Kelley, a psychiatrist trained at Berkeley and Columbia, had been treating American soldiers in Europe for combat stress. He saw his new job as a chance to “learn the why of the Nazi success,” he later wrote in his book 22 Cells in Nuremberg, “so we can take steps to prevent the recurrence of such evil.” Before the historic war-crimes trials in Nuremberg, Kelley spent five months interviewing the 22 captive defendants at length, giving them Rorschach and other tests and collecting possessions they’d surrendered. He particularly enjoyed matching wits with Hermann Goering, Hitler’s second in command, whom he treated for an addiction to paracodeine. It was at the Nuremberg prison that Kelley interviewed Rudolf Hess, beginning in October 1945. Hess was a special case. Once Adolf Hitler’s deputy and designated successor, he’d been in custody for more than four years, far longer than the others. When Kelley talked to him, Hess would shuffle around his cell, slip into and out of amnesia and stare into space. But when Kelley asked why he’d made his ill-fated solo flight to England in the spring of 1941, Hess was clear: The British and the Germans should not be fighting each other, but presenting a united front against the Soviets. He had come to broker a peace. “I thought of the colossal naiveté of this Nazi mind,” Kelley wrote in an unpublished statement, “imagining you could plant your foot on the throat of a nation one moment and give it a kiss on both cheeks the next.” Hess saw himself as an envoy, and was shocked when the British took him prisoner. As the months passed, he came to suspect that his captors were trying to poison him, so he took to wrapping bits of his food and medications in brown paper and sealing them with a wax stamp, intending to have them analyzed for proof that he was being abused. He also wrote a statement about his captivity that totaled 37 double-spaced pages. When Kelley returned to the United States, he boxed up everything from his work at Nuremberg—his notes, the tests, inmates’ belongings, including X-rays of Hitler’s skull, paracodeine capsules confiscated from Goering, and Hess’ food packets and statement—and took it home to Santa Barbara, California. “It was that Nazi stuff in the basement,” says his son Douglas Kelley Jr., a retired postal worker. “We all knew it was there.” The archive is now in his basement, in suburban Maryland, between boxes of family photographs and his niece’s artwork. Some of its contents have been published—Jack El-Hai’s recent book The Nazi and the Psychiatrist includes a portrait of Goering that the former Reichsmarschall autographed for Kelley. But the younger Kelley allowed Smithsonian to photograph Hess’ food packets for the first time. The packets, and Hess’ statement, provide a glimpse into the mind of a man who, the elder Kelley wrote in 22 Cells, “will continue to live always in the borderlands of insanity.” When he first landed in Scotland, Hess wrote, the British people “took care of me very well. They...put a rocking chair near the fireplace and offered me tea. Later, when I was surrounded by British soldiers, a young Tommy got up and gave me a bottle of milk which he had taken along for his guard duty.” The next day, he requested a meeting with the Duke of Hamilton, in the mistaken belief that the duke would be sympathetic to Hess’ peace plan. Hamilton said he would inform King George VI, but nothing ever came of it. Over the next few weeks, Hess was moved from Scotland to a military installation at Mytchett Place, about 40 miles southwest of London. “When I arrived...I instinctively distrusted the food,” Hess wrote. “Thus I did not eat or drink anything on the first day.” He grudgingly agreed to the suggestion that he eat with his doctors and guards for reassurance that he wasn’t being poisoned, but then, he said, he was offered food different from theirs. “Once, when I was careless and drank a little bit of milk by myself,” he wrote, “a short time later I got dizzy, had a terrific headache and could not see straight any more. Soon thereafter I got into an hilarious mood and increased nervous energy became apparent. A few hours later, this gave way to the deepest depression and weakness. From then on I had milk and cheese brought into my room every day but merely to deceive the people that I was eating that stuff.” Of course Hess was interrogated. “My correct answers evidently caused disappointment,” he wrote. “However, loss of memory which I simulated gradually caused satisfaction.” So he feigned amnesia more and more. Eventually, “I got to such a state that apparently I could not remember anything...that was further back than a few weeks.” He concluded that his questioners were trying “to weaken my memory” before a meeting with Lord Chancellor Simon, Britain’s highest-ranking jurist, that June. To prepare for the meeting, Hess fasted for three days to clear his mind. “I was well enough for a conference lasting two and a half hours, even though I was still under the influence of a small amount of brain poison.” The lord chancellor, however, found Hess’ peace plan unconvincing and his complaints of maltreatment incredible. He left, Hess wrote, “convinced I had become a victim of prison psychosis.” Soon it wasn’t just brain poison in his food. Hess believed that the British put a rash-inducing powder in his laundry, and that the Vaseline they gave him to treat the rash contained heart poison. He believed the guards added bone splinters and gravel to his meals to break his teeth. He attributed his sour stomach to their lacing his food with so much acid “the skin came loose and hung in little bits from my palate.” In desperation, he wrote, “I scratched lime from the walls in the hope that this would neutralize the other stuff but I was not successful.” When his stomach pains disappeared, it was because “my body readjusted” and so “they stopped giving me any more acid.” In November 1941, Hess sent a letter asking for a meeting with the Swiss envoy in London, who he thought could intervene on his behalf. “I had hardly mailed the letter,” Hess recalled, “when again huge quantities of brain poison were put in my food to destroy my memory.” The Swiss envoy visited Hess, several times, and agreed to take samples of his medications for a laboratory analysis. When the tests determined that nothing was wrong, Hess concluded that “it was an easy matter for the secret service...to give orders that nothing should be found in them for reasons important to the conduct of the war.” As the months passed, Hess tried twice to kill himself, by jumping over a staircase railing and by stabbing himself with a butter knife. His obsession with food was unrelenting. When the Swiss envoy visited in August 1943, Hess had lost 40 pounds. In November 1944, Hess petitioned the British for a “leave of absence” in Switzerland to restore his health. It was denied. When Hess was transferred to Nuremberg in October 1945, he relinquished his food packets under protest and asked Kelley to make sure they were safe. Kelley determined that while Hess suffered from “a true psychoneurosis, primarily of the hysterical type, engrafted on a basic paranoid and schizoid personality, with amnesia, partly genuine and partly feigned,” he was fit to stand trial. More than half a dozen other psychiatrists, from Russia, France, England and the United States, agreed. Most of the other Nuremberg defendants were sentenced to death, but Hess, convicted of two counts related to crimes against peace, was sentenced to life in prison. Douglas Kelley Sr. concluded that the Nuremberg defendants represented not a specifically Nazi pathology, but that “they were simply creatures of their environment, as all humans are.” Kelley killed himself on New Year’s Day 1958, swallowing a cyanide capsule in front of his family. (Goering, too, had taken cyanide, after he was sentenced to hang.) Hess spent 40 years complaining of the food and his health at Spandau Prison in western Berlin before he succeeded at what he’d tried twice before. He hanged himself with an extension cord on August 17, 1987. He was 93. Caren Chesler is a seasoned journalist who spent half her career as a general news reporter and half as a business reporter. Her work has appeared in many publications including the New York Times, Salon, Scientific American and Bloomberg Business News.
65f1d22c1e8222551691419b4c56b4d1
https://www.smithsonianmag.com/history/run-silent-run-deep-72432889/
Run Silent, Run Deep
Run Silent, Run Deep The sonar room supervisor of the nuclear-powered fast-attack submarine USS Batfish (SSN 681) picked up his microphone and punched the intercom button for the officer of the deck a few feet away in the control room. "CONN, SONAR. SONAR CONTACT BEARING ZERO-SIX-TWO. CLASSIFIED POSSIBLE SOVIET SUBMARINE!" The sonar listeners had just picked up the distinctive sounds of a Soviet "Yankee" class ballistic missile submarine on a course toward the East Coast of the United States. The date of this undersea interception was March 17, 1978. The place: about 200 miles above the Arctic Circle in the Norwegian Sea. Concern over Soviet missile submarine patrols in areas that allowed them to target the eastern half of the United States had led to the Navy order that sent Batfish to sea under the command of Comdr. Thomas W. Evans (now a retired rear admiral). She would seek to intercept a Soviet missile submarine, follow her, and observe her operations throughout an entire patrol without being detected. The 120-odd men who served in an attack submarine such as Batfish lived and worked in incredibly cramped conditions in a steel tube about 300 feet long and 32 feet in beam. Though by no means small, these boats were dwarfed by the giant "boomers," or missile boats, of both the American and Soviet navies. Batfish and her sister submarines were quieter than their Soviet quarry. "Our task was to establish a trailing position well behind the Yankee from which we could maintain tactical control," says Batfish skipper Evans. "Maintaining tactical control means you're in that zone where you're close enough to hear the noises emitted by their machinery and propeller through all the other noises in the sea, but not too far away—where you can hear him and he can't hear you." Batfish stayed with the Yankee for 50 days—throughout the Soviet submarine's patrol—following her through storms, fishing fleets, the cacophony of oil exploration explosions, traveling 10,369 shadowing miles to record the Yankee's 8,871 miles. As far as is known, she was never detected by the Soviet boat. Batfish surfaced off her home port of Charleston, South Carolina, on May 17 after 77 days submerged. The Soviets did ultimately learn of our trailing missions through the treachery of American spies in our own navy. However, Rear Adm. Sumner Shapiro, the director of Naval Intelligence in 1978, now retired, believes that such knowledge of our ability to track their submarines anywhere in the world's oceans made the Soviets realize that their ballistic missile submarine force, which they counted on for reliable second-strike capability, was not invulnerable. This realization could have been a major factor in bringing the Cold War to an end. If so, then the U.S. submariners of Batfish and other boats of the Navy's Silent Service made history. Now at least part of their story can be told.
2267874368d997cf24aa056cebedf35b
https://www.smithsonianmag.com/history/sacrifice-amid-the-ice-facing-facts-on-the-scott-expedition-96367423/
Sacrifice Amid the Ice: Facing Facts on the Scott Expedition
Sacrifice Amid the Ice: Facing Facts on the Scott Expedition Captain Lawrence "Titus" Oates with ponies. Photo: Wikimedia Commons For Lawrence Oates, the race to the South Pole had a portentous start. Just two days after the Terra Nova Expedition left New Zealand in November 1910, a violent storm killed two of the 19 ponies in Oates’s care and nearly sank the ship. His journey ended almost two years later, when he stepped out of a tent and into the teeth of an Antarctic blizzard after uttering ten words that would bring tears of pride to mourning Britons. During the long months in between, Oates’s concern for the ponies paralleled his growing disillusionment with the expedition’s leader, Robert Falcon Scott. Oates had paid one thousand pounds for the privilege of joining Scott on an expedition that was supposed to combine exploration with scientific research. It quickly became a race to the South Pole after the Norwegian explorer Roald Amundsen, already at sea with a crew aboard the Fram, abruptly changed his announced plan to go to the North Pole. “BEG TO INFORM YOU FRAM PROCEEDING ANTARCTIC—AMUNDSEN,” read the telegram he sent to Scott. It was clear that Amundsen would leave the collecting of rock specimens and penguin eggs to the Brits; he wanted simply to arrive first at the pole and return home to claim glory on the lecture circuit. Oates, circa 1911. Photo: Wikipedia Born in 1880 to a wealthy English family, Lawrence Oates attended Eton before serving as a junior officer in the Second Boer War.  A gunshot wound in a skirmish that earned Oates the nickname “Never Surrender” shattered his thigh, leaving his left leg an inch shorter than his right. Still, Robert Scott wanted Oates along for the expedition, but once Oates made it to New Zealand, he was startled to see that a crew member (who knew dogs but not horses) had already purchased ponies in Manchuria for five pounds apiece. They were “the greatest lot of crocks I have ever seen,” Oates said. From past expeditions, Scott had deduced that white or gray ponies were stronger than darker horses, though there was no scientific evidence for that. When Oates told him that the Manchurian ponies were unfit for the expedition, Scott bristled and disagreed. Oates seethed and stormed away. Inspecting the supplies, Oates quickly surmised that there was not enough fodder, so he bought two extra tons with his own money and smuggled the feed aboard the Terra Nova. When, to great fanfare, Scott and his crew set off from New Zealand for Antarctica on November 29, 1910, Oates was already questioning the expedition in letters home to his mother: “If he gets to the Pole first we shall come home with our tails between our legs and make no mistake. I must say we have made far too much noise about ourselves all that photographing, cheering, steaming through the fleet etc. etc. is rot and if we fail it will only make us look more foolish.” Oates went on to praise Amundsen for planning to use dogs and skis rather than walking beside horses. “If Scott does anything silly such as underfeeding his ponies he will be beaten as sure as death.” After a harrowingly slow journey through pack ice, the Terra Nova arrived at Ross Island in Antarctica on January 4, 1911. The men unloaded and set up base at Camp Evans, as some crew members set off in February on an excursion in the Bay of Whales, off the Ross Ice Shelf—where they caught sight of Amundsen’s Fram at anchor. The next morning they saw Amundsen himself, crossing the ice at a blistering pace on his dog sled as he readied his animals for an assault on the South Pole, some 900 miles away. Scott’s men had had nothing but trouble with their own dogs, and their ponies could only plod along on the depot-laying journeys they were making to store supplies for the pole run. Given their weight and thin legs, the ponies would plunge through the top layer of snow; homemade snowshoes worked only on some of them. On one journey, a pony fell and the dogs pounced, ripping at its flesh. Oates knew enough to keep the ponies away from the shore, having learned that several ponies on Ernest Shackleton’s Nimrod expedition (1907-1909) had fallen dead after eating salty sand there. But he also knew some of his animals simply would not hold up on any lengthy journey. He suggested to Scott that they kill the weaker ones and store the meat for the dogs at depots on the way to the pole. Scott would have none of it, even though he knew that Amundsen was planning to kill many of his 97 Greenland dogs for the same purpose. “I have had more than enough of this cruelty to animals,” Scott replied, “and I’m not going to defy my feelings for the sake of a few days’ march.” “I’m afraid you’ll regret it, Sir,” Oates answered. The Terra Nova crews continued with their depot-laying runs, with the dogs becoming “thin as rakes” from long days of heavy work and light rations. Two ponies died of exhaustion during a blizzard. Oates continued to question Scott’s planning. In March of 1911, with expedition members camped on the ice in McMurdo Sound, a crew woke in the middle of the night to a loud cracking noise; they left their tents to discover they were stranded on a moving ice floe. Floating beside them on another floe were the ponies. The men hopped over to the animals and began moving them from floe to flow, trying to get them back to the Ross Ice Shelf to safety. It was slow work, as they often had to wait for another floe to drift close enough to make any progress at all. Then a pod of killer whales began circling the floe, poking their heads out of the water to see over the floe’s edge, their eyes trained on the ponies. As Henry Bowers described in his diary, “the huge black and yellow heads with sickening pig eyes only a few yards from us at times, and always around us, are among the most disconcerting recollections I have of that day. The immense fins were bad enough, but when they started a perpendicular dodge they were positively beastly.” Oates, Scott and others came to help, with Scott worried about losing his men, let alone his ponies. Soon, more than a dozen orcas were circling, spooking the ponies until they toppled into the water. Oates and Bowers tried to pull them to safety, but they proved too heavy. One pony survived by swimming to thicker ice. Bowers finished off the rest with a pick axe so the orcas at least wouldn’t eat them alive. “These incidents were too terrible,” Scott wrote. Worse was to come. In November 1911, Oates left Cape Evans with 14 other men, including Scott, for the South Pole. The depots had been stocked with food and supplies along the route. “Scott’s ignorance about marching with animals is colossal,” Oates would write. “Myself, I dislike Scott intensely and would chuck the whole thing if it were not that we are a British expedition.… He is not straight, it is himself first, the rest nowhere.” Scott's party at the South Pole, from left to righ:, Wilson, Bowers, Evans, Scott and Oates. Photo: Wikimedia Commons Unlike Scott, Amundsen paid attention to every detail, from the proper feeding of both dogs and men to the packing and unpacking of the loads they would carry, to the most efficient ski equipment for various mixtures of snow and ice. His team traveled twice as fast as Scott’s, which had resorted to manhauling their sledges. By the time Scott and his final group of Oates, Bowers, Edward Wilson and Edgar Evans had reached the South Pole on January 17, 1912, they saw a black flag whipping in the wind. “The worst has happened,” Scott wrote. Amundsen had beaten them by more than a month. “The POLE,” Scott wrote. “Yes, but under very different circumstances from those expected. We have had a horrible day—add to our disappointment a head wind 4 to 5, with a temperature -22 degrees, and companions laboring on with cold feet and hands.… Great God! This is an awful place and terrible enough for us to have labored to it without the reward of priority.” The return to Camp Evans was sure to be “dreadfully long and monotonous,” Scott wrote. It wasn’t monotonous. Edgar Evans took a fall on February 4th and became “dull and incapable,” according to Scott; he died two weeks later after another fall near the Beardmore Glacier. The four survivors were suffering from frostbite and malnutrition, but seemingly constant blizzards, temperatures of 40 degrees below zero and snowblindness limited their progress back to camp. Oates, in particular, was suffering. His old war wound now practically crippled him, and his feet were “probably gangrene,” according to Ross D.E. MacPhee’s Race to the End: Amundsen, Scott and the Attainment of the South Pole. Oates asked Scott, Bowers and Wilson to go on without him, but the men refused. Trapped in their tent during a blizzard on March 16th or 17th (Scott’s journal no longer recorded dates), with food and supplies nearly gone, Oates stood up. “I am just going outside and may be some time,” he said—his last ten words. The others knew he was going to sacrifice himself to increase their odds of returning safely, and they tried to dissuade him. But Oates didn’t even bother to put his boots on before disappearing into the storm. He was 31. “It was the act of a brave man and an English gentleman,” Scott wrote. John Charles Dollman's A Very Gallant Gentleman, 1913. Photo: Wikipedia Two weeks later, Scott himself was the last to go. “Had we lived,” Scott wrote in one of his last diary entries, “I should have had a tale to tell of the hardihood, endurance and courage of my companions which would have stirred the heart of every Englishman.  These rough notes and our dead bodies must tell the tale.” Roald Amundsen was already telling his tale, one of triumph and a relatively easy journey to and from the South Pole. Having sailed the Fram into Tasmania earlier in March, he knew nothing of Scott’s ordeal—only that there had been no sign of the Brits at the pole when the Norwegians arrived. Not until October 1912 did the weather improve enough for a relief expedition from Terra Nova to head out in search of Scott and his men. The next month they came upon Scott’s last camp and cleared the snow from the tent. Inside, they discovered the three dead men in their sleeping bags. Oates’s body was never found. Sources Books: Ross D.E. MacPhee, Race to the End: Amundsen, Scott and the Attainment of the South Pole, American Museum of Natural History and Sterling Publishing Co., Inc., 2010.  Robert Falcon Scott, Scott’s Last Expedition: The Journals, Carroll & Graf Publishers, Inc., 1996.  David Crane, Scott of the Antarctic: A Biography, Vintage Books, 2005.  Roland Huntford, Scott & Amundsen: The Race to the South Pole, Putnam, 1980. For Lawrence Oates, the race to the South Pole had a portentous start. Just two days after the Terra Nova Expedition left New Zealand in November 1910, a violent storm killed two of the 19 ponies in Oates’s care and nearly sank the ship. His journey ended almost two years later, when he stepped out of a tent and into the teeth of an Antarctic blizzard after uttering ten words that would bring tears of pride to mourning Britons. During the long months in between, Oates’s concern for the ponies paralleled his growing disillusionment with the expedition’s leader, Robert Falcon Scott. Oates had paid one thousand pounds for the privilege of joining Scott on an expedition that was supposed to combine exploration with scientific research. It quickly became a race to the South Pole after the Norwegian explorer Roald Amundsen, already at sea with a crew aboard the Fram, abruptly changed his announced plan to go to the North Pole. “BEG TO INFORM YOU FRAM PROCEEDING ANTARCTIC—AMUNDSEN,” read the telegram he sent to Scott. It was clear that Amundsen would leave the collecting of rock specimens and penguin eggs to the Brits; he wanted simply to arrive first at the pole and return home to claim glory on the lecture circuit. Oates, circa 1911. Photo: Wikipedia Born in 1880 to a wealthy English family, Lawrence Oates attended Eton before serving as a junior officer in the Second Boer War.  A gunshot wound in a skirmish that earned Oates the nickname “Never Surrender” shattered his thigh, leaving his left leg an inch shorter than his right. Still, Robert Scott wanted Oates along for the expedition, but once Oates made it to New Zealand, he was startled to see that a crew member (who knew dogs but not horses) had already purchased ponies in Manchuria for five pounds apiece. They were “the greatest lot of crocks I have ever seen,” Oates said. From past expeditions, Scott had deduced that white or gray ponies were stronger than darker horses, though there was no scientific evidence for that. When Oates told him that the Manchurian ponies were unfit for the expedition, Scott bristled and disagreed. Oates seethed and stormed away. Inspecting the supplies, Oates quickly surmised that there was not enough fodder, so he bought two extra tons with his own money and smuggled the feed aboard the Terra Nova. When, to great fanfare, Scott and his crew set off from New Zealand for Antarctica on November 29, 1910, Oates was already questioning the expedition in letters home to his mother: “If he gets to the Pole first we shall come home with our tails between our legs and make no mistake. I must say we have made far too much noise about ourselves all that photographing, cheering, steaming through the fleet etc. etc. is rot and if we fail it will only make us look more foolish.” Oates went on to praise Amundsen for planning to use dogs and skis rather than walking beside horses. “If Scott does anything silly such as underfeeding his ponies he will be beaten as sure as death.” After a harrowingly slow journey through pack ice, the Terra Nova arrived at Ross Island in Antarctica on January 4, 1911. The men unloaded and set up base at Camp Evans, as some crew members set off in February on an excursion in the Bay of Whales, off the Ross Ice Shelf—where they caught sight of Amundsen’s Fram at anchor. The next morning they saw Amundsen himself, crossing the ice at a blistering pace on his dog sled as he readied his animals for an assault on the South Pole, some 900 miles away. Scott’s men had had nothing but trouble with their own dogs, and their ponies could only plod along on the depot-laying journeys they were making to store supplies for the pole run. Given their weight and thin legs, the ponies would plunge through the top layer of snow; homemade snowshoes worked only on some of them. On one journey, a pony fell and the dogs pounced, ripping at its flesh. Oates knew enough to keep the ponies away from the shore, having learned that several ponies on Ernest Shackleton’s Nimrod expedition (1907-1909) had fallen dead after eating salty sand there. But he also knew some of his animals simply would not hold up on any lengthy journey. He suggested to Scott that they kill the weaker ones and store the meat for the dogs at depots on the way to the pole. Scott would have none of it, even though he knew that Amundsen was planning to kill many of his 97 Greenland dogs for the same purpose. “I have had more than enough of this cruelty to animals,” Scott replied, “and I’m not going to defy my feelings for the sake of a few days’ march.” “I’m afraid you’ll regret it, Sir,” Oates answered. The Terra Nova crews continued with their depot-laying runs, with the dogs becoming “thin as rakes” from long days of heavy work and light rations. Two ponies died of exhaustion during a blizzard. Oates continued to question Scott’s planning. In March of 1911, with expedition members camped on the ice in McMurdo Sound, a crew woke in the middle of the night to a loud cracking noise; they left their tents to discover they were stranded on a moving ice floe. Floating beside them on another floe were the ponies. The men hopped over to the animals and began moving them from floe to flow, trying to get them back to the Ross Ice Shelf to safety. It was slow work, as they often had to wait for another floe to drift close enough to make any progress at all. Then a pod of killer whales began circling the floe, poking their heads out of the water to see over the floe’s edge, their eyes trained on the ponies. As Henry Bowers described in his diary, “the huge black and yellow heads with sickening pig eyes only a few yards from us at times, and always around us, are among the most disconcerting recollections I have of that day. The immense fins were bad enough, but when they started a perpendicular dodge they were positively beastly.” Oates, Scott and others came to help, with Scott worried about losing his men, let alone his ponies. Soon, more than a dozen orcas were circling, spooking the ponies until they toppled into the water. Oates and Bowers tried to pull them to safety, but they proved too heavy. One pony survived by swimming to thicker ice. Bowers finished off the rest with a pick axe so the orcas at least wouldn’t eat them alive. “These incidents were too terrible,” Scott wrote. Worse was to come. In November 1911, Oates left Cape Evans with 14 other men, including Scott, for the South Pole. The depots had been stocked with food and supplies along the route. “Scott’s ignorance about marching with animals is colossal,” Oates would write. “Myself, I dislike Scott intensely and would chuck the whole thing if it were not that we are a British expedition.… He is not straight, it is himself first, the rest nowhere.” Scott's party at the South Pole, from left to righ:, Wilson, Bowers, Evans, Scott and Oates. Photo: Wikimedia Commons Unlike Scott, Amundsen paid attention to every detail, from the proper feeding of both dogs and men to the packing and unpacking of the loads they would carry, to the most efficient ski equipment for various mixtures of snow and ice. His team traveled twice as fast as Scott’s, which had resorted to manhauling their sledges. By the time Scott and his final group of Oates, Bowers, Edward Wilson and Edgar Evans had reached the South Pole on January 17, 1912, they saw a black flag whipping in the wind. “The worst has happened,” Scott wrote. Amundsen had beaten them by more than a month. “The POLE,” Scott wrote. “Yes, but under very different circumstances from those expected. We have had a horrible day—add to our disappointment a head wind 4 to 5, with a temperature -22 degrees, and companions laboring on with cold feet and hands.… Great God! This is an awful place and terrible enough for us to have labored to it without the reward of priority.” The return to Camp Evans was sure to be “dreadfully long and monotonous,” Scott wrote. It wasn’t monotonous. Edgar Evans took a fall on February 4th and became “dull and incapable,” according to Scott; he died two weeks later after another fall near the Beardmore Glacier. The four survivors were suffering from frostbite and malnutrition, but seemingly constant blizzards, temperatures of 40 degrees below zero and snowblindness limited their progress back to camp. Oates, in particular, was suffering. His old war wound now practically crippled him, and his feet were “probably gangrene,” according to Ross D.E. MacPhee’s Race to the End: Amundsen, Scott and the Attainment of the South Pole. Oates asked Scott, Bowers and Wilson to go on without him, but the men refused. Trapped in their tent during a blizzard on March 16th or 17th (Scott’s journal no longer recorded dates), with food and supplies nearly gone, Oates stood up. “I am just going outside and may be some time,” he said—his last ten words. The others knew he was going to sacrifice himself to increase their odds of returning safely, and they tried to dissuade him. But Oates didn’t even bother to put his boots on before disappearing into the storm. He was 31. “It was the act of a brave man and an English gentleman,” Scott wrote. John Charles Dollman's A Very Gallant Gentleman, 1913. Photo: Wikipedia Two weeks later, Scott himself was the last to go. “Had we lived,” Scott wrote in one of his last diary entries, “I should have had a tale to tell of the hardihood, endurance and courage of my companions which would have stirred the heart of every Englishman.  These rough notes and our dead bodies must tell the tale.” Roald Amundsen was already telling his tale, one of triumph and a relatively easy journey to and from the South Pole. Having sailed the Fram into Tasmania earlier in March, he knew nothing of Scott’s ordeal—only that there had been no sign of the Brits at the pole when the Norwegians arrived. Not until October 1912 did the weather improve enough for a relief expedition from Terra Nova to head out in search of Scott and his men. The next month they came upon Scott’s last camp and cleared the snow from the tent. Inside, they discovered the three dead men in their sleeping bags. Oates’s body was never found. Sources Books: Ross D.E. MacPhee, Race to the End: Amundsen, Scott and the Attainment of the South Pole, American Museum of Natural History and Sterling Publishing Co., Inc., 2010.  Robert Falcon Scott, Scott’s Last Expedition: The Journals, Carroll & Graf Publishers, Inc., 1996.  David Crane, Scott of the Antarctic: A Biography, Vintage Books, 2005.  Roland Huntford, Scott & Amundsen: The Race to the South Pole, Putnam, 1980. Gilbert King is a contributing writer in history for Smithsonian.com. His book Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America won the Pulitzer Prize in 2013.
4982b67308e6376b53daa00176aed00a
https://www.smithsonianmag.com/history/salk-sabin-and-the-race-against-polio-169813703/
Salk, Sabin and the Race Against Polio
Salk, Sabin and the Race Against Polio They were two young Jewish men who grew up just a few years apart in the New York area during the Great Depression, and though they were both drawn to the study of medicine and did not know each other at the time, their names would, years later, be linked in a heroic struggle that played out on the front pages of newspapers around the world. In the end, both Albert Sabin and Jonas Salk could rightfully claim credit for one of humanity’s greatest accomplishments—the near-eradication of polio in the 20th century. And yet debate still echoes  over whose method is best suited for the mass vaccination needed to finish the job: Salk’s injected, dead-virus vaccine or Sabin’s oral, live-virus version. Jonas Salk at the University of Pittsburgh. Photo: Wikimedia Commons In the first half of the 20th century, Americans lived in fear of the incurable paralytic poliomyelitis (polio) disease, which they barely understood and knew not how to contain. That the disease led to some kind of infection in the central nervous system that crippled so many children, and even a president (Franklin D. Roosevelt) was alarming enough. But the psychological trauma that followed a neighborhood outbreak resonated. Under the mistaken belief that poor sanitary conditions during the “polio season” of summer increased exposure to the virus, people resorted to measures that had been used to combat the spread of influenza or the plague. Areas were quarantined, schools and movie theaters were closed, windows were sealed shut in the heat of summer, public swimming pools were abandoned, and draft inductions were suspended. Worse, many hospitals refused to admit patients who were believed to have contracted polio, and the afflicted were forced to rely on home care by doctors and nurses who could do little more than fit children for braces and crutches. In its early stages, polio paralyzed some patients’ chest muscles; if they were fortunate, they would be placed in an “iron lung,” a tank respirator with vacuum pumps pressurized to pull air in and out of the lungs. The iron lungs saved lives, but became an intimidating visual reminder of polio’s often devastating effects. Parents carry a stricken child during the polio scare. Photo: Wikipedia By the early 1950s, 25,000 to 50,000 people were becoming infected each year, and 3,000 died from polio in 1952. Parents and children lived in fear that they would be next. The public had been clamoring for some kind of relief as the media reported word of possible vaccines in development.  Government as well as corporate and private money flowed into research institutes, led by the National Foundation for Infantile Paralysis (which later became the March of Dimes, for its annual fund-raising campaigns). At the same time, the two New Yorkers, Salk and Sabin, now living in Pittsburgh and Cincinnati, respectively, raced against the clock, and each other, to cure the dreaded disease. Jonas Edward Salk was born in 1914, the son of Ashkenazi Jewish Russian parents who had immigrated to East Harlem. A gifted student, Salk enrolled at the New York University School of Medicine, but showed little interest in practicing. He was inspired by the intellectual challenges of medical research, particularly his study of  the influenza epidemic that claimed the lives of millions after World War I. With his mentor, Thomas Francis Jr., he worked to develop an influenza vaccine. Salk had an opportunity to pursue a PhD in biochemistry, but he did not want to leave medicine. “I believe that this is all linked to my original ambition, or desire,” he later said, “which was to be of some help to humankind, so to speak, in a larger sense than just a one-to-one basis.” During World War II, Salk began postgraduate work in virology, and in 1947 he began studying infantile paralysis at the University of Pittsburgh Medical School. It was there that he devoted his research to developing a vaccine against polio, concentrating not on the live vaccines that other researchers had been experimenting with (at great peril; one test killed six children and crippled three more), but with a “killed virus” that Salk believed would be safer. Dr. Albert Sabin. Photo: Wikimedia Commons Albert Bruce Sabin was born to Jewish parents in Poland in 1906 and came to the United States in 1921 when his family, fleeing religious persecution, settled in Paterson, New Jersey. Like Salk, Sabin attended medical school at New York University, and after graduating in 1931, he began research on the causes of polio. After a research stint at the Rockefeller Institute, Sabin left New York for the Children’s Hospital Research Foundation in Cincinnati, where he discovered that the polio virus lived and multiplied in the small intestine. An oral vaccine, he believed, might block the virus from entering the bloodstream, destroying it before it spread. Salk cultivated polio viruses on cultures of monkey kidney cells, killed the viruses with formaldehyde, then injected the killed virus into monkeys. The experiments worked. The next step was to test the vaccine on humans, but many wondered who would volunteer to be injected with the polio virus, killed or not. Salk provided the answer: He injected himself and his wife and children—the first humans to be inoculated. In 1954, a large-scale field trial was arranged, with the support of major pharmaceutical companies, and nearly two million schoolchildren between the ages of 6 and 9 participated in the study. One half received the vaccine, the other half a placebo. Then everyone waited. In Cincinnati, Sabin and his research associates swallowed live avirulent viruses and continued to perform trials on prisoners at a federal prison in Chillicothe, Ohio, where volunteer inmates were paid $25 and promised “some days off” their sentences. All thirty prisoners developed antibodies to the virus strains with none taking ill, and the trials were deemed successful.  Sabin wanted to do even larger studies, but the United States would not permit it, so he tested his vaccine in Russia, East Germany and some smaller Soviet Bloc countries. Newspaper Headlines on April 13, 1955. Photo: March of Dimes On April 12, 1955, Dr. Thomas Francis Jr., who monitored the Salk trials, called a press conference at the University of Michigan. The conference was broadcast to to 54,000 physicians who gathered in movie theaters; millions of Americans tuned in by radio. After Francis declared Salk’s vaccine to be “safe and effective,”  church bells rang out and tearful families embraced. The polio panic would soon be over, as pharmaceutical companies rushed to create hundreds of millions of doses of the new vaccine. Sabin’s Europeans trials were also deemed highly successful, and in 1957, his oral vaccine was tested in the United States. In 1963, it became the standard vaccine, and the one used in the effort to eradicate polio around the world. There has always been, with Sabin’s vaccine, a slight chance that the polio virus could mutate back into a dangerous virus—a risk the United States deemed unacceptable. A federal advisory panel recommended Salk’s killed-virus vaccine for use in Americans. Shopkeeper expresses gratitude in April, 1955. Photo: Wikipedia Over the years, polio was found to be a highly contagious disease that spread, not in movie theaters or swimming pools, but from contact with water or food contaminated from the stool of an infected person, and yet polio panic was a source of anxiety among Americans surpassed only by fear of atomic attack. Although Jonas Salk is credited with ending the scourge of polio because his killed-virus vaccine was first to market, Albert Sabin’s sweet-tasting and inexpensive oral vaccine continues to prevent the spread of poliomyelitis in nearly every corner of the world. Sources Books: David M. Oshinsky, Polio: An American Story, Oxford University Press, 2005. Jeffrey Kluger, Splendid Solution: Jonas Salk and the Conquest of Polio, Berkley Trade, 2006. Articles: “Jonas Salk and Albert Bruce Sabin.” Chemical Heritage Foundation, www.Chemheritage.org.  ”Conquering Polio,” by Jeffrey Kluger, Smithsonian magazine, April, 2005. http://www.smithsonianmag.com/science-nature/polio.html ”Fear of Polio in the 1950s,” by Beth Sokol, University of Maryland, Honors Project, http://universityhonors.umd.edu/HONR269J/projects/sokol.html. “Jonas Salk, M.D., The Calling to Find a Cure,” Academy of Achievement: A Museum of Living History. http://www.achievement.org/autodoc/page/sal0bio-1. Gilbert King is a contributing writer in history for Smithsonian.com. His book Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America won the Pulitzer Prize in 2013.
269296c24aa28536e04c63b14a08699f
https://www.smithsonianmag.com/history/samarra-rises-131464352/
Samarra Rises
Samarra Rises I'm standing on a street corner in the center of Samarra—a strife-scarred Sunni city of 120,000 people on the Tigris River in Iraq—surrounded by a squad of American troops. The crackle of two-way radios and boots crunching shards of glass are the only sounds in this deserted neighborhood, once the center of public life, now a rubble-filled wasteland. I pass the ruins of police headquarters, blown up by an Al Qaeda in Iraq suicide truck bomber in May 2007, and enter a corridor lined by eight-foot-high slabs of concrete—"Texas barriers" or "T-walls," in U.S. military parlance. A heavily guarded checkpoint controls access to the most sensitive edifice in the country: the Askariya Shrine, or Mosque of the Golden Dome, one of the holiest sites in Shia Islam. Here, in February 2006, Al Qaeda militants blew up the delicate gold-tile dome atop the thousand-year-old Shiite shrine, igniting a spasm of sectarian killing that brought the country to the edge of civil war. For the past year and a half, a committee led by Iraqi Prime Minister Nuri al-Maliki has been working with United Nations consultants to clear debris from the site and to begin rebuilding the Golden Dome—a $16 million project that aims to restore the shrine sufficiently to receive Shiite pilgrims by this summer. I've been trying for three days to get close to the shrine, stymied by an order from al-Maliki's office barring journalists from the site—an indication of how sensitive the bombing remains in this country. U.S. military officers in Samarra have pulled strings on my behalf with the mayor, Iraqi police officials and the Ministry of Planning in Baghdad. This time, after I reach the checkpoint, a friendly commander of the Askariya Brigade, a predominantly Shiite police force dispatched from Baghdad last year to guard the site, makes a call to his superiors in the Iraqi capital, then escorts me through. As I approach the shrine in the 120-degree heat, I take in evidence of battles between U.S. troops and Al Qaeda that ripped Samarra apart for five years, making it, according to one U.S. general, "the most destroyed city in Iraq." I pass a bullet-pocked hotel, shuttered trinket and mobile-phone shops, and a closed madrassah, or Islamic school. Heaps of debris have been neatly laid along both sides of the road. The stump of the once-glorious dome is now covered with wooden scaffolding. A few golden tiles still cling to jagged remnants of the bruised and broken structure. Near the main gate of the Askariya Shrine, I see the first sign of activity in an otherwise moribund landscape: a bulldozer, laden with fragments of the dome, rumbles through the portal toward a dumping ground nearby. A dozen laborers bustle about the courtyard, which is filled with broken pillars and chunks of concrete bristling with exposed rebar. The whine of a pneumatic drill and the rhythmic pounding of a hammer resound from inside the shrine. "We have 120 workers on the site, working day and night, in two 12-hour shifts," Haidar al-Yacoubi tells me. A Shiite from Baghdad who has served as a technical adviser to the project since April, he adds: "Al Hamdulillah [praise God], the dome will rise again." For nearly 11 centuries, the Askariya Shrine has been revered by Shiite Muslims as a symbol of sacrifice and martyrdom. The original building was constructed in A.D. 944, as the final resting place for Ali al-Hadi and his son, Hassan al-Askari, Shiite imams who had lived under house arrest—and were allegedly poisoned—at the military camp of the Sunni caliph al-Mu'tasim, when Samarra was the capital of the Islamic world. In 1905, the 150-foot dome, covered in 72,000 gold tiles and surrounded by pale-blue walls, was built above the shrine, signifying its importance; many of the faithful regard only the mosques of Najaf and Karbala as holier. Enhancing the sanctity of the compound is the adjacent Blue Mosque, built over a sardhab, or cellar, where Muhammad al-Mahdi, the Twelfth or Hidden Imam, withdrew and then disappeared in the ninth century. Shiites believe that al-Mahdi will one day rise from his "crypt" below the mosque, ushering in man's redemption and the end of the world. For many Shiites, something close to the end of the world occurred on the morning of February 22, 2006, after eight Al Qaeda terrorists disguised in Iraqi military uniforms entered the shrine, overpowered guards, fixed explosives to the golden dome and blew it to pieces. The attack was a key part of Al Qaeda's strategy to foment civil war between Shiite and Sunni Muslims in Iraq, thereby sowing chaos, driving out occupying U.S. forces and turning the country into a fundamentalist caliphate. No one was killed in the attack, but within hours, as Al Qaeda's leadership had hoped, the violent spiral began: Shiite militants set fire to at least two dozen Sunni mosques in Baghdad and killed three imams. Sunnis retaliated by killing Shiites. Soon Baghdad—and much of the rest of Iraq—was caught in a vicious cycle of car bombings, kidnappings, murders and ethnic cleansing. By the end of that year, more than 10,000 people had died across the country. Samarra, meanwhile, sank deeper into destitution and despair, neglected by the Shiite-dominated government, avoided by contractors, and fought over by U.S. forces and a range of insurgent groups. "The city was dead," Mahmoud al-Bazzi, mayor of Samarra, tells me. Today, however, after thousands of former Sunni insurgents came over to the American side; the "surge" of 30,000 U.S. troops ordered by President George W. Bush in early 2007 increased security; and a wave of successful U.S. and Iraqi strikes against Al Qaeda in Iraq put the terrorists on the defensive, the worst of Iraq's violence appears to be over. In Samarra, markets have come back to life and playgrounds are filled with children. And the very symbol of the country's descent into sectarian carnage—the Askariya Shrine—has brought together Sunnis and Shiites in a rebuilding effort. The endeavor, city officials and U.S. soldiers alike hope, will bring back hundreds of thousands of Shiite pilgrims from Iran, the Gulf States and beyond; restore Samarra's economic fortunes; and narrow Iraq's sectarian rift. "Rebuilding a Shia mosque in the heartland of the Sunni insurgency would have been unthinkable" less than a year ago, says Lt. Col. J. P. McGee, commander of the Second Battalion, 327th Infantry, based in Samarra since October 2007. "That's a powerful symbol of how Iraq has changed." But peace in Samarra, as in the rest of Iraq, remains fragile. The city has become, in effect, a giant prison, isolated by an encircling berm, and divided by mazes of T-walls and sandbagged checkpoints. Remnants of Al Qaeda lurk in the surrounding desert, still recruiting among Samarra's youth and waiting for opportunities to strike. Prime Minister al-Maliki, deeply suspicious of Sunni paramilitary units outside the jurisdiction of the Shiite-dominated government, has moved to take control of the former insurgents, known as the Sons of Iraq, and drastically reduce their numbers. The Sons of Iraq have asserted that if they don't receive jobs—either in the Iraqi security forces or in public works projects—they could take up arms again. Should that happen, the tenuous security in Samarra that has made the shrine project possible could collapse overnight. Moreover, the effort itself, although showcased by the government as a powerful example of reconciliation, has been mired in political gamesmanship and sectarian suspicion for the past year, and its success is by no means assured. I flew into Samarra by Black Hawk military helicopter from Baghdad on a steamy night early this past September, sweeping low over the Tigris River for much of the 70-mile, 45-minute journey. Although attacks against coalition forces have dropped dramatically, moving anywhere in the country remains risky: the next morning, I made the short journey from the airfield to the city in a vehicle called an MRAP (for mine-resistant ambush protected), a 38,000-pound armored behemoth with a 12-foot-high turret topped by a 50-caliber machine gun. The intimidating truck—also known as a Cayman—was introduced by the U.S. Army last February here in Salahuddin province to replace the Humvee, which is far more vulnerable to attacks by IEDs—improvised explosive devices. "The MRAPs have saved a lot of lives," a specialist riding in my Cayman told me. But they aren't foolproof: on July 9, 2008, Sgt. First Class Steven Chevalier—driving a Cayman through central Samarra—was killed by an RKG3 thermal grenade, a handheld canister filled with flammable pellets capable of penetrating armor. On August 15, a second RKG3 exploded inside another Cayman, critically burning four U.S. soldiers. We crossed the Tigris over a dam; just downstream, hundreds of Iraqis were trying to beat the oppressive heat by swimming off a sandy bank. Soon we arrived at Patrol Base Olson, a Saddam-era casino built along the river and cut off from the rest of the city by rows of T-walls. This heavily fortified compound is the home of the 150 soldiers of Charlie Company, which has led the fight against Al Qaeda in Samarra, recruited fighters from the Sons of Iraq and helped secure the area around the Askariya Shrine. We pulled into the compound in a cloud of dust, and I stepped from the vehicle into a parking lot littered with bullet casings and crushed, half-empty water bottles. Inside the former casino—now Charlie Company's weapons depot, cafeteria, Internet café and Tactical Operations Center (TOC)—I was welcomed by Capt. Joshua Kurtzman, 29, the company commander. An army officer's son and West Point graduate who crossed from Kuwait with the original invasion force, Kurtzman was now serving his third tour in Iraq. Sitting in his cluttered office at the TOC—one of the few corners of Patrol Base Olson with functioning air conditioning—Kurtzman recounted the marathon U.S. effort to bring Samarra under control during the past five years. U.S. forces arrived in the city in April 2003 and faced a growing insurgency within six months. A succession of U.S. offensives killed hundreds of militants and destroyed large parts of the city. But U.S. attempts to drive out the insurgents never succeeded. By late 2005, Al Qaeda controlled Samarra, with U.S. troops safe only inside Patrol Base Olson and a heavily fortified "Green Zone" adjacent to it. Kurtzman recalled the dark days of Al Qaeda's rule in the city: militants cruised the streets with antiaircraft machine guns mounted on white Toyota pickup trucks. Public executions were held in Samarra's main market. Contractors, shopkeepers, even Sunni imams, were forced to hand over salaries to the militants. Ninety percent of the 40 or so fuel trucks destined for Samarra every few days were hijacked by Al Qaeda, their contents sold on the black market for up to $50,000 per truckload. In June 2007, militants again infiltrated the Askariya Shrine and blew apart the minarets. A month earlier, a suicide truck bomber had attacked police headquarters, killing the commander and 11 of his troops, and driving the rest of the force—700 men—out of the city. "We were fighting daily with Al Qaeda," said Kurtzman. "We had nine IEDs in a three-hour period on [one road through town]. Every patrol we went on, we were in a firefight or were encountering IEDs." Then, in December 2007, the Iraqi government and its U.S. allies began to take back the city. The troops raised watchtowers and secured a berm that had been built around the city in 2005. Beginning a few months earlier, the Iraqi government had begun dispatching a national police brigade—4,000 strong—made up of both Sunnis and Shiites, along with a Kurdish battalion of the Iraqi Army. U.S. troops entered negotiations with Sunni insurgents, who had become fed up with Al Qaeda's tactics—including setting off car bombs inside Samarra. "Al Qaeda wanted to fight everybody," Abu Mohammed, leader of the Sons of Iraq in Samarra, told me. "They killed a lot of innocent people, from all levels of society." A deal was signed last February, and 2,000 Sunni fighters—many of whom had spent years arming IEDs to kill American troops—were given one to three days of weapons training. The Sons of Iraq manned checkpoints and began feeding their new U.S. allies intelligence. "They'd say, 'My brother, who lives in this neighborhood, told me there's a cache here and there are six guys guarding it,'" Kurtzman recounted. U.S. and Iraqi forces conducted pinpoint raids, engaged Al Qaeda in firefights and, in time, drove its members out of Samarra. In an innovation first tried in Anbar province, U.S. troops also undertook a census of Samarra, registering every adult male in the city, scanning irises and taking fingerprints. According to U.S. Army data, hostile actions against American troops dropped from 313 in July 2007 to 5 in October 2008. "I sit here now and say, 'Man, I wish we'd thought of this two years ago,'" says Capt. Nathan Adams, who was based in Samarra in 2005 also. "But we were not ready then, and the Iraqi [insurgents] were not either. They needed to fight the superpower, to save face, then negotiate back to the middle ground." After six months of cooperation, "Al Qaeda's cells are dormant," Kurtzman told me. "They are hiding out in the middle of the desert, just trying to survive." One evening I toured Samarra with Kurtzman and a platoon of soldiers from Charlie Company. We climbed into three Caymans and rumbled into the moonless night; the delicate turquoise dome of the Blue Mosque, bathed in fluorescent light, loomed just beyond the patrol base. It was the first week of Ramadan, and the streets were nearly deserted; most people were still at home for iftar, the feast at sundown that breaks the dawn-to-dusk fast. Only a few groceries, textile shops and restaurants were open, lit by small generators. Samarra's sporadic electricity was out again—no surprise in a city with few functioning services. "The Iraqi provincial government put half a million dollars into a water treatment plant, but there's no chlorine, so you might as well be drinking the Tigris with a straw," Kurtzman told me. We dismounted and walked up the road to the main Sunni mosque in Qadisiya, an affluent quarter dominated during Saddam's time by high-level Baathists and army officers. Just a few months ago, Kurtzman said, troops returning to base from firefights with the militants would hear the muezzin call for jihad against America. But the main council of Sunni mosques in Iraq fired the imam last winter, and the radical messages stopped. "Six months ago, I would not have been standing right here," says Kurtzman. "I'd have been shot at." A crowd of children from an adjacent playground—a provincial government project completed a month ago—gathered around the platoon, along with a few adults. Kurtzman chatted them up, his interpreter by his side. "It's good to see everybody outside tonight." The kids clustered excitedly, trying out a few words of English, hoping for a pen or another small gift. "This must be the hottest place on earth right now," Kurtzman said. "The weather in Saudi Arabia is 105. It's 120 degrees here." The men murmured their assent. "So how much power are you getting here? Two hours on, five hours off?" "Maybe a couple of hours during the day, a couple of hours at night. That's all." A Sons of Iraq member stepped forward and began complaining about his employment prospects. I had been told that under intense pressure from the Iraqi government, the U.S. Army had dropped 200 Sunni fighters from its payroll in just the past month and would have to lay off another thousand in the months to come. In addition, salaries, now at $300 a month, were being renegotiated and could drop by a third. "There's a lot of anxiety out there," Kurtzman told me, as we climbed back into the Cayman. From its earliest days, the effort to rebuild the Askariya Shrine has been beset by the violence and sectarian tensions that tormented so much of Iraq. Immediately after the bombing, then-Prime Minister Ibrahim al-Jaafari, a Shiite, called for United Nations help in restoring it. A few weeks later, Unesco representatives in Paris and Amman, Jordan, agreed to underwrite an Iraqi proposal to train Iraqi technicians and architects, and help rebuild not only the shrine, but Sunni mosques and churches across Iraq. In April 2006, a team from the Iraqi Ministry of Planning set out for Samarra by road for the first on-site assessment. The trip was aborted, however, after word reached the team that an ambush was planned by Al Qaeda. For months afterward, "We searched for international experts to go there, but the reaction was, 'No way,'" Mohamed Djelid, director of Unesco in Iraq, told me. In June 2007, Unesco awarded a contract to Yuklem, a Turkish construction company, to conduct a feasibility study and make initial preparations—cleaning and production of architectural drawings—for the dome's reconstruction. "They sent one expert to Samarra, two times," Djelid said. Then came the destruction of the minarets in June 2007, which frightened off the Turks and made even some Unesco officials skittish about staying involved. "I myself was hesitating about whether Unesco should put our experts in this kind of situation," Djelid said. "But if we stopped, we were concerned about the consequences. What kind of message would that send?" Late that year came another setback: Turkish troops began pushing into Kurdish Iraq in pursuit of PKK Kurdish separatist guerrillas. In the face of an anti-Turkish backlash in Iraq, Yuklem became even more reluctant to send its technicians to Samarra. But in December 2007, a small team of Unesco experts from across the Muslim world—Egyptians, Turks and Iranians—arrived in Samarra and set up an office near the Askariya Shrine. "The shrine was a mess, it was catastrophic, it was clear it was going to be a big challenge," said Djelid. Then the contract with the Turkish company, which had failed to begin work on the risky mission, was canceled. Al-Maliki appointed a task force to take control of the feasibility study, clear the site, and stabilize and protect what remained of the Golden Dome. But while the reconstruction project has been gaining momentum, it still remains enmeshed in sectarian politics. Some Sunnis in Samarra believe that al-Maliki's committee is acting as a front for Tehran, and that the presence of Iranians on the Unesco team is part of a plot to impose Shiite dominance in a Sunni city. "The Iranians have taken over this project," charges Suhail Najm Abed, a local Unesco consultant. "We threw out Al Qaeda, but we are bringing in another Hezbollah," referring to the Lebanese Shiite guerrilla group funded by Iran. For his part, Djelid defends using Iranian engineers: "[They] have a lot of expertise," he says. "When we discuss it with the population of Samarra, most tell us, 'If the Iranians are coming under the umbrella of Unesco, we have no problem.'" Meanwhile, Unesco has been engaged in a debate with the Iraqi government about whether to rebuild the dome with modern materials or to remain faithful to the original construction, which could prolong the project by years. No one can predict with certainty when the dome will rise again. Unesco says that it expects only clean-up efforts and surveying to be completed by this summer. On my last evening in Samarra, Kurtzman took me to meet Abu Mohammed, a former insurgent commander turned Sons of Iraq leader. As the muezzin from an adjacent mosque was blaring the post-iftar call to prayer, we pulled up in three Caymans to a handsome villa in Qadisiya. Abu Mohammed—an imposing and lean-faced man in his early 50s, clad in a white dishdasha, or traditional robe—greeted us in his courtyard and motioned for us to sit on plastic chairs arranged in a circle. Half a dozen other members of the Sons of Iraq welcomed us, including Abu Farouk, a hawk-nosed chain smoker and former tank driver in the Iran-Iraq war. Kurtzman had told me earlier that Abu Mohammed had led mortar teams against U.S. troops at the height of the Iraq insurgency, drawing on his experience as a rocket battalion commander in the Iraqi Army under Saddam. "In every country being occupied, there will be resistance," the former insurgent now began, balancing his 5-year-old son, Omar, in his lap. "And this is the legal right for any nation." Abu Mohammed told me that his Sunni fighters had joined forces with the Americans last February only after their overtures to the Iraqi government had been rebuffed. "The U.S. was our last option," he acknowledged. "When the Americans came to this city, we didn't have a shared enemy. But now we have an enemy which both sides want to fight." The cooperation had been fruitful, Abu Mohammed said, yet he was concerned about the future. Al-Maliki's Shiite-dominated government was about to take control of the 53,000 Sunni fighters in Baghdad, and would soon turn its attention to Anbar and Salahuddin provinces. Despite talk of integrating the Sons of Iraq into the Iraqi security forces, he said, "we've tried to get the government to hire some of our fighters as policemen. But until now we didn't see a single person hired." Kurtzman confirmed that even though Samarra's police force is woefully understrength, the Iraqi government was dragging its feet in hiring. "A Shia-dominated central government in a city that blew up one of the holiest shrines in the Shia world has a lot of bitterness against the people [of Samarra]," Kurtzman said. "That's why, in nine months, you haven't gotten police hired from here." Abu Mohammed insisted that his men were committed to peace, that rebuilding the shrine would benefit everyone in Samarra. But stability, he said, depended on jobs for the Sons of Iraq, and "we don't trust the Iraqi government." Back at the Askariya Shrine, Haidar al-Yacoubi, the Shiite from Baghdad who serves as a technical adviser to the reconstruction project, gestured proudly at the workers sorting rubble in the courtyard. The integration of Shiites and Sunnis at the site, he said, would send a message to the world. "We don't make the Sunni-Shia difference important here," al-Yacoubi said, as we watched a Caterpillar bulldozer push debris through the mosaic-inlaid main gate. "Iraq is a kind of rainbow, so when we rebuild this mosque, we try to pick from each [group]." It remains to be seen, of course, whether such generous sentiments can be sustained—not only at the Mosque of the Golden Dome, but in Samarra and the rest of Iraq. Freelance writer Joshua Hammer is based in Berlin. Photographer Max Becherer lives in Cairo. Joshua Hammer is a contributing writer to Smithsonian magazine and the author of several books, including The Bad-Ass Librarians of Timbuktu: And Their Race to Save the World's Most Precious Manuscripts and The Falcon Thief: A True Tale of Adventure, Treachery, and the Hunt for the Perfect Bird.
1c40a5cc87c5fcbb71f2858d1480f734
https://www.smithsonianmag.com/history/santa-claus-builds-a-flying-machine-169033044/?no-isthttp://www.smithsonianmag.com/history/santa-claus-builds-a-flying-machine-169033044/?no-ist
Santa Claus Builds A Flying Machine
Santa Claus Builds A Flying Machine Postcard showing “Santa Claus of the Future” from 1908 (Source: Novak Archive) Some people are up in arms over a recent update to Santa Claus that excised his smoking habit. However you feel about Santa losing his pipe, let me assure you that this won’t be the last time that Santa gets a makeover. It’s easy for some people to forget that every generation has “updated” Santa to fit with the times — or in some cases to fit with the future. As the 1800s gave way to the 1900s, many Americans felt like perhaps Santa Claus needed a new way of getting from house to house. Since the early 19th century, old Saint Nick had been using a sleigh and reindeer to deliver his presents. But by the 1890s some Americans thought an automobile would be a more modern form of transportation for the jolly old man. However, some illustrators didn’t think that the automobile was quite modern enough and wanted to blast Santa into the future with his very own flying machine. The postcard above (sent in 1908) shows Santa smoking his pipe in his flying machine and dropping a doll down some lucky kid’s chimney. A boy dreams of the radio parts Santa will bring him in his flying machine in the Dec 1922 issue of Science and Invention (Source: Novak Archive) The December 1922 issue of Science and Invention magazine included a list of the best radio parts to buy your little “radio bug.” The list included an illustration of a young boy dreaming about Santa Claus soaring through the sky in his flying machine. That large aerial sitting behind Santa lets us know that he’s definitely hip to the latest technology of the Roaring Twenties. Santa’s flying machine in the Dec 22, 1900 Duluth Evening Herald (Source: Minnesota Historical Society microfilm archive) The December 22, 1900 issue of the Duluth Evening Herald in Duluth, Minnesota ran a page claiming that Santa’s reindeer would be put out of work soon as he skims over the tops of houses in his flying machine. Santa of the future in yet another flying machine (Dec 21, 1900 Carbondale Press) The December 21, 1900, edition of the Carbondale Press in Carbondale, Illinois included the illustration above — “The Twentieth Century Santa Claus.” Just as there were debates at the turn of the 21st century over whether to celebrate the year 2000 or 2001 as the beginning of the century, so too were they fighting over the start of the 20th. Unlike the 21st century however — where 2000 pretty much won out for those impatient yet Y2K-compliant souls — it was generally accepted that the year 1901 would be the proper time to celebrate the beginning of the 20th century. Santa Claus “up to date” in the December 24, 1901 Cedar Rapids Evening Gazette This illustration of Santa “up to date” comes from the December 24, 1901 Cedar Rapids Evening Gazette in Cedar Rapids Iowa. This may be the most modern of them all because if you look carefully you’ll see that Santa Claus patented his flying invention. I guess he didn’t want the Easter Bunny biting his style. Santa’s flying machine from the December 19, 1897 issue of the Galveston Daily News The December 19, 1897, issue of the Galveston Daily News in Galveston, Texas ran a poem by Earle Hooker Eaton titled “The Song of Santa Claus.” The poem speaks of Kris Kringle’s new flying machine and how neglected the poor reindeer are. Here’s hoping their “pitiful fate” was simply being put out to pasture rather than meeting some grisly demise at the hands (or hooves) of modernity. With a whirr of my wings I’m away on the wind, Heigh-ho! Heigh-ho! Like a bird in the sky, And my home at the Pole soon is left far behind, Heigh-ho! Heigh-ho! But it’s cold up so high! I’ve a packet of trinkets and candy and toys, To slip in the stockings of misses and boys, Till heart after heart is a storehouse of joys, Heigh-ho! Heigh-ho! How delightful to fly! Every whir of my wings speeds me swift on my way Heigh-ho! Heigh-ho! What a wonderful gait! For the horse and the reindeer have both had their day, Heigh-ho! Heigh-ho! What a pitiful fate! Poor Dasher and Dancer no longer are seen, And Donder and Blitzen with envy are green, Kris Kringle now travels by flying machine, Heigh-ho! Heigh-ho! But I’m right up to date! Do you have a favorite vision of futuristic Santa Claus? How do you suppose Santa will get around in the year 2100? Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
7f2032bd245cf75239dd67059cbc2cf6
https://www.smithsonianmag.com/history/sarah-winnemucca-devoted-life-protecting-lives-native-americans-face-expanding-united-states-180959930/
Sarah Winnemucca Devoted Her Life to Protecting Native Americans in the Face of an Expanding United States
Sarah Winnemucca Devoted Her Life to Protecting Native Americans in the Face of an Expanding United States For the first few years of her life, Sarah Winnemucca, who was born around 1844, did not know that she was American. Born Thocmetony (Shell Flower) among the Numa (known among whites as the Northern Paiute or “digger” Indians), she roamed with her people over western Nevada and eastern Oregon, gathering plants and fish from local lakes. But even during her early years, Winnemucca had learned to be afraid of the men with “white” (blue) eyes, who looked like owls because of their beards. For Winnemucca, being “American” was a complicated process of adopting the behaviors and language of people she had reason to distrust. Translating between the two cultures became her life’s work. And though she was uncomfortable with the role, her influence is still felt today: Winnemucca’s autobiography, Life Among the Paiutes, the first English narrative by a Native American woman, voices a thoughtful critique of Anglo-American culture while recounting the fraught legacy of federal lands, including Nevada’s Pyramid Lake and Oregon’s Malheur region, recently the site of a militia takeover. (The 19th-century Malheur Indian reservation lies immediately north of the current wetlands). As Winnemucca grew up, she came to understand that the settlers were not leaving and she began adopting Anglo-American habits, acquiring the Christian name Sarah and mastering English and Spanish. At her grandfather’s request, she and her sister went to a convent school in San Jose, California, but they were only there a few weeks when “complaints were made to the sisters by wealthy parents about Indians being in school with their children.” For most of her life, she sought to straddle American and Native cultures to help the Northern Paiutes. In 1859, land was set aside near Pyramid Lake for a reservation. Winnemucca and her family were expected to abandon their nomadic life for a settled, “American” lifestyle—and make a success of farming in a dry, arid landscape without any training. Many Paiutes died of starvation at Pyramid Lake. They were only given supplies the first year, with government agents pocketing the money intended for them for the following 22 years (a practice common on many reservations). After the first disastrous winter there, Winnemucca was driven to action, begging military leaders at Nevada’s Camp McDermit for help. Wagonloads of supplies were finally sent to the reservation. Winnemucca was hired as a military interpreter and her father and their band moved to the military camp. Translating was a means for Winnemucca to get better treatment for her people, but she was often in an untenable position. In the mid 1870s, she had to translate for agent William V. Rinehart, whom she found to be a hard, unlikeable man. If she translated Rinehart’s words without comment, she failed to protect her people; but if she tried to convey grievances from the Northern Paiutes, she might be (and was) fired from her position. Rinehart eventually banned her from Malheur. Winnemucca fared better in the military camps, where her knowledge of Paiute life garnered some respect. In 1878, she worked as a messenger, scout and interpreter for General O. O. Howard during the Bannock War, a skirmish between the U.S. military and the Bannock Indians. “This was the hardest work I ever did for the government in all my life … having been in the saddle night and day; distance, about two hundred and twenty-three miles. Yes, I went for the government when the officers could not get an Indian man or a white man to go for love or money. I, only an Indian woman, went and saved my father and his people,” she later wrote. Her courageous actions landed her on the front page of The New York Times in June 1878, but sowed mistrust between her and local tribes. This autobiographical work was written by one of the country's most well-known Native American women, Sarah Winnemucca. She was a Paiute princess and a major figure in the history of Nevada; her tribe still resides primarily in the state. The Bannock War ended badly for the Paiutes, who were mostly innocent bystanders. In 1879, military leaders forced the Paiutes at Camp McDermit to march more than 350 miles in winter to the Yakama reservation in Washington territory.  Winnemucca was devastated; she had promised the Paiutes they would be all right if they followed military orders. In Yakama she worked as an interpreter. She argued with the reservation agent, wrote letters to military and government leaders, and in the winter of 1880, accompanied her father and other Paiute leaders to Washington, D.C., to meet with the secretary of the interior, Charles Schurz. They succeeded in obtaining a letter allowing the Paiutes to return to Malheur, but the Yakama agent refused to let them leave. Several of the Paiutes accused Winnemucca of betraying them for money. She showed them Schurz’s letter and said, “I have said everything I could in your behalf ... I have suffered everything but death to come here with this paper. I don’t know whether it speaks truth or not. You can say what you like about me. You have a right to say I have sold you. It looks so. I have told you many things which are not my own words, but the words of the agents and the soldiers … I have never told you my own words; they were the words of the white people, not mine.” Winnemucca escalated her fight for reform. When face-to-face petitions and letters failed to improve conditions for the Paiutes, she began lecturing in San Francisco, dramatizing the plight of reservation Indians. These performances offered a carefully curated version of the “Indian princess” to various white crowds, and she often wore native dress. She told a reporter, “I would be the first Indian woman who ever spoke before white people, and they don’t know what the Indians have got to stand sometimes.” She described the abuses of reservation agents, particularly Rinehart. But her voice came at high cost: Rinehart responded by calling Winnemucca—in public and in letters to the Office of Indian Affairs—a drunk, a gambler and a whore. Winnemucca became famous. In 1883, sisters Elizabeth Palmer Peabody and Mary Peabody Mann, important educators, intellectuals and members of the Transcendentalist movement, invited her to lecture in New England. The Peabody sisters also arranged for the publication of Life Among the Paiutes later that year. In all, Winnemucca spoke nearly 300 times throughout New England, meeting John Greenleaf Whittier, Ralph Waldo Emerson, Supreme Court Justice Oliver Wendell Holmes, and Senator Henry Dawes, among others. “The lecture was unlike anything ever before heard in the civilized world—eloquent, pathetic, tragical at times; at others [her] quaint anecdotes, sarcasms, and wonderful mimicry surprised the audience again and again into bursts of laughter and rounds of applause,” wrote a reporter from The Daily Silver State in 1879. But despite her successful speaking, Sarah was not always as conformable as her audiences would like, and her writing about Americans often criticized their hypocrisy and challenged popular narratives about pioneers. Of the infamous Donner Party, who showed up when she was five, Winnemucca wrote, “Well, while we were in the mountains hiding, the people that my grandfather called our white brothers came along to where our winter supplies were. They set everything we had left on fire. It was a fearful sight. It was all we had for the winter, and it was all burned during that night.” Even more cutting, she reflected in her autobiography, “Since the war of 1860 there have been one hundred and three (103) of my people murdered, and our reservation taken from us; and yet we, who are called blood-seeking savages, are keeping our promises to the government. Oh, my dear good Christian people, how long are you going to stand by and see us suffer at your hands?” After the mid-1880s she abandoned lecturing, exhausted and disillusioned. In 1885 she told The Daily Silver State that she had fought “agents for the general good of [her] race, but as recent events have shown that they are not disposed to stand by me in the fight, I shall relinquish it.” She worked in both worlds, but was at home, ultimately, in neither. She once told an interviewer, “I would rather be with my people, but not to live with them as they live.” She turned her energies instead toward a school for Paiute children, teaching children to read and write in English and providing them with training in marketable skills. Unfortunately, funding for the school was a persistent problem, and in 1887, the Dawes Act mandated that Native children be taught in white-run, English-only schools. And so the school was closed. Winnemucca may have begun her life ignorant of Americans, but by the time she died in 1891, Americans were not ignorant of her—her obituary ran in The New York Times. And if her speeches and writing did not make the changes she hoped for, they remain a vivid, eloquent testimony of a life spent speaking for others. Rosalyn Eves wrote her PhD dissertation on 19th-century women's rhetoric in the American West, including Sarah Winnemucca. She teaches at Southern Utah University and her first novel is forthcoming from Knopf in 2017.
bee8964789d7bd48f187f90703b5d3a0
https://www.smithsonianmag.com/history/savoring-pie-town-85182017/
Savoring Pie Town
Savoring Pie Town The name alone would make a stomach-growling man wish to get up and go there: PieTown. And then too, there are the old photographs—those moving gelatin-silver prints, and the equally beautiful ones made in Kodachrome color, six and a half decades ago, at the heel of the Depression, on the eve of a global war, by a gifted, itinerant, government, documentary photographer working on behalf of FDR’s New Deal. His name was Russell Lee. His Pie Town images—and there are something like 600 of them preserved in the archives of the Library of Congress—portrayed this little clot of high-mountain-desert New Mexico humanity in all of its redemptive, communal, hard-won glory. Many were published last year in Bound for Glory, Americain Color 1939-43. But let’s get back to pie for a minute. “Is there a particular kind you like?” Peggy Rawl, coowner of PieTown’s Daily Pie Café, had asked sweetly on the phone, when I was still two-thirds of a continent away. There was clatter and much talk in the background. I’d forgotten about the time difference between the East Coast and the Southwest and had called at an inopportune hour: lunchtime on a Saturday. But the chief confectioner was willing to take time out to ask what my favorite pie was so that she could have one ready when I got there. Having known about PieTown for many years, I was itching to go. You’ll find it on most maps, in west-central New Mexico, in CatronCounty. The way you get there is via U.S. 60. There’s almost no other way, unless you own a helicopter. Back when Russell Lee of the Farm Security Administration (FSA) went to Pie Town, U.S. 60—nowhere near as celebrated a highway as its more northerly New Mexico neighbor, Route 66, on which you got your kicks—called itself the “ocean to ocean” highway. Big stretches weren’t even paved. Late last summer, when I made the trek, the road was paved just fine, but it was still an extremely lonesome two lane ribbon of asphalt. We’ve long licked the idea of distance and remoteness in America, and yet there remain places and roads like PieTown and U.S. 60. They sit yet back beyond the moon, or at least they feel that way, and this, too, explains part of their beckoning. When I saw my first road sign for PieTown outside a New Mexico town called Socorro (by New Mexico standards, Socorro would count as a city), I found myself getting cranky and strangely elevated. This was because I knew I still had more than an hour to go. It was the psychic power of pie, apparently. Again, I hadn’t planned things quite right—I’d left civilization, which is to say Albuquerque—without properly filling my stomach for the three-hour haul. I was muttering things like, They better damn well have some pie left when I get there. The billboard at Socorro, in bold letters, proclaimed: HOME COOKING ON THE GREAT DIVIDE. PIE TOWNUSA. I drove on with some real resolve. Continental Divide: this is another aspect of PieTown’s strange gravitational pull, or so I have become convinced. People want to go see it, taste it, at least in part, because it sits right on the Continental Divide, at just under 8,000 feet. PieTown, on the Great Divide—it sounds like a Woody Guthrie lyric. Something there is in our atavistic frontier self that hankers to stand on a spot in America, an invisible demarcation line, where the waters start to run in different directions toward different oceans. Never mind that you’re never going to see much flowing water in PieTown. Water, or, more accurately, its lack, has much to do with PieTown’s history. The place was built up, principally, by Dust Bowlers of the mid- and late 1930s. They were refugees from their busted dreams in Oklahoma and West Texas. A little cooperative, Thoreauvian dream of self-reliance flowered 70 and 80 years ago, on this red earth, amid these ponderosa pines and junipers and piñon and rattlesnakes. The town had been around as a settlement since at least the early 1920s, started, or so the legend goes, by a man named Norman who’d filed a mining claim and opened a general store and enjoyed baking pies, rolling his own dough, making them from scratch. He’d serve them to family and travelers. Mr. Norman’s pies were such a hit that everybody began calling the crossroads PieTown. Around 1927, the locals petitioned for a post office. The authorities were said to have wanted a more conventional name. The Pie Towners said it would be PieTown or no town. In the mid-’30s, something like 250 families lived in the surrounding area, most of them in exile from native ground gone arid. By the time Russell Lee arrived, in the company of his wife, Jean, and with a trunk full of cameras and a suitcase full of flashbulbs, the town with the arresting name boasted a Farm Bureau building, a hardware and feed store, a café and curio shop, a hotel, a baseball team, an elementary school, a taxidermy business. There was a real Main Street that looked a little like a movie set out of the Old West. Daily, except Sunday, the stagecoach came through, operated by Santa Fe Trail Stages, with a uniformed driver and with the passengers’ luggage roped to the roof of a big sedan or woody station wagon. Lee came to PieTown as part of an FSA project to document how the Depression had ravaged rural America. Or as the Magdalena News put it in its issue of June 6, 1940: “Mr. Lee of Dallas, Texas, is staying in Pietown, taking pictures of most anything he can find. Mr. Lee is a photographer for the United States department of agriculture. Most of the farmers are planting beans this week.” Were Lee’s photographs propagandistic, serving the aims of an administration back in Washington bent on getting New Deal relief legislation through Congress and accepted by the American people? Of course. That was part and parcel of the mission of the FSA/OWI documentary project in the first place. (OWI stands for Office of War Information: by the early ’40s, the focus of the work had shifted from a recovering rural America to an entire nation girding for war.) But with good reason, many of the project’s images, like the names of some of those who produced them—Walker Evans, Dorothea Lange, Arthur Rothstein, Ben Shahn, Marion Post Wolcott, John Vachon, Gordon Parks, Russell Lee—have entered American cultural myth. The results of their collaborative work—approximately 164,000 FSA/OWI prints and negatives—are there in drawer after drawer of file cabinets at the Library of Congress in a room I have visited many times. (Most of the pictures are now also on-line at http://memory.loc.gov/ammem/fsowhome.html.) Taken together, those images have helped define who we are as a people, or who we’d like to think we are; they amount to a kind of Movietone newsreel looping through our heads. Lee took plenty of pictures in PieTown of the deprived living conditions; he showed how hard it all was. His pictures weren’t telling lies. And yet his pictures of people like the Caudills almost made you forget the deprived living conditions, forgive them, because the sense of the other—the shared food and good times at all-day community church sings—was so powerfully rendered. In front of Lee’s camera, the Caudills’ lives seemed to narrate the received American story of pluck and determination. Never mind that I now also knew—in the so-called more rational and objective part of my brain—that the Thoreauvian ideal of self-reliance had foundered badly in this family. For Doris and Faro Caudill (and their daughter, Josie, who was about 8 when Lee took his pictures), the PieTown dream became closer to a nightmare. Faro got sick, got lung trouble, the family moved away (just two years after the pictures were taken). Faro sought work in the city, Faro ran around. An acrimonious divorce ensued. Doris ended up married to another man for 39 years. She even went to Alaska to try the American homesteading dream all over again. There is a beautiful book published several years ago about the Caudills and their saga, but especially about Doris: Pie Town Woman, by Joan Myers, a New Mexico author. In 1942, when Faro Caudill hitched the gate at his PieTown homestead for the last time, he scrawled on the wood: “Farewell, old homestead. I bid you adieu. I may go to hell but I’ll never come back to you.” And yet what you also get from Myers’ book about Doris in her very old age, not long from her death, is a deep longing to be there again, to have that life again. She told the author she’d like to have hot and cold running water, though. “As old as I am, I like to take a bath now and then. We would take a bath on Saturday night. We had a number three bathtub. I’d get the water all hot and then I’d bathe Josie and then I’d take a bath and then Faro would take a bath. . . . You kind of wore the water out.” What happened in this dot of civilization, to go on with PieTown’s history, is that the agricultural dream dried up—quite literally. The good growing years lasted not even a generation. It was the water once more, a grapes of wrath anew, the old Western saga of boom to bust. Somehow, by the ’50s, the climate had seemed mysteriously to shift, just as it had in the places abandoned earlier by those Okies and West Texans and Kansans. The winters became balmier. The snows wouldn’t fall, not like they once did; the earth refused to hold its moisture for the spring planting. The corn and pinto bean fields, which two decades before had yielded rich harvests, as long as its tillers were willing to give to them the sunup-to-sundown work that they demanded, withered. And so, many of those once-exiled families found themselves exiled again. Some of them had already long moved on to cities, to jobs in defense plants and airplane factories. They’d gone to Albuquerque, to California, where the life was said to be easier, the paychecks regular. But the town never died out entirely. Those who’d stayed behind made a living by any means they could: drilling wells, grazing cows, running mom and pop businesses, opening cafés called the Pie-O-Neer, recently reopened, or the Break 21. And new homesteaders always seemed to arrive, willing to try out the PieTown dream. The highway had already taken me through and around the parched mountains and mesas and across a vast moonlike tract from the Pleistocene age called the Plains of San Agustin. The land had begun to rise again, almost imperceptibly at first, and then rather dramatically. It was still desert, but the land looked more fertile now. That was mostly illusion. I couldn’t find any town at first. The “town” looked like no more than a wide spot in the road, with the Daily Pie Café and the post office and an art gallery just about the only visible enterprises. I just had to adjust my eyes, I just had to give it time—to find the drilling business, the realty office selling ranchettes, the mobile home campgrounds, the community center, the several churches, the fist of simple homes that stood along the old main street before they relocated U.S. 60, the long-closed old log hotel still standing on the old U.S. 60, home now to bats and spiders and snakes. Russ and Jean Lee had lodged there while he’d made his pictures. I just had to look around to find the town cemetery—windblown, weedy, ghostly, beautiful. There were graves piled with stones, and under them were Americans who had lasted 90 and more years. I walked into the offices of the Alegres Electric Company, a husband and wife operation owned by Judy and Bob Myers. They are both licensed electricians. The shop was in a little mud-dried house with a brown tin corrugated roof across the macadam from the Daily Pie. In addition to their electrical business, the Myers were also offering trail mix and soft drinks and flashlight batteries. “Hikers come through on the Divide,” Judy explained. She was sitting at a computer, a classic-looking frontier woman with deep facial lines set in a leathery tan. She said that she and her husband had chased construction jobs all over the country, and had somehow managed to raise their kids while doing it. They’d found PieTown four or five years ago. They intended to stick. “As long as we can keep earning some kind of living here,” Judy said. “As long as our health would hold.” Of course, there are no doctors or hospitals nearby. “I guess you could call us homesteaders,” Judy said. I encountered Brad Beauchamp. He’s a sculptor. He had topped 60. He was staffing the town Tourist and VisitorInformationCenter. There was a sign with those words in yellow lettering on the side of an art gallery. There was a big arrow and it directed me to the rear of the gallery. Beauchamp, instantly friendly, ten years a Pie Towner, is a transplant from San Diego, as is his wife. In California, they’d had a horse farm. They wanted a simpler life. Now they owned 90 acres and a cabin and an array of four-footed animals. They were making their living as best they could. Beauchamp, a lanky drink of water recovering from a bicycling accident, talked of yoga, of meditation, of a million stars in the New Mexico sky. “I’ve worked real hard on . . . being calm out here,” he said. “So are you calmer?” “I’ve got such a long way to go. You know, when you come to a place like this, you bring all your old stuff with you. But this is the place. We’re not moving.” Since the sculptor was staffing the visitor’s center, it seemed reasonable to ask if I could get some PieTown literature. “Nope,” he said, breaking up. “That’s because we don’t have any. We have a visitor information center, but nothing about PieTown. We do have brochures for a lot of places in the state, if you’d like some.” Outside the post office, on the community bulletin board, there was a hand-scrawled notice: “Needed. Support from Community for Pie Festival. 1) Organize a fiddle contest. 2) Help set up on Friday 10 Sept.” The planners of the all-day event were asking for volunteers for the big pie-eating contest. Judges were needed, cleanup committees. There would be the election of a Pie queen and king. Candidates for the title were being sought. Sixty-four years before, photographer Lee had written to his boss Roy Stryker in Washington: “Next Sunday at Pietown they are having a big community sing—with food and drink as well—it lasts all day so I’m going to be sure to be here for that.” Earlier Stryker had written to Lee about PieTown: “[Your] photographs, as far as possible, will have to indicate something of what you suggest in your letter, namely: an attempt to integrate their lives on this type of land in such a way as to stay off the highways and the relief rolls.” There had been no passage of years. It was as if the new stories were the old stories, just with new masks and plot twists. And then there was the Daily Pie. I’ve been to some restaurants where a lot of desserts were listed on the menu, but this was ridiculous. The day’s offerings were scrawled in a felt-tip pen on a big “Pie Chart” above my head. In addition to regular apple, there was New Mexican apple (laced with green chili and piñon nuts), peach walnut crumb, boysen berry (that’s the spelling in Pie Town), key lime cheesecake (in Pie Town it’s a pie), strawberry rhubarb, peanut butter (it’s a pie), chocolate chunk crème, chocolate walnut, apple cranberry crumb, triple berry, cherry streusel, and two or three others that I can no longer remember and didn’t write down in my notebook. The Pie Chart changes daily at the Daily Pie, and sometimes several times within a day. A red dot beside a name meant that there was at least a whole other pie of that same kind back in the kitchen. And a 1 or a 2 beside a name meant there were just one or two slices left, and apparently wouldn’t be any more until that variety came up in the cycle again. I settled on a piece of New Mexican apple, which was a lot better than “tasty.” It was zingy. And now that I’ve sampled my share of PieTown’s finest selections, I’d like to relay a happy fact, which is probably implicit anyway: at the Daily Pie Café—where so much of PieTown’s current life unfolds— they serve much more than pie. Six days a week they make a killer breakfast and a huge lunch, and two days a week they dish until 8 p.m., and on Sundays, the pièce de résistance, they’re glad to work you over with one of those all-afternoon, old-fashioned turkey, ham or roast-beef dinners with potatoes and three vegetables that your grandmother used to make, the kind that got sealed lovingly in family albums and in the amber of memory. For three days I took my meals at the Daily Pie, and as it happened, I became friendly with an old-timer named Paul Painter. He lives 24 miles from PieTown, off the main road. Six days a week—every day that it’s open—Painter comes in his pickup, 48 miles round trip, most of it by dirt road, arriving at the same hour, 11 a.m. “He’s steady as a damn stream coming out of the mountain,” said Mike Rawl, husband of Daily Pie Café pie chef Peggy Rawl, not to mention the café’s greeter, manager, shopper, cook and other co-owner. Every day Painter puts in the same order: big steak (either rib-eye or New York strip), three eggs, toast and potatoes. He’ll take two hours to dine. He’ll read the paper. He’ll flirt with the waitresses. And then he’ll drive home. Painter is deep in his 70s. His wife died years ago, his kids live away. He told me that he spends every day and night alone, except for those several hours at the café. “Only way I know what day of the week it is, is from a little calendar I keep right by the light bulb in my bedroom,” he said. “Every night I reach over and make a check. And then I turn out the light.” Said Rawl one day in his café, after the rush of customers: “I’ve thought about it a lot. I think the very same impulses that brought the homesteaders out here brought us out. My family. They had the Dust Bowl. Here you’ve got to come out and buy a tax license and deal with insurance and government regulations. But it’s the same thing. It’s about freedom, the freedom to leave one place and try to make it in another. For them their farms got buried in sand. They had to leave. Back in Maryland it never really seemed like it was for us. And I don’t mean for us, exactly. You’re helping people out. This place becomes part of the town. I’ve had people running out of gas in the middle of the night. (I’ve got a tank out back here.) You’re a part of something. That’s what I mean to say. It’s very hard. You have to fight it. But the life here is worth the fight.” I went around with “Pop” McKee. His real name is Kenneth Earl McKee. He has a mountain man’s untrimmed white beard. When I met him, his pants were held up by a length of blue cord, and the leather of his work boots seemed soft as lanolin. He had a little heh-heh caving-in-on-itself laugh. He has piercing blue eyes. He lives in a simple home not even 200 yards from where, in the early summer of 1940, a documentarian froze time in a box on a pine board elementary school stage. Pop McKee, past 70, is one of the last surviving links to Russell Lee’s photographs. He is in many of Russell Lee’s PieTown photographs. He is that little kid, third from right, in the overalls at the PieTown community school, along with his cousin and one of his sisters. The kids of PieTown are singing on a makeshift stage. Pop is about 8. In 1937, Pop McKee’s father—Roy McKee, who lies in the town cemetery, along with his wife, Maudie Bell—had driven a John Deere tractor from O’Donnell, Texas, toward his new farming dream, pulling a wagon with most of the family possessions. It took him about five days. Pop asked me if I wanted to go out to the old homestead. I sure did. “I guess we will then,” he said, cackling. “Life must have been so hard,” I said, as we drove to the homestead. It was out of town a little ways. “Yeah, but you didn’t know it,” he said. “You never wanted a better life, an easier one?” “Well, you didn’t know no better one. A fellow doesn’t know a better one, he won’t want one.” At the homeplace, a swing made from an old car seat was on the porch. It was a log house chinked with mortar. Inside, the dinnerware was still in a beautiful glass cabinet. There were canned goods on a shelf. No one lived at the homeplace, but the homeplace still somehow lived. “He had cows when he died,” Pop said of his dad, who made 90 in this life. “Did you tend him at the end?” “He tended himself. He died right over there, in that bed.” All of the family was present that day, May 9, 2000. Roy McKee, having come out to PieTown so long ago, had pulled each grown child down to his face. He said something to each one. And then turned to the wall and died.
e9fe239bd0d671519c280ec185f85320
https://www.smithsonianmag.com/history/scandalous-quarter-protest-wasnt-180962088/
The “Scandalous” Quarter Protest That Wasn’t
The “Scandalous” Quarter Protest That Wasn’t It started out innocently enough: In January 1917, the United States released a new quarter dollar it had minted at the end of the previous year. Just 52,000 copies of the 1916-dated quarter were produced. But this was no ordinary coin. Instead, it would become one of the most legendary and sought-after in American history. The reason: a single bare breast on Lady Liberty. From the first, the coin was a big hit. “Crowds Flock to Get New Quarters,” noted a New York Sun headline on January 17, 1917. “Miss Liberty’s Form Shown Plainly, to Say Least,” the Sun added, suggesting that Liberty's anatomy might have something to do with the coin’s popularity. Indeed, the goddess’s garb gave newspapers across the land something to huff and/or snicker about. The Wall Street Journal primly observed that, “Liberty as attired on the new quarter just draws the line at license.” An Iowa newspaper sniffed about the “almost nude figure of a woman,” saying, “We can see no use in the government parading such pieces of art before the public." An Ohio paper was a little more whimsical, observing that Liberty was “clad something after the manner of Annette Kellerman,” referring to a famous swimmer turned silent actress of the day who was supposedly the first star to appear naked in a Hollywood movie. (Alas, that 1916 film, A Daughter of the Gods, has been lost to time, like so many of its era.) The Los Angeles Times, meanwhile, reported that few buyers of the new coin in that city “found anything in her state of dress or undress to get excited about. In fact, Miss Liberty is dressed up like a plush horse compared to the Venus de Milo.” Prohibitionists meeting in Chicago, whose moral concerns apparently went beyond demon rum, may have been the group that condemned the coin most severely. “There is plenty of room for more clothes on the figure,” one Prohibitionist leader told reporters. “I do not approve of its nudity.” But a letter-to-the-editor writer in Tacoma, Washington rose to Liberty’s defense. “I wonder why some people are always seeing evil in everything,” he said. “There are so many people who would be so thankful to have the quarter they would not notice or care about the draperies.” Eventually, the Prohibitionists got their wish. Though additional bare-breasted quarters were issued in 1917, later that year a new redesign went into circulation. The offending bosom was now covered with chainmail armor. In the ensuing decades, the story would evolve from one of bemusement and mild protest in some “quarters” to a tale of national outrage. By the late 20th century, the standard account had everything but irate mobs storming the U.S. Mint with pitchforks and flaming torches. Writers now repeated the tale of widespread public “uproar.” Adjectives like “scandalous,” “naughty,” and “risqué” popped up in nearly every article. One price guide referred to it as “America’s first ‘obscene’ coin.” A major auction house with a collection of quarters for sale called it a “Scandalous Rare Coin That Created Moral Outrage.” Some accounts even claimed that famous anti-vice crusader Anthony Comstock had personally led the attack against the coin. The only problem with that story? Comstock died in 1915. Not that he wouldn’t have joined in if he could. A longtime foe of scantily clad mythological figures, Comstock once unsuccessfully pressed for the removal of a gilded,13-foot-tall and totally naked statue of the Roman goddess Diana mounted atop Manhattan’s Madison Square Garden. After decades of hype, a new generation of writers has finally taken a closer look at the alleged coin contretemps. One of them is Robert R. Van Ryzin, currently the editor of Coins magazine. Van Ryzin says he grew up believing the Liberty legend as a young collector. When he began writing about coins professionally, though, he could find little evidence that large numbers of Americans were incensed by a 25-cent piece—or that their complaints were the reason the Mint altered the coin. “I don’t know who started it,” he says of the long-accepted story. “But I suspect it was easy for people to believe such a thing.” In other words, it made sense to modern Americans that their 1917 counterparts were so prudish that they could be shocked by their pocket change. In fact, contemporary news accounts show nearly as much griping about the depiction of the eagle on one side of the quarter as about Liberty on the other. Squawked one bird buff: “It is well known that the eagle in flight carries his talons immediately under his body, ready for a spring, whereas in the quarter dollar eagle the talons are thrown back like the feet of a dove.” Other critics charged that the design of the coin made it likely to collect dirt and require washing. And the Congressional Record shows that when the U.S. Senate took up the question of a redesign, its complaint was that the coins didn’t stack properly—a problem for bank tellers and merchants—rather than how Lady Liberty was, uh, stacked. The coin’s designer, a respected sculptor named Hermon A. MacNeil, wasn’t happy with how it had come out, either. Given the opportunity to redesign the coin, he made a number of changes, just one of which was the addition of the chain mail. Liberty’s battle-ready look may have been a response to the First World War, which was raging in Europe and which the U.S. would officially join in April 1917, rather than a nod to modesty. All of those factors—more than a priggish populace—seem to have doomed the 1916 design. Though much of the myth has now been toned down, it still has legs. The decades of fuss—some of it real, much of it exaggerated—seem to have guaranteed the 1916 coin a lasting place among collector favorites. Today even a badly worn specimen can command a retail price of over $4,000, compared with about $35 for the more-chaste 1917 coin in the same condition. A mint condition quarter could be worth as much as $36,500. The low production volume of the 1916 coins accounts for some of that price, but hardly all of it. Even in the sedate world of coin collecting, usually not considered the sexiest of hobbies, there’s nothing like a little scandal to keep a legend alive. Greg Daugherty is a magazine editor and writer as well as a frequent contributor to Smithsonian.com. His books include You Can Write for Magazines.
994d6ce0df2aaef89b58d849bfdd5eaa
https://www.smithsonianmag.com/history/science-fear-royal-scandal-made-france-modern-and-other-new-books-read-180974685/
The Science of Fear, the Royal Scandal That Made France Modern and Other New Books to Read
The Science of Fear, the Royal Scandal That Made France Modern and Other New Books to Read To confront her crippling fear of heights, journalist Eva Holland jumped out of an airplane and learned to rock climb. But while she endured these experiments with a semblance of aplomb, she found that the experience did little to assuage her fears. “I was facing my fear, but it was hard to imagine my resulting feelings, or my control over them, ever improving,” explains Holland in Nerve: Adventures in the Science of Fear, one of five new nonfiction titles featured in Smithsonian magazine’s weekly books roundup. The latest installment in our “Books of the Week” series, which launched in late March to support authors whose works have been overshadowed amid the COVID-19 pandemic, details Holland’s nerve-racking exploits, the stories of 50 forgotten female innovators, a 19th-century royal scandal that unmade France’s Bourbon dynasty, an investigation of how street addresses reflect race and class, and an overview of St. Louis’ turbulent history. Representing the fields of history, science, arts and culture, innovation, and travel, selections represent texts that piqued our curiosity with their new approaches to oft-discussed topics, elevation of overlooked stories and artful prose. We’ve linked to Amazon for your convenience, but be sure to check with your local bookstore to see if it supports social distancing-appropriate delivery or pickup measures, too. When Eva Holland’s greatest fear—her mother’s untimely passing—was realized in 2015, she decided to embark on a journey of self-discovery, examining “the extent to which her many fears had limited her … and whether or not it was possible to move past them.” Nerve, a work that contextualizes Holland’s personal phobias by delving into the latest scientific research, is the product of this years-long quest. As Holland writes in the book’s prologue, she began by breaking down fear into three “imperfect” categories: phobias, trauma, and the ephemeral. From there, she set out to answer key questions, including how and why humans feel fear, whether a cure for fear exists, and whether there is a “better way to feel afraid.” Over the course of her research, Holland grappled with her own fears, interviewed individuals who have a rare disease that prevents them from feeling fear and met with scientists working to cure phobias with a single pill. Though she freely admits that she “can’t say that I am now in perfect control over my fears,” the journalist does note that her relationship with fear is forever changed. With Nerve, Holland hopes to instill these same lessons in others. She adds, “Fear is an experience that unites, even as, in the moment, it makes each of us alone.” Street addresses, argues Deirdre Mask in The Address Book, convey crucial information regarding their demographic details, including race, wealth and identity, of those who live there. These numbers and names also reflect power—“the power to name, the power to shape history, the power to decide who counts, who doesn’t, and why.” As Mask writes in the book’s introduction, addresses come in handy when directing ambulances where to go, but at the same time, they “exist so people can find you, police you, tax you, and try to sell you things you don’t need through the mail.” Take, for instance, rural West Virginia, which had few street addresses prior to 1991, when a telecommunications company began an unprecedented address-making campaign aimed, “quite literally, [at putting] West Virginians on the map.” Locals, who had long been accustomed to providing directions based on geographic landmarks rather than street names, viewed the initiative with suspicion, writes Mask. Mask explores the tensions raised by street names—and the ripple effects of not having an address—through case studies of Nazi Germany, a Haitian cholera outbreak, ancient Rome and other communities across four continents. Per the New York Times’ review of The Address Book, the book is surprisingly encouraging for a story on “class, poverty, disease, racism and the Holocaust,” drawing on a “cast of stirring meddlers whose curiosity, outrage and ambition inspire them to confront problems ignored by indifferent bureaucracies.” The July Revolution of 1830 is perhaps best known for ending the Bourbon dynasty’s rule in France. But as Maurice Samuels writes in The Betrayal of the Duchess, the uprising had at least one unexpected side effect still evident in modern French society: namely, the rise of rampant anti-Semitism. Samuels traces France’s pervasive anti-Semitism to the 1832 betrayal of Marie-Caroline de Bourbon-Sicile, duchesse de Berry, by her trusted advisor, a “seductive yet volatile man” named Simon Deutz. The duchess, mother of the 11-year-old heir to the crown, had been exiled in the aftermath of the July Revolution, but far from placidly accepting this unwelcome turn of events, she rallied supporters and led a guerrilla army tasked with restoring the Bourbon dynasty to the throne. De Berry evaded authorities for six months, but on November 6, 1832, was found hiding in a Nantes home. Upon emerging from a secret compartment, she reportedly said, “I am the duchesse de Berry. You are French soldiers. I entrust myself to your honor!” Deutz, the man responsible for the duchess’ discovery, was a Jewish convert to Catholicism who gave up his former confidant for a small fortune. In the aftermath of the betrayal, according to Samuels, the duchess’ supporters came to view Deutz’s action as emblematic of modernity—in other words, a “symbol for the evils … ushered in by the French Revolution.” Adds Samuels, “The story transformed resistance to modernity into a passion play with the Jew as villain and, in so doing, helped make anti-Semitism a key feature of right-wing ideology in France.” As the geographic center of the United States of America, St. Louis has seen more than its fair share of historical happenings. In The Broken Heart of America, historian Walter Johnson traces the city’s evolution—including Lewis and Clark’s 1804 expedition, the Missouri Compromise, the 1857 Dred Scott decision, and the 2014 uprising in nearby Ferguson—from the nation’s “most radical city” to an urban center marred by racial inequality. “The story of human geography of St. Louis is as much a story of ‘Black removal’—the serial destruction of Black neighborhoods and the transfer of their population according to the reigning model of profit and policing at any given moment—as of white flight,” writes Johnson in the book’s introduction. Imperialism, capitalism and racism have long coalesced in St. Louis, but far from being a representative city at once torn between “east and west, north and south,” the historian argues, the Missouri capital has, in fact, “been the crucible of American history,” much of which has “unfolded from the juncture of empire and anti-Blackness in the city of St. Louis.” Virginia Woolf’s A Room of One’s Own contains several sayings that have since become mainstays in the feminist lexicon. The 1929 essay’s title, for example, is commonly used to describe the privacy and independence needed to foster female creativity. Anonymous Is a Woman, a new offering from women’s rights expert Nina Ansary, derives its title from another oft-repeated Woolf quote: “I would venture to guess that Anon, who wrote so many poems without signing them, was often a woman.” In keeping with the British writer’s line of thinking, Anonymous Is a Woman explores the stories of 50 female innovators whose accomplishments have been largely overlooked. Beginning with En Hedu-Anna, an Akkadian woman who was the world’s first known female astronomer, and ending with Alice Ball, a 20th-century American chemist who discovered a treatment for leprosy, the book uses short biographical sketches illustrated by artist Petra Dufkova to unravel 4,000 years of gender inequality. As Ansary writes in the book’s opening chapters, “It was a challenge to select only fifty women. … [D]espite formidable cultural barriers, women have developed their skills and talents, employed their intellect and creativity, and achieved distinction in diverse endeavors.” Proceeds from the sale of Anonymous Is a Woman will be donated to the Center for Human Rights in Iran and the London School of Economics Centre for Women, Peace and Security. Meilan Solly is Smithsonian magazine's assistant digital editor, humanities. Website: meilansolly.com.
a08815e79bacec55606cbba713e9a415
https://www.smithsonianmag.com/history/search-site-worst-indian-massacre-us-history-180959091/
The Search Is On for the Site of the Worst Indian Massacre in U.S. History
The Search Is On for the Site of the Worst Indian Massacre in U.S. History In the frigid dawn of January 29, 1863, Sagwitch, a leader among the Shoshone of Bia Ogoi, or Big River, in what is now Idaho, stepped outside his lodge and saw a curious band of fog moving down the bluff toward him across a half-frozen river. The mist was no fog, though. It was steam rising in the subzero air from hundreds of U.S. Army foot soldiers, cavalry and their horses. The Army was coming for his people. Over the next four hours, the 200 soldiers under Colonel Patrick Connor’s command killed 250 or more Shoshone, including at least 90 women, children and infants. The Shoshone were shot, stabbed and battered to death. Some were driven into the icy river to drown or freeze. The Shoshone men, and some women, meanwhile, managed to kill or mortally wound 24 soldiers by gunfire. Historians call the Bear River Massacre of 1863 the deadliest reported attack on Native Americans by the U.S. military—worse than Sand Creek in 1864, the Marias in 1870 and Wounded Knee in 1890. It is also the least well known. In 1863, most of the nation’s attention was focused on the Civil War, not the distant western territories. Only a few eyewitness and secondhand accounts of the incident were published at the time in Utah and California newspapers. Local people avoided the site, with its bones and shanks of hair, for years, and the remaining Bia Ogoi families quietly dispersed. But their descendants still tell the tale of that long-ago bloody day, and now archaeologists are beginning to unearth the remains of the village that didn’t survive. Darren Parry, a solemn man who is a council member of the Northwestern Band of the Shoshone Nation and Sagwitch’s great-great-great grandson, stands on a hill named Cedar Point. He looks down on the historic battlefield in its braided river valley. An irrigation canal curves along the base of the bluffs, and a few pickup trucks drive along U.S. Highway 91, following a route used by the Shoshone 200 years ago. These alterations to the landscape—roads, farms and an aqueduct, along with shifts in the river’s meandering course through the valley—have made it difficult, from a scientist’s perspective, to pinpoint the location of the Shoshone winter village. Parry, though, does not have this problem. “This spot overlooks everything that was important to our tribe,” he says. “Our bands wintered here, resting and spending time with family. There are warmer places in Utah, but here there are hot springs, and the ravine for protection from storms.” The So-So-Goi, or People Who Travel on Foot, had been living well on Bia Ogoi for generations. All their needs—food, clothes, tools and shelter—were met by the rabbits, deer, elk and bighorn sheep on the land, the fish in the river, and the camas lilies, pinyon nuts and other plants that ripened in the short, intense summers. They lived in loose communities of extended families and often left the valley for resources such as salmon in Oregon and bison in Wyoming. In the cold months, they mostly stayed in the ravine village, eating carefully stored provisions and occasional fresh meat. White-skinned strangers came through the mountain passes into the valley seeking beaver and other furs. These men gave the place a new name, Cache Valley, and the year a number, 1825. They gave the So-So-Goi a new name, too—Shoshone. The Shoshone traded with the hunters and trappers, who were little cause for concern since they were few in number and only passing through. But then people who called themselves Mormons came to the northern valley. The Mormons were looking for a place where they, too, could live well. They were many in number, and they stayed, calling this place Franklin. The newcomers cut down trees, built cabins, fenced the land to keep in livestock, plowed the meadows for crops and hunted the remaining game. They even changed Big River’s name to Bear. At first, relations between the Shoshone and the Mormons were cordial. The settlers had valuable things to trade, such as cooking pots, knives, horses and guns. And the Shoshone knowledge of living off the land was essential when the Mormons’ first crops failed. But eventually, the Shoshone “became burdensome beggars” in the eyes of the Mormons, writes Kenneth Reid, Idaho’s state archaeologist and director of the Idaho State Historic Preservation Office, in a new summary of the massacre for the U.S. National Park Service’s American Battlefield Protection Program. “Hunger, fear and anger prompted unpredictable transactions of charity and demand between the Mormon settlers and the increasingly desperate and defiant Shoshones. The Indians pretended to be friendly, and the Mormons pretended to take care of them, but neither pretense was very reassuring to the opposite party.” In Salt Lake City, the territorial commissioner of Indian affairs was well aware of the growing discord between the two peoples and hoped to resolve it through treaty negotiations that would give the Shoshones land—somewhere else, of course—and food. Conflict continued, however, and when a small group of miners was killed, Army Colonel Connor resolved to “chastise” those he believed responsible—the Shoshone people living in the ravine in the northern valley at the confluence of a creek and the Bear River. Pointing below Cedar Point, Parry says, “My grandmother told me that her grandfather [Sagwitch’s son Yeager, who was 12 years old and survived the massacre by pretending to be dead] told her that all the tipis were set up right here in the ravine and hugging the side of the mountain.” He continues, “Most of the killing took place between here and the river. Because the soldiers drove the people into the open and into the river.” In 2013, the Idaho State Historical Society began efforts to map and protect what may remain of the battlefield. The following year, archaeologists Kenneth Cannon, of Utah State University and president of USU Archeological Services, and Molly Cannon, director of the Museum of Anthropology at Utah State, started investigating the site. Written and oral accounts of the events at Bear River suggested the Cannons would find remains from the battle in a ravine with a creek that flowed into the river. And soon they did find artifacts from the post-massacre years, such as buckles, buttons, barbed wire and railroad spikes. They even found traces of a prehistoric hearth from around 900 A.D. But their primary goal, the location of the Shoshone-village-turned-killing-ground, proved elusive. There should have been thousands of bullets that had been fired from rifles and revolvers, as well as the remnants of 70 lodges that had sheltered 400 people—post-holes, hardened floors, hearths, pots, kettles, arrowheads, food stores and trash middens. Yet of this core objective, the scientists found only one piece of hard evidence: a spent .44-caliber round lead ball of that period that could have been fired by a soldier or warrior. The Cannons dove back into the data. Their team combined historic maps with magnetometer and ground-penetrating-radar studies, which showed potential artifacts underground, and geomorphic maps that showed how floods and landslides had reshaped the terrain. That’s when they found “something really exciting,” says Kenneth Cannon. “The three different types of data sources came together to support the notion that the Bear River, within a decade of the massacre, shifted at least 500 yards to the south, to its present location,” he says. The archaeologists now suspect that the site where the heaviest fighting and most deaths occurred has been buried by a century of sediment, entombing all traces of the Shoshone. “We had been looking in the wrong place,” Kenneth Cannon says. If his team can get funding, the Cannons will return to the Bear River valley this summer to resume their search for Bia Ogoi. Though the exact site of the village is still unknown, the massacre that destroyed it may finally be getting the attention it deserves. In 2017, the Idaho State Museum in Boise will host an exhibit on the Bear River Massacre. And the Northwestern Shoshone are in the process of acquiring land in the area for an interpretive center that would describe the the lives of their ancestors in the Bear River valley, the conflicts between native people and European immigrants and the killings of 1863. This is a story, Parry says, that needs to be told. Editor's Note, May 13, 2016: After publishing, two corrections were made to this story. First, a sentence was clarified to indicate that archaeologists found evidence of a prehistoric hearth, not a dwelling. Second, a sentence was removed to avoid the implication that the scientists are looking for or collecting human bones as part of their research. Sylvia Wright is a science writer and photographer based in Davis, Calif. She tells stories about the work of researchers in the American West.
64fcb99e3f4263a1c7f97b5b608ac0ae
https://www.smithsonianmag.com/history/secret-societies-you-might-not-know-180958294/
Eight Secret Societies You Might Not Know
Eight Secret Societies You Might Not Know By their very name, secret societies inspire curiosity, fascination and distrust. When the Washington Post broke the story last month that Supreme Court Justice Antonin Scalia spent his final hours in the company of members of a secret society for elite hunters, people instantly wanted to know more about the group. The fraternity in question, International Order of St. Hubertus, was incorporated by Count Anton von Sporck in 1695 and was originally intended to gather “the greatest noble hunters of the 17th Century, particularly in Bohemia, Austria and countries of the Austro Hungarian Empire, ruled by the Habsburgs,” according to its official website. After the organization denied membership to Nazis, notably military leader Hermann Goering, Hitler dissolved it, but the order reemerged after World War II, and an American chapter was founded in the late 1960s. The order is just one of many clandestine organizations that exist today, though the popularity of these secret clubs peaked in the 18th and 19th centuries, writes Noah Shachtman for Wired. Back then, many of these societies served as safe spaces for open dialogue about everything from academia to religious discourse, removed from the restrictive eye of the church and state. As Schatman writes: These societies were the incubators of democracy, modern science, and ecumenical religion. They elected their own leaders and drew up constitutions to govern their operations. It wasn’t an accident that Voltaire, George Washington, and Ben Franklin were all active members. And just like today’s networked radicals, much of their power was wrapped up in their ability to stay anonymous and keep their communications secret. The emphasis on secret was what inspired so much distrust in the exclusive clubs. No less than the New York Times weighed in on secret societies in 1880, not wholly dismissing the theory that “Freemasonry brought about the civil war and acquitted President Johnson and… has committed or concealed crimes without number.” The Times comments, “This able theory of Freemasonry is not so readily believed as the theory that the European secret societies are the ruling power in Europe, but there are still many people as yet outside the lunatic asylum who firmly believe it.” Many religious leaders felt at the very least conflicted about secret orders. In 1887, Reverend T. De Witt Talmage wrote his sermon on “the moral effect of Free Masonry, Odd Fellowship, Knights of Labor, Greek Alphabet and other Societies.” The reverend, who said he had “hundreds of personal friends who belonged to orders” used Proverbs 25: 9 —"discover not a secret to another” —to ask his audience to question whether or not being a member of a secret society would be a positive or negative decision for them. Meanwhile, that same week, Cardinal James Gibbons took a more definitive stand on secret orders, saying that they had “no excuse for existence.” In the United States in the late-19th century, there was enough of a national uproar against secret societies that one concerned group created an annual “Anti-Secret Society Convention.” In 1869, at the national convention in Chicago, the attendees went after the “secular press.” The organization’s secretary said that the press "either approved or ignored secret societies” while “few religious papers have spunk enough to come out for Christ in opposition to Masonry.” But by 1892, the group, which deemed the societies an "evil to society and a menace to our civil institutions," had failed to “secure them anything but strong denunciation,” as the Pittsburgh Dispatch commented. While The Da Vinci Code novelist Dan Brown and his contemporaries have shined a light upon some of the bigger secret fraternal organizations like the Order of Skull and Bones, Freemasons, Rosicrucians and the Illuminati, there are still other, lesser-known groups that have compelling stories of their own. Here are just a few: The Improved Benevolent and Protective Order of Elks of the World In 1907, the Seattle Republican reported on the Order of Elks, writing that "it is claimed by members and officers that it is one of the most thriving secret societies among Afro-Americans of this city." According to the non-profit African American Registry, the fraternal order was founded in Cincinnati, Ohio, in 1899 after two black men were denied admission to the Benelovent and Protective Order of Elks of the World, which is still popular today and, despite questions raised on discriminative practices, now allows any American citizen, 21 years or older, who believes in God to be invited to join its ranks. The two men decided to take the order’s name and make their own club around it. Formally called the Improved Benevolent and Protective Order of Elks of the World, the order was once considered to be at the center of the black community. During the era of segregation, the lodge was one of the few places where black men and women could socialize, the Pittsburgh Post-Gazette wrote. In recent years, however, the Post-Gazette commented that the secret organization has struggled to retain its relevance.  Still, the secret society continues to sponsor educational scholarship programs, youth summer computer literacy camps, parades as well as community service activities throughout the world. The Grand Orange Lodge The Grand Orange Lodge, known more commonly as the “Orange Order” got its name from Prince William III, the Prince of Orange, and was founded after the Battle of the Diamond outside a small village in modern-day Northern Ireland called Loughgall. Its purpose was to "protect Protestants" and that’s why, in 1849, the Lord Lieutenant of Ireland, George William Frederick Villiers, captured the ire of Dublin’s Waterford News for supporting the society. The paper wrote, "Lord Clarendon has been holding communication with an illegal society in Dublin for upwards of ten days. The Grand Orange Lodge, with its secret signs and pass-words, has been plotting with his Excellency during the whole of that period. This may seem strange, but it is a fact…” At the time, secret societies were banned from Ireland as they were said to have acted in “antagonism to the “Land League,” an Irish political organization, according to Ireland’s official records on statistics of eviction and crime. The Grand Orange Lodge is still around today with clubs in Ireland, as well as others around the world. Prospective members of the Protestant fraternity don’t take a pledge, they just have to affirm their acceptance of the Principles of Reformation, as well as loyalty to their country. As to the question of whether they are “anti-Roman Catholic”, the official website states, “Orangeism is a positive rather than a negative force. It wishes to promote the Reformed Faith based on the Infallible Word of God - the Bible. Orangeism does not foster resentment or intolerance. Condemnation of religious ideology is directed against church doctrine and not against individual adherents or members.” The Independent Order of Odd Fellows Perhaps one needs to be a member of the altruistic and friendly society known as the Independent Order of Odd Fellows to know for sure when the club first started, but the first written record of the order comes in 1812, however, and it references George IV. Even before he was named Prince Regent of the United Kingdom, George IV, had been a member of the Freemasons, but as the story goes, when he wanted a relative of his to be admitted to the society without having to to endure the lengthy initiation process, the request was emphatically denied. George IV left the order, declaring he would establish a rival club, according to a history of the Independent Order of Odd Fellows published by the Philadelphia Evening Telegraph in 1867. The official website of the order, however, traces the clubs origins all the way back to 1066. Regardless of how it first started, it’s fair to say the king got his wish. The Independent Order of Odd Fellows is still around today, and the club counted British prime ministers Winston Churchill and Stanley Baldwin among its ranks. The Odd Fellows, as they call themselves, are grounded in the ideals of friendship, love and truth. There are real skeletons in the order’s lodges; they are used during initiation to remind its members of their mortality, the Washington Post reported in 2001. The Knights of Pythias The Knights of Pythias was founded by Justus H. Rathbone, a government employee in Washington, D.C., in 1864. He felt there was a moral need for an organization that practiced “brotherly love,” which would make sense, seeing as the country was in the midst of the Civil War. The name is a reference to the Greek legend of Damon and Pythias, the Pythagorean ideal of friendship. All of its founding members worked for the government in some capacity, and it was the first fraternal order to be chartered by an act of Congress, the order’s official website writes.  The Knights of Pythias’ colors are blue, yellow and red. Blue signifies friendship, yellow charity and red benevolence, the North Carolina Evening Chronicle wrote in a special edition celebrating the 50th anniversary of the club in 1914. The Knights of Pythias is still active and is a partner of the Boy Scouts of America, the second organization to receive its charter from the United States Congress. The Ancient Order of the Foresters Known today as “Foresters Friendly Society,” the Ancient Order of the Foresters was initially established in 1834, according to the society’s website, albeit under a slightly different name. The Ancient Order was created before state health insurance began in England, so the club offered sick benefits to its working class members. In 1874, the American and Canadian branches left the Ancient Order and set up the Independent Order of the Foresters.  Candidates looking to be admitted to the club had to “pass an examination by a competent physician, who is himself bound by his connection with the order,” the Boston Weekly Globe wrote in 1879.  The society still provides insurance policies today for its members, who also engage in a variety of community service activities. The Ancient Order of United Workmen John Jordan Upchurch and 13 others in Meadville, Pennsylvania, founded the Ancient Order of United Workmen in 1868 with the goal of bettering conditions for the working class. Like the Foresters, it set up protections for its members. Initially, should a member die, all brothers of the order contributed a dollar to a member’s family. That number would eventually be capped at $2,000. The Ancient Order of United Workmen is no longer around, but its legacy continues, as the order unintentionally created a new kind of insurance that would influence other fraternal groups to add an insurance provision in their constitutions. The Patriotic Order Sons of America The Patriotic Order Sons of America dates back to the early days of the American Republic, according to its official website. Following in the footsteps of The Sons of Liberty, the Order of United Americans and Guards of Liberty, the Patriotic Sons of America, which later added the word “Order” to its name, became one of the “most progressive, most popular, most influential as well as strongest patriotic organizations” in the United States in the early 20th century, the Allentown Leader wrote in 1911. How progressive the order actually was is up to interpretation. In 1891, the Sons of America refused to delete the word “white” in its constitution, defeating a proposition that would allow black men to apply. Today, the order opens its membership up to “all native-born or naturalized American male citizens, 16 years and older, who believe in their country and its institutions, who desire to perpetuate free government, and who wish to encourage a brotherly feeling among Americans, to the end that we may exalt our country, to join with us in our work of fellowship and love.” The Molly Maguires In the 1870s, 24 foreman and supervisors in the coal mines of Pennsylvania were assassinated. The suspected culprit? Members of the secret society the Molly Maguires, an organization with Irish origins brought to the United States by Irish immigrants. The Maguires likely got its name because members used women’s clothing as a disguise while allegedly carrying out its illegal acts, which also included arson and death threats. The group was finally undone by a mole planted by the famed Pinkerton Detective Agency, which was hired by the mining companies to investigate the group. In a series of criminal trials, 20 Maguires were sentenced to death by hanging. The Order of the Sons of St. George, another secret organization, which was founded in 1871 to oppose the Maguires also appears to have vanished. Jacqueline Mansky is a freelance writer and editor living in Los Angeles. She was previously the assistant web editor, humanities, for Smithsonian magazine.
3d3ecb5f2d68dbc8779ac77a6f56b188
https://www.smithsonianmag.com/history/secrets-of-the-maya-deciphering-tikal-2289808/
Secrets of the Maya: Deciphering Tikal
Secrets of the Maya: Deciphering Tikal Tikal’s great plaza, at the heart of what was one of the most powerful city-states in the Americas, is surrounded by monumental structures: the stepped terraces of the North Acropolis, festooned with grotesque giant masks carved out of plaster and masonry; a steep pyramid called Temple I, whose roof comb towers 145 feet above the ground, and its mate across the plaza, TempleII, soaring 125 feet above the grass; and a complex of mysterious buildings called the Central Acropolis. At the peak of its glory, around a.d. 750, Tikal was home to at least 60,000 Maya and held sway over several other city-states scattered through the rain forest from the YucatánPeninsula to western Honduras. Though magnificent, the ruins of Tikal visible today represent but a fraction of the original city-state. During its heyday, archaeologists say, “downtown” Tikal was about six square miles, though research indicates that the city-state’s population may have sprawled over at least 47 square miles. Yet most of Tikal—the heart of Guatemala’s Tikal National Park, about an hour’s drive northeast of the modern city of Flores—has not even been excavated. And until recently, the same could be said about the nature of the Maya themselves. For much of the 20th century, Maya experts followed the lead of Carnegie Institution of Washington archaeologist J. Eric Thompson, who argued that the Maya were peaceful philosophers and extraordinary observers of celestial events content to ponder the nature of time and the cosmos. Thompson, who died in 1975, theorized that Tikal and other sites were virtually unpopulated “ceremonial centers” where priests studied planets and stars and the mysteries of the calendar. It was a beautiful vision—but nearly all wrong. “For all of Eric Thompson’s important findings in many areas of Maya studies,” writes anthropologist Michael Coe in his 1992 book Breaking the Maya Code,“he singlehandedly held back the decipherment [of Mayan hieroglyphs] for four decades” and, consequently, the study of the Maya. When, in the 1960s, the hieroglyphs—the most sophisticated writing system created in the New World—were at last beginning to be deciphered, a new picture of these people emerged. Mayan art and writing, it turned out, contained stories of battles, sacrificial offerings and torture. Far from being peaceful, the Maya were warriors, their kings vainglorious despots. Maya cities were not merely ceremonial; instead, they were a patchwork of feudal fiefdoms bent on conquest and living in constant fear of attack. “Blood was the mortar of ancient Maya ritual life,” wrote groundbreaking epigrapher Lin-da Schele and art historian Mary Miller in their 1986 book The Blood of Kings. It is one of the ironies of this view that evidence for it has long been in plain sight. At the base of Tikal’s North Acropolis stands a row of tall carved stones, or stelae. Each stela depicts a sumptuously bedecked king, and the monoliths are covered in hieroglyphs that, once deciphered, illuminated our view of Maya life. During the Spanish conquest of Mesoamerica in the 16th century, the Catholic Church’s Friar Diego de Landa supervised the burning of hundreds of Maya codices—fig-bark books rich in mythological and astronomical information. Only four Maya codices are known to have survived. And one key to the glyphs from that time was saved: a manuscript that Landa wrote in 1566 about his contact with the Maya. It recorded what he mistakenly thought was the Mayan alphabet. Although parts of his manuscript were first published in 1864, nearly a century would pass before epigraphers understood that Mayan hieroglyphs are actually a combination of symbols using both logographs (words) and syllabic signs (units of sound). However, it was not until the 1970s that the full meaning of many hieroglyphs was understood. Today at least 85 percent of known Mayan texts have been read and translated. The descendants of the ancient Maya, who long ago lost the ability to read their ancestors’ writings, have been in the midst of a cultural revival. Having weathered the Catholic Church’s suppression of their culture during the 16th and 17th centuries and later endured a string of brutal dictators, including the notorious Efrain Ríos Montt—responsible for the murder of more than 100,000 Maya in the early 1980s— some Maya have begun openly to celebrate their heritage with pilgrimages to Tikal and other sites. Abandoned by its original inhabitants more than a thousand years ago, the city remained unknown to outsiders for almost a millennium. In 1525, Spanish conquistador Hernando Cortés passed within a few dozen miles of the place without learning of it. Likewise, in 1841, the American diplomat, journalist and explorer John Lloyd Stephens and the British illustrator Frederick Catherwood reported with great fanfare their “discovery” of ruins in the Maya region, but they missed Tikal. Guatemalan archives mention that local people lived in Tikal in the 18th century, but the first official expedition to the ruin wasn’t until 1848. Even “Tikal” is a relatively recent name, derived from the Mayan word ti ak’al, or “at the water hole.” A leader in the field of Mayan epigraphy is David Stuart, who was awarded a MacArthur Fellowship in 1984 at age 18—the youngest recipient of the so-called genius award—for his several publications and papers about deciphering Mayan hieroglyphs. He defined some previously unknown glyphs and refined the spelling rules of the Mayan writing system. Now 38, Stuart is the curator of Mayan hieroglyphs at HarvardUniversity’s Peabody Museum of Archaeology and Ethnology. He has a special fondness for Tikal. “It’s the atmosphere of the place,” Stuart says. “Tikal is simply one of the most overpowering archaeological sites in the world.” Though Tikal may have been settled by at least 600 b.c., most of the city’s edifices were built during what is called the Classic period of Maya history, from a.d. 250 to 900. It was a time when the Maya created great artwork and amazing architecture across the region (see “Of Majesty and Mayhem,” p. 49). Recent finds may yet force scholars to redefine the beginning of this period. This spring, archaeologists working at the nearby city of Cival uncovered evidence that distinctively Mayan art and writing may have developed as early as 300 b.c., and a wall painting dating to about a.d. 100, the oldest known intact Maya mural to date, was discovered in an 80-foot-high pyramid at the ruins of San Bartolo, a ceremonial site in Guatemala. Still,Tikal stands out. “The buildings at Tikal are particularly well built, and they have stood up quite well against the onslaught of the jungle,” says Stan Loten, an architectural archaeologist and retired professor who conducted surface surveys of Tikal’s standing structures from 1964 to 1970. Beginning in the 1880s, well before other glyphs yielded up their meanings, researchers began decoding the Maya calendar from glyphs on stelae at sites all over the Maya world. Most stelae include the date of their creation, written in a five-number sequence known to scholars as the Long Count, or the number of days since the beginning of this current era. This system is built on a base of 20 rather than 10 and is made up of glyphs and combinations of a single dot for “one,” a bar for “five,” and a glyph that translated to mih, or “zero.” Once scholars figured out this system, they were able to correlate it with the Gregorian calendar, revealing an astonishing sense of time: the Long Count starts in 3114 b.c. The earliest dated monument yet discovered in Tikal and all of the Maya lowlands, Stela 29, has a Long Count date of 8.12.14.13.15, which translates to a.d. 292. Understanding this calendar was an important step in understanding the history of the Maya. Of all the dated stelae found at Tikal, not one is from between a.d. 562 and 692. This period of monumental silence is known as the Hiatus. For decades, scholars were at a loss to explain what happened during those years. But after the discovery of the Long Count, one of the next breakthroughs in deciphering the Mayan writing system was recognizing what experts call the emblem glyph—a unique hieroglyph that represents a specific city-state. Tikal’s emblem glyph is read as mutal, which is based on the word mut, meaning “bound” or “tied.” The glyph resembles how a ruler’s tied-back hair might look from behind (see stela, page 46), and appears on stelae in ancient Maya city-states as far away as Copán, about 180 miles to the southeast. But why? As experts translated more glyphs, they learned that Tikal had lost a war with Caracol, a Maya city in present-day Belize. The evidence is a boast of the victory, in a.d. 562, inscribed on an altar found in Caracol. That crushing defeat must have hung over Tikal like a pall. Before the glyphs were read, no archaeologist would have dreamed that Caracol, though a substantial city-state, could have laid low the mighty Tikal. Other stelae at Caracol suggest that the key to its triumph was an alliance with Calakmul, another Maya powerhouse in present-day Mexico. For more than 100 years, then, Tikal may have been a conquered city-state, languishing in thrall to foreign rulers. Somehow, Tikal recovered. In 672, the city launched a war against Dos Pilas, about 70 miles to the southwest. An upstart Maya city less than 50 years old at the time, Dos Pilas had the nerve to use Tikal’s emblem glyph, calling itself in effect “New Tikal.” In the war, Tikal was triumphant. Glyphcovered stone stairways at Dos Pilas record the city’s defeat. So explicit are Mayan glyphs that archaeologists have by now compiled a chronology of 33 rulers of Tikal (including at least one queen) spanning 800 years. Scholars formerly named these rulers after the glyphs that signified them, such as Double Bird, Jaguar Paw and Curl Snout. As epigraphers learned to sound out the glyphs, they assigned phonetic names. The architect of the first phase of Tikal’s revival was Nuun Ujol Chaak, a warrior king also known as Shield Skull. Nuun Ujol Chaak’s era was hardly peaceful. As a young king, he fled Tikal when Calakmul declared war in a.d. 657. But he returned to lead Tikal’s defeat of Dos Pilas in 672. Then, only five years later, Nuun Ujol Chaak lost again to Dos Pilas, which was most likely collaborating with Calakmul, probably the greatest Maya power at the end of the seventh century. Victory over Tikal’s rivals was finally achieved by his son, Jasaw Chan K’awiil I, on August 5, 695. A drawing on a building in the Central Acropolis shows Jasaw carried in triumph into the city on a litter, leading his captive— perhaps the defeated lord of Calakmul—by a tether. Templeiv, erected about a.d. 741, is a dizzying pyramid that stands 212 feet above the ground, the tallest Maya structure ever built. Only the upper levels of TempleIV have been restored, but thanks to a pair of wooden staircases that surmount the rubble, visitors can climb nearly to the top of this structure for the finest view at Tikal. A seemingly limitless green expanse of rain forest billows into the distance like waves on a chlorophyll ocean. There is no sign of any other human settlement. Yet hidden in the jungle below is another of Tikal’s mysteries. The Lost World is a complex of pyramids and buildings southwest of the GreatPlaza. It was excavated and restored between 1979 and 1985 by Guatemalan archaeologists working on the Tikal National Project. The area, according to Guatemalan epigrapher Federico Fahsen, served as an observatory from about 500 b.c. to a.d. 250. During the early Classic period, it vied with the North Acropolis as the ceremonial epicenter of Tikal and served as a royal burial ground. Around the Lost World, architectural and artistic features suggest Tikal had links to Teotihuacán, a city in the highlands of Mexico whose culture flourished between a.d. 150 and 650, entirely separate from the Maya. Because Teotihuacán lies 630 miles from Tikal, many scholars originally doubted that the two empires were even aware of the other’s existence. Yet ceramic designs found at Tikal and other Maya sites seem to mirror the iconography of the Teotihuacán culture—especially its grim-visaged storm god, Tlaloc. Only six years ago, David Stuart untangled a series of fourth-century glyphic texts from Tikal’s Stela 31 that helped connect the two empires. Remarkably, he was able to read the glyph that confirmed scholarly speculation pinpointing the day when a lord from Teotihuacán named Siyah K’ak’, or Fire is Born, arrived at Tikal: a.d. January 31, 378. It is probably no coincidence that the 14th king of Tikal, Chak Tok Ich’aak I, long known as Jaguar Paw, died the same day. The impact that other civilizations have had on the Maya is just beginning to be understood, researchers say. Perhaps the greatest Maya mystery of all is the cause of the civilization’s abrupt decline. The last dated stela erected at Tikal was put up in a.d. 869; the last anywhere in the Maya world, in 909. The causes of what University of Pennsylvania archaeologist Robert Sharer calls “one of the most profound cultural failures in human history” have been debated for a century. The stelae are no help—the collapse seems to have ended most of the carving. Most likely, researchers speculate, a severe drought devastated a society that was already suffering from overpopulation and famine. Tikal still keeps some secrets. Scanning a map of the ruins laid out on his desk, Stuart points to an area of nameless, unexcavated mounds just south of the Lost World. “I’ve always been curious about this group,” Stuart says. “You can spend five or six years digging a site and not greatly change our understanding of Classic Maya civilization. What changes it is the fortuitous discovery of a new inscription.” His finger rests on the area. “Who knows what you might find there?”
4e512fe3e348b7c91cf4837645f01441
https://www.smithsonianmag.com/history/see-jewish-life-before-holocaust-newly-released-digital-archive-180952582/
See Jewish Life Before the Holocaust Through a Newly Released Digital Archive
See Jewish Life Before the Holocaust Through a Newly Released Digital Archive In 1935, Roman Vishniac, a Russian-born Jew and heralded photographer, journeyed throughout Eastern Europe with one goal: photograph impoverished Jewish communities. The American Jewish Joint Distribution Committee, his employer, planned to use the images to raise funds for relief efforts, but the photos would become an iconic link to the culture that vanished as a result of the Holocaust. Years earlier, long before his trip through Eastern Europe, Vishniac and his family emigrated from Russia to Berlin, where he built a photo-processing laboratory, pursued his interest in microscopic research, and became an acclaimed street photographer. As Hitler and the Nazi Party rose to power in the 1930s, Vishniac remained in Berlin, but after Kristallnacht in 1938, he initiated plans to leave Germany with his family. In 1939, he spent six weeks in an internment camp in France, ultimately managing to secure release and moving with his family to New York City. After the war, he returned to photograph Jewish communities in displaced persons camps in the late 1940s, as well as those in 1950s New York City. Only 350 of Vishniac’s images were published or printed during his lifetime, though his photo archive of negatives numbers around 9,000. The U.S. Holocaust Memorial Museum (USHMM) and the International Center for Photography (ICP) have teamed up to make the rest of Vishniac’s images available to the public. Last week, they launched an online photo database that includes scans of Vishniac’s prints and negatives—in many cases published for the first time anywhere. “This is an incredibly important body of material. He has this iconic role in Jewish culture, and yet only a handful of his images have been printed or published in his lifetime,” says Maya Benton, who curates the archive for ICP and is working on a book on Vishniac’s work. Most of Vishniac’s negatives and printed images lack captions, and information about what’s on each roll of film is sparse. “We don’t have captions or dates or locations for 99 percent of his work," says Benton. The goal is that by opening the archive up to the public someone, somewhere might recognize something. “We’re in the decade in which Holocaust survivors are dying out, which is why we felt this urgency and this rush to do this,” says Benton. As people look through the collection, they can make notes on different images, which then go to historians at USHMM to follow up. Searching their own extensive textual and photographic archives, they can track down a name or location clue to the larger context of an image. “It’s more than identifying a person who may have died. It’s about restoring and preserving their history,” says Judy Cohen, director of photographic reference collection at the USHMM.  Given the museum’s wide audience and slew of daily visitors, they’ve already had some success in tracking down the individuals even before the project’s launch. Benton has some personal perspective on the project: Her mother spent her childhood in a displaced persons camp. She’s been studying Vishniac’s work for at least a decade. In the course of her analysis, Benton realized that he actually photographed the camp where her mother lived, giving Benton the opportunity to show some images of the camp to her mother. “She remembered the kind of feel of the place,” recalls Benton, who hopes that the archive provokes similar experiences in Jewish households around the world: Younger, digital-savvy grandchildren sitting down with their parents and grandparents to revisit a world that was lost. Vishniac’s negatives, in fact, paint a very different picture than the one we might imagine of Jewish life in Central and Eastern Europe before the war. Instead of the solemn images of black-hatted men and schoolboys with curls (payos) , they depict theatrical performers, women tending shops, and other everyday scenes—all distinctly relatable. “It shows a very different aspect of Jewish life,” says Benton. “It shows the richness and diversity of that world.” Digitizing the archive also makes it available to researchers around the world for study. Given the breadth of archive, these could range from historians studying the rise of Nazi power in Berlin to photography experts looking at the documentary movement and comparing Vishniac to more acclaimed photographers such as Dorothea Lange. But in the archives, interspersed with these chronicles of Jewish communities, are photographs of hormones and skin cells. In the 1960s, Vishniac, also a trained biologist, pioneered techniques in photomicroscopy. The ICP team is working to digitize Vishniac’s printed images, films, and correspondence to flesh out the archive. As more archival material is scanned, historians at USHMM will follow more leads and hopefully fill in some blanks. Because, as Benton notes, “as the survivors die out, the weight will fall on photos to tell their stories.” Helen Thompson writes about science and culture for Smithsonian. She's previously written for NPR, National Geographic News, Nature and others.
0633d9711745cc4ad8fe11811aec3cc6
https://www.smithsonianmag.com/history/see-rare-footage-fdr-speaking-national-institute-health-180952664/
See Rare Footage of F.D.R. Speaking at the National Institute of Health
See Rare Footage of F.D.R. Speaking at the National Institute of Health On October 31, 1940, just days before President Franklin Delano Roosevelt would be elected to an unprecedented third term as President of the United States, he traveled to Bethesda to dedicate the National Cancer Institute and the new campus of what was then the National Institute of Health (N.I.H.), before it would eventually become known in plural form—National Institutes of Health—as multiple units were established over subsequent years. [×] CLOSE RICH: Hidden Treasure: The National Library of Medicine (Sappol, Hidden Treasure) Today, the National Library of Medicine is making the film of Roosevelt’s speech publicly available for the first time, nearly 74 years after the President made his speech. Sound recordings,transcripts, and photographs of this event have been available publicly for many years. Our research suggests, however, that this rare film footage has not been seen publicly since its recording and may no longer exist anywhere else.That late October afternoon, Roosevelt stood on the steps of the new main N.I.H. building, ready to address a crowd of 3,000 people. Still relevant today, in a variety of contexts, are the subjects he discussed: the need for preparedness in light of war and for research into deadly diseases, recent improvements in public health and health care, and hope that the research conducted at NIH would lead to new cures for and even the prevention of disease. The live footage of the speech was given to N.L.M. many years ago by the National Archives and Records Administration. The recording does not appear to have been professionally produced, although news organizations such as CBS were present on that day. The camera is unsteady in places, a hand sweeps across the lens, and the filming starts and stops, though it isn’t known whether this is a result of the original filming or of later editing. While we have long been able to hear Roosevelt’s support for public health and medical research, now we can see him state some of his powerful words from this important speech, and truly appreciate the experience of being in the audience on that historic day. The President’s concluding words capture the weight of the moment: “Today the need for the conservation of health and physical fitness is greater than at any time in the nation’s history. In dedicating this Institute, I dedicate it to the underlying philosophy of public health, to the conservation of life, to the wise use of the vital resources of our nation. I voice for America, and for the stricken world, our hopes, our prayers, our faith, in the power of man’s humanity to man.” Five years before Roosevelt’s dedication, in 1935, Luke and Helen Wilson had donated land in Bethesda, Maryland, to the government to be used as the new home of the National Institute of Health. At the dedication, President Roosevelt thanked Mrs. Wilson for the gift she and her husband had made to and for the benefit of the nation, “For the spacious grounds on which these buildings stand we are indebted to Mr. and Mrs. Luke I. Wilson, who wrote me in 1935, asking if part of their estate at Bethesda, Maryland, could be used to the benefit of the people of this nation. I would tell her now as she sits beside me that in their compassion for suffering, their hope for human action to alleviate it, she and her husband symbolized the aspirations of millions of Americans for a cause such as this. And we are very grateful.” The Wilsons’ donated their land shortly before the President signed the Social Security Act in 1935. The Act contained provisions meant to assist in “establishing and maintaining adequate public health services” throughout the country. Roosevelt made certain in his speech to pointedly address those who opposed some of his proposed health care initiatives, stating that “neither the American people nor their government intend to socialize medical practice any more than they plan to socialize industry.” The possibility of the United States entering the war in Europe was also clearly on the President’s mind. In his speech, he tied together the “strategic importance of health” with the need for the nation to be prepared for war, saying, “The total defense that we have heard so much about of late—that total defense which this nation seeks—involves a great deal more than building airplanes and ships and guns and bombs, for we cannot be a strong nation unless we are a healthy nation, and so we must recruit not only men and materials, but also knowledge and science in the service of national strength.” In his remarks, the President singled out the new National Cancer Institute (N.C.I.) that he was dedicating. He praised the Institute, stating “It is promoting and stimulating cancer research throughout the nation; it is bringing to the people of the nation a message of hope because many forms of the disease are not only curable but even preventable. Beyond this, it is doing research here and in many universities to unravel the mysteries of cancer. We can have faith in the ultimate results of these efforts.”Roosevelt lauded the past work of the National Institute of Health and emphasized the need to be vigilant against illnesses from abroad. “These buildings, which we dedicate, represent new and improved housing for an institution which has a long and distinguished background of accomplishment in this task of research… Now that we are less than a day by plane from the jungle-type yellow fever of South America, less than two days from the sleeping sickness of equatorial Africa, less than three days from cholera and bubonic plague, the ramparts we watch must be civilian in addition to military.” For their assistance in determining what research suggests to be the uniqueness of this footage, we thank our colleagues in the N.L.M.’s Audiovisual Program and Development Branch of the Lister Hill National Center for Biomedical Communications, the N.I.H. Office of History, and the National Archives and Records Administration. We also thank our colleagues Dr. David Cantor for the extensive historical research he completed on the subject of F.D.R. and the N.I.H. before we initiated our effort to make this film public available, and especially Anatoliy Milihkiker, a contract archives technician in the History of Medicine Division, who recognized the unique content of this film as he undertook a recent survey of the our extensive historical audio-visual collections. Rebecca C. Warlow is Head of Images and Archives in the History of Medicine Division at the National Library of Medicine.
03d2479423619b9ebb3b19a165a7615a
https://www.smithsonianmag.com/history/seeking-humanity-al-capone-180960880/
Seeking the Humanity of Al Capone
Seeking the Humanity of Al Capone Al Capone is much more myth than man in the popular imagination. While the notorious gangster of 1920s Prohibition-era Chicago still lingers in our cultural consciousness, this image is one riddled with contradictions: of a mobster and a do-gooder; a man who sprayed silver bullets into the air from his car and helped feed the city’s poor as he orchestrated some of the most cold-blooded murders in Chicago’s history. Although he was only leader of the infamous “Chicago Outfit” for only six years, Al Capone has remained permanently enshrined as one of America’s most notorious criminals and still commands our attention almost a century later. National Book Award-winning biographer Deidre Bair attempts to unravel this complex mythology of Capone in her latest work, Al Capone: His Life, Legacy, and Legend. “This is the story of a ruthless killer, a scofflaw, a keeper of brothels and bordellos, a tax cheat and perpetrator of frauds, a convicted felon, and a mindless, blubbering invalid,” Bair writes. Her biography draws on a rich—and, until recently, untapped—pool of resources: Capone’s remaining living relatives. Using interviews with Capone’s surviving family members, Bair attempts to humanize Capone, mapping out his close and important family bonds to his mother, wife, and son and exploring his later life, during which he developed severe mental impairments—a part of the narrative often excluded from his mythology. Al Capone: His Life, Legacy, and Legend [Deirdre Bair] on Amazon.com. *FREE* shipping on qualifying offers. From a National Book Award-winning biographer, the first complete life of legendary gangster Al Capone to be produced with the cooperation of his family Bair interviewed relatives and second- and third-generation Capone family members to try and build a picture that challenged the criminal Capone of popular imagination. But not every family member was willing to talk—many family members had changed their surnames and moved away from Chicago in the generations since the gangster’s death. Some spoke with Bair on the condition of anonymity, and as such, no names are given with some of the quotes Bair sourced. Many of the grandchildren of some of Capone’s former cronies were unwilling to speak with Bair, having promised their parents and grandparents to never discuss “business” outside the family. But the stories she extracts from the distant relatives who did talk help demystify many of the highly fabled stories around Capone—especially those that concern his sexual exploits, his kindness and charitableness, and the importance he placed on his family life. Alphonse “Al” Capone was born in Brooklyn, New York, in 1899, the son of Italian immigrants. After being kicked out of school in the sixth grade, he joined one of the borough’s tough teen gangs. At age 15, Capone began working for Johnny Torrio, one of the city’s most notorious Italian-American gang leaders, helping him in his many mob outfits including brothels and bars. Unlike Capone’s six brothers and two sisters, Al embraced the cultural myth of the American Dream, seeing himself wholly as an American. When anyone called him an “Italian,” writes Bair, he would say, “I’m no Italian—I was born in Brooklyn.” Bair writes that Capone was propelled to the “illegitimate” life out of necessity. His father died when Capone was 21 and he was the child tasked with providing for the family. Capone was fiercely devoted to his mother, writes Bair, calling her on a daily basis while he began a career as a mobster. It was this commitment to his family—especially this love for his mother—that prompted Capone to create a divide between “work” and home life in an effort to protect the welfare of his family and shield them from his growing criminal exploits. Capone adopted this approach from his mobster mentor Johnny Torrio, who believed work and family should never mix, telling Capone to “keep your hands clean” and use others do your “dirty work.” According to Bair, the surviving members of Capone’s family believe that, were it not for his father's death, Capone might have become the respectable businessman he always aspired to be. “The mantle of criminal greatness was thrust upon his unwilling shoulders,” Bair writes. After his marriage to wife Mae in 1918 and the birth of his only son, Sonny, Capone still remained a notorious womanizer. Bair is able to detail much of this thanks to relatives’ stories about his sexual deeds. This kind of philandering gave Capone syphilis, which he then passed on to his wife. Bair writes Capone did not seek treatment in spite of enduring painful sores, rashes, and regular flu-like symptoms because in doing so, he would then need to tell his wife about his adultery: To admit to having an STD was admitting to adultery itself. Later in life, untreated syphilis proved to be Capone’s undoing, completely deteriorating his mental faculties. After Torrio gave Capone the reins of the organized crime syndicate, the Chicago Outfit, in 1929, Capone conquered the city through a sophisticated network of brothels and speakeasies. By 1929, he had accumulated a net worth of over $40 million—approximately $550 million today—and associations with over 700 murders. Capone also controlled the sale of liquor to over 10,000 speakeasies. “I make money by suppling a public demand,” Capone told a reporter at the time. “If I break the law, my customers…of the best people in Chicago are as guilty as me.” To help maintain his reign, Capone often paid off top city officials, rigged local elections, and sometimes even kidnapped workers and henchmen from rival outfits. But in her book, Bair offers a new history of Capone, and separates fact from fiction in the process. For example, she tackles one story claiming Capone kept a 15-year-old female mistress in an apartment during his early years in New York, a tale Bair points out was impossible since Capone could not have afforded to do so, despite numerous biographies that purport it as truth. Bair also upholds certain enduring legends, like Capone’s supposed wish that he had started in the milk business before the beer business, since milk was always in demand and far easier to trade in than alcohol in Prohibition Chicago. Further, Bair explores the legend that has it Capone was the one responsible for putting expiration dates on milk bottles in Chicago, which it turns out has some kernels of truth. Along with his brother, Capone did indeed open own dairy farm and manufactured milk that was sold in bottles with expiration dates. The rumor says that Capone pushed for expiration dates because one of his relatives got sick from drinking milk, but Bair, based on conversations with Capone's descendants, believes it was a first step toward becoming a more legitimate businessman. While the infamous St. Valentine’s Massacre of 1929 is part of Capone’s common image—an event whereby he orchestrated the murder of seven rival gang members—Bair argues that it’s his family that defines him. His descendants report that his unwavering and enduring devotion to both mother and wife demonstrate his true persona, an identity they believe has now been completely eclipsed by his gangland legacy. They share that he loved to fish, would joyously sing at family functions, and had an intense passion for writing music. Later in life, Capone’s 11-year prison sentence—ironically handed down for tax evasion rather than for any of the many murders he coordinated—saw him mentally unravel, a result of his untreated syphillis. Capone left prison with the mind of a childish twelve-year-old in 1939. Bair shares stories of Capone being cared by wife Mae and his brothers after his imprisonment, spending his days at home in pajamas and having imaginary conversations with long-dead colleagues or enemies in their back yard, delusions the entire family went often along with. At age 48, Capone died on January 25, 1947 of a stroke. Bair's Capone is powerfully human, a daunting task given his infamous pop culture stature and her biography reminds us that even though Capone was one of the most notorious mobsters in America’s history, he spent more time in prison than actually running illegal bootlegging operations in Prohibition Chicago, ending his life a “blubbering, babbling” mess. “Was he a mobster? Yes. Was he a monster? No,” one relative tells Bair. Since Capone is such a wealth of contradictions, Bair believes “the only certainty is that as time passes and the man who was Al Capone recedes into history, the legend shows no sign of stopping.” Nathan Smith is a culture and technology writer. His writing has appeared in The Atlantic, Wired and Forbes.
9c0ec74b287f8254a649bc6e8d5fbb7f
https://www.smithsonianmag.com/history/september-1861-settling-in-for-a-long-war-48371156/
September 1861: Settling in for a Long War
September 1861: Settling in for a Long War Five months into the Civil War—on September 9—Richmond, Virginia’s Daily Dispatch editorialized that the time for debate had passed. “Words are now of no avail: blood is more potent than rhetoric, more profound than logic.” Six days earlier, Confederate forces had invaded Kentucky, drawing that state into the war on the Union side and firming up the border between North and South. But who to trust in the border states? “We have had no success lately, and never can have success, while the enemy know all our plans and dispositions,” wrote Confederate war clerk John Beauchamp Jones on September 24 from Richmond. “Their spies and emissaries here are so many torch-bearers for them.” In Washington, President Lincoln confronted disloyalty even to his north; between the 12th and 17th, he ordered troops in Maryland to arrest 30 secessionists, including members of the state legislature. About the same time, Confederate general Robert E. Lee was waging and losing his first campaign, at Cheat Mountain in Western Virginia. Even soldiers spared direct battle had no easy time. “I must again march without one bite of anything to eat,” the Confederate soldier Cyrus F. Jenkins wrote in his diary from a spot some 80 miles away. “The clouds are flying over us and the rain is falling thick and fast.” Union generals lost a weeklong siege of Lexington, Missouri, but took control of Ship Island, off the Gulf Coast of Mississippi. The island would later serve as a staging ground for the campaign against New Orleans. Although Lincoln had upheld the Fugitive Slave Act in his inaugural address, the runaway slave question remained fraught. How would Union soldiers treat fugitives they encountered? In a letter to a friend, author and abolitionist Lydia Maria Child quoted a Union soldier commanded to return fleeing slaves: “That is an order I will not obey.” Lincoln doubted that he had the power to obliterate slavery by decree. In any case, such an act would alienate the crucial border states whose favor he struggled to retain. In late August, Union major general John C. Frémont had issued a sweeping proclamation declaring free the slaves of Confederate sympathizers in Missouri. On September 11, Lincoln ordered Frémont to rescind the order, citing legal questions. (Lincoln’s own more carefully considered proclamation would ripen over the course of the coming year.) For Mary Todd Lincoln, the president’s wife, the war clouded everything. “The weather is so beautiful, why is it, that we cannot feel well,” she wrote to her cousin on the 29th from the White House. “If the country was only peaceful, all would be well.” Ulysses S. Grant, then a brigadier general in the Union Army, had just confided to his sister Mary: “This war...is formidable and I regret to say cannot end so soon as I anticipated at first.” David Zax is a freelance journalist and a contributing editor for Technology Review (where he also pens a gadget blog).
be32e75b9977b3c9a1e3ad16f54fb05f
https://www.smithsonianmag.com/history/serial-there-were-these-groundbreaking-examples-serialized-non-fiction-180953378/
Before Serial, There Were These Groundbreaking Examples of Serialized Non-Fiction
Before Serial, There Were These Groundbreaking Examples of Serialized Non-Fiction Serial, a new podcast that spends an entire season focusing on a 15-year-old murder, has taken the world by storm. It is the top-rated podcast on iTunes and each episode has been downloaded or streamed at least 1.2 million times. The popular program has been compared to Truman Capote’s serialized story about a brutal 1959 murder and has even spawned a parody podcast. Produced by the creators of This American Life and hosted by veteran journalist Sarah Koenig, Serial follows Koenig’s re-investigation of the 1999 murder of Baltimore high school student Hae Min Lee. Lee’s ex-boyfriend Adnan Syed was convicted for her murder and is serving a life-sentence in a Maryland correctional facility, but still insists that he is innocent. Eight episodes of Serial have aired so far, and it’s far from clear what the conclusion will be. Will Koenig be able to exonerate Adnan? What really happened to Hae? And what is Jay’s role? While the serial format employed by Koenig for her show is perhaps unique to radio, it has a long tradition in print journalism.  So if you’re a Serial fan and find the week wait between new episodes to be torture, consider diving into one of these other examples of the genre that some call “non-fiction serial”, many of which had lasting impacts on how the public viewed important issues such as war, the treatment of mental patients, privacy in the modern age, and climate change. In Cold Blood Truman Capote’s chilling tale of the savage 1959 murder-by-shotgun of the Clutter family in Holcomb, Kansas was first published as a four-part serial in The New Yorker. It was released in book form the following year, setting the gold standard of the non-fiction novel. Ten Days in a Madhouse In the 19th century, American journalist Elizabeth Jane Cochrane faked insanity to study a mental institution from within. Cochrane got herself committed to Blackwell’s Island Insane Asylum in New York. Writing under the pen name Nellie Bly, Cochrane’s reports of brutality and neglect were initially published as a 17-part series of articles for the New York World; the articles were later complied into a book, Ten Days in a Mad-House. The asylum began implementing changes almost immediately; when Bly returned to Blackwell’s Island a month later with a grand jury in tow, according to Mental Floss, “many of the abuses [including sanitary conditions and overbearing nurses] had been corrected.” Panic-free GMOs Beginning last summer, Grist began publishing a series that aimed to provide a levelheaded assessment of genetically modified foods. The 29-part series, most of them written by reporter Nathanael Johnson, examined everything from the myths surrounding GM crops to the mixed benefits of biotech seeds for farmers. Black Hawk Down In 1997, The Philadelphia Inquirer published a series of 29 articles by reporter Mark Bowden that documented the Battle of Mogadishu in Somalia, the most intensive close combat in U.S. Military history since the Vietnam War. One of the key events captured in the articles was the downing of a pair of U.S. Black Hawk helicopters. To write the articles, Bowden drew upon interviews with the men who fought in Mogadishu, as well as transcripts of military radio transmissions and a review of classified videotape. The series was later published as the book Black Hawk Down: A Story of Modern War, which was then adapted into a critically acclaimed movie. The Climate of Man In 2005, the New Yorker magazine published a three-part series titled The Climate of Man by staff writer Elizabeth Kolbert that examined the issue of climate change by often times visiting the people and places that were directly being affected. Kolbert later expanded upon her reporting for the series in a book about climate change called Field Notes from a Catastrophe. Seeking a Good Death The 1997 Pulitzer Prize for Explanatory Journalism was awarded to Michael Vitez, a reporter for The Philadelphia Inquirer, for his five-part series on the choices that confronted critically ill patients who sought to die with dignity. The Curve of Binding Energy One of the classic examples of narrative science journalism, John McPhee’s 1973 story about American physicist and prominent nuclear weapon designer Ted Taylor was initially published as a three-part series for the New Yorker. McPhee toured American nuclear institutions with Taylor and showed how easy it would be for a terrorist to steal nuclear material from private industry to create their own atomic bombs. The series’ title, The Curve of Binding Energy, refers to the amount of nuclear binding energy needed to hold atomic nuclei together. The articles were later published as a book under the same name. The Snowden Bombshells Last summer, The Guardian newspaper published a series of articles by journalist Glenn Greenwald, independent filmmaker Laura Poitras, and others that exposed the extent to which the U.S. National Security Agency was violating the privacy rights of Americans. The material used to report the series was provided by former intelligence analyst and exiled whistleblower Edward Snowden. The series of 14 articles won The Guardian a 2014 Pulitzer Prize in the category of Public Service. Ker Than is a freelance science writer living in the Bay Area. He has written for National Geographic, New Scientist, and Popular Science.
c02a37bfc8efc5c04b083316f6eb0f87
https://www.smithsonianmag.com/history/seven-famous-people-who-missed-the-titanic-101902418/
Seven Famous People Who Missed the Titanic
Seven Famous People Who Missed the Titanic The sinking of the Titanic claimed some 1,500 lives, among them a gallery of early 20th-century A-list celebrities. Captains of industry John Jacob Astor IV and Benjamin Guggenheim both went down with the ship, as did Macy’s co-owner Isidor Straus and his wife, Ida, who refused to leave his side. The popular American mystery writer Jacques Futrelle, the American painter and sculptor Francis Millet, and Maj. Archibald Butt, friend and aide to then-President William Howard Taft, were lost as well. But for all the boldface names among the Titanic’s victims, many more might have been aboard, but for the vagaries of fate. Among them were: The novelist, then 40, considered returning from his first European holiday aboard the Titanic; an English publisher talked him out of the plan, persuading the writer that taking another ship would be less expensive. Dreiser was at sea aboard the liner Kroonland when he heard the news. He recalled his reaction the following year in his memoir, A Traveler at Forty: “To think of a ship as immense as the Titanic, new and bright, sinking in endless fathoms of water. And the two thousand passengers routed like rats from their berths only to float helplessly in miles of water, praying and crying!” Greg Daugherty is a magazine editor and writer as well as a frequent contributor to Smithsonian.com. His books include You Can Write for Magazines.
44ad90b2d349447a2c30c41296751068
https://www.smithsonianmag.com/history/seven-famous-people-who-missed-the-titanic-101902418/?c=y&navigation=next&page=2
Seven Famous People Who Missed the Titanic
Seven Famous People Who Missed the Titanic The Italian inventor, wireless telegraphy pioneer and winner of the 1909 Nobel Prize in Physics was offered free passage on Titanic but had taken the Lusitania three days earlier. As his daughter Degna later explained, he had paperwork to do and preferred the public stenographer aboard that vessel. Although Marconi was later grilled by a Senate committee over allegations that his company’s wireless operators had withheld news from the public in order to sell information to the New York Times, he emerged from the disaster as one of its heroes, his invention credited with saving more than 700 lives. Three years later, Marconi would narrowly escape another famous maritime disaster. He was on board the Lusitania in April 1915 on the voyage immediately before it was sunk by a German U-boat in May. Greg Daugherty is a magazine editor and writer as well as a frequent contributor to Smithsonian.com. His books include You Can Write for Magazines.
87cb10e989d8f919dd91be7a13b27543
https://www.smithsonianmag.com/history/seven-famous-people-who-missed-the-titanic-101902418/?c=y&navigation=thumb&page=6
Seven Famous People Who Missed the Titanic
Seven Famous People Who Missed the Titanic The 34-year-old multimillionaire sportsman, an heir to the Vanderbilt shipping and railroad empire, was returning from a trip to Europe and canceled his passage on the Titanic so late that some early newspaper accounts listed him as being on board. Vanderbilt lived on to become one the most celebrated casualties of the Lusitania sinking three years later. Greg Daugherty is a magazine editor and writer as well as a frequent contributor to Smithsonian.com. His books include You Can Write for Magazines.
89ea3ada71c42a1c7c6da4f586b9926c
https://www.smithsonianmag.com/history/sex-and-space-travel-predictions-from-the-1950s-81300280/
Sex and Space Travel: Predictions from the 1950s
Sex and Space Travel: Predictions from the 1950s Illustration by L. Sterne Stevens in the March 1956 issue of Sexology magazine (source: Novak Archive) In September of 1992 astronauts Jan Davis and Mark Lee became the first married couple to leave the planet together. But NASA didn’t originally plan on it happening that way. NASA had an unwritten rule that married astronauts couldn’t be sent into space together. Davis and Lee had been assigned to the mission in 1989 but were later married in January 1991. After the agency learned of their marriage, NASA took two months to review the situation and believed that both were too important to the mission (the second flight of Space Shuttle Endeavour) for either of them to be removed. The couple had no children and NASA explained that if they had, they most certainly wouldn’t have flown together. June 26, 1992 Wisconsin State Journal Their flight was a minor public relations scandal because of an obvious question that reporters of the time were not shy about asking: would they be having sex in space? The answer from the astronauts and NASA was an unequivocal “no”. Outside of science fiction, the topic of sex in space has received surprisingly scant attention. But it was science fiction that inspired Dr. Robert S. Richardson to write an article in the March 1956 issue of Sexology: The Magazine of  Sex Science, wherein he describes his vision of what sexual relations might look like when space travel is a reality. This was a year and a half before the launch of Sputnik, so the Space Age wasn’t even firing on all thrusters yet. But Dr. Richardson opens his article by discussing his frustration with the fact that sex is never addressed in any of the sci-fi shows on TV. Given the reputation of 1950s broadcasting as a sexless environment — where married couples on programs like I Love Lucy had to sleep in separate beds, and wouldn’t even say the word “pregnant” — Richardson’s surprise comes across as a bit disingenuous. Nonetheless, Richardson makes his case for what he believes the future of sex in space might look like. From the introduction to the 1956 article: Recent announcements by the United States and Soviet Governments that they are planning space satellites and space rockets have stimulated universal interest in the problems of space travel. Space voyages to Mars will take a long time, and settlements on the distant plants will be lonely. While much has been written about the various scientific aspects of space travel, this is the first article which deals with the important medical problem: How will the natural sexual needs of early space travelers be met so as to provide a modicum of mental health for the space pioneers? Perhaps unsurprisingly, Dr. Richardson’s views on women in space aren’t the most enlightened. He writes under the assumption that only men will be astronauts and that these men will have certain carnal needs to be met during long missions in space. Many of Richardson’s ideas about space, and especially Mars, clearly come from the Collier’s series of articles on space travel from 1952 to 1954. Interestingly, Richardson becomes fixated on Mars throughout the article, ignoring the moon — a place humans wouldn’t even sink their boots until a full 13 years after his article was published. Richardson compares the establishment of an inevitable Martian base to the experience of military men in remote regions of the Arctic. But unlike relatively short tours in Greenland of a year or less, he acknowledges that a trip to Mars would be an adventure of three years or more. But can healthy young men work efficiently and harmoniously for long without women ? Reactions to this question vary widely. There are some who think it outrageous that sex should enter into the question at all. Just forget about the women. Keep busy and you won’t need to worry. Others recognize sex as a disturbing factor, but feel it is not too serious. In the old days, sailors made long voyages without women and still managed to perform their duties and bring the ship into port. They admit there was sexual over-indulgence soon after the sailors got on shore, but that was only to be expected. The remark heard most often is that the men turn to homosexualism and auto-eroticism during extended voyages. None of these answers meets the problem squarely. They either side-step the issue or suggest some degrading compromise solution. Richardson’s solution to the problem of loneliness for astronaut men sailing towards Mars is rather offensive, proposing that women tag along as sex objects with a mission to serve the crew (and take dictation when necessary). In our expedition to Mars, let our healthy young males take along some healthy young females to serve as their sexual partners. (Of course it would also help if they could operate a radio transmitter and take dictation.) These women would accompany them quite openly for this purpose. There would be no secrecy about this. There would be nothing dishonorable about their assignment. They would be women of the kind we ordinarily speak of as “nice girls.” “But then they wouldn’t be nice girls any more!” people will object. Judged by the arbitrary standards of our present social reference system, they certainly would not. But in our new social reference system they would be nice girls. Or rather, the girls would be the same, but our way of thinking about them would be different. It is possible that ultimately the most important result of space travel will be not what we discover upon the planets, but rather the changes that our widening outlook will effect upon our way of thinking. Will men and women bold enough to venture into space feel that they are still bound by often artificial and outmoded conventions of behavior prevalent upon a planet fifty million miles behind them ? May not men and women upon another world develop a social reference system — shocking as judged by us on earth today — but entirely “moral” according to extra-terrestrial standards? This last bit of speculation — of proposing that on other planets people may develop their own set of cultural and moral standards by which to judge sexual activity — would certainly be an interesting discussion to have, if it weren’t predicated on the notion that women would necessarily be secretaries and sex objects acting at the pleasure of the all-male astronaut crew. As far as we know, no one has yet had sex in space. But when they inevitably do, I suspect neither party will need to supplement their astronautic duties by taking dictation. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
75def6709626710199d97a91e43b7c89
https://www.smithsonianmag.com/history/sistine-chapel-evolution-new-haven-connecticut-180958499/
The “Sistine Chapel of Evolution” Is in New Haven, Connecticut
The “Sistine Chapel of Evolution” Is in New Haven, Connecticut When visitors go to the Yale Peabody Museum of Natural History, they are not exactly wrong to think that dinosaurs are the stars of the show. This is, after all, the museum that discovered Stegosaurus, Brontosaurus, Apatosaurus, Allosaurus, Triceratops, Diplodocus and Atlantosaurus, among others. House of Lost Worlds: Dinosaurs, Dynasties, and the Story of Life on Earth There’s even a 7,350-pound bronze Torosaurus on the sidewalk in front of this red brick Gothic Revival building on the outskirts of downtown New Haven. It was the Peabody that led the great age of paleontological discovery in the 19th century. It also went on to launch the modern dinosaur renaissance in the late 1960s, setting off a global wave of dinomania and incidentally inspiring the Jurassic Park franchise. And Peabody researchers continue to make groundbreaking discoveries. In 2010, they determined, for the first time, the exact coloration of an entire dinosaur, feather by feather. Anchiornis huxleyi is unfortunately still in China, where it was discovered: It looked like a Las Vegas showgirl crossed with a spangled Hamburg chicken. Plus, the Peabody houses one of the most revered images in all of paleontology: The Age of Reptiles, by Rudolph Zallinger, is a 110-foot-long mural depicting dinosaurs and other life-forms in a 362-million-year panorama of Earth’s history, moving one writer to call the museum “a Sistine Chapel of evolution.” So why on earth go to the Peabody for any reason other than dinosaurs? One answer: for the fossil mammal and bird discoveries that most visitors miss, but which Charles Darwin himself considered the best evidence for the theory of evolution in his lifetime. These discoveries were largely the work of a brilliant and intensely competitive Yale paleontologist named Othniel Charles Marsh. Though raised in a poor upstate New York farming family, Marsh was a nephew of George Peabody, a merchant banker and promoter of all things American in mid-19th-century London. Peabody built a vast fortune from scratch and then gave much of it away in his lifetime, with an emphasis on the formal education he lacked. The Yale Peabody Museum of Natural History, founded at his nephew’s urging in 1866 and now celebrating its 150th anniversary, was one result. Peabody’s wealth also enabled Marsh to lead a series of four pioneering Yale expeditions in the early 1870s, traveling via the new transcontinental railroad and on horseback to explore the American West. This story is a selection from the April issue of Smithsonian magazine Marsh focused at first not on dinosaurs, then little known, but on a creature of ardent popular and scientific interest: the horse. In January 1870, Thomas Henry Huxley, a British paleontologist nicknamed “Darwin’s Bulldog” for his fierce advocacy of evolutionary theory, used fossils to trace the horse back 60 million years to its supposed origin in Europe. But Marsh and his Yale crews were accumulating a rich fossil record proving, he thought, that the horse had evolved in North America. Huxley was so intrigued that he visited Yale in 1876, intent on seeing the evidence for himself. The two men spent much of an August week at “hard labor” reviewing fossils. It was a revelation: Huxley would ask to see a specimen illustrating some point about horse evolution, and as Huxley’s son and biographer Leonard later recounted, “Professor Marsh would simply turn to his assistant and bid him fetch box number so and so,” until Huxley finally exclaimed, “I believe you are a magician; whatever I want, you just conjure it up.” Huxley became a ready convert to Marsh’s argument that horses evolved in North America, and at his request, Marsh cobbled together a celebrated—though not particularly striking—illustration. You can see it now in a display case just past the dinosaurs, in the Peabody’s Hall of Mammals. It’s a lineup of leg bones and molars of different North American species. They show the horse increasing in size and evolving over 50 million years, from Orohippus, with four toes on its front legs, on up to the modern horse with a single hoof—an evolutionary development that allows it to gallop even across hard, flat prairies and deserts. Huxley presented this diagram and outlined the North American story at a lecture that September in New York. He thought Marsh had already discovered enough about the horse “to demonstrate the truth of the evolution hypothesis,” a truth, as the New York Times put it, “which could not be shaken by the raising of side issues.” Huxley also predicted that a more primitive horse would eventually turn up with a fifth toe. He and Marsh had discussed this theoretical “dawn horse,” dubbed Eohippus, and one evening in New Haven, Huxley had sketched a fanciful five-toed horse. Then he’d penciled in an equally fanciful hominid, riding bareback. With a swirling flourish, Marsh had added the caption “Eohippus & Eohomo,” as if horse and cowboy were ambling together out of the sunrise of some ancient American West. Writing a few days after his visit about what he had seen at the Peabody, Huxley remarked, “There is no collection of fossil vertebrates in existence, which can be compared to it.” What caught the attention of Darwin himself, though, wasn’t so much the horses as a pair of late Cretaceous birds. In the early 1870s, Marsh managed to obtain two spectacular fossil birds—Hesperornis and Ichthyornis— from 80 million-year-old deposits in the Smoky Hills region of north-central Kansas. These specimens had heads, unlike the only specimen of the ancient bird Archaeopteryx then known, and these heads had distinctly reptilian teeth for catching hold of fish underwater. The discovery, Marsh announced triumphantly, “does much to break down the old distinction between Birds and Reptiles.” In a monograph on the toothed birds of North America, he predicted correctly that Archaeopteryx would also turn out to have had teeth. In 1880, a correspondent was moved to write Marsh, “Your work on these old birds, and on the many fossil animals of North America, has afforded the best support to the theory of Evolution, which has appeared within the last twenty years”—that is, since the publication of On the Origin of Species. The letter was signed, “With cordial thanks, believe me, Yours very sincerely, Charles Darwin.” Hesperornis and Ichthyornis now occupy a little-noticed display case at the side of the Great Hall of Dinosaurs, overshadowed by the 70-foot-long Brontosaurus hulking nearby and the huge mural overhead. But they are worth a look for one added reason. Marsh eventually published his monograph about the toothed birds through the U.S. Geological Survey (USGS). Much later, in the 1890s, a congressman held up a copy of this book as an instance of taxpayer spending on “atheistic rubbish.” His incredulously repeated phrase—“birds with teeth, birds with teeth!”—helped drive a Congressional attack on the USGS, which was then arguing that scientific mapping of the water supply should shape the settlement of the West. Congress soon slashed USGS funding and overrode its warning that pell-mell settlement would yield “a heritage of conflict and litigation over water rights.” People fighting over water in the drought-stricken American West are still feeling the bite of those “birds with teeth.” ********** I took a seat on a wooden bench, alone except for a guard, in a room with a dozen or so gigantic dinosaurs on display. Brontosaurus dominates the scene, and it’s easy enough to see why Marsh gave it a name that means “thunder lizard.” The discovery of such enormous dinosaurs began one day in March 1877 when two scientifically minded friends, on a hike above Morrison, Colorado, suddenly found themselves gawking in silence at an enormous fossil vertebra embedded in stone. It was “so monstrous,” one of them wrote in his journal, “so utterly beyond anything I had ever read or conceived possible that I could hardly believe my eyes.” Marsh had by then withdrawn from fieldwork, instead using his inherited wealth to deploy hired collectors. He was also deeply engaged in a bitter rivalry, now remembered as “the Bone Wars,” with Edward Drinker Cope at the Academy of Natural Sciences of Philadelphia. Marsh managed to edge out Cope for that huge new specimen, naming it Titanosaurus (later Atlantosaurus). That same year, Marsh’s collectors also found and shipped him the meat-eating Jurassic monster Allosaurus and the plant-eaters Apatosaurus and Stegosaurus. Visitors to the museum today are liable to run their eyes over the massive bulk of Stegosaurus—which weighed five tons, when alive—and notice that its skull seems far too small for an adequate brain. Marsh thought so, too, and conjectured that Stegosaurus must have had a second brain in a large hollow area of its lower vertebrae. His Stegosaurus was long believed to be the inspiration for a celebrated bit of light verse in The Chicago Tribune in 1903, which included these lines: The creature had two sets of brains— One in his head (the usual place), The other at his spinal base. Thus he could reason a priori As well as a posteriori. Although numerous popular books still associate this poem with the Stegosaurus, that connection turns out to be false. In reality, a former student of Marsh’s merely borrowed his two-brain idea and slapped it onto an entirely different dinosaur, Brachiosaurus, at the Field Museum in Chicago. It was the Brachiosaurus that inspired this verse. But let’s at least credit Stegosaurus with an assist. Credit it, too, with only a single brain, described by one modern paleontologist, as roughly “the size and shape of a bent hotdog.” Nine of Marsh’s dinosaurs turn up in the mural overhead, but only three of Cope’s​.​ (Old rivalries die hard.) Artist Rudolph Zallinger was a 23-year-old at the start in 1942, and later admitted that he did not know “the front end from the rear end of a dinosaur.” He spent four years on the project, and one art historian called the resulting Garden of Eden for dinosaurs the most important mural since the 15th century. In 1953, Life magazine published a fold-out reprint of the original study of the mural, with a detail of Brontosaurus and Stegosaurus on the cover. The mural thus inspired a generation of future paleontologists. It also caught the attention of a moviemaker in Tokyo, who borrowed heavily from Zallinger’s dinosaurs to put together a new monster—Godzilla. Zallinger’s mural incorporated the then-current dogma, from O.C. Marsh and others, that dinosaurs were plodding tail-draggers. But in 1964, John Ostrom, a paleontologist at the museum, made a discovery that shattered this stereotype. He and an assistant were out for a walk in Bridger, Montana, at the end of that year’s field season, when they spotted what looked like a hand with an outsized claw eroding out of a rocky slope. It was in fact a foot, and that sharp, sickle-shaped claw projecting almost five inches from the innermost toe eventually gave the species its name, Deinonychus, or “terrible claw.” Studying his find over the next few years, Ostrom began to think that instead of being slow and stupid, Deinonychus “must have been a fleet-footed, highly predaceous, extremely agile and very active animal, sensitive to many stimuli and quick in its responses.” He took this idea an audacious leap forward before the North American Paleontological Convention in 1969. Evidence suggested, he declared, that many dinosaurs “were characterized by mammalian or avian levels of metabolism.” This idea elicited “shrieks of horror” from traditionalists in the audience, according to the paleontologist Robert Bakker, who had been Ostrom’s undergraduate student at Yale and went on to popularize this new view of dinosaurs. It was the beginning of the modern dinosaur renaissance. The following year, Ostrom began to compare the many similarities between Deinonychus and the ancient bird Archaeopteryx. From that insight, he went on in a series of groundbreaking papers to establish that the bipedal theropod dinosaurs, including Deinonychus, were in fact the ancestors of modern birds. This idea is now so commonplace that researchers debate why birds were the only dinosaurs to survive the mass extinction of 66 million years ago. The novelist Michael Crichton later spent time interviewing Ostrom in person and by phone, paying particular attention to the capabilities of Deinonychus. He later told Ostrom apologetically that his book Jurassic Park would instead feature Velociraptor, a Deinonychus relative, because the name sounded “more dramatic.” Visitors to the Peabody Museum can, however, still see the original Deinonychus model with its arms and legs flung back and out, elbows bent, claws flared. During a recent visit, a former graduate student of Ostrom’s pointed out an intriguing resemblance: If you take those outstretched arms and swing them back just a little farther (with a few small evolutionary adaptations), that hand-snatching gesture becomes the wingbeat of birds. The museum is currently raising funds to undertake a dramatic updating of both the Great Hall of Dinosaurs and the Hall of Mammals. (Brontosaurus will no longer drag its tail and Stegosaurus will do combat with Allosaurus.) But it’s worth going now because the outdated displays and the dinosaur reconstructions are somehow evocative of another era in paleontology. When you go, take a look at another fossil most visitors skip past: It’s a Uintathere, a “beast of the Uinta Mountains.” It lived roughly 45 million years ago on the present-day Utah-Wyoming border, and it looked like a rhinoceros, but with long, saber-like upper canines, and three sets of knobs, like the ones on the head of a giraffe, running from its nose to the top of its oddly flattened head. This Uintathere was one of the first reconstructions O.C. Marsh approved for display in the museum. Marsh generally liked to reconstruct fossil animals only on paper, with the actual bones safely stored away for study. So he nervously ordered his preparator to construct a Uintathere entirely out of papier-mâché. Because of the scale of the Uintathere, this required paper with a high fiber content. According to backroom lore, the perfect raw material arrived at the museum one day after Marsh prevailed on friends in high places to provide U.S. currency otherwise destined for destruction. The sign on the display does not say so. But you can pass on the tale to your companions: What you are looking at may be quite literally the first “million-dollar fossil.” Richard Conniff, a Smithsonian contributor since 1982, is the author of seven books about human and animal behavior.
719a1d7123a6bb123a23fa439fc92629
https://www.smithsonianmag.com/history/site-salem-witch-trial-hangings-finally-has-memorial-180964049/
The Site of the Salem Witch Trial Hangings Finally Has a Memorial
The Site of the Salem Witch Trial Hangings Finally Has a Memorial Eight years ago, when they bought their house overlooking a wooded ledge in Salem, Massachusetts, Erin O’Connor and her husband, Darren Benedict, had no idea why that parcel stood empty. The scrubby lot lay tucked between houses on Pope Street, within sight of a large Walgreen’s—nothing much to look at. So when people began to stop by and take pictures of the empty site last winter, they wondered why. If they’d been there in 1692, they would have known. That’s when the rocky ledge on the parcel next door turned into a site of mass execution—and when the bodies of people hanged as witches were dumped into a low spot beneath the ledge known as “the crevice.” In the night, when the hangings were over, locals heard the sounds of grieving families who snuck over to gather up their dead and secretly bury them elsewhere. But for much of history, the site sat quietly obscured by woods and buildings. A leather tannery and railroad operated nearby, and in recent years, houses surrounded it. And for O’Connor, Benedict and much of Salem, that history has faded despite the town’s outsized reputation. Now, it will finally be commemorated when Salem mayor Kimberley Driscoll dedicates a memorial below Proctor’s Ledge on July 19. The date coincides with the first of three mass executions there. On the same day in 1692, five women—Sarah Good, Elizabeth Howe, Susannah Martin, Rebecca Nurse, and Sarah Wildes—were hanged from a tree on the ledge, and their bodies fell into a “crevice,” where the memorial now marks their names. Later victims included wealthy landowner John Proctor, killed in August. He had publicly condemned the witch trials and had punished his female servants for claiming to be possessed by witches’ spirits in the hysteria of the day. Proctor’s Ledge is named for his grandson, who bought the land knowing its history. The Salem witch trials were “the largest and most lethal witch hunt in American history,” wrote historian Emerson “Tad” Baker, a professor at Salem State University in his 2015 book A Storm of Witchcraft: The Salem Trials and the American Experience. In a June symposium about the trials, Baker spoke about the volatile political and social climate in Salem in the 1690s. At the time, an interim colonial government was in charge and Sir William Phips, the new governor, was considered weak. In response, says Baker, people felt a spiritual decline. “Puritans thought God was telling them something,” he says. Add to this the extreme weather of the “Little Ice Age”—hot dry summers and lethally cold winters—famine, economic failures and frontier wars with the French and Native Americans, and it became a scenario ripe for disaster. Finger-pointing and mass hysteria ensued. During a series of trials, young women accused “witches” of making them contort, writhe and shriek. The accusations were “neighbor on neighbor,” says University of Connecticut geographer Ken Foote. It was an anxious time. In the 325 years since 19 of the falsely accused were hanged as witches in Salem, the coastal town has never forgotten what happened. (Most of the trial activity took place in Salem. Some of the young accusers lived in Salem Village, later renamed Danvers.) Somehow, the site of the hangings had until now faded from memory, replaced by an obsession with the “witches” themselves that borders on kitsch. Witch tourism gave Salem the moniker “Witch City,” a major economic driver that local officials have long said they value. (Even the police department’s logo includes a witch.) Every Halloween, as many as 250,000 visit for the event called Haunted Happenings. Revelers dress as zombies and witches. Families take “ghost tours,” and wander around a psychic fair, costume balls, and film festivals—all run by a public-private partnership called Destination Salem. The kinder, gentler form of witch interest dates to the television situation comedy “Bewitched,” which filmed several episodes in town in the 1970s. A statue of the actress Elizabeth Montgomery (who played the witch Samantha Stevens) stands downtown. Other popular sites include the Witch House, the home of trials judge Jonathan Corwin, and the Old Burying Point Cemetery, where tourists visit the grave of the other judge, John Hathorne (ancestor of author Nathaniel Hawthorne). Adding to its rich history, Salem has become a center for thousands of practitioners of the Wiccan faith, which has no relation to the satanic imaginations of 1692. It’s hard to know where the dark history fades and the spiritual or lighthearted steps in. Though tourists often ask where the hangings took place, they were directed to the wrong place for years. Taxi drivers and, famously, John Lennon and Yoko Ono’s limousine driver, would take them to the top of the place named Gallows Hill because for years townspeople thought that was the hanging site. Only last year did a group of historians, including Baker, verify that the hangings took place below Gallows Hill, on Proctor’s Ledge, underscoring the earlier conclusion of historian Sidney Perley, who identified the ledge in the early 1900s. The new memorial, the first of its kind to be built at the execution site, was funded by a community grant and donations from some of the descendants of Salem’s “witches.” (Many descendants belong to a group called the Associated Daughters of Early American Witches.) It incorporates a granite wall and memorial stones with the 19 killed innocents’ names set in a semicircle around a single oak tree, a dominant tree in the colonial landscape (the hangings above were probably from an oak).  In 1992, the Salem Award Foundation erected the Salem Witch Trials Memorial adjacent to the Old Burying Ground, a cemetery in town where one of the judges and some other notables are interred. Visitors leave notes and flowers on commemorative benches, “and I think some of them must think it is just a park,” Baker, the historian, says. Mayor Driscoll said in a release that the new memorial site “presents an opportunity for us to come together as a community, recognize the injustice and tragedy perpetrated against those innocents in 1692, and recommit ourselves to the values of inclusivity and justice.” Baker believes that the memorial could turn Americans toward greater understanding of what a witch hunt truly means in today’s world full of fear of terrorism. “Americans today gaze back at the people of 1692 as a foolish, superstitious, and intolerant lot,” Baker wrote in A Storm of Witchcraft. “Yet that is to dismiss the figure in the mirror.” But not everyone feels unbridled relief at this new awareness. Neighbors of Proctor’s Ledge didn’t know they lived near the exact site of the hangings until last winter, when the city held public hearings to discuss the memorial site. They understood that the site (owned by the city since 1936) “would never be built on because the city owns it,” says O’Connor. “We were a little bummed because they cut down all our trees.” And the Halloween reveling “can be a little crazy,” she says. “My neighbor last year was working at home and people started having a loud séance in her back yard.” Centuries ago, the site of the hangings had been out of the way, if not quiet. The once marshy area lay on the outskirts of town and could be seen from a distance, says Marilynne Roach, a Watertown, Massachusetts, researcher and author of the book Six Women of Salem. In 1692, she says, the ledge overlooked the North River, which at the time made an L-shaped bend heading from the town of Peabody and toward the Atlantic Ocean. By the 19th century, the neighborhood surrounded tanneries (known as “blubber hill”) and other industry. Today, one can drive from there to the Salem town center in about five minutes. The choice of that particular spot for the hangings likely served a strategic purpose 325 years ago that sounds pretty ghoulish today: It was public enough so people could watch the executions, Roach says, “but you don’t want it in someone’s backyard. It is a bit out of the way and it is public land. I get the impression that everybody in town who could get away from work would come out and watch these.” Roach credits documented accounts of nearby residents who once stood near the ledge as providing the historic proof that this was the place. Now, the site will once again attract the public—not gawkers this time, but visitors commemorating the witch trials’ innocent victims. Robin Eddy, whose backyard abuts the hanging site, said a neighbor told her as she moved in 21 years ago, "You know, you're living on the site where they threw the witches' bodies after they hung them." She added, "And I'm like, 'Ha ha ha.'" Eddy said that unlike some of her neighbors, "I think it's pretty cool. I think it's amazing. To me, the memorial is…like a hallowed ground. It represents kind of where the human race was at a certain point in history. It makes me think about that and how we never want to go that way again. We need to practice tolerance." The mayor will stand respectfully as she dedicates the memorial below Proctor’s Ledge on the 325th anniversary of five of the hangings. And in a few months, Halloween will come again to Salem. “That’s fun,” Roach says, “but mixing up strolling zombies with the memorial in the middle of town and the burial ground, it confuses the public and it leads to crowds that can damage the real thing. And [it’s] a pain in the neck for people who live in Salem. I kind of avoid Salem in October for the most part."
8909edd7cecfdb34200e2772e81ff91f
https://www.smithsonianmag.com/history/speech-brought-india-brink-independence-180964366/
The Speech That Brought India to the Brink of Independence
The Speech That Brought India to the Brink of Independence For more than 200 years, Britain had asserted its iron will over India. From the East India Company levying taxes starting in the 18th century to Britain instituting direct rule over two-thirds of the country in the mid-19th century, India had been extorted for centuries—and with the start of World War II, India was declared to be at war with Germany without any Indian political leaders actually being consulted. The nation would go on to provide 2.3 million soldiers for an army as well as food and other goods to help the Allies defeat the Axis Powers. Much as the Indian National Congress (the largely Hindu public assembly that had some governmental functions) sympathized with defeating fascism, they balked at seeing their country further pillaged for resources. So in 1939, members of the Congress informed Viceroy Lord Linlithgow—the highest-ranking British official in India—they would only support the war effort if Indian independence lay at the end of it. To which Linlithgow issued his own threat: if the Congress didn’t support Britain, Britain would simply turn to, and empower, the Muslim League (a political group that fought to protect the rights of Muslim Indians and later called for a separate nation for Muslims). As Winston Churchill later confessed, “the Hindu-Moslem feud [was] a bulwark of British rule in India.” The Congress could do nothing but acquiesce. But they hadn’t abandoned the fight, especially one of their most notable members: Mohandas “Mahatma” Karamchand Gandhi. The spiritual and political leader first experienced racism decades earlier, as a London-educated lawyer working in colonial South Africa. There, he was thrown off a train for trying to sit in the first class car; the 1893 incident led him to his civil rights work, for which he was repeatedly imprisoned. “I discovered that as a man and as an Indian I had no rights,” Gandhi later said of that period in South Africa. “More correctly, I discovered that I had no rights as a man because I was an Indian.” Agitating for change through nonviolence would become Gandhi’s lifelong pursuit. On the eve of World War II, he wrote Hitler twice in hopes of persuading the dictator to avoid total war (it’s impossible to know if Hitler read the letters, as no response was ever sent). And when India was forced to assist the United Kingdom in the fight, Gandhi began a small individual civil disobedience campaign, recruiting political and community leaders for the cause. Although his 1940 effort was disrupted by arrests of the participants, popular opinion in England was largely on Gandhi’s side—U.K. citizens favored Indian independence. By 1942, Prime Minister Churchill felt enough pressure to send Sir Stafford Cripps, a member of the War Cabinet, to discuss a change to India’s political status. But upon learning that Cripps wasn’t actually offering full independence and that current Indian politicians would still have no say in military strategy, the Congress and the Muslim League rejected his proposal—leaving Gandhi open to harness the wave of anti-British sentiment for a new round of protests. The movement, Gandhi decided, would be called “Quit India” to reflect his main demand: that the United Kingdom leave India voluntarily. In a speech at a meeting of the Congress in Bombay at the beginning of August 1942, Gandhi instructed his fellow leaders that this was the moment to seize power: “Here is a mantra, a short one, that I give to you. You may imprint it on your hearts and let every breath of yours give expression to it. The mantra is ‘Do or Die.’ We shall either free India or die in the attempt; we shall not live to see the perpetuation of our slavery. Every true Congressman or woman will join the struggle with inflexible determination not to remain alive to see the country in bondage and slavery.” The Congress agreed that Gandhi should lead a nonviolent mass movement and passed their decision as the “Quit India Resolution” on August 8. Gandhi was prepared to give a public address on the subject the very next day, when word came that British authorities were planning on arresting him and other members of the Congress. “They dare not arrest me. I cannot think they will be so foolish. But if they do, it will mean that their days are numbered,” Gandhi said. But late that night, Gandhi and many other members of the Congress were indeed arrested and imprisoned under the Defense of India Rules. The press was forbidden from publishing any part of Gandhi’s speech, supporting the Congress’s call to action, or reporting on measures the British government enacted to suppress the nascent movement. “The resolution said, ‘On the declaration of India’s independence a provisional government will be formed and free India will become an ally of the United Nations.’ This meant unilaterally declaring India’s independence,” writes Pramod Kapoor, author of the forthcoming book Gandhi: An Illustrated Biography, by email. The thought of an unauthorized shift to independence is what so terrified the British. “The intelligence reports the government was getting were equally alarming. The British had at one point even mulled over the possibility of deporting Gandhi to Aden.” On August 10, India’s Secretary of State Leo Amery, working with the War Cabinet and other British leaders, announced the reason for the arrests of Gandhi and the Congress to the press. Amery said the Indian leaders planned to incite “strikes, not only in industry and commerce, but in the administration and law courts, schools and colleges, the interruption of traffic and public utility services, the cutting of telegraph and telephone wires, the picketing of troops and recruiting stations… The success of the proposed campaign would paralyze not only the ordinary civil administration of India, but her whole war effort.” In short, the movement would have led to dire calamity if the British government had not detained its leaders. But Amery’s speech, meant to paint the British government in a positive light and vilify the Congress, completely backfired. As historian Paul Greenough writes, “The chief irony of 1942 in India was that the awesome power of the press to inspire united action was unleashed by the British government; the radicalizing text was the composition of Leopold Amery, not Mahatma Gandhi… [the] self-consciously rebellious underground press was never able to duplicate the impact or achieve the degree of mass coordination which Amery’s speech had provoked.” In essence, Amery had provided the blueprints for how to rebel. Civilians attacked railway stations and post offices, fought against police officers and held riots. The police and the British Army in India led a violent crackdown on the rioters, arresting over 100,000 people. Viceroy Lord Linlithgow compared the uprising to the failed Sepoy Rebellion of 1857, when nearly one million Indians and thousands of Europeans were killed. The total civilian deaths after the Quit India protests, however, were closer to 1,000. Still, the underground press did have success in one thing: getting Gandhi’s mantra out to the masses. “Do or die” became the unifying rallying cry for a civil disobedience campaign that spread across the subcontinent and lasted from August 1942 to September 1944. Protests erupted from Bombay to Delhi to Bengal; a steel plant closed for 13 days; a strike at a textile factory lasted 3.5 months. Even though Muslim participation in “Quit India” wasn’t as high as other groups, supporters of the Muslim League still offered shelter to activists. And, crucially, Indians employed by the British government as police officers and administrative officials turned on their employer. “They gave shelter, provided information and helped monetarily. In fact, the erosion of loyalty to the British Government of its own officers was one of the most striking aspects of the Quit India struggle,” writes Bipan Chandra in India’s Struggle for Independence. Although Gandhi deeply regretted that the movement had turned so violent after his arrest, he and his wife, Kasturba, were both incarcerated in Agha Khan Palace and could do nothing but struggle to survive, writes Kapoor. In February 1943, Gandhi staged a 21-day hunger strike that nearly killed him, but remained imprisoned. His wife developed bronchitis and suffered several heart attacks behind bars; she would ultimately die there just a month before Gandhi was released in May 1944. The day of Gandhi’s release marked his last ever in an Indian prison, where had spent a combined total of 2,089 days over the course of his life—nearly six years (and not factoring in the 249 days he was in South African prisons). While the “Quit India” movement ended in late 1944, the momentum it provided in securing the country’s independence proved unstoppable. Three years later, India was independent. And through a successful lobbying effort by the Muslim League, the independent Islamic state of Pakistan was also established along the new sovereign nation’s northwestern border. Although some scholars have argued the rebellion was only a small part of Britain’s decision to relinquish the “Crown Jewel” of the colonies—citing the need to rebuild after World War II as a more pressing concern—others, including Kapoor, see the movement as a major turning point. “It was an opportune time in the life of a long freedom struggle,” Kapoor says. “With or without the war, the time was ripe for some sort of intensive movement.” And that movement happened to be “Quit India.” Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
c747fded57465b5b154043c5742639cb
https://www.smithsonianmag.com/history/sticky-rice-mortar-view-space-and-more-fun-facts-about-chinas-great-wall-180962197/
Sticky Rice Mortar, the View From Space, and More Fun Facts About China’s Great Wall
Sticky Rice Mortar, the View From Space, and More Fun Facts About China’s Great Wall Ancient work of monumental architecture, Wonder of the World, and protection against—giant lizards? The Great Wall of China is perhaps more powerful as a symbol than a physical structure, but in a new Hollywood blockbuster starring Matt Damon (who weathered some controversy related to whitewashing) the wall is all about fighting off formidable enemies. To celebrate the release of “The Great Wall,” read more about the mammoth structure that inspired the movie. The wall was built over the course of centuries Construction of the wall was first initiated by Emperor Qin Shi Huang around 220 BC, who was the first emperor of unified China. For centuries, China had been divided into numerous geopolitical factions. This Warring States Period saw plenty of walls constructed to form boundaries between the different groups. With Qin as emperor, the walls between states were removed and some were repurposed to form a border between China and the “barbarians” to the north. Approximately 300,000 captured soldiers and conscripts were forced to complete Qin’s section of the wall, which were mostly made of rammed earth. While Qin was remarkable for starting the wall, the most enduring sections were built during the Ming Dynasty (1368-1644), when Beijing was made the new Chinese capital. This chunk of the wall stretched from the Yalu River (on the border with modern-day North Korea) to Gansu Province hundreds of miles to the west. The Ming wall remains the most famous portion of the structure, with its iconic stone towers and gates around Beijing, Tianjin and Hebei. It isn’t actually one long wall Built by a series of governments over 2,000 years, the wall isn’t one long, unbroken stretch of fearsome architecture. It is actually a chain of different structures, including fortresses, gates, watchtowers and shelters, and there are large gaps between different sections. The wall’s official length was released in 2012 by China’s State Administration of Cultural Heritage after a five-year study, putting it at 13,170 miles, but experts point out that this includes sections of the wall that no longer exist. Arthur Waldron, a historian and expert on the Great Wall, says the solid wall is more like 1,700 miles long. The Great Wall wasn’t a great barrier Although giant lizards were never a concern, like they are for Matt Damon and his cohorts in The Great Wall, Chinese governments were very concerned about Mongol raiders—and with good reason, considering how often they invaded. But it turns out the wall wasn’t very effective way of keeping the invaders out. “While a towering monument to Chinese civilization, it was hardly impregnable,” writes Ishaan Tharoor for the Washington Post. “The Mongols, Manchus and others all breached this great defense and went on to establish their dominion behind its ramparts.” Genghis Khan and Kublai Khan easily broke through the wall in the 13th century, and in September 1550, Mongol raider Altan Khan led tens of thousands of raiders on an attack past the wall, killing thousands of Chinese civilians and plundering the countryside for several weeks before retreating. Depending on which dynasty was in power, the wall wasn’t even all that necessary. “The Tang, who ruled from 618 to 907AD built virtually no walls, because the imperial family was part Turkic and skilled in Central Asian warfare and diplomacy,” writes Peter Hessler for the New Yorker. During the Ming Dynasty, the wall was one of three strategies for dealing with the Mongols. The other two included taking the offensive and buying off important leaders with gifts or access to trade. It’s a myth that you can see the Great Wall from the Moon (and it’s only barely visible from space) In 1923, National Geographic started one of the most enduring myths about the wall: that it’s “the only work of man’s hands which would be visible to the human eye from the moon.” Neil Armstrong, following his return from the moon in 1969, was asked on a number of occasions whether the wall was visible. But due to the wall’s construction materials, which blend into the terrain around it, the Great Wall has only ever been visible from low orbit (100 miles up) – and even then, the sun has to be in the perfect position to illuminate it and cast shadows. Even China’s own astronaut, Yang Liwei, admitted he couldn’t identify the structure with the naked eye. There’s a secret ingredient that holds the wall together Scientists at Zhejiang University in China were researching the makeup of mortar used for building the Great Wall when they realized something unusual was added to the standard mixture of lime (limestone that has been heated to a high temperature) and water: sticky rice. The mixture made it the world’s first example of composite mortar, including organic and inorganic material. In their tests, the scientists compared the quality of mortar made with and without sticky rice, and found that “sticky rice-lime mortar has more stable physical properties, has greater mechanical storage, and is more compatible, which make it a suitable restoration mortar for ancient masonry.” Using sticky rice as a construction ingredient was one of the greatest innovations of the Ming dynasty, helping their structures (including tombs and pagodas as well) survive earthquakes and the elements, researchers said. People have been pillaging chunks of the Great Wall for decades While it might be a source of national pride today, the Great Wall hasn’t always received so much love. Approximately one-third of the structure is crumbling, 20 percent is in “reasonable” condition, and the last half has disappeared after centuries of neglect. During the deadly Cultural Revolution (a 10-year movement initiated by Mao Zedong that resulted in the killing of 1.5 million Chinese and millions more imprisoned and tortured), Chairman Mao and other officials encouraged the dismantling of the wall for use as bricks to build homes. And while it may be state-protected today, farmers living in rural areas continue to use the bricks to build homes and animal pens. Smugglers snuck valuable contraband through border checkpoints along the wall In addition to keeping invaders out, the wall was an ideal checkpoint for letting people in. Nomadic people of the steppe came to the wall to trade horses and leather for manufactured Chinese goods like pottery and clothing. Much like modern TSA agents, Chinese border guards kept records of travelers passing through gates along the wall, checked for contraband, and compared travelers to lists of criminals and smugglers. Among the most famous smugglers were the two legendary monks who hid silkworm eggs in their bamboo staffs, managing to trick the border guards and bring the source of silk to Byzantine emperor Justinian I. The wall is the longest cemetery on Earth Construction workers were a disposable commodity when it came to building the wall. It’s estimated that as many as 400,000 people died building the wall, earning it the sobriquet “longest cemetery on Earth.” Many of the workers who died during the wall’s construction were buried in its foundation. Peasants and soldiers forced into labor suffered under terrible conditions, with insufficient food, steep hillsides and brutal weather. The wall had such a reputation for suffering that it was an indispensible reference in Chinese literature, like in the “Soldier’s Ballad” (200 A.D.) and popular novels of the Ming dynasty. It was one pricey wall Even without factoring in the loss of life, the wall was a massive undertaking. Between the cost of labor, the food and dwellings needed to house workers, and the raw materials, the Great Wall was extraordinarily expensive. Often the Chinese people bore the brunt of these expenses, since the government levied higher taxes to pay for the wall and its repairs. During the Ming dynasty, repairs on the west end of the wall cost 470 ounces of silver per kilometer, for a total of 487,500 ounces. Repairs to the east also required further financing. Building extensions to the walls themselves were even more costly: in 1576 these fortifications were projected to cost over 3.3 million ounces of silver, which accounted for more than three-quarters of the government’s annual revenue, writes historian Julia Lovell in The Great Wall: China Against the World, 1000 BC - AD 2000. A graffiti zone for the Great Wall Decorations etched into the Great Wall go back for centuries, including carvings of clouds and lotus blossoms supposedly created by the wives of soldiers constructing the wall under the direction of General Qi Jiguang of the Ming dynasty. But in modern times, graffiti has become a nuisance rather than an expression of art. In 2016, NBA player Bobby Brown of the Houston Rockets came under fire for carving his name into the Great Wall, but plenty of more anonymous tourists have left their marks as well. The problem has become so widespread, Chinese officials have considered setting up a special graffiti section at one of the fighting towers at the Mutianyu section of the wall (about 40 miles north of Beijing), where visitors will be allowed to carve their immortal words. Lorraine Boissoneault is a contributing writer to SmithsonianMag.com covering history and archaeology. She has previously written for The Atlantic, Salon, Nautilus and others. She is also the author of The Last Voyageurs: Retracing La Salle's Journey Across America. Website: http://www.lboissoneault.com/
82381adbfbb0b509e2d46af19b606276
https://www.smithsonianmag.com/history/still-mysterious-death-edgar-allan-poe-180952936/
The (Still) Mysterious Death of Edgar Allan Poe
The (Still) Mysterious Death of Edgar Allan Poe It was raining in Baltimore on October 3, 1849, but that didn't stop Joseph W. Walker, a compositor for the Baltimore Sun, from heading out to Gunner's Hall, a public house bustling with activity. It was Election Day, and Gunner's Hall served as a pop-up polling location for the 4th Ward polls. When Walker arrived at Gunner's Hall, he found a man, delirious and dressed in shabby second-hand clothes, lying in the gutter. The man was semi-conscious, and unable to move, but as Walker approached the him, he discovered something unexpected: the man was Edgar Allan Poe. Worried about the health of the addled poet, Walker stopped and asked Poe if he had any acquaintances in Baltimore that might be able to help him. Poe gave Walker the name of Joseph E. Snodgrass, a magazine editor with some medical training. Immediately, Walker penned Snodgrass a letter asking for help. Midnight Dreary: The Mysterious Death of Edgar Allan Poe The Poe Shadow: A Novel Baltimore City, Oct. 3, 1849 Dear Sir, There is a gentleman, rather the worse for wear, at Ryan's 4th ward polls, who goes under the cognomen of Edgar A. Poe, and who appears in great distress, & he says he is acquainted with you, he is in need of immediate assistance. Yours, in haste, JOS. W. WALKER To Dr. J.E. Snodgrass. On September 27—almost a week earlier—Poe had left Richmond, Virginia bound for Philadelphia to edit a collection of poems for Mrs. St. Leon Loud, a minor figure in American poetry at the time. When Walker found Poe in delirious disarray outside of the polling place, it was the first anyone had heard or seen of the poet since his departure from Richmond. Poe never made it to Philadelphia to attend to his editing business. Nor did he ever make it back to New York, where he had been living, to escort his aunt back to Richmond for his impending wedding. Poe was never to leave Baltimore, where he launched his career in the early 19th- century, again—and in the four days between Walker finding Poe outside the public house and Poe's death on October 7, he never regained enough consciousness to explain how he had come to be found, in soiled clothes not his own, incoherent on the streets. Instead, Poe spent his final days wavering between fits of delirium, gripped by visual hallucinations. The night before his death, according to his attending physician Dr. John J. Moran, Poe repeatedly called out for "Reynolds"—a figure who, to this day, remains a mystery. Poe's death—shrouded in mystery—seems ripped directly from the pages of one of his own works. He had spent years crafting a careful image of a man inspired by adventure and fascinated with enigmas—a poet, a detective, an author, a world traveler who fought in the Greek War of Independence and was held prisoner in Russia. But though his death certificate listed the cause of death as phrenitis, or swelling of the brain, the mysterious circumstances surrounding his death have led many to speculate about the true cause of Poe's demise. "Maybe it’s fitting that since he invented the detective story," says Chris Semtner, curator of the Poe Museum in Richmond, Virginia, "he left us with a real-life mystery." 1. Beating In 1867, one of the first theories to deviate from either phrenitis or alcohol was published by biographer E. Oakes Smith in her article "Autobiographic Notes: Edgar Allan Poe." "At the instigation of a woman, " Smith writes, "who considered herself injured by him, he was cruelly beaten, blow upon blow, by a ruffian who knew of no better mode of avenging supposed injuries. It is well known that a brain fever followed. . . ." Other accounts also mention "ruffians" who had beaten Poe senseless before his death. As Eugene Didier wrote in his 1872 article, "The Grave of Poe," that while in Baltimore, Poe ran into some friends from West Point, who prevailed upon him to join them for drinks. Poe, unable to handle liquor, became madly drunk after a single glass of champagne, after which he left his friends to wander the streets. In his drunken state, he "was robbed and beaten by ruffians, and left insensible in the street all night." 2. Cooping Others believe that Poe fell victim to a practice known as cooping, a method of voter fraud practiced by gangs in the 19th century where an unsuspecting victim would be kidnapped, disguised and forced to vote for a specific candidate multiple times under multiple disguised identities. Voter fraud was extremely common in Baltimore around the mid 1800s, and the polling site where Walker found the disheveled Poe was a known place that coopers brought their victims. The fact that Poe was found delirious on election day, then, is no coincidence. Over the years, the cooping theory has come to be one of the more widely accepted explanations for Poe's strange demeanor before his death. Before Prohibition, voters were given alcohol after voting as a sort of reward; had Poe been forced to vote multiple times in a cooping scheme, that might explain his semi-conscious, ragged state. Around the late 1870s, Poe's biographer J.H. Ingram received several letters that blamed Poe's death on a cooping scheme. A letter from William Hand Browne, a member of the faculty at Johns Hopkins, explains that "the general belief here is, that Poe was seized by one of these gangs, (his death happening just at election-time; an election for sheriff took place on Oct. 4th), 'cooped,' stupefied with liquor, dragged out and voted, and then turned adrift to die." 3. Alcohol "A lot of the ideas that have come up over the years have centered around the fact that Poe couldn’t handle alcohol," says Semtner. "It has been documented that after a glass of wine he was staggering drunk. His sister had the same problem; it seems to be something hereditary." Months before his death, Poe became a vocal member of the temperance movement, eschewing alcohol, which he'd struggled with all his life. Biographer Susan Archer Talley Weiss recalls, in her biography "The Last Days of Edgar A. Poe," an event, toward the end of Poe's time in Richmond, that might be relevant to theorists that prefer a "death by drinking" demise for Poe. Poe had fallen ill in Richmond, and after making a somewhat miraculous recovery, was told by his attending physician that "another such attack would prove fatal." According to Weiss, Poe replied that "if people would not tempt him, he would not fall," suggesting that the first illness was brought on by a bout of drinking. Those around Poe during his finals days seem convinced that the author did, indeed, fall into that temptation, drinking himself to death. As his close friend J. P. Kennedy wrote on October 10, 1849: "On Tuesday last Edgar A. Poe died in town here at the hospital from the effects of a debauch. . . . He fell in with some companion here who seduced him to the bottle, which it was said he had renounced some time ago. The consequence was fever, delirium, and madness, and in a few days a termination of his sad career in the hospital. Poor Poe! . . . A bright but unsteady light has been awfully quenched." Though the theory that Poe's drinking lead to his death fails to explain his five-day disappearance, or his second-hand clothes on October 3, it was nonetheless a popular theory propagated by Snodgrass after Poe's death. Snodgrass, a member of the temperance movement, gave lectures across the country, blaming Poe's death on binge drinking. Modern science, however, has thrown a wrench into Snodgrasses talking points: samples of Poe's hair from after his death show low levels of lead, explains Semtner, which is an indication that Poe remained faithful to his vow of sobriety up until his demise. 4. Carbon Monoxide Poisoning In 1999, public health researcher Albert Donnay argued that Poe's death was a result of carbon monoxide poisoning from coal gas that was used for indoor lighting during the 19th century. Donnay took clippings of Poe's hair and tested them for certain heavy metals that would be able to reveal the presence of coal gas. The test was inconclusive, leading biographers and historians to largely discredit Donnay's theory. 5. Heavy Metal Poisoning While Donnay's test didn't reveal levels of heavy metal consistent with carbon monoxide poisoning, the tests did reveal elevated levels of mercury in Poe's system months before his death. According to Semtner, Poe's mercury levels were most likely elevated as a result of a cholera epidemic he'd been exposed to in July of 1849, while in Philadelphia. Poe's doctor prescribed calomel, or mercury chloride. Mercury poisoning, Semtner says, could help explain some of Poe's hallucinations and delirium before his death. However, the levels of mercury found in Poe's hair, even at their highest, are still 30 times below the level consistent with mercury poisoning. 6. Rabies In 1996, Dr. R. Michael Benitez was participating in a clinical pathologic conference where doctors are given patients, along with a list of symptoms, and instructed to diagnose and compare with other doctors as well as the written record. The symptoms of the anonymous patient E.P., "a writer from Richmond" were clear: E.P. had succumbed to rabies. According to E.P.'s supervising physician, Dr. J.J. Moran, E.P. had been admitted to a hospital due to "lethargy and confusion." Once admitted, E.P.'s condition began a rapid downward spiral: shortly, the patient was exhibiting delirium, visual hallucinations, wide variations in pulse rate and rapid, shallow breathing. Within four days—the median length of survival after the onset of serious rabies symptoms—E.P. was dead. E.P., Benitez soon found out, wasn't just any author from Richmond. It was Poe whose death the Maryland cardiologist had diagnosed as a clear case of rabies, a fairly common virus in the 19th century. Running counter to any prevailing theories at the time, Benitez's diagnosis ran in the September 1996 issue of the Maryland Medical Journal. As Benitez pointed out in his article, without DNA evidence, it's impossible to say with 100 percent certainty that Poe succumbed to the rabies virus. There are a few kinks in the theory, including no evidence of hydrophobia (those afflicted with rabies develop a fear of water, Poe was reported to have been drinking water at the hospital until his death) nor any evidence of an animal bite (though some with rabies don't remember being bitten by an animal). Still, at the time of the article's publication, Jeff Jerome, curator of the Poe House Museum in Baltimore, agreed with Benitez's diagnosis. "This is the first time since Poe died that a medical person looked at Poe's death without any preconceived notions," Jerome told the Chicago Tribune in October of 1996. "If he knew it was Edgar Allan Poe, he'd think, 'Oh yeah, drugs, alcohol,' and that would influence his decision. Dr. Benitez had no agenda." 7. Brain Tumor One of the most recent theories about Poe's death suggests that the author succumbed to a brain tumor, which influenced his behavior before his death. When Poe died, he was buried, rather unceremoniously, in an unmarked grave in a Baltimore graveyard. Twenty-six years later, a statue was erected, honoring Poe, near the graveyard's entrance. Poe's coffin was dug up, and his remains exhumed, in order to be moved to the new place of honor. But more than two decades of buried decay had not been kind to Poe's coffin—or the corpse within it—and the apparatus fell apart as workers tried to move it from one part of the graveyard to another. Little remained of Poe's body, but one worker did remark on a strange feature of Poe's skull: a mass rolling around inside. Newspapers of the day claimed that the clump was Poe's brain, shriveled yet intact after almost three decades in the ground. We know, today, that the mass could not be Poe's brain, which is one of the first parts of the body to rot after death. But Matthew Pearl, an American author who wrote a novel about Poe's death, was nonetheless intrigued by this clump. He contacted a forensic pathologist, who told him that while the clump couldn't be a brain, it could be a brain tumor, which can calcify after death into hard masses. According to Semtner, Pearl isn't the only person to believe Poe suffered from a brain tumor: a New York physician once told Poe that he had a lesion on his brain that caused his adverse reactions to alcohol. 8. Flu A far less sinister theory suggests that Poe merely succumbed to the flu—which might have turned into deadly pneumonia—on this deathbed. As Semtner explains, in the days leading up to Poe's departure from Richmond, the author visited a physician, complaining of illness. "His last night in town, he was very sick, and his [soon-to-be] wife noted that he had a weak pulse, a fever, and she didn’t think he should take the journey to Philadelphia," says Semtner. "He visited a doctor, and the doctor also told him not to travel, that he was too sick." According to newspaper reports from the time, it was raining in Baltimore when Poe was there—which Semtner thinks could explain why Poe was found in clothes not his own. "The cold and the rain exasperated the flu he already had," says Semtner, "and maybe that eventually lead to pneumonia. The high fever might account for his hallucinations and his confusion." 9. Murder In his 2000 book Midnight Dreary: The Mysterious Death of Edgar Allan Poe, author John Evangelist Walsh presents yet another theory about Poe's death: that Poe was murdered by the brothers of his wealthy fiancée, Elmira Shelton. Using evidence from newspapers, letters and memoirs, Walsh argues that Poe actually made it to Philadelphia, where he was ambushed by Shelton's three brothers, who warned Poe against marrying their sister. Frightened by the experience, Poe disguised himself in new clothes (accounting for, in Walsh's mind, his second-hand clothing) and hid in Philadelphia for nearly a week, before heading back to Richmond to marry Shelton. Shelton's brothers intercepted Poe in Baltimore, Walsh postulates, beat him, and forced him to drink whiskey, which they knew would send Poe into a deathly sickness. Walsh's theory has gained little traction among Poe historians—or book reviewers; Edwin J. Barton, in a review for the journal American Literature, called Walsh's story "only plausible, not wholly persuasive." "Midnight Dreary is interesting and entertaining," he concluded, "but its value to literary scholars is limited and oblique." --- For Semtner, however, none of the theories fully explain Poe's curious end. "I've never been completely convinced of any one theory, and I believe Poe's cause of death resulted from a combination of factors," he says. "His attending physician is our best source of evidence. If he recorded on the mortality schedule that Poe died of phrenitis, Poe was most likely suffering from encephalitis or meningitis, either of which might explain his symptoms." Natasha Geiling is an online reporter for Smithsonian magazine.
614055bfa74321f359c7bc6c14c3b077
https://www.smithsonianmag.com/history/stopping-nazi-plots-1930s-los-angeles-180966961/
The Nazis’ Plan to Infiltrate Los Angeles And the Man Who Kept Them at Bay
The Nazis’ Plan to Infiltrate Los Angeles And the Man Who Kept Them at Bay Men in armbands stand below an American flag, flanked by Nazi symbols and a portrait of Hitler. In another photograph, swastika flags line Broadway Street in Los Angeles. The cover of historian Steven J. Ross’s new book looks like something straight out of the beloved novel The Man in the High Castle and television series of the same name. But these aren’t doctored images and no, you’re not about to crack open Philip K. Dick’s alternative, dystopian tale. In Hitler in Los Angeles: How Jews Foiled Nazi Plots Against Hollywood and America, Ross, a professor at the University of Southern California, uncovers the fascinating, complex story of how Nazis infiltrated the region and recruited sympathetic Americans to their cause. While American Nazis were working on plans and ideas to subvert the government and carry out acts of anti-Semitic violence, Leon Lewis created a network of spies to stop them. A Jewish lawyer and WWI veteran, Lewis was the founding executive secretary of the Anti-Defamation League. Throughout the 1920s and early '30s, he tracked the rise of fascism in Europe both for the organization and on his own. As Ross related in an interview, “I think it's safe to say nobody was watching Hitler more closely during those years than Lewis.” After Hitler became chancellor of Germany in 1933, Nazi officials sent agents to the United States to start the Friends of New Germany (FNG) organization—later renamed the German American Bund—intended to bolster support overseas. That July, Nazis held a rally in Los Angeles and started meeting and recruiting at their Deutsche Haus headquarters downtown—beginning a cycle Lewis was all too familiar with. As Ross writes, “Lewis knew from years of monitoring the foreign press that the Nazi government encouraged Germans living in the United States to form ‘active cells wherever sufficient numbers of Nationalist Socialists can be gathered into proselytizing units.’” Central to the Nazis’ mission was cultivating fifth columnists—“disloyal forces within a nation’s border”—who could be called upon to side with Germany if war began. It was clear to Lewis that it was time to act, but he found the Jewish community divided as to how best to combat rising anti-Semitism, and the U.S. government was more concerned with tracking Communism than fascism. So Lewis organized a spy ring on his own, focusing on the same people the Nazis were hoping to recruit: German-Americans veterans. Just as Hitler had channeled the frustration of World War I veterans and struggling citizenry in Germany to help elect him, his supporters in Los Angeles hoped to stir up feelings of resentment among those who were disgruntled by cuts to their veteran benefits during the Depression. Southern California was a particularly appealing locus: about one-third of disabled veterans lived there, and the region had 50 German-American organizations with 150,000 members, which the Nazis hoped to unite. Compared to New York City, the port of Los Angeles was largely unguarded, perfect for trafficking in propaganda from Germany. Additionally, the area was ripe for Nazi messaging: it was one of the strongest centers outside of the South for the Klu Klux Klan, with large gatherings held throughout the 1920s. The chilling, little-known story of the rise of Nazism in Los Angeles, and the Jewish leaders and spies they recruited who stopped it. No American city was more important to the Nazis than Los Angeles, home to Hollywood, the greatest propaganda machine in the world. The Nazis plotted to kill the city's Jews and to sabotage the nation's military installations: plans existed for hanging twenty prominent Hollywood figures such as Al Jolson, Charlie Chaplin, and Samuel Goldwyn; for driving through Boyle Heights and machine-gunning as many Jews as possible; and for blowing up defense installations and seizing munitions from National Guard armories along the Pacific Coast. But Lewis, who knew a number of German-American vets from his work with the Disabled American Veterans, appealed to his spies’ sense of patriotism. The spies, Ross said, “risked their lives because they believed that when a hate group attacks one group of Americans, it's up to every American to rally to defend them.” And their loyalty to Germany didn’t translate to Hitler; many despised him for what he had done to their ancestral nation. Save for one Jewish spy, Lewis’s network was comprised entirely of Gentiles. Initially, Lewis planned to spy just long enough to find evidence to convince local and federal officials of the real danger Nazis posed to Los Angeles. But when he presented his first round of findings, he was met with ambivalence, at best; he discovered a number of L.A. law enforcement personnel were sympathetic to Nazism and fascism—or were members of the groups themselves. Without serious government attention, Lewis realized he would need to keep his operation going. He decided to solicit financial support from Hollywood executives—who were also the targets of some of the unearthed plans and whose industry was at the core of Hitler’s machinations. Before the various theaters of war opened in the late '30s and early '40s, the Nazis trained their eyes on the theaters in Hollywood. Hitler and his chief propagandist, Joseph Goebbels, realized the power of the film industry’s messaging, and they resented the unsavory portrayals of WWI-era Germany. Determined to curb negative portrayals of the nation and Nazis, they used their diplomats to pressure American studios to “create understanding and recognition for the Third Reich,” and refused to play films in Germany that were unfavorable to Hitler and his regime. Lewis’s network of spies, many of whom were trusted by top Bund officials in L.A., reported on and worked to interrupt a wide range of haunting plots, including the lynching of film producers Louis B. Mayer and Samuel Goldwyn and star Charlie Chaplin. One called for using machine guns to kill residents of the Boyle Heights neighborhood (a predominantly Jewish area), and another conspired to create a fake fumigation company to surreptitiously kill Jewish families (a chilling precursor to the gas chambers of Nazi concentration camps). Lewis’s spies even uncovered plans to blow up a munitions plant in San Diego and to destroy several docks and warehouses along the coast. There was talk of seizing National Guard armories and setting up a West Coast fortress for Hitler after Germany’s planned invasion and ultimate takeover the U.S. government. The many plans were drafted by local fascists and Nazis but the leaders, Ross explained, “would have undoubtedly told officials in Berlin, most likely by handing over sealed letters to the Gestapo officer who accompanied every German vessel that docked in L.A. from 1933 until 1941.” Lewis and his spies were able to break up these plots through a variety of means: by sowing discord between leaders of the Bund, getting certain plotters deported or into legal trouble and fostering a general sense of distrust among members that spies had infiltrated the group. While Ross doesn’t think the Germans would have prevailed in overthrowing the government, he contends that many of the schemes were serious threats. “I uncovered so many plots to kill Jews that I absolutely believe, had Leon Lewis' spies not penetrated and foiled every single one of those plots, some of them would have succeeded,” he said. On December 8, 1941—the day after Pearl Harbor and the U.S.’ entrance into the war—when the FBI needed to round up Nazi and fascist sympathizers, Lewis was able to provide crucial information on operations in California. Yet Lewis continued his spy ring even after the U.S. declared war on Germany, because he found a “dramatic rise in anti-Semitism as greater numbers of citizens blamed Jews for leading the nation into war.” His spy operations ceased in 1945, once the war came to a close. At its core, Hitler in Los Angeles subverts the idea that there wasn’t active and significant resistance to Nazism in America before WWII. Even decades later, it’s easy to wonder why more wasn’t done to prevent Hitler’s rise and Nazi atrocities, and to point out the warning signs that now seem obvious. But Ross’s research makes clear there was a contemporary understanding and opposition, well before the rest of the US realized the scale of Hitler’s plans, even if the story went untold for so long. The son of Holocaust survivors, Ross said that researching this book has changed how he thinks about resistance: “They stopped this without ever firing a gun, without ever using a weapon. They used the most powerful weapon of all…their brains.” But the book also challenges an idea many Americans take comfort in—that “it can’t happen here.” In a sense, it did happen here: Nazism and fascism found a foothold in 1930s Los Angeles and attracted locals to its cause. And while Lewis’s dedication helped thwart it, it’s alarming to consider the alternate history wasn’t far off. Anna Diamond is the former assistant editor for Smithsonian magazine.
780762c630e8614d26f2978fb5f96faf
https://www.smithsonianmag.com/history/story-lidice-massacre-180970242/
The Lost Children of the Lidice Massacre
The Lost Children of the Lidice Massacre In 1947, eight-year-old Václav Zelenka returned to the Czech village of Lidice as the last of the town’s lost children. Five years earlier, he and the rest of Lidice’s 503 residents had been viciously attacked by the Nazis, but the young Zelenka had few recollections of the event. He had spent the remainder of World War II living with an adoptive family in Germany, never realizing that he was stolen from his community in Czechoslovakia. In hindsight, Zelenka was lucky: He was one of only 17 child survivors of the Nazis’ June 10, 1942, massacre, an arbitrary act of violence that ultimately claimed the lives of 340 Lidice residents. Despite his initial reluctance to leave Germany, Zelenka readjusted to his former life—and later became the mayor of the rebuilt town of Lidice. The world first learned about Lidice via a brutally detached Nazi radio annoucement broadcast the day after the attack: “All male inhabitants have been shot. The women have been transferred to a concentration camp. The children have been taken to educational centers. All houses of Lidice have been leveled to the ground, and the name of this community has been obliterated.” Although the Nazis hoped to make an example of Lidice by erasing it from history, their bold proclamation, accompanied by ample photographic evidence of the atrocity, infuriated the Allies to such an extent that Frank Knox, secretary of the U.S. Navy, proclaimed, “If future generations ask us what we were fighting for in this war, we shall tell them the story of Lidice.” When news of the Lidice massacre broke, the international community responded with outrage and a promise to keep the town’s memory alive. A small neighborhood in Joliet, Illinois, adopted Lidice’s name, and President Franklin D. Roosevelt released a statement praising the gesture: “The name of Lidice was to be erased from time,” he said. “Instead of being killed as the Nazis would have it, Lidice has been given new life.” In the English district of Stoke-on-Trent, Member of Parliament Barnett Stross led a “Lidice Shall Live” campaign and raised money for rebuilding efforts. Artists further immortalized the tragedy in works including poet Edna St. Vincent Millay’s The Massacre of Lidice. In comparison, the Allied response to the Nazis’ Final Solution, which claimed the lives of six million Jews (including 263,000 Czech Jews), was deliberately measured. On December 17, 1942, the U.S., British and other Allied governments issued a statement condemning the Nazis’ annihilation of European Jews, but they were hesitant to overemphasize the Jews’ plight. The people of Lidice were seen as universal victims—peaceful civilians who had the misfortune to witness the Nazis’ disregard for human life firsthand. Europe’s Jewish population represented a far more politically charged demographic. Amidst rising anti-Semitic sentiment and German propaganda accusing the Allies of bowing to “Jewish interests,” Lidice emerged as a neutral, indisputably despicable example of Nazi immorality. Discussion of the Holocaust, on the other hand, raised an entirely separate debate. *** If not for an untimely love letter, Lidice might have escaped the war unscathed. Czechoslovakia was one of the Nazis’ first targets: Germany assumed control of the Sudetenland, a Czech territory inhabited by many ethnic Germans, in 1938, and invaded the remaining Czech lands in March 1939. Lidice, a mining village about 12 miles from Prague, languished under the control of Reinhard Heydrich, a high-ranking SS official and deputy of the Protectorate of Bohemia and Moravia, but did not appear to be in immediate danger. As Heydrich worked to crush the Czech resistance movement, however, the situation grew tenuous. On May 27, 1942, operatives ambushed the hated Nazi; critically wounded, Heydrich died of sepsis on June 4. An enraged Adolf Hitler ordered immediate retaliation. He decided to make an example of Lidice because he believed several residents were connected to the Czech resistance. In nearby Kladno, the Gestapo had intercepted a love letter written by a suspected participant in Heydrich’s assassination. The note was addressed to a local factory worker who, upon interrogation, implicated the Horáks, a family living in Lidice. Known Allied sympathizers, the Horáks even had a son fighting in Great Britain’s Czech army, but after investigating the claim, the Nazis found no connection between the family and Heydrich’s death. Hitler, determined to punish the Czech people regardless of their complicity in the underground movement, moved ahead with his plan. Just after midnight on June 10, Nazi officials arrived in Lidice and herded villagers into the main square. Men over the age of 15 were taken to the Horáks’ farmhouse, women and children to a school in Kladno. By afternoon, the Nazis had systematically executed 173 men. Victims were brought out in groups of 10 and lined up against a barn, which had been covered with mattresses to prevent bullets from ricocheting. Officials offered mercy to local priest Josef Stembarka in exchange for calming his congregation, but he refused. “I have lived with my flock,” he said, “and now I will die with it.” Women who refused to leave their husbands were also shot, and men who happened to be away from the village were later found and killed. Determined to obliterate Lidice, the Nazis destroyed every building in sight and even dug up the town’s cemetery. They dumped massacre victims into a mass grave dug by prisoners from Terezin, a nearby concentration camp, and gleefully filmed the aftermath of the annihilation. This footage would soon become Nazi propaganda designed to quell further resistance. In Kladno, the remaining villagers waited for news of their families. Pregnant women and babies under the age of one were separated from the others, as were several children with Germanic facial features. No news arrived, but three days after the attack, Nazi officials separated the young from their mothers, assuring all that a reunion would follow relocation. The women boarded trucks bound for Ravensbrück concentration camp, and most of the children left for a camp in Łódź, Poland. The young survivors arrived in Łódź with a message from their Nazi captors: “The children are taking with them only what they wear. No special care is to be provided.” Indeed, the only “care” given at the camp was extensive physical testing. German doctors measured the children’s facial features, identifying those with “Aryan” characteristics as candidates for Germanization—a process where suitably featured non-German children were adopted by German families. In total, nine children met the criteria for Germanization and were sent to Puschkau, Poland, to learn German and begin the assimilation process. On July 2, the remaining 81 children arrived at Chelmno extermination camp. Historians believe they were killed in mobile gas chambers that same day. By the end of the war, 340 of Lidice’s 503 residents were dead as a direct result of the June 10 massacre. 143 women and 17 children, including those born just after the attack, eventually returned to the ruins of their hometown and began the arduous task of resurrecting the community. Today, Lidice—a small town of about 540 residents, rebuilt alongside a memorial and museum commemorating the tragedy—stands in defiance of the Nazis’ attempted extermination: 82 larger-than-life bronze statues, each representing a lost child of Lidice, greet visitors. Last year, on the 75th anniversary of the tragedy, mourners gathered everywhere from the Czech village itself to an Illinois neighborhood that has borne Lidice’s name since July 1942. Anna Hanfová, one of three siblings selected for Germanization, was one of the first lost children to return. She spent the remainder of the war living in eastern Germany but maintained limited contact with her sister Marie and cousin Emilie Frejová, and when Anna returned to Lidice, she led authorities to both relatives’ new German homes. Otto and Freda Kuckuk, a well-to-do couple with strong SS ties, had adopted Frejová. In Witnesses to War, author Michael Leapman writes that Frejová adjusted well, but Marie’s new life was more complicated: Her adoptive family treated her like a slave and convinced her that the Czech were a subservient race. It took several years for Marie to overcome this indoctrinated belief. Václav, the third sibling, refused to cooperate with his captors; he drifted between children’s homes and incurred brutal punishments for unruly behavior. In late 1945, Josefina Napravilova, a humanitarian who located about 40 lost Czech children during the aftermath of the war, encountered Vaclav at a displaced persons camp. He was slow to trust her but later dubbed Napravilova his “second mother.” Elizabeth White, a historian at the United States Holocaust Memorial Museum, explains the difficulty of the children’s rehabilitation process, as most selected for Germanization were taken from home at a young age and eventually forgot their Czech heritage. “When [the children] were found and sent back, they didn't remember how to speak Czech,” White says. “One girl’s mother survived Ravensbrück but had tuberculosis and died four months after she came back. At first when they spoke, they had to use a translator.” Martina Lehmannová, director of the Lidice Memorial, says that the Nazis embraced Lidice as a symbol of power. In comparison to many of their crimes, which were largely hidden from the rest of the world, the Nazis publicized the town’s destruction through radio broadcasts and propaganda footage. “They were proud of it,” Lehmannová adds. *** As White explains, there were several reasons for the Allies’ relative restraint toward the Holocaust: Nazi propaganda insinuated that the Allies were only fighting the war to protect Jewish interests, and the Allies wanted to refute this claim. In the U.S., anti-Semitic sentiment was on the rise, and many people believed that Roosevelt was overly beholden to the Jews. The Allies also believed that widespread knowledge of the Final Solution would lead to demands for increased immigration quotas, which would aid Jewish refugees but infuriate isolationists and foster further instability. “The Allies emphasized that the Nazis were a threat to all of humanity, that the war was about freedom versus slavery,” White adds. “When they would condemn Nazi atrocities, [they highlighted attacks] against peaceful citizens.” Thanks to the visual evidence provided by the Nazis, the Lidice massacre became a powerful Allied propaganda tool. By focusing on atrocities against all innocent individuals, the Allies spurred patriotism without encouraging claims of their overzealous interest in Jewish affairs. Although the Nazis failed to erase Lidice from history, White says the attack fulfilled at least one intended purpose: “Within Czechoslovakia, [the massacre] really did lead to the breaking of the resistance.” The Nazis’ harsh reprisal may have succeeded in deterring underground activity, but the Czech people did not forget the terrors inflicted at Lidice. As Lehmannová explains, the name of the town is very close to the Czech word lid, which means people, and in the aftermath of the tragedy, Lidice came to represent the Nazis’ crimes against all inhabitants of Czechoslovakia. In 1947, Lidice was reborn after an outpouring of global support. Builders laid the foundation stone of the new village 300 meters from its original location, which now holds a memorial to the murdered townspeople. A garden filled with more than 24,000 donated rose bushes connects new and old. “You can taste the feeling of dystopia on the empty space of old Lidice and the feeling of utopia in the new village,” says Lehmannová. Since 1967, Lidice has hosted the International Children’s Exhibition of Fine Arts: Lidice, an annual competition in which youth from all over the world submit art based on themes such as biodiversity, cultural heritage and education. According to Sharon Valášek, Mid-West honorary consul to the Czech Republic, the Lidice massacre “became a symbol of human suffering around the world,” and the exhibition was conceived as a way of having people “think about human suffering in general, not necessarily just related to Lidice.” Today, the thriving Lidice community stands as a testament to its residents’ resilience, but the rebuilding process was far from straightforward. In 1967, reporter Henry Kamm visited the fledgling town and spoke to Ravensbrück survivor Miloslava Žižková. She acknowledged the difficulties of returning to Lidice, noting that there was no school because “we are still missing one generation.” Žižková added, however, that Lidice was home: “This is where we have our roots.” Just outside of the new village, a wooden cross marked the mass grave of Lidice’s murdered residents—including Žižková’s father and grandfather. Here, at least, survivors found a hauntingly tangible explanation for their return. Meilan Solly is Smithsonian magazine's assistant digital editor, humanities. Website: meilansolly.com.
b26d9da82ff5c42b15b7f8890d2cfb24
https://www.smithsonianmag.com/history/surprisingly-intolerant-history-milk-180969056/
The Surprisingly Intolerant History of Milk
The Surprisingly Intolerant History of Milk On May 8, 1858, Frank Leslie’s Illustrated Newspaper ran a scandalous article on a seemingly benign topic: milk. In a 5,000-word expose, the paper characterized a group of Brooklyn and New York distilleries as “milk murderers” who had distributed “liquid poison” to the unsuspecting masses. “For the midnight assassin, we have the rope and the gallows; for the robber the penitentiary; but for those who murder our children by the thousands we have neither reprobation nor punishment,” wrote the reporter. “They are not penal villains, but licensed traders, and though their traffic is literally in human life the Government seems powerless or unwilling to interfere.” Sold by firms hoping to maximize their profits, so-called “swill milk” came from dairy cows that were fed the steaming remains of grain distillation. These cows lived in nearby stables amid miserable conditions–most only survived for a few months–and produced a sickly, bluish milk. To mask this ghastly color, the distilleries added chalk, eggs, flour, water, molasses, and other substances. Local distributors then purchased this toxic concoction from the distilleries and brazenly marketed it as “Pure Country Milk.” The mendacity of breweries and their willingness to take advantage of young children and their families no doubt contributed to the dramatic and theatrical flair of Leslie’s reports. But as Mark Kurlansky points out in his new book Milk!, the controversy is just one episode among many in milk’s long history. Indeed, for Kurlansky, no food invites more clamorous debate. Mark Kurlansky's first global food history since the bestselling Cod and Salt; the fascinating cultural, economic and culinary story of milk and all things dairy--with recipes throughout. “We’ve been arguing about these issues for 10,000 years,” Kurlansky says. “In a lot of cases, it’s because there’s not a hard answer… there’s a conflict of values.” And argue, they did: In subsequent reporting, Leslie’s alleged that “the deaths of two-thirds of the children in New York and Brooklyn could be distinctly traced to the use of impure milk” and the normally restrained New York Times wondered how “the 8,000 children that died last year from the poison of swill milk” could not spur public health officials and local leaders into action. It soon became clear that New York wasn’t the only city with problems: Thousands of children from Boston to Chicago to San Francisco died each year from the contaminated swill. Public outcry from these revelations eventually led many distilleries to close their deleterious dairies, or, at the very least, clean up their operations. The drama was also magnified by the fact that raw milk was just becoming popular. For most of history, humans weren’t interested in the direct consumption of animal milk. Instead, the early milkers of the fertile crescent transformed it into sour yogurt, butter and cheese; the hot climate caused milk to quickly spoil. Even so, milk was a vital symbol in the mythology of the Sumerians, Greeks and Egyptians. The Fulani of West Africa believed that the world started with a single drop of milk, and in Norse legend, a cow made from thawing frost sustained the world in its earliest days. As Kurlansky points out, Milk is even written into the story of our cosmos–our galaxy, after all, is called the Milky Way. But even with these deep cultural connections, milk held a peculiar status among early civilizations. The Greeks castigated barbarians for their gluttonous desire for dairy, and in Rome, milk was widely regarded as low-status food because it was something only farmers drank. Northern Europeans would earn similar ridicule for their love of reindeer milk, and Japanese Buddhists later rebuked Europeans as “butter stinkers.” Given the longstanding intolerance, it’s hard to explain why milk became prevalent in Western diets. While medieval Europeans relied on dairy products for their sustenance, raw milk remained dangerous. Feeding children with bottles, something that’s been done since antiquity, was seen as a last resort, according to Kurlanksy. There were some efforts to combat the spoilage problem, and enterprising farmers tried to keep milk and cream cold by lowering them into wells. But when farmer Thomas Moore famously built the first refrigerator in 1803, he was interested in storing butter, not milk. It would take both technological innovation and a change in social mores to popularize animal milk. With the growth of cities and the movement of families from rural to urban areas during the 19th century, more women began to work outside the home, and new technologies that mechanized milking allowed access at a lower cost than ever before. Although agriculture lagged behind other industries such as textiles, milk was one of the first foods to be truly affected by industrialization. “It was the age of the Industrial Revolution, where the ethos was to make everything bigger,” Kurlansky says. “You went from small operations to large operations–shops turned into factories–and everything was just going that way.” Skyrocketing production and affordable prices led to the widespread availability of raw milk, but it would take another important breakthrough across the Atlantic to ensure safe consumption: pasteurization Pioneered by its namesake, Louis Pasteur, in France during the 1860s, pasteurization proved a tough sell in the United States even with the swill milk debacle. There was little doubt that the process improved milk safety by eliminating the diseases that led to so many deaths, but consumers complained that pasteurized milk was flavorless. Some officials, including Harvey Wiley, then the director of the U.S. Bureau of Chemicals also argued that pasteurized milk lost its nutritional qualities. In response, milk distributors introduced alternatives to reassure the public about the safety of milk, most notably the certified milk produced by Fairfield Dairy at the end of the 19th century. However, many consumers were unwilling or unable to pay the high price. The milk question grew to such dire proportions that it even attracted the attention of President Theodore Roosevelt. In 1908, his Surgeon General released a 600-page report that attributed most childhood deaths to impure milk and argued that pasteurization was the best way to address the ongoing public health crisis. “While pasteurization is not the ideal to be sought, practically, it is forced upon us by present conditions,” the authors wrote. “It prevents much sickness and saves many lives.” Despite the mounting scientific evidence, pasteurization still spread slowly. Beyond nutritional concerns, some feared that it was just a superficial intervention. As one commentator noted in a March 1908 issue of Outlook, “Wholesale pasteurization, while lulling consumers into a false sense of security, would vastly increase the burdens of milk inspectors and make their work more difficult if not entirely impossible.” Others bemoaned the high costs of pasteurization and argued that it could lead to other maladies. In Chicago, for example, Alderman Jacob Hey called it “false science” and said it was the cause of rickets and scurvy. As Kurlansky points out, public health explanations did little to satisfy proponents of raw milk who could respond with their own critiques of the system. “Milk, probably more than any other food, is really personal,” Kurlansky says. “We're all set up as mammals to have milk as our first nutrition and people are just kind of stuck on that idea.” The discussion around how to best prepare milk even continues today, evinced in the growth of the GMO-free products and the resurgence of artisanal industries and local dairies. Kurlansky says that the economics remain a difficult challenge – “it just takes so much money to feed a cow” – but there are new opportunities for next crop of dairy innovators and entrepreneurs. After millennia of raucous disagreement, though, it’s unlikely that we’ll see any resolution in the near future. After all, raw milk is just one brief episode in a long history of dairy-fueled debate. “The problem with the milk story is that it doesn't have any conclusion,” Kurlansky says. “As it goes along, it just picks up more and more of these controversies. And people are still fighting over milk because it’s just essential to human history.” Daniel Fernandez is an editorial intern at Smithsonian magazine. He studies journalism and history at Northwestern University in Evanston, Illinois.
ff68e868e3cd70d7dc51c95c058d76c7
https://www.smithsonianmag.com/history/sweeping-new-digital-database-emphasizes-enslaved-peoples-individuality-180976513/
Who Were America’s Enslaved? A New Database Humanizes the Names Behind the Numbers
Who Were America’s Enslaved? A New Database Humanizes the Names Behind the Numbers The night before Christmas in 1836, an enslaved man named Jim made final preparations for his escape. As his enslavers, the Roberts family of Charlotte County, Virginia, celebrated the holiday, Jim fled west to Kanawha County, where his wife’s enslaver, Joseph Friend, had recently moved. Two years had passed without Jim’s capture when Thomas Roberts published a runaway ad pledging $200 (around $5,600 today) for the 38- to 40-year-old’s return. “Jim is … six feet or upwards high, tolerably spare made, dark complexion, has rather an unpleasant countenance,” wrote Roberts in the January 5, 1839, issue of the Richmond Enquirer. “[O]ne of his legs is smaller than the other, he limps a little as he walks—he is a good blacksmith, works with his left hand to the hammer.” In his advertisement, Roberts admits that Jim may have obtained free papers, but beyond that, Jim’s fate, and that of his wife, is lost to history. Fragments of stories like Jim’s—of lives lived under duress, in the framework of an inhumane system whose aftershocks continue to shape the United States—are scattered across archives, libraries, museums, historical societies, databases and countless other repositories, many of which remain uncatalogued and undigitized. All too often, scholars pick up loose threads like Jim’s, incomplete narratives that struggle to be sewn together despite the wealth of information available. Enslaved: Peoples of the Historic Slave Trade, a newly launched digital database featuring 613,458 entries (and counting), seeks to streamline the research process by placing dozens of complex datasets in conversation with each other. If, for instance, a user searches for a woman whose transport to the Americas is documented in one database but whose later life is recorded in another, the portal can connect these details and synthesize them. “We have these data sets, which have a lot of specific information taken in a particular way, [in] fragments,” says Daryle Williams, a historian at the University of Maryland and one of the project’s principal investigators. “... [If] you put enough fragments together and you put them together by name, by place, by chronology, you begin to have pieces of lives, which were lived in a whole way, even with the violence and the disruptions and the distortions of enslavement itself. We [can] begin then to construct or at least understand a narrative life.” Funded through a $1.5 million grant from the Andrew W. Mellon Foundation, Enslaved.org—described by its creators as a “linked open data platform” featuring information on people, events and places involved in the transatlantic slave trade—marks the culmination of almost ten years of work by Williams and fellow principal investigators Walter Hawthorne, a historian at Michigan State University, and Dean Rehberger, director of Michigan State’s Matrix Center for Digital Humanities & Social Sciences. Originally, the team conceived Enslaved.org as a space to simply house these different datasets, from baptismal records to runaway ads, ship manifests, bills of sale and emancipation documents. But, as Rehberger explains, “It became a project about how we can get datasets to interact with one another so that you can draw broader conclusions about slavery. … We’re going in there and grabbing all that data and trying to make sense of it, not just give [users] a whole long list of things.” The project’s first phase launched earlier this month with searchable data from seven partner portals, including Slave Voyages, the Louisiana Slave Database and Legacies of British Slave-Ownership. Another 30 databases will be added over the next year, and the team expects the site to continue to grow for years to come. Museums, libraries, archives, historical societies, genealogy groups and individuals alike are encouraged to submit relevant materials for review and potential inclusion. *** To fulfill the “important obligation” of involving researchers of all types and education levels, the scholars made their platform “as familiar and unintimidating as possible,” according to Williams. Users who arrive without specific research goals in mind can explore records grouped by categories as ethnicity or age, browse 75 biographies of both prominent enslaved and free people and lesser-known ones, and visualize trends using a customizable dashboard. Researchers, amateur genealogists and curious members of the public, meanwhile, can use Enslaved.org to trace family histories, download peer-reviewed datasets, and craft narratives about some of the 12.5 million enslaved Africans transported to the New World between the 16th and 19th centuries. At its core, says Rehberger, Enslaved.org is a “discovery tool. We want you to be able to find all these different records that have traditionally been out in these silos, and bring them together in the hope that people can then reconstruct what’s there.” Mary N. Elliott, curator of American slavery at the Smithsonian’s National Museum of African American History and Culture, emphasizes the project’s potential to help the public “understand [history] in more nuanced and personalized, humanized ways.” Reflecting on the creation of the museum’s “Slavery and Freedom” exhibition, she recalls, “One of the things that people said was ‘Oh, there’s only so much you can say about the lives of enslaved people during the early period. There’s nothing that they wrote.’” But as both Elliott and the team behind the web portal point out, archival records—when read correctly—can convey a strong sense of lived experiences. Some of the sources featured in the database “have the enslaved person speaking, or at least someone writing down what they said, or something close to their physical presence,” Williams says. By weaving these threads of information together, he adds, contemporary observers can gain a sense of everything from enslaved people’s personal sentiments to how the official record may obscure the reality of their lived experiences. Individuals looking for stories of their own family history may end up empty-handed (for now) but still come across records that inform their understanding of the brutal reality of enslavement. If, for instance, someone searching for their great-great uncle Harry comes across a runaway ad for Ned, an enslaved man who lived in the same area around the same time, they might dismiss it as unrelated. “But if you look at Ned’s story, you start to read the record, and you [see] that he has a scar over his eye. He ran away twice before,” Elliott says. “He’s probably running toward his loved ones. … It tells you about how he had the ability to run away twice. And is this plantation near the one my family was enslaved at? And I wonder where he got that scar.” For people to “read the record, in a way that they understand the humanity of African Americans under the most inhumane circumstances,” is key, the curator continues. “You’re not reading it for the sake of reading. You’re really connecting with this … man who [had] something traumatic happen to him within the framework of slavery.” *** Enslaved.org traces its origins to the 2000s, when Hawthorne was researching a book on the flow of enslaved people from two ports in West Africa. Drawing on an archive of Brazilian state inventories, which listed enslaved Africans as property whose value was based on factors such as age and skills, he created a database with demographic information on some 9,000 individuals. This broad swath of data allowed the historian to run statistical analyses about patterns of enslavement, including “Where were people coming from? … Can I zero it down to a particular place? What … were they bringing with them across the ocean? What foods did they eat? How did they worship?” Hawthorne adds, “You begin to see people coming [to the Americas] not as generalized Africans, … but as Balanta, as Mandinka, as Fulani, as Hausa, people who come with specific cultural assumptions, with specific religious beliefs. What did they preserve from the place [where] they came? What did they have to abandon based on the conditions in the Americas?” In 2010, Hawthorne partnered with Rehberger and historian Gwendolyn Midlo Hall, who had created a similar portal featuring 107,000 records of enslaved individuals in Louisiana, to build a digital repository for both datasets. Funded through a $99,000 grant from the National Endowment for the Humanities, the resulting project, Slave Biographies: The Atlantic Database Network, laid the groundwork for Enslaved.org, a site capable of not only housing dozens of datasets but also placing them in interaction with each other. A decade ago, computing technology hadn’t advanced enough to interpret data on the scale used by Enslaved.org. Today, however, researchers can use semantic triples—three-part sentences that “define a particular moment,” like “Maria was baptized in 1833” or “Maria got married in 1855,” according to Rehberger—to create vast “triplestores” filled with linked information. Here, the site can parse out Maria, the religious rite (baptism or marriage), and the year as three distinct bits of data. “I often think of … ripping apart the dataset into little bits and pieces of paper, and then taking a thread and trying to thread and bring them back together again,” Rehberger says. “That, in a sense, is what we’re trying to do.” *** As Hawthorne notes, the team is still “in the early days of our project,” If an individual enters their family name in the search bar in the near future, they likely won’t find anything. “It’s possible that you will,” he adds, “but certainly as this project grows and expands, as more and more scholars and members of the public contribute, those possibilities [open] up.” Enslaved.org welcomes data compiled by the public, but Williams emphasizes that the researchers aren’t “exactly crowdsourcing.” All submissions will undergo two levels of review; scholars can also submit their datasets to the portal’s peer-reviewed Journal of Slavery and Data Preservation. Another option for individuals with an interest in unearthing these kinds of hidden histories is to volunteer at local historical associations and museums, which can then collaborate directly with the Enslaved.org team. The project’s launch earlier this month arrives at a pivotal point in the nation’s history. “We’re in a moment right now, of interest in slavery and slave histories and slave names, slave biographies,” Williams says. “It’s also a social and racial justice moment, … a family history, genealogy curiosity moment.” One of Enslaved.org’s strengths, says Elliott, is its ability to map current events onto the past. Though the database’s focus is enslaved people, it also contains information on enslavers and individuals who participated in the historical slave trade. Slavery involved “all these different actors,” the curator explains. “And that’s vastly important, because it’s so easy for people to segregate this history. But … you cannot look at a bill of sale and [say] it’s only a black person on that document. Guess who signed it? The seller and the purchaser. [And] there’s a witness.” By focusing on individuals rather than the overwhelming—and often unfathomable—numbers that tend to dominate discussions of slavery, the team hopes to restore once-anonymous figures’ identities and deepen the public’s understanding of the transatlantic slave trade. “There’s a lot of power to reading about individuals as opposed to populations of people,” Hawthorne says. “If you look through the datasets, every single entry is a named individual. And there’s a lot of power to that, to thinking about Atlantic slavery, slavery in the American South, as being about individuals, about individual struggles under this incredibly violent institution.” Meilan Solly is Smithsonian magazine's assistant digital editor, humanities. Website: meilansolly.com.
e278edb5e1b659dc16544d788a071041
https://www.smithsonianmag.com/history/synchronized-swimming-has-history-dates-back-ancient-rome-180960108/
Synchronized Swimming Has a History That Dates Back to Ancient Rome
Synchronized Swimming Has a History That Dates Back to Ancient Rome Most people think of synchronized swimming, which gained Olympic status in 1984, as a newcomer sport that dates back only as far as Esther Williams' midcentury movies. But the aquatic precursors of synchronized swimming are nearly as old as the Olympics themselves. Ancient Rome’s gladiatorial contests are well known for their excessive and gruesome displays, but their aquatic spectacles may have been even more over the top. Rulers as early as Julius Caesar commandeered lakes (or dug them) and flooded amphitheaters to stage reenactments of large naval battles— called naumachiae—in which prisoners were forced to fight one another to the death, or drown trying. The naumachiae were such elaborate productions that they were only performed at the command of the emperor, but there is evidence that other—less macabre—types of aquatic performances took place during the Roman era, including an ancient forerunner to modern synchronized swimming. The first-century A.D. poet Martial wrote a series of epigrams about the early spectacles in the Colosseum, in which he described a group of women who played the role of Nereids, or water nymphs, during an aquatic performance in the flooded amphitheater. They dove, swam and created elaborate formations and nautical shapes in the water, such as the outline or form of a trident, an anchor and a ship with billowing sails. Since the women were portraying water nymphs, they probably performed nude, says Kathleen Coleman, James Loeb Professor of the Classics at Harvard University, who has translated and written commentaries on Martial’s work. Yet, she says, “There was a stigma attached to displaying one’s body in public, so the women performing in these games were likely to have been of lowly status, probably slaves.” Regardless of their social rank, Martial was clearly impressed with the performance. “Who designed such amazing tricks in the limpid waves?” he asks near the end of the epigram. He concludes that it must have been Thetis herself—the mythological leader of the nymphs—who taught “these feats” to her fellow-Nereids. Fast forward to the 19th century and naval battle re-enactments appear again, this time at the Sadler’s Wells Theater in England, which featured a 90-by-45 foot tank of water for staging “aqua dramas.” Productions included a dramatization of the late-18th-century Siege of Gibraltar, complete with gunboats and floating batteries, and a play about the sea-god Neptune, who actually rode his seahorse-drawn chariot through a waterfall cascading over the back of the stage. Over the course of the 1800s, a number of circuses in Europe, such as the Nouveau Cirque in Paris and Blackpool Tower Circus in England, added aquatic acts to their programs. These were not tent shows, but elegant, permanent structures, sometimes called the “people’s palaces,” with sinking stages or center rings that could be lined with rubber and filled with enough water to accommodate small boats or a group of swimmers. In England, these Victorian swimmers were often part of a performing circuit of professional "natationists" who demonstrated "ornamental" swimming, which involved displays of aquatic stunts, such as somersaults, sculling, treading water and swimming with arms and legs bound. They waltzed and swam in glass tanks at music halls and aquariums, and often opened their acts with underwater parlor tricks like smoking or eating while submerged. Though these acts were first performed by men, female swimmers soon came to be favored by audiences. Manchester (U.K.) Metropolitan University's sports and leisure historian, Dave Day, who has written extensively on the subject, points out that swimming, "packaged as entertainment," gave a small group of young, working-class women the opportunity to make a living, not only as performers, but also as swimming instructors for other women. But as more women in England learned to swim, the novelty of their acts wore off. In the United States, however, the idea of a female aquatic performer still seemed quite avant-garde when Australian champion swimmer Annette Kellerman launched her vaudeville career in New York in 1908. Billed as the "Diving Venus" and often considered the mother of synchronized swimming, Kellerman wove together displays of diving, swimming and dancing, which The New York Times called "art in the making." Kellerman's career—which included starring roles in mermaid and aquatic-themed silent films and lecturing to female audiences about the importance of getting fit and wearing sensible clothing—reached its pinnacle when she, and a supporting cast of 200 mermaids, replaced prima-ballerina Pavlova as the headline act at the New York Hippodrome in 1917. While Kellerman was promoting swimming as a way to maintain health and beauty, the American Red Cross, which had grown concerned about high drowning rates across the country, turned to water pageants as an innovative way to increase public interest in swimming and water safety. These events, which featured swimming, acting, music, life-saving demonstrations or some combination of these, became increasingly popular during the 1920s. Clubs for water pageantry, water ballet and "rhythmic" swimming—along with clubs for competitive diving and swimming—started popping up in every pocket of America. One such group, the University of Chicago Tarpon Club, under the direction of Katharine Curtis, had begun experimenting with using music not just as background, but as a way to synchronize swimmers with a beat and with one another. In 1934, the club, under the name Modern Mermaids, performed to the accompaniment of a 12-piece band at the Century of Progress World's Fair in Chicago. It was here that "synchronized swimming" got its name when announcer Norman Ross used the phrase to describe the performance of the 60 swimmers.  By the end of the decade, Curtis had overseen the first competition between teams doing this type of swimming and written its first rulebook, effectively turning water ballet into the sport of synchronized swimming. While Curtis, a physical education instructor, was busy moving aquatic performance in the direction of competitive sport, American impresario Billy Rose saw a golden opportunity to link the already popular Ziegfeld-esque “girl show” with the rising interest in water-based entertainment. In 1937, he produced the Great Lakes Aquacade on the Cleveland waterfront, featuring—according to the souvenir program—"the glamour of diving and swimming mermaids in water ballets of breath-taking beauty and rhythm." The show was such a success that Rose produced two additional Aquacades in New York and San Francisco, where Esther Williams was his star mermaid. Following the show, Williams became an international swimming sensation through her starring roles in MGM's aquamusicals, featuring water ballets elaborately choreographed by Busby Berkeley. Though competitive synchronized swimming—which gained momentum during the middle of the century—began to look less and less like Williams' water ballets, her movies did help spread interest in the sport. Since its 1984 Olympic induction, synchronized swimming has moved farther from its entertainment past, becoming ever "faster, higher, and stronger," and has proven itself to be a serious athletic event. But regardless of its roots, and regardless of how it has evolved, the fact that synchronized swimming remains a spectator favorite—it was one of the first sporting events to sell out in Rio—just goes to show that audiences still haven't lost that ancient appetite for aquatic spectacle. If synchronized swimming looks easy, the athletes are doing their jobs. Though it is a grueling sport that requires tremendous strength, flexibility, and endurance—all delivered with absolute precision while upside down and in the deep end—synchronized swimmers are expected to maintain "an illusion of ease," according to the rulebook issued by FINA, the governing body of swimming, diving, water polo, synchronized swimming and open water swimming. Olympic synchronized swimming includes both duet and team events, with scores from technical and free routines combined to calculate a final rank. Routines are scored for execution, difficulty and artistic impression, with judges watching not only for perfect synchronization and execution, both above and below the surface, but also for swimmers' bodies to be high above the water, for constant movement across the pool, for teams to swim in sharp but quickly changing formations, and for the choreography to express the mood of the music. The United States and Canada were the sport's early leaders, but Russia—with its rich traditions in dance and acrobatics, combined with its stringent athletic discipline—has risen to dominance in recent years, winning every gold Olympic medal of the 21st century and contributing to the ever-changing look of the sport. Russia, followed by China, remains the team to watch in Rio this year, while the U.S. is hoping for a win from American duet pair Mariya Koroleva and Anita Alvarez. Vicki Valosik is a writer and synchronized swimmer based in Washington, D.C. She is currently working on a book on the history of synchronized swimming and aquatic performance. Her writing has appeared in The Atlantic, American Scholar, Slate, Philadelphia Inquirer, and Washingtonian Magazine, among others. Find her at vickivalosik.net.
98f820f1cf1d5c9b4d404d793d47746d
https://www.smithsonianmag.com/history/telemedicine-predicted-in-1925-124140942/?no-ist=
Telemedicine Predicted in 1925
Telemedicine Predicted in 1925 The 1920s was an incredible decade of advancement for communications technology. Radio was finally being realized as a broadcast medium, talkies were transforming the film industry, and inventors were tinkering with the earliest forms of television. People of the 1920s recognized that big changes were ahead, and no one relished in guessing what those changes might be more than Hugo Gernsback. Gernsback was a pioneer in both radio and publishing, always pushing the boundaries of what the public might expect of their technological future. In 1905 (just a year after emigrating to the U.S. from Germany at the age of 20) Gernsback designed the first home radio set and started the first mail-order radio business in the world. The radio was called the Telimco Wireless and was advertised in magazines like Scientific American for $7.50 (about $180 today). In 1908 Gernsback put out the world’s first radio magazine, Modern Electronics. Distributed by the American News Company, Modern Electronics was a huge hit and was said to be profitable from its first issue. In 1909 he opened the first radio storefront in New York, supplementing his mail-order radio sales by selling radio parts to amateur radio operators in the city. In 1913 Gernsback started publishing a magazine called Electrical Experimenter, which in 1920 became known as Science and Invention. In the February, 1925 issue of Science and Invention Gernsback wrote an article that would combine his fascination with the future of radio communications and predict a device for the year 1975 that we still don’t see in any practical household form today. Gernsback’s device was called the “teledactyl” and would allow doctors to not only see their patients through a viewscreen, but also touch them from miles away with spindly robot arms. He effectively predicted telemedicine, though with a weirder twist than we see implemented in 2012. From the February, 1925 issue of Science and Invention: The Teledactyl (Tele, far; Dactyl, finger — from the Greek) is a future instrument by which it will be possible for us to “feel at a distance.” This idea is not at all impossible, for the instrument can be built today with means available right now. It is simply the well known telautograph, translated into radio terms, with additional refinements. The doctor of the future, by means of this instrument, will be able to feel his patient, as it were, at a distance….The doctor manipulates his controls, which are then manipulated at the patient’s room in exactly the same manner. The doctor sees what is going on in the patient’s room by means of a television screen. The doctor of the future examines a patient (1925) Quite impressively, the teledactyl was imagined as a sensory feedback device, which allowed the doctor to not only manipulate his instruments from afar, but feel resistance. Here we see the doctor of the future at work, feeling the distant patient’s arm. Every move that the doctor makes with the controls is duplicated by radio at a distance. Whenever the patient’s teledactyl meets with resistance, the doctor’s distant controls meet with the same resistance. The distant controls are sensitive to sound and heat, all important to future diagnosis. Gernsback positions his predictions about telemedicine within the rapidly changing communications landscape of the 1920s: As our civilization progresses we find it more and more necessary to act at a distance. Instead of visiting our friends, we now telephone them. Instead of going to a concert, we listen to it by radio. Soon, by means of television, we can stay right at home and view a theatrical performance, hearing and seeing it. This, however is far from sufficient. As we progress, we find our duties are multiplied and we have less and less to transport our physical bodies in order to transact business, to amuse ourselves, and so on. The busy doctor, fifty years hence, will not be able to visit his patients as he does now. It takes too much time, and he can only, at best, see a limited number today. Whereas the services of a really big doctor are so important that he should never have to leave his office; on the other hand, his patients cannot always come to him. This is where the teledactyl and diagnosis by radio comes in. It wasn’t just the field of medicine that was going to be revolutionized by this new device. Other practical uses would involve seeing and signing important documents from a distance: The man of 1975 signs important documents by videophone (1925) Here we see the man of the future signing a check or document at a distance. By moving the control, it goes through exactly the same motions as he would in signing he document. He sees what he is doing by means of the radio teleview in front of him. The bank or other official holds the document in front of a receiving teledactyl, to which is attached a pen or other writing instrument. The document is thus signed. This diagram also explained how the teledactyl worked: Diagram explaining how the teledactyl was supposed to work (1925) Interestingly, we’d see this idea for telemedicine pop up again in 1990s concept videos from AT&T and Pacific Bell. A year after this article was released Gernsback began publishing Amazing Stories, the first magazine that was devoted entirely to science fiction. Gernsback published a number of different magazines throughout his life, but I’d argue that none were filled with more rich, retro-future goodness than Science and Invention. Matt Novak is the author of the Paleofuture blog, which can now be found on Gizmodo.
078e50cb016dbd065de1c0cbf99983e6
https://www.smithsonianmag.com/history/telling-forgotten-stories-everyday-americans-revolutionary-war-180962618/
Telling the Forgotten Stories of the Everyday Americans of the Revolutionary War
Telling the Forgotten Stories of the Everyday Americans of the Revolutionary War In a darkened theater, a traveling 19th-century entertainer uses a crankie, a moving paper panorama with back-lit shadow puppets, to introduce five 18th-century characters—a Catawba Indian, an Irish immigrant woman whose family fought on opposite sides in the revolution, a Continental Army soldier, a witness to the 1770 Boston massacre and a free black man who fought in the key Revolutionary War battle. This is Liberty Fever, the movie visitors see when they enter the new American Revolution Museum at Yorktown. When screen testers commented that its portrayal of the Revolutionary War was “politically correct,” Peter Armstrong, the museum’s senior director of operations and education, told them that was absolutely the intention. "There was a conscious decision to ask, 'How do we ensure those people watching this movie feel connected with these individuals?'," Armstrong says. He wanted the people in the film to mirror the people in the audience—and for their diverse stories to share center stage. Those ordinary people, not beloved artifacts, form the heart of the museum where small stories loom large, coursing through the galleries like so many streams flowing into the same revolutionary river. When visitors tap on an 80-inch-tall interactive screen, “Personal Stories of the Revolution,” in one gallery, they see the tales of 20 characters like Peter Harris, the Catawba Indian they first met in Liberty Fever. An actor portraying Harris tells how he fought and was wounded in 1779 during the American victory at the Battle of Stono Ferry in South Carolina. There’s the story of David Fanning, a Loyalist who fought for the British in North Carolina and then switched sides at the urging of his sister, Esther De Berdt Reed, a Philadelphia woman who raised $300,000 to provide shirts and other supplies to the Continental Army. There’s even Trip, the Wheaten Terrier who belonged to Isabella Ferguson, the Irish immigrant to South Carolina who appears in Liberty Fever. The revolution split her family, like it did so many. "I'm a rebel. Glory is in the name," Ferguson told her brother-in-law, who fought for the British in a story documented in an 1848 book, Revolutionary Women in the War for American Independence. "My brother's a rebel, and the dog, Trip, is a rebel, too." Heather Hower, the museum's media project manager who helped create the exhibit, watches a family listening to Ferguson’s story and smiles. "That's exactly what we intended," she says. “We want visitors to make a personal connection.” "We're telling the stories of ordinary people in an extraordinary time," says Armstrong.  "Here at Yorktown is where the subjects of a king become citizens of a nation." Stories like 16-year-old Jon Harrington, whose mother woke him so he could grab his fife and witness the first shots at Lexington and Concord. Or Sarah Osborn Benjamin, who traveled with the Continental Army and delivered food to the troops during the siege at Yorktown. Or James Lafayette, the slave who was freed to fight and became a spy key to victory at Yorktown. The $50 million museum, not far from where Lord Lieutenant General Charles Cornwallis surrendered to George Washington on October 19, 1781, opens on March 23 with 13 days of festivities, one for each colony. The museum replaces the 40-year-old Yorktown Victory Center, which opened in 1976 as part of bicentennial celebrations, and features an expanded outdoor living history area. It’s not alone. The American Revolution Museum is mere miles from Colonial Williamsburg, Jamestown and a gaggle of battlefields and other revolutionary attractions in the region.  Its debut comes just weeks before another long-anticipated museum opens in Philadelphia, the Museum of the American Revolution, which boasts a 3,000-piece collection of revolutionary artifacts including George Washington’s headquarters tent from Valley Forge. To help lure visitors to Yorktown, museum officials turned to Armstrong, who arrived three years ago after a decade at the United Kingdom's National Museum of Arms and Armour. The arms museum, he notes, was “taxonomic” and able to only display about 10 percent of an extensive collection of artifacts, which isn’t all that uncommon among museums. At the American Revolution Museum, artifacts, such as one of the earliest portraits of an African slave and a rare July 1776 broadside of the Declaration of Independence, were collected to tell stories that enhanced the experiences. He trained in the theater and uses storytelling to bring history to life. Museums today, he says, need to find a way to connect emotionally in a world where facts are at fingertips. "What is it that made these individuals decide they could join together and take on the most powerful nation in the world? What is this concept of liberty and freedom?" Armstrong asks. "Why did the guy sitting on his farm in Pennsylvania decide to take up arms and potentially lose his life? It seems to me to be a very emotional response and if you want to understand that emotional response, you need to understand that guy in Pennsylvania. It all sounds very highbrow, but to be honest, it's just common sense. People want to know about people." The museum’s curators and researchers began with a long list of individuals that evolved over years, Hower says. The goal? Make people fall in love with the true stories of individuals. Legends and myths that could not be documented, like the story of Molly Pitcher, were discarded. For quotes in films and exhibits, the team relied on diaries and pension applications. The stories of Peter Harris and Sarah Osborn Benjamin were fleshed out by pension depositions they filed with the Veterans Administration. A portrait of Reed was tracked to descendants in New York using ancestry registries and photographed for the exhibit. Storytelling abounds throughout the museum, from artifacts to interactive screens and short films. Visitors can use a mobile app to explore the galleries through the perspectives of patriots, Loyalists, children, women and figures like the Marquis de Lafayette, Alexander Hamilton and George Washington. Social media is part of the experience, too. Visitors learning about the American Revolution through the eyes of children, for instance, can take a photo in the gallery featuring the story of James Forten, an African-American who at 14 joined a privateer fighting the British. With the app, they can superimpose his clothing onto the photo, then share their revolutionary selfie. "We’re trying to make connections in different ways," Hower says. "It's about relevance. Why are these people important to me today?" Those connections continue at an expansive living history area. It features a replica Army encampment laid out according to the principles of Major General Friedrich von Steuben, the Prussian credited with shaping the Continental Army into fighting form. Adjacent to the camp is a farm with a residence, bake house and slave quarters based on the property of Edward Moss, who lived nearby in colonial times. Outside, visitors can help with an artillery firing. They may muster for drills. They might weed the garden. And if their timing is right, they get a chance to sample the tarts or pies made with ingredients and period tools from recipes by Amelia Simmons or Hannah Glasse, who wrote contemporary colonial cookbooks. The experience ends with a bang. Visitors finish up with a 180-degree, 71-foot-wide, 4D "Siege of Yorktown." Inside the small theater, benches shake, winds blow, smoke clouds your vision and the smell of coffee and gunpowder fill the air. For Armstrong, it’s more than a theatrical experience—it’s another path into the past through the lives of ordinary people. “Let's face facts, 80 to 90 percent of the people who come to a museum are just there for a good day out," he says. "You want to be with somebody who is just like you. The more we can make it so you can associate with the individual, the better you understand the story.” Jim Morrison is a freelance writer whose stories, reported from two dozen countries, have appeared in numerous publications including Smithsonian.com, the New York Times, and National Wildlife.
006d07518816aaed996a10e8de7296fe
https://www.smithsonianmag.com/history/ten-notable-apocalypses-that-obviously-didnt-happen-9126331/?no-ist=&page=2
Ten Notable Apocalypses That (Obviously) Didn’t Happen
Ten Notable Apocalypses That (Obviously) Didn’t Happen 1. The First Warnings From Assyria An Assyrian clay tablet dating to around 2800 B.C. bears the inscription: “Our Earth is degenerate in these later days; there are signs that the world is speedily coming to an end; bribery and corruption are common; children no longer obey their parents; every man wants to write a book and the end of the world is evidently approaching.” The world didn’t end (just look around), and despite the plague of corruption and petulant teenagers, four centuries later the Assyrians would establish an empire that eventually encompassed most of the Near East. The Assyrian Empire came to an abrupt end in 612 B.C., when its capital was attacked by the Babylonian army. Still, by the standards of ancient empires, 18 centuries wasn’t such a bad run. 2. Crusaders’ Concerns Pope Innocent III relied upon apocalyptic theology in his efforts to rally Europe to launch a fifth crusade to capture Jerusalem and the rest of the Holy Land from the Ayyubid Empire. He identified the rise of Islam as the reign of the Antichrist—whose defeat would usher in the Second Coming. In 1213, Innocent III wrote: “A son of perdition has arisen, the false prophet Muhammed, who has seduced many men from the truth by worldly enticements and the pleasures of the flesh… we nevertheless put our trust in the Lord who has already given us a sign that good is to come, that the end of this beast is approaching, whose number, according to the Revelation of Saint John, will end in 666 years, of which already nearly 600 have passed.” The predicted date was 1284. Seven years later, the last crusader kingdom fell, when the Sultan Khalil conquered the city of Acre, in present-day Israel. The rest of the world, however, remained intact. 3. Botticelli Paints His Fears The Renaissance is remembered as a golden age of art and learning, but the era also marked a resurgence in apocalyptic prophecies. The reason? “Advances in time keeping and in astronomy encouraged standardization of the calendar,” writes David Nirenberg, a professor of medieval history at the University of Chicago, “while a string of calamities (from the European point of view], such as the Turkish conquest of Constantinople… fomented a new numerological apocalyptic interest.” Expectations of the apocalypse found their expression in the art of the period—most famously in The Mystical Nativity, painted by Italian Renaissance master Sandro Botticelli. The lower part of the painting depicts several small devils wedged under rocks or pinned to the ground, while a Greek inscription offers this gloomy prediction: “I, Sandro, painted this picture at the end of the year 1500 in the troubles of Italy in the half time after the time according to the eleventh chapter of St. John in the second woe of the Apocalypse in the loosing of the devil for three and a half years. Then he will be chained in the twelfth chapter and we shall see him trodden down as in this picture.” (That would place the apocalypse at around A.D. 1504.) Art historians believe that Botticelli was influenced by the sermons of Girolamo Savonarola—a Dominican monk who urged both rich and poor alike to repent for their sins and renounce worldly pleasures. Certain that the apocalypse was near, Savonarola predicted, “the sword of the Lord will come upon the earth swiftly and soon” in the form of war, pestilence and famine. 4. The Germanic Flood That Never Came In 1499, the German mathematician and astronomer Johannes Stöffler predicted that a vast flood would engulf the world on February 20, 1524. (His calculations foretold 20 planetary conjunctions during this year—16 of which would take place in a “watery sign,” a.k.a. Pisces.) In Europe, more than 100 different pamphlets were published endorsing Stöffler’s doomsday prophecy. Business boomed for boat-builders, not least for German nobleman Count von Iggleheim, who constructed a three-story ark on the Rhine. Although 1524 was a drought year in Europe, a light rain did fall on the designated day. Crowds of people—hoping to gain a seat on Iggleheim’s ark—began to riot. Hundreds were killed and the count was stoned to death. Stöffler later recalculated the actual date to be 1528, but by then his reputation as a soothsayer had been ruined. That’s kind of a shame because, according to a story told in 1558 by German historian Hieronymus Wolf, Stöffler once predicted that his life would be endangered by a “falling body.” He chose to spend that day indoors, where, during a discussion with friends, Stöffler reached to grab a book from a shelf, which came loose and smashed him on the head, seriously injuring him. 5. Black Skies Over New England At 9 a.m. on May 19, 1780, the sky over New England was enveloped in darkness. An 1881 article in Harper’s Magazine stated that, “Birds went to roost, cocks crowed at mid-day as at midnight, and the animals were plainly terrified.” The unnatural gloom is believed to have been caused by smoke from forest fires, possibly coupled with heavy fog. But at the time, some feared the worst. “People [came] out wringing their hands and howling, the Day of Judgment is come,” recalled a Revolutionary War fifer. The “Dark Day” ended at midnight, when the stars once again became visible in the night sky. But lingering concerns about a pending apocalypse prompted some people to seek out an obscure Christian sect—the Shakers—who had recently settled near Albany, New York. A splinter of the Quaker movement, the Shakers preached complete celibacy as the true path to redemption. The Shakers knew an opportunity when they saw one and embarked on a 26-month mission throughout New England, which brought them hundreds of converts. The most famous individual to emerge from the “Dark Day” was Abraham Davenport, a member of the Connecticut legislature, which was in session when the sky blackened. Members of the legislature, fearing the apocalypse had come, moved for adjournment. Davenport is said to have responded: “The day of judgment is either approaching, or it is not. If it is not, there is no cause of an adjournment; if it is, I choose to be found doing my duty. I wish therefore that candles may be brought.” The New England poet John Greenleaf Whittier commemorated Davenport in a poem first published in the Atlantic Monthly in 1866. 6. Finding Omens in the Great Pyramid of Giza A.D. 1881 was a banner year for apocalyptic expectations. For starters, there was the prediction of “Mother Shipton,” a 16th-century British soothsayer whose prophecies were first published in 1641. A later edition, published in 1862, included the prediction: “The world to an end shall come; in eighteen hundred and eighty one.” However, the book’s author, Charles Hindley, admitted that this and other prophecies (including the invention of the telegraph and the steam engine) were added as a hoax in an apparent attempt to boost book sales. Writing in an 1881 edition of Harper’s Magazine, an unnamed author lamented, “I fear it will be impossible… to deliver the English masses from this unhappy piece of miseducation.” However, on a more hopeful note, the article added: “I am assured by friends of mine employed in the British Museum that for months that institution has been fairly besieged by people anxious to know if there be any such manuscript as that referred to, or if the predictions are genuine.” Nonetheless, the 1911 edition of Encyclopaedia Britannica noted that the 1881 end-of-the-world prophecy was “the cause of the most poignant alarm throughout rural England in that year, the people deserting their houses, and spending the night in prayer in the fields, churches and chapels.” Supporting “evidence” for an apocalypse in 1881 came from an unlikely source: the Great Pyramid of Giza. Charles Piazzi Smyth, the Astronomer Royal for Scotland, became convinced that the pyramid had been built not by the Egyptians but by an Old Testament patriarch (perhaps Noah) under divine guidance. As such, Smyth saw theological implications in just about every measurement of the Great Pyramid, including a calculation for the End of Days. Smyth’s research was satirized in a January 5, 1881, column in the New York Times: “In the great gallery of the pyramid… there are precisely eighteen hundred and eighty-one notches… hence if the pyramid is trustworthy and really knows its business, we have arrived at the last year of the earth. There are a vast number of people who believe in this remarkable theory of the pyramid, and they are one and all perfectly sure that the pyramid cannot tell a lie… in case they should happen to be disappointed and to be under the unpleasant necessity of making New Year’s calls in the snow on the First of January 1882, they will probably blaspheme the pyramid and lose all faith in man and stones.” 7. Beware of Halley’s Comet Comets have long been viewed as portents of doom—and the reappearance of Halley’s comet in 1910 was no exception. Early that year, British and Irish writers opined that the comet was a harbinger of a forthcoming invasion by Germany. Some Parisians blamed the comet for a massive flood of the Seine River that devastated their city. But full-fledged panic would erupt when Chicago’s Yerkes Observatory announced in February 1910 that it had detected a poisonous gas called cyanogen in Halley’s tail. The New York Times reported that the noted French astronomer, Camille Flammarion believed the gas “would impregnate that atmosphere and possibly snuff out all life on the planet.” Most scientists sought to reassure the public. The famed astronomer Percival Lowell explained that the gases making up Halley’s tail were “so rarefied as to be thinner than any vacuum.” But the damage had already been done. People rushed to purchase gas masks and “comet pills.” The New York Times reported that “terror occasioned by the near approach of Halley’s comet has seized hold of a large part of the population of Chicago.” Likewise, the Atlanta Constitution reported that people in Georgia were preparing safe rooms and covering even keyholes with paper. (One man, the paper said, had “armed himself with a gallon of whiskey” and requested that friends lower him to the bottom of a dry well, 40 feet deep.) After Halley’s passed by the Earth in May, the Chicago Tribune announced (unnecessarily) “We’re Still Here.” Not everyone, however, was caught up in the apocalyptic frenzy. Rooftop “comet parties” were all the rage in cities throughout the United States. 8. Planets Align, Nothing Happens In 1974, John Gribbin and Stephen Plagemann wrote a best-selling book, The Jupiter Effect, warning that in March 1982, an alignment of the major planets on the same side of the Sun would trigger a series of cosmic events - culminating in an earthquake along the San Andreas fault that would wipe out Los Angeles. The book had an aura of credibility, since both authors were Cambridge-educated astrophysicists and Gribbin was an editor at the prestigious science magazine Nature. The scientists claimed that the combined gravitational force of the planets (especially dense ones, such as Jupiter and Saturn) would exert tidal forces on the Sun, causing an increase in sunspot activity that would douse the earth with high-speed particles, which, in turn, would cause abrupt changes to our planet’s rotation, leading to earthquakes. Several scientists criticized The Jupiter Effect, saying its argument was based on a tissue-thin chain of suppositions. (Seismologist Charles Richter of Caltech called the thesis “pure astrology in disguise.”) Still, the book spooked people worldwide—thanks, in part, to the endorsement of other doomsayers such as Hal Lindsey (author of the best-selling 1970s book, The Late Great Planet Earth) who, in 1980, wrote that earthquakes across the planet would trigger meltdowns at nuclear power plants and would smash dams, causing massive floods. As the dreaded date approached, panicked city residents bombarded Los Angeles’ Griffith Observatory with phone calls. Elsewhere, the San Diego Vista Press reported on March 10, 1982: “We've literally had people ask, ‘Should I sell my house and move away?’ said Kevin Atkins of Gates Planetarium [in Denver, Colorado]… One small Christian sect in the Philippines is building a maze of padded cubicles and trying out padded suits in readiness for disasters.” Even Beijing’s newspaper, The People’s Daily, sought to assure readers that “there is no regular cause-effect relation at all between this astronomical phenomenon and natural disasters like earthquakes.” One year after the non-doomsday event, Gribbin and Plagemann published The Jupiter Effect Reconsidered. It was also a best-seller. 9. The Y2K Panic At least during this apocalyptic scare, there was someone to blame: Over the decades, computer programmers had used two, rather than four digits, to represent years. As such, computers would allegedly go haywire on January 1, 2000, since the dumb machines would not be able to make sense of the year “00”—and thus the dreaded “Y2K Bug” was born. Some pundits defended the programmers, noting that their actions had been a logical way to conserve precious computer memory and save money. Others were less flattering. “What led to the Y2K Bug was not arrogant indifference to the future,” wrote Brian Haynes in The Sciences Magazine. “On the contrary, it was an excess of modesty. (‘No way my code will still be running 30 years out.’) The programmers could not envision that their hurried hacks and kludges would become the next generation’s ‘legacy systems.’” A September 1999 poll conducted by the Wall Street Journal found that 9 percent of Americans believed Microsoft was hiding the solution to the problem. The Independent newspaper warned of possible “nuclear war,” caused by glitches in early-warning systems; the International Monetary Fund predicted economic chaos in developing nations; Federal Reserve Chairman Alan Greenspan worried that panic over the Bug would prompt U.S. businesses to stockpile goods, leading to widespread shortages, and CNN reported that the U.S. milk supply would dry up because dairy farm equipment might malfunction. Still, panic over the Y2K Bug never quite reached the fever pitch that many anticipated. A Gallup Poll reported that by mid-December 1999, only 3 percent of Americans anticipated “major problems,” compared with 34 percent the year before. Billions of dollars were spent worldwide to fix the Y2K Bug, and debate still rages over how much of that spending was necessary. 10. A Man-Made Black Hole? Ever since the early 1990s, the media has reported that the Large Hadron Collider (LHC) could potentially create a black hole that would swallow the Earth. The LHC—which was switched on in September 2008—is 17 miles in circumference and buried 570 feet beneath the Alps on the Swiss-French border. The collider has the capacity to smash together proton beams at velocities up to 99.99 percent of the speed of light. In doing so, it can simulate the conditions and energies that existed shortly after the start of the Big Bang—thereby providing insights into critical questions as to how our universe was formed. Still, some skeptics worry that the high-energy collision of protons could create micro black holes. One reason this doomsday rumor persists is that quantum physicists have a tendency never to say never. As long as certain physical laws are obeyed, potential events are placed in the rather broad category of “non-zero” probability. Or, as Amherst physicist Kannan Jagannathan explains: “If something is not forbidden, it is compulsory… In an infinite universe, even things of low probability must occur (actually infinitely often).” However, by that same standard, Jagannathan adds, quantum physics dictates that it is theoretically possible to turn on your kitchen faucet and have a dragon pop out. And that explains why physicists (with the possible exception of those who are dragon-phobic) are not terribly worried. “The world is constantly bombarded by energetic cosmic rays from the depths of space, some of them inducing particle collisions thousands of times more powerful than those that will be produced by the LHC,” says Stéphane Coutu, a professor of physics at {Penn State. “If these collisions could create black holes, it would have happened by now.” Meanwhile, technical difficulties prompted the LHC to be shut down after just nine days. Operations are scheduled to slowly resume in late 2009 and early 2010. If the world does end, check this Web site for updates. Mark Strauss is the Departments Editor at Air & Space magazine.